EP4161427A1 - Robotic reference frames for navigation - Google Patents

Robotic reference frames for navigation

Info

Publication number
EP4161427A1
EP4161427A1 EP21737848.8A EP21737848A EP4161427A1 EP 4161427 A1 EP4161427 A1 EP 4161427A1 EP 21737848 A EP21737848 A EP 21737848A EP 4161427 A1 EP4161427 A1 EP 4161427A1
Authority
EP
European Patent Office
Prior art keywords
tracking markers
robotic
tracking
robotic arm
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21737848.8A
Other languages
German (de)
French (fr)
Inventor
Aviv ELLMAN
Dany JUNIO
Katherine M. PUCKETT
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mazor Robotics Ltd
Original Assignee
Mazor Robotics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mazor Robotics Ltd filed Critical Mazor Robotics Ltd
Publication of EP4161427A1 publication Critical patent/EP4161427A1/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/3945Active visible markers, e.g. light emitting diodes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery

Definitions

  • the present technology generally relates to robotic surgery, and relates more particularly to navigation during robotic surgery.
  • Surgical navigation systems are used to track the position of one or more objects during surgery.
  • Surgical robots are useful for holding one or more tools or devices during a surgery, and may operate autonomously (e.g., without any human input during operation), semi-autonomously (e.g., with some human input during operation), or non-autonomously (e.g., only as directed by human input).
  • a robotic navigation system comprises: a robot base; a robotic arm comprising a proximal end secured to the robot base, a distal end movable relative to the proximal end, and one or more arm segments between the proximal end and the distal end; and a plurality of tracking markers secured to the robotic arm.
  • Each of the one or more arm segments may support at least one of the plurality of tracking markers.
  • the plurality of tracking markers may comprise at least three tracking markers.
  • Each of the plurality of tracking markers may be an LED.
  • Each of the plurality of tracking markers may be configured to emit or reflect light at a different wavelength than at least another one of the plurality of tracking markers.
  • Each of the plurality of tracking markers may be configured to emit light in pulses at a first frequency different than a second frequency at which at least another one of the plurality of tracking markers is configured to emit light in pulses.
  • Each of the plurality of tracking markers may be a geometric pattern.
  • At least one of the plurality of tracking markers may be moveably secured to the robotic arm.
  • the at least one of the plurality of tracking markers may be selectively moveable between a first position and a second position.
  • At least two of the plurality of tracking markers may be circumferentially spaced about one of the one or more arm segments of the robotic arm.
  • the robotic navigation system may further comprise at least one processor and a memory.
  • the memory may store instructions for execution by the at least one processor that, when executed, cause the at least one processor to determine, based on a pose of the robotic arm and a known location of each of the plurality of tracking markers on the robotic arm, a predicted arrangement of the plurality of tracking markers.
  • the memory may store additional instructions for execution by the at least one processor that, when executed, further cause the at least one processor to: receive, from a camera configured to detect the plurality of tracking markers, information about a detected arrangement of the plurality of tracking markers; and compare the detected arrangement to the predicted arrangement.
  • a method of utilizing a robotic reference frame for navigation comprises: receiving, from a tracking marker sensor, first information about a plurality of tracking markers fixedly secured to a robotic arm of a robot; receiving, from the robot, second information corresponding to a position of the robotic arm in a robotic coordinate system; and registering the robotic coordinate system to a navigation coordinate system based on the first information and the second information.
  • Each of the plurality of tracking markers may comprise an LED.
  • Each of the plurality of tracking markers may comprise a geometric pattern.
  • the first information may comprise information about a position of the plurality of tracking markers in the navigation coordinate system.
  • the second information may comprise information about a position at which each of the plurality of tracking markers is fixedly secured to the robotic arm.
  • a device for surgical navigation utilizing a robotic reference frame comprises: at least one communication interface for receiving information from a robot; at least one tracking marker sensor for detecting a plurality of tracking markers on a robotic arm of the robot; at least one processor; and at least one memory.
  • the memory stores instructions for execution by the at least one processor that, when executed, cause the at least one processor to: receive, from the robot, information corresponding to a predicted arrangement of the plurality of tracking markers, the predicted arrangement defining a custom, one-time reference frame; receive, from the at least one tracking marker sensor, data corresponding to a detected arrangement of the plurality of tracking markers; and compare the predicted arrangement to the detected arrangement to determine whether the custom, one-time reference frame has been created.
  • the memory may store additional instructions for execution by the processor that, when executed, further cause the processor to: confirm a position of an object in a predetermined coordinate space based on creation of the custom, one-time reference frame.
  • Each of the plurality of tracking markers may be a geometric pattern.
  • the at least one tracking marker sensor may be a camera, and each of the plurality of tracking markers may be an LED. At least two of the plurality of tracking markers may have different wavelengths or be configured to pulse at different frequencies.
  • each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • each one of A, B, and C in the above expressions refers to an element, such as X, Y, and Z, or class of elements, such as Xi-Xn, Yi-Ym, and Zi-Zo
  • the phrase is intended to refer to a single element selected from X, Y, and Z, a combination of elements selected from the same class (e.g., Xi and X2) as well as a combination of elements selected from two or more classes (e.g., Yi and Z 0 ).
  • FIG. 1A is a block diagram of a system according to at least one embodiment of the present disclosure.
  • Fig. IB depicts a robot according to at least one embodiment of the present disclosure
  • Fig. 2A is a flowchart of a method according to at least one embodiment of the present disclosure
  • Fig. 2B depicts a predicted arrangement of tracking markers according to at least one embodiment of the present disclosure
  • Fig. 2C depicts a detected arrangement of tracking markers according to at least one embodiment of the present disclosure
  • FIG. 3 is another flowchart of a method according to at least one embodiment of the present disclosure.
  • Fig. 4 is another flowchart of a method according to at least one embodiment of the present disclosure.
  • the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit.
  • Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple Al l, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • general purpose microprocessors e.g., Intel Core i3, i5, i7, or i9 processors
  • Intel Celeron processors Intel Xeon processors
  • Intel Pentium processors Intel Pentium processors
  • AMD Ryzen processors AMD Athlon processors
  • Navigated robotic procedures often involve reference frames and trackers whose positions are detected by a tracking marker sensor.
  • a navigation system could use a camera as a tracking marker sensor, which can detect optical tracking markers on a reference frame attached to a robotic arm. With this information, the coordinate system of the robotic system can be correlated to the coordinate system of the navigation system. In other circumstances, the information from the tracking marker sensor, which can accurately determine the position and orientation of the robotic arm, can be used to calibrate the robotic system.
  • tracking markers e.g., LEDs or reference frame indicia
  • a camera or other tracking marker sensor can more easily detect the position and orientation of the robotic arm despite potential line-of-sight issues or other difficulties.
  • unique geometry can be created by LEDs on different parts of the robotic arm.
  • the actual tracked geometry changes according to the relative motion of the robotic arm joints.
  • the location of the robotic arm can be used to correlate navigation space, robotic space, and patient space.
  • Embodiments of the present disclosure eliminate the need for a physical reference frame mounted to a robotic arm, as well as for the use of snapshot reference frames and, more generally, for the snapshot process of matching navigation and robotic coordinate spaces.
  • Embodiments of the present disclosure further enable verification of navigation system integrity in real time by comparing sensed positions of robotic arm tracking markers (sensed, for example, by a navigation system) to predicted positions of the tracking markers (predicted, for example, based on information from the robot about a position and/or orientation (e.g., a pose) of the robotic arm).
  • embodiments of the present disclosure enable verification of robotic integrity in real time by comparing robotic arm position and/or orientation information (e.g., pose information) based on actual encoder readings to robotic arm position and/or orientation information determined based on the sensed positions of robotic arm tracking markers.
  • embodiments of the present disclosure therefore increase ease of use and decrease operation times over known navigation systems, registration procedures, and calibration operations.
  • FIG. 1A a block diagram of a system 100 according to at least one embodiment of the present disclosure is shown.
  • the system 100 may be used, for example: to carry out one or more aspects of one or more of the methods disclosed herein; for navigation purposes; for registration purposes; for calibration operations; to verify operational integrity of a navigation system (such as the navigation system 160) or of a robot (such as a robot 136); or for any other useful purpose.
  • the system 100 comprises a computing device 102, a tracking marker sensor 132, a robot 136, a navigation system 160, a database 164, and a cloud 168.
  • systems according to other embodiments of the present disclosure may omit any one or more of the computing device 102, the tracking marker sensor 132, the robot 136, the navigation system 160, the database 164, and/or the cloud 168. Additionally, systems according to other embodiments of the present disclosure may arrange one or more components of the system 100 differently (e.g., one or more of the tracking marker sensor 132, the robot 136, and the navigation system 160 may comprise the components shown in Fig. 1A as being part of the computing device 102).
  • the computing device 102 comprises at least one processor 104, at least one communication interface 108, at least one user interface 112, and at least one memory 116.
  • a computing device according to other embodiments of the present disclosure may omit one or both of the communication interface(s) 108 and the user interface(s) 112.
  • the at least one processor 104 of the computing device 102 may be any processor identified or described herein or any similar processor.
  • the at least one processor 104 may be configured to execute instructions stored in the at least one memory 116, which instructions may cause the at least one processor 104 to carry out one or more computing steps utilizing or based on data received, for example, from the tracking marker sensor 132, the robot 136, the navigation system 160, the database 164, and/or the cloud 168.
  • the computing device 102 may also comprise at least one communication interface 108.
  • the at least one communication interface 108 may be used for receiving image data or other information from an external source (such as the tracking marker sensor 132, the robot 136, the navigation system 160, the database 164, the cloud 168, and/or a portable storage medium (e.g., a USB drive, a DVD, a CD)), and/or for transmitting instructions, images, or other information from the at least one processor 104 and/or the computing device 102 more generally to an external system or device (e.g., another computing device 102, the tracking marker sensor 132, the robot 136, the navigation system 160, the database 164, the cloud 168, and/or a portable storage medium (e.g., a USB drive, a DVD, a CD)).
  • an external source such as the tracking marker sensor 132, the robot 136, the navigation system 160, the database 164, the cloud 168, and/or a portable storage medium (e.g., a USB drive
  • the at least one communication interface 108 may comprise one or more wired interfaces (e.g., a USB port, an ethemet port, a Firewire port) and/or one or more wireless interfaces (configured, for example, to transmit information via one or more wireless communication protocols such as 802.1 la/b/g/n, Bluetooth, Bluetooth low energy, NFC, ZigBee, and so forth).
  • the at least one communication interface 108 may be useful for enabling the device 102 to communicate with one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
  • the at least one user interface 112 may be or comprise a keyboard, mouse, trackball, monitor, television, touchscreen, button, joystick, switch, lever, and/or any other device for receiving information from a user and/or for providing information to a user of the computing device 102.
  • the at least one user interface 112 may be used, for example, to receive a user selection or other user input in connection with any step of any method described herein; to receive a user selection or other user input regarding one or more configurable settings of the computing device 102 and/or of another component of the system 100; to receive a user selection or other user input regarding how and/or where to store and/or transfer data received, modified, and/or generated by the computing device 102; and/or to display information (e.g., text, images) and/or play a sound to a user based on data received, modified, and/or generated by the computing device 102. Notwithstanding the inclusion of the at least one user interface 112 in the system 100, the system 100 may automatically (e.g., without any input via the at least one user interface 112 or otherwise) carry out one or more, or all, of the steps of any method described herein.
  • the system 100 may automatically (e.g., without any input via the at least one user interface 112 or otherwise) carry out one or more, or all
  • the computing device 102 may utilize a user interface 112 that is housed separately from one or more remaining components of the computing device 102.
  • the user interface 112 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 112 may be located remotely from one or more other components of the computer device 102.
  • the at least one memory 116 may be or comprise RAM, DRAM, SDRAM, other solid- state memory, any memory described herein, or any other tangible non-transitory memory for storing computer-readable data and/or instructions.
  • the at least one memory 116 may store information or data useful for completing, for example, any step of the methods 200 or 300 described herein.
  • the at least one memory 116 may store, for example, information about one or more predetermined coordinate systems 120 (e.g., information about a robotic coordinate system or space, information about a navigation coordinate system or space, information about a patient coordinate system or space); instructions 124 for execution by the at least one processor 104, for example to cause the at least one processor 104 to carry out one or more of the steps of the method 200 and/or of the method 300; and/or one or more algorithms 128 for use by the processor in carrying out any calculations necessary to complete one or more of the steps of the method 200 and/or of the method 300, or for any other calculations.
  • predetermined coordinate systems 120 e.g., information about a robotic coordinate system or space, information about a navigation coordinate system or space, information about a patient coordinate system or space
  • instructions 124 for execution by the at least one processor 104 for example to cause the at least one processor 104 to
  • Such predetermined coordinate system(s) 120, instructions 124, and/or algorithms 128 may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines, and may cause the at least one processor 104 to manipulate data stored in the at least one memory 116 and/or received from or via another component of the system 100.
  • the tracking marker sensor 132 is operable to detect one or more tracking markers 156 (described below).
  • the tracking marker sensor 132 may be, for example, an optical camera; an infrared camera; a 3D camera system; a stereoscopic vision system; another imaging device; or any other sensor that can detect one or more tracking markers 156.
  • the tracking marker sensor 132 may comprise a dedicated processor for executing instructions stored in a dedicated memory of the tracking marker sensor 132, or the tracking marker sensor 132 may simply be configured to transmit data collected therewith to the computing device 102 or to another component of the system 100. Although shown in Fig.
  • the tracking marker sensor 132 may be in communication with any one or more of the computing device 102, the robot 136, the navigation system 160, the database 164, and/or the cloud 168.
  • the computing device 102 may comprise the tracking marker sensor 132
  • the navigation system 160 may comprise the tracking marker sensor 132.
  • the robot 136 may comprise the tracking marker sensor 132.
  • the tracking marker sensor 132 may be positioned directly above an operating table or portion thereof, or above and to one side of an operating table or portion thereof, or in another convenient position within an operating room or other room housing the robot 136.
  • the tracking marker sensor 132 may be positioned at a location selected to provide the tracking marker sensor 132 with a clear and/or unobstructed view of the robotic arm 144 of the robot 136 (and thus of one or more tracking markers 156 fixedly secured to the robotic arm 144) during operation thereof.
  • the tracking marker sensor 132 is fixed, while in other embodiments, the tracking marker sensor 132 may be precisely movable (whether manually or automatically) in one or more directions.
  • the tracking marker sensor 132 may be configured to capture data regarding sensed tracking markers 156 only at a given moment in time.
  • the tracking marker sensor 132 may be configured to capture still images comprising one or more tracking markers 156.
  • the tracking marker sensor 132 may be configured to capture such data at periodic intervals, or when commanded by a user (e.g., via a user interface 112), or upon a signal (generated either autonomously or in response to user input) from the computing device 102, the robot 136, and/or the navigation system 160.
  • the tracking marker sensor 132 may additionally or alternatively be operable to capture data corresponding to one or more tracking markers 156 continuously, in real-time.
  • the tracking marker sensor 132 may provide a stream of real-time sensor data to the computing device 102, which may continuously process the sensor data to detect one or more tracking markers 156 therein.
  • the tracking marker sensor 132 may comprise more than one tracking marker sensor 132.
  • the robot 136 may be any surgical robot or surgical robotic system.
  • the robot 136 may be or comprise, for example, the Mazor XTM Stealth Edition robotic guidance system.
  • the robot 136 may comprise a base 140 that supports a robotic arm 144.
  • the robot 136 may comprise one or more robotic arms 144.
  • the robotic arm 144 may comprise a first robotic arm and a second robotic arm. In other embodiments, the robot 136 may comprise more than two robotic arms 144.
  • the robotic arm 144 may, in some embodiments, assist with a surgical procedure (e.g., by holding a tool in a desired trajectory or pose and/or supporting the weight of a tool while a surgeon or other user operates the tool, or otherwise) and/or automatically carry out a surgical procedure.
  • the robotic arm 144 may have three, four, five, six, or more degrees of freedom.
  • the robotic arm 144 may comprise one or more segments 152.
  • Each segment 152 may comprise a member 176 and a joint 172 to which the member 176 is attached and/or from which the member 176 extends.
  • the joint 172 may be secured, for example, to the base 140 or to the member 176 of another segment 152.
  • the joint 172 may be any type of joint that enables selective movement of the member 176 relative to the structure to which the joint 172 is attached.
  • the joint 172 may be a pivot joint, a hinge joint, a saddle joint, or a ball- and-socket joint.
  • the joint 172 may allow movement of the member 176 in one dimension or in multiple dimensions, and/or along one axis or along multiple axes.
  • the joint 172 of the segment 152 may be secured to the base 140, and the member 176 of the segment 152 may comprise a proximal end secured to the joint 172 and a distal end supporting an end effector.
  • the end effector may be, for example, a tool (e.g., a drill, saw, imaging device) or a tool guide (e.g., for guiding a biopsy needle, ablation probe, or other tool along a desired trajectory).
  • a first segment 152 may comprise a joint 172 secured to the base 140, and the member 176 of the first segment 152 may comprise a proximal end secured to the joint 172 and a distal end supporting a joint of a second segment 152.
  • the member 176 of the second segment 152 may comprise a proximal end secured to the joint 172 of the second segment 152, and a distal end supporting a joint 172 of a third segment 152, and so on.
  • the member 176 of the final segment 152 may comprise a distal end that supports an end effector 180, which may be the same as or similar to the end effector described above.
  • the joints 172 of the various segments 152 may or may not be of the same type, and the members 176 of the various segments 152 may or may not be identical.
  • All or some of the joints 172 of the segments 152 of the robotic arm 144 may be powered (so as to be selectively controllable without physical manipulation by a human). Any one or more of electric, pneumatic, hydraulic, and/or other means may be used to selectively control movement of a member 176 about the joint 172.
  • each segment 152 may comprise a servo for selectively moving the member 176 of that segment 152 relative to the joint 172 of that segment 152.
  • the robotic arm 144 also comprises one or more sensors 148. Each sensor 148 may be positioned to detect a position of a member 176 of a given segment 152 relative to the joint 172 of the segment 152.
  • a sensor 148 may detect an angular position of the member 176 relative to an axis of the hinge joint.
  • the joint 172 of a given segment 152 is or comprises a rotary joint (e.g., configured to allow rotation of the member 176 about an axis that extends through the member 176 and the joint 172)
  • the sensor 148 may detect an angular position of the member 176 relative to the axis that extends through the member 176 and the joint 172.
  • Each sensor 148 may be, for example, a rotary encoder, a linear encoder, or an incremental encoder.
  • Data from the sensors 148 may be provided to a processor of the robot 136, to the processor 104 of the computing device 102, and/or to the navigation system 160.
  • the data may be used to calculate a position in space of the robotic arm 144 relative to a predetermined coordinate system.
  • the robot 136 may calculate a position in space of the robotic arm 144 relative to a coordinate system with an origin at the position where the joint 172 of the first segment 152 of the robotic arm 144 is secured to the base 140.
  • the calculation may be based not just on data received from the sensor(s) 148, but also on data or information (such as, for example, physical dimensions) corresponding to each segment 152 and/or corresponding to an end effector secured to the final segment 152.
  • a known location of the proximal end of the robotic arm 144 e.g., where a joint 172 of the first segment 152 is secured to the base 140
  • known dimensions of each segment 152 e.g., known dimensions of each segment 152
  • data from the sensor(s) 148 about an orientation of the member 176 of each segment 152 relative to the joint 172 of each segment 152 may be used to calculate the path of the robotic arm through space.
  • a plurality of tracking markers 156 are fixedly secured to or positioned on the robotic arm 144.
  • “fixedly secured” does not mean “permanently secured,” and indeed the tracking markers 156 may be detachable from the robotic arm 144.
  • the tracking markers 156 may be light-emitting diodes (LEDs).
  • the tracking markers 156 may all be identical, or one or more of the tracking markers 156 may be different than another one or more of the tracking markers 156.
  • one or more of the tracking markers 156 may be configured to emit light at a first wavelength, and another one or more of the tracking markers 156 may be configured to emit light at a second wavelength different than the first wavelength.
  • one or more of the tracking markers 156 may be configured to reflect light at a first wavelength, while another one or more of the tracking markers may be configured to reflect light at a second wavelength that is different than the first wavelength.
  • the emitted and/or reflected wavelengths of light of the embodiments described above may be wavelengths within a particular spectrum (e.g., wavelengths corresponding to red light versus wavelengths corresponding to blue light in the visible spectrum, or different wavelengths in the infrared spectrum) as well as wavelengths from different spectrums (e.g., a wavelength in the visible spectrum versus a wavelength in the infrared spectrum).
  • one or more of the tracking markers 156 may be or comprise an LED that pulses at a first frequency, and another one or more of the tracking markers 156 may be or comprise an LED that pulses at a second frequency different than the first frequency.
  • the tracking markers 156 may be or comprise reflective spheres, geometric patterns (such as, for example, QR codes), or other items or features that may be readily distinguished by the tracking marker sensor 132.
  • the tracking markers may be configured to be detectable by a tracking marker sensor 132 even when covered by a drape or other covering that may be arranged on or over the robotic arm 144 to maintain a sterile operating room environment.
  • At least one tracking marker 156 is fixedly secured to or positioned on each segment 152.
  • One or more groups of tracking markers 156 may be secured to one or more of the segments 152.
  • the number of tracking markers 156 fixedly secured to or positioned on the robotic arm 144 may be at least three.
  • each segment 152 may have fixedly secured thereto, or positioned thereon, a plurality of tracking markers 156, arranged so that at least one tracking marker 156 is visible (e.g., to the tracking sensor 132) from any one of a plurality of possible orientations of the segment 152.
  • the tracking markers 156 are arranged on the robotic arm 144 so that at least three tracking markers 156 are visible (e.g., to the tracking sensor 132) from any one of a plurality of possible orientations of the robotic arm 144.
  • the tracking markers 156 may be circumferentially spaced about the robotic arm 144 or about a particular segment of the robotic arm 144.
  • the tracking markers 156 may also be radially and/or longitudinally spaced about the robotic arm 144 or about a particular segment of the robotic arm 144.
  • the plurality of tracking markers 156 may be moveably secured to the robotic arm 144, and may further be selectively moveable relative to the robotic arm 144.
  • one or more of the plurality of tracking markers 156 may be configured to move (or to be moved automatically) from a first position on the robotic arm 144 to a second position on the robotic arm 144 when the robotic arm 144 moves into or out of a certain position or set of positions.
  • the purpose of such movement of the one or more of the plurality of tracking markers 156 may be to facilitate maintenance of a line of sight between each (or at least a subset) of the plurality of tracking markers 156 and the tracking sensor 132.
  • the robot 136 (and/or another component of the system 100) may be configured to track whether each of the plurality of tracking markers 156 is in its respective first position or second position, and to provide such information to the navigation system 160 (or to any other component of the system 100) to enable correlation of a robotic coordinate system with a navigation coordinate system based on a position of the tracking markers 156 relative to the robotic arm 144 as known by the robot 136 (and/or another component of the system 100), and further based on a position of the tracking markers 156 as detected by the navigation system 160 (e.g., using a tracking sensor 132).
  • the number of tracking markers 156 on the robotic arm 144 may be selected based on a minimum number of tracking markers 144 needed to determine a position in space of the robotic arm 144 based on the relative orientation of the detected tracking markers 156, as described in more detail below. For example, if the minimum number of tracking markers 156 needed to determine a position of the robotic arm 144 is 3, then the total number of tracking markers 156 on the robotic arm may be 3 or more. Alternatively, if the minimum number of tracking markers 156 needed to determine a position of the robotic arm 144 is 4, then the total number of tracking markers 156 on the robotic arm 144 may be 4 or more.
  • the minimum number of tracking markers 156 needed to determine a position of the robotic arm 144 may be the minimum number needed to ensure that the position of the tracking markers 156 in space (e.g., as detected by the tracking marker sensor 132) is unique for every possible orientation of the robotic arm 144. For example, if only one tracking marker 156 were positioned on a distal end of a robotic arm 144 having a plurality of segments 152, then a plurality of orientations or poses of the robotic arm 144 could likely be assumed without moving the tracking marker 156.
  • Adding additional tracking markers 156 to the robotic arm 144 reduces the number of possible positions of the robotic arm 144 associated with each arrangement of the tracking markers 156, until a threshold number of tracking markers 156 is reached that represents the minimum number of tracking markers 156 needed to determine a position of the robotic arm 144.
  • the minimum number of tracking markers 156 needed to determine a position of the robotic arm 144 may be two.
  • the minimum number of tracking markers 156 needed to determine a position of the robotic arm 144 may be three or more.
  • additional tracking markers 156 may be provided to ensure redundancy should one or more of the tracking markers 156 have an obstructed line-of-sight to the tracking marker sensor 132.
  • the navigation system 160 may provide navigation for a surgeon and/or for the robot 136 during an operation.
  • the navigation system 160 may be any now- known or future-developed navigation system, including, for example, the Medtronic Stealth StationTM S8 surgical navigation system.
  • the navigation system 160 may include a camera or other sensor(s) for detecting and/or tracking one or more reference markers, navigated trackers, or other objects within an operating room or other room where a surgical procedure takes place.
  • the navigation system 160 may comprise the tracking marker sensor 132.
  • the navigation system 160 may be used to track a position of the robotic arm 144 (or, more particularly, of tracking markers 156 attached to the robotic arm 144).
  • the navigation system 160 may be used to track a position of one or more reference markers or arrays or other structures useful for detection by a camera or other sensor of the navigation system 160.
  • the navigation system 160 may include a display for displaying one or more images from an external source (e.g., the computing device 102, tracking marker sensor 132, or other source) or a video stream from the camera or other sensor of the navigation system 160.
  • the system 100 may operate without the use of the navigation system 160.
  • the database 164 may store information that correlates each particular arrangement of tracking markers 156 to a corresponding position and orientation, or pose, of the robotic arm 144.
  • information from the tracking marker sensor 132 about the position of each of a plurality of detected tracking markers 156 may be used to look up, in the database 164, a corresponding position of the robotic arm 144.
  • the database 164 may additionally or alternatively store, for example, information about or corresponding to one or more characteristics of the tracking markers 156; one or more surgical plans for use by the robot 136, the navigation system 160, and/or a user of the computing device 102 or of the system 100; one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information.
  • the database 164 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud 168.
  • the database 164 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • a hospital image storage system such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • the cloud 168 may be or represent the Internet or any other wide area network.
  • the computing device 102 may be connected to the cloud 168 via the communication interface 108, using a wired connection, a wireless connection, or both.
  • the computing device 102 may communicate with the database 164 and/or an external device (e.g., a computing device) via the cloud 168.
  • a method 200 for utilizing a robotic reference frame for navigation may be performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as the robot 136) or part of a navigation system (such as the navigation system 160).
  • a processor other than any processor described herein may also be used to execute the method 200.
  • the at least one processor may perform the method 200 by executing instructions stored in a memory, such as the instructions 124 of the memory 116.
  • the instructions may correspond to one or more steps of the method 200 described below.
  • the instructions may cause the processor to execute one or more algorithms, such as the algorithms 128.
  • the method 200 comprises determining, based on a planned pose of a robotic arm, a predicted arrangement of a plurality of tracking markers (step 204).
  • the planned pose of the robotic arm may be a current pose of the robotic arm.
  • the robotic arm may be the same as or similar to the robotic arm 144 of the robot 136.
  • the plurality of tracking markers may be the same as or similar to the tracking markers 156.
  • the determining may comprise accessing stored information — whether from a memory such as the memory 116, a database such as the database 164, or elsewhere — about a position of each of the plurality of tracking markers relative to the robotic arm or a portion thereof.
  • the determining may further comprise calculating a predicted position of each of the plurality of tracking markers based on the planned position of the robotic arm and information about a position of each of the plurality of tracking markers relative to the robotic arm (or a portion thereof).
  • the determining may comprise calculating a predicted position of every tracking marker fixedly secured to the robotic arm, or only of a subset of every tracking marker fixedly secured to the robotic arm.
  • the determining may further comprise compiling the calculated predicted positions of each of the plurality of tracking markers into a predicted arrangement of the plurality of tracking markers.
  • the predicted arrangement of the robotic arm may be determined relative to a coordinate system.
  • the coordinate system may be a robotic coordinate system, a navigation coordinate system, or another coordinate system.
  • the tracking markers may be the same as or similar to the tracking markers 156 described above.
  • the tracking markers may be LEDs, or reflective spheres, or geometric patterns such as QR codes.
  • the tracking markers may all be identical, or each may be distinguishable from the others by a unique characteristic (e.g., a unique wavelength, a unique pulsating frequency).
  • some tracking markers may be distinguishable from one or more other tracking markers by a shared characteristic. For example, a first set of tracking markers may emit light at a first wavelength, and a second set of tracking markers may emit light at a second wavelength.
  • Fig. 2B illustrates a predicted arrangement 250 of a plurality of tracking markers according to at least one embodiment of the present disclosure, with the corresponding position of the robot (including the robotic arm) shown in dotted lines.
  • the method 200 also comprises receiving, from a tracking marker sensor, information about a detected arrangement of the plurality of tracking markers (step 208).
  • the tracking marker sensor may be the same as or similar to the tracking marker sensor 132.
  • the tracking marker sensor may be an optical camera, an infrared camera, or any other sensor configured to detect the tracking markers.
  • the tracking marker sensor in some embodiments, may be part of a robot such as the robot 136, or part of a navigation system such as the navigation system 160, or part of a computing device such as the computing device 102. In some embodiments, the tracking marker sensor may be independent of any of the foregoing components, but may be in electronic communication with one or more of the foregoing components.
  • the information may comprise a position of each of a plurality of tracking markers in a detected arrangement of tracking markers.
  • the position of one or more of the plurality of tracking markers may be defined based on a coordinate system (such as a robotic coordinate system or a navigation coordinate system) or relative to another one or more of the plurality of tracking markers.
  • the information may comprise only information about a position of each of the plurality of tracking markers relative to each other, and may be useful for calculating a position of each of the plurality of tracking markers relative to a coordinate system (such as a robotic coordinate system or a navigation coordinate system).
  • the information may be received, for example, via a communication interface such as the communication interface 108.
  • Fig. 2C shows a detected arrangement 260 of the plurality of tracking markers, according to at least one embodiment of the present disclosure, as might be shown or reflected in the information about the detected arrangement of the plurality of tracking markers received in step 208.
  • the method 200 also comprises comparing the detected arrangement of the plurality of tracking markers to the predicted arrangement of the plurality of tracking markers (step 212).
  • the comparing may comprise translating or otherwise correlating the predicted position of each of the plurality of tracking markers from one coordinate space (e.g., a robotic coordinate space) to another coordinate space (e.g., a navigation coordinate space).
  • the comparing may comprise translating or otherwise correlating the detected position of each of the plurality of tracking markers from one coordinate space (e.g., a navigation coordinate space) to another coordinate space (e.g., a robotic coordinate space).
  • the comparing may comprise simply comparing the relative positions of the plurality of tracking markers in the predicted arrangement to the relative positions of the plurality of tracking markers in the detected arrangement.
  • the robotic arm (by virtue of the tracking markers fixedly secured thereto) constitutes a custom, one-time reference frame useful for the same purposes as any known reference frame — including, for example, to register a robotic coordinate system to a navigation coordinate system, and/or to determine or confirm a position of a given object in a particular coordinate system.
  • the robotic arm may, for example, be positioned such that an end thereof is in contact with an anatomical feature, a surgical tool, or another object, such that the robotic arm comprises a reference frame based upon which the position of the anatomical feature, the surgical tool, or the other object may be determined or confirmed.
  • Embodiments of the present disclosure thus advantageously eliminate a need for sometimes-bulky reference frames.
  • some navigation systems require certain minimum distances between tracking markers (or references frames for holding tracking markers to be a minimum size) to provide accurate navigation within a volume applicable to a given application. Larger volumes in particular may require larger reference frames than smaller volumes.
  • the reference frames holding tracking markers for surgeries or other medical procedures can extend from three inches to ten inches in multiple dimensions in order to achieve a minimum necessary size.
  • these reference frames tend to be bulky, are easily bumped (which can, for example, cause undesired movement of the objects to which they are affixed), can get in the way of movement of a surgeon or of any movable object in an operating room environment, and can be difficult to use.
  • the use of a robotic reference frame as described in several embodiments herein enables a somewhat bulky object already in the operating room (the robot) to be used, when needed, to create an instantaneous reference frame, thus eliminating the need for dedicated reference frames and the problems associated therewith.
  • the comparison yields a conclusion that the detected arrangement matches the predicted arrangement, the operational integrity of the robot and of the navigation system can be confirmed. This can be useful during surgical procedures as well as for initial calibration operations for the robotic system.
  • the comparison yields a conclusion that the detected arrangement does not match the predicted arrangement, even though the robotic arm is in the pose (e.g., the position and/or orientation) used to determine the predicted arrangement, a further conclusion can be reached that one or both of the robot and the navigation system lack operational integrity.
  • a warning may be displayed to an operator of the robot and/or of the navigation system, and/or an audible sound may be played, via a user interface (such as, for example, the user interface 112 of the computing device 102, or a user interface specific to the robot or to the navigation system). Provision of such a warning to an operator of the robot and/or of the navigation system helps to ensure that the suspect operational integrity of the robot and/or of the navigation system can be investigated, and any errors corrected, before the robot and/or the navigation system are used further.
  • a user interface such as, for example, the user interface 112 of the computing device 102, or a user interface specific to the robot or to the navigation system.
  • the detected arrangement of the plurality of tracking markers is only slightly different than the predicted arrangement of the plurality of tracking markers
  • another arrangement of the plurality of tracking markers e.g., based on a different pose of the robotic arm
  • the camera or other tracking marker sensor may provide additional information about a second detected arrangement of the tracking markers (detected, for example, when the robotic arm is in the different pose). If the second detected arrangement of the plurality of tracking markers is again only slightly different than the second predicted arrangement of the plurality of tracking markers, then an error calculation and/or calibration process may be undertaken to determine an adjustment to be applied to any further predicted arrangement so that it matches the corresponding detected arrangement, or vice versa.
  • an offset between a predicted arrangement of the plurality of tracking markers and a corresponding detected arrangement of the plurality of tracking markers can be characterized by a constant or a derived equation, such that the offset can be incorporated into further comparisons of a predicted arrangement and a detected arrangement of the plurality of tracking markers, then the operational integrity of the robot and/or of the navigation system may be confirmed.
  • the method 200 also comprises determining a pose of the robotic arm based on the predicted and/or detected arrangement (step 216).
  • the determining may comprise receiving, from the robot having the robotic arm, information about the pose of the robotic arm that corresponds to the predicted and/or detected arrangements of the tracking markers.
  • the determining may alternatively comprise calculating — based on information about the position of each tracking marker (and/or about the relative positions of the tracking markers in the detected arrangement of tracking markers) as well as information about a position of each tracking marker relative to the robotic arm or a segment thereof, and information about the robotic arm and/or the segment(s) thereof — a pose of the robotic arm.
  • the determining may comprise determining a pose of the robotic arm relative to a robotic coordinate system, a navigation coordinate system, or some other coordinate system.
  • the determining may further comprise combining the calculated poses of the one or more arm segments into a calculated pose of the robotic arm as a whole.
  • information needed to calculate a pose of the robotic arm based on the predicted and/or detected arrangement of the tracking markers may be accessed from a memory such as the memory 116 or a database such as the database 164.
  • Such information may include, for example, information about the dimensions, arrangement, range of motion, and/or other characteristics of the segments of the robotic arm and/or of the robotic arm as a whole.
  • One or more algorithms (such as the algorithms 128) stored in a memory (such as the memory 116) may be used when calculating the pose of the one or more segments of the robotic arm and/or of the robotic arm as a whole.
  • the present disclosure encompasses embodiments of the method 200 that comprise more or fewer steps than those described above.
  • a method 300 of utilizing a robotic reference frame for registration comprises receiving, from a sensor, first information about a plurality of tracking markers fixedly secured to a robotic arm (step 304).
  • the sensor may be, for example, a tracking marker sensor as described elsewhere herein, including, for example, a tracking marker sensor 132, or any other sensor suitable for detecting the plurality of tracking markers.
  • the tracking markers may be any tracking markers described herein, including for example, tracking markers 156.
  • the robotic arm may be any robotic arm described herein, including, for example, a robotic arm 144 of a robot 136, or any other robotic arm.
  • the first information may be or comprise information about a position of the plurality of tracking markers in a navigation coordinate system.
  • the first information may be or comprise information about a position of the plurality of tracking markers relative to each other.
  • the first information may be or comprise information about one or more characteristics of one or more specific tracking markers, such as information about a wavelength of one or more specific tracking markers, a pulsating frequency of one or more tracking markers, and/or a geometric pattern of one or more specific tracking markers.
  • the first information may be received directly from the sensor or via one or more communication interfaces such as the communication interface 108, and/or via a cloud such as the cloud 168, or via any other network, device, or component.
  • the method 300 also comprises receiving, from the robotic arm or a robot that comprises the robotic arm, second information corresponding to a position of the robotic arm in a robotic coordinate system (step 308).
  • the second information may be based, for example, on data obtained from one or more sensors in the robotic arm, such as the sensors 148.
  • the second information may comprise sensor data about a detected position of one or more segments of the robotic arm and/or of the robotic arm as a whole.
  • the second information may be based on one or more settings of one or more components of the robotic arm.
  • the second information may comprise data describing a position (whether an actual position or a commanded position) of one or more motors, servos, gears, or other devices or components used to control a position of the robotic arm and/or one or more segments thereof.
  • the second information may be obtained independently of the first information, and vice versa.
  • the method 300 also comprises correlating a position of the robotic arm in the robotic coordinate system to a position of the plurality of tracking markers in the robotic coordinate system (step 312).
  • the correlating may comprise accessing information from a memory such as the memory 116, a database such as the database 164, a robot such as the robot 136, or another storage location.
  • Such information may include, for example, information regarding a precise position of each of the plurality of tracking markers on the robotic arm, and/or information about the dimensions, arrangement, range of motion, and/or other characteristics of the segments of the robotic arm and/or of the robotic arm as a whole.
  • a position of the plurality of tracking markers in the robotic coordinate system may be calculated.
  • Such calculations may utilize, for example, one or more algorithms such as the algorithms 128.
  • the result of the correlating may be a calculated or otherwise determined position, in the robotic coordinate system, of the plurality of tracking markers fixedly secured to the robotic arm.
  • the method 300 also comprises registering the robotic coordinate system to the navigation coordinate system (step 316). The registering may be accomplished based on the first information and the second information. In some embodiments, the registering comprises the correlating described above as step 312 (e.g., step 316 may, in some embodiments, comprise step 312).
  • the registering may comprise determining a relationship between the robotic coordinate system and the navigation coordinate system based on a known position of the plurality of tracking markers in the navigation coordinate system (as included in or determined from the first information) and a known position of the plurality of tracking markers in the robotic coordinate system (as determined from the second information).
  • the registering may utilize one or more algorithms such as the algorithms 128 stored in the memory 116.
  • the method 300 also comprises registering a patient coordinate system to the navigation coordinate system (step 320).
  • the registering of this step 320 comprises correlating the robotic coordinate system to a patient coordinate system (or vice versa), and, based on that registration, correlating the patient coordinate system to the navigation coordinate system.
  • the registering may comprise determining a relationship between the patient coordinate system and the navigation coordinate system based on a relationship between the patient coordinate system and the robotic coordinate system and a relationship between the robotic coordinate system and the navigation coordinate system.
  • the registering of this step 320 may utilize one or more algorithms such as the algorithms 128 stored in the memory 116.
  • the present disclosure encompasses a number of variations on the method 300.
  • the registering step 316 may comprise correlating a position of the robotic arm in a robotic coordinate system (as indicated in or determined from the second information received in the step 308) to a position of each of the plurality of tracking markers, as currently described with respect to the step 312.
  • the second information may comprise position information for each of the plurality of tracking markers in the robotic coordinate system, such that the correlating step 312 or any similar step is unnecessary.
  • a position of each of the plurality of tracking markers in the robotic coordinate system is determined based on second information about a position of the robotic arm in the robotic coordinate system
  • a position of the robotic arm may be determined based on the first information
  • the registering step 316 may comprise registering the robotic coordinate system to the navigation coordinate system (or vice versa) based on the position of the robotic arm as determined from the first information and the position of the robotic arm as indicated by or determined from the second information.
  • the method 300 beneficially enables registration of a navigation coordinate system to a robotic coordinate system or vice versa without the use of a reference frame other than a reference frame formed by the robotic arm (including the tracking markers fixedly secured to the robotic arm) itself.
  • the method 300 thus avoids the cost of a separate reference frame, the expenditure of the time needed to secure the separate reference frame to the robotic arm, and, in instances where the reference frame would be secured directly to the patient (e.g., to the patient’s vertebra or pelvis), the incision that would otherwise be needed to secure the separate reference frame to the patient.
  • the method 300 removes the need for a snapshot frame that would otherwise be required during registration of a navigation coordinate system to a robotic coordinate system or vice versa.
  • the present disclosure encompasses embodiments of the method 300 with more or fewer steps than those described above.
  • a method 400 of utilizing a robotic reference frame comprises receiving information corresponding to a predicted arrangement of a plurality of tracking markers, which arrangement defines a custom, one-time reference frame (step 404).
  • the predicted arrangement may be based on an expected position of a robotic arm to which the plurality of tracking markers are fixedly secured, or a current position of a robotic arm to which the plurality of tracking markers are fixedly secured.
  • the information may comprise the results of one or more calculations carried out (e.g., by a robot such as the robot 136, or by a processor such as the processor 104) to determine the predicted arrangement of the plurality of tracking markers based on an expected or current position of the robotic arm to which the tracking markers are fixedly secured.
  • the expected or current position of the robotic arm may be a position that enables an end of the robotic arm (whether an end effector or otherwise) to be in contact with an object, the position of which needs to be determined or confirmed (whether in a navigation coordinate space or otherwise).
  • the information may comprise information about a position of each individual tracking marker relative to a robotic coordinate system or another coordinate system.
  • the information may comprise information about a characteristic of each individual tracking marker, such as a wavelength, pulsating frequency, or geometric pattern of each individual tracking marker.
  • Each of the plurality of tracking markers may be any tracking marker described herein, including, for example, a tracking marker 156.
  • Each of the tracking markers is fixedly secured to a robotic arm.
  • the plurality of tracking markers may comprise at least three tracking markers, or at least four tracking markers, or at least five tracking markers, or at least more than five tracking markers.
  • the method 400 also comprises receiving data corresponding to a detected arrangement of the plurality of tracking markers (step 408).
  • the data may be received, for example, from a tracking marker sensor such as the tracking marker sensor 132.
  • the data may comprise position information of the plurality of tracking markers in a navigation coordinate space or another coordinate space.
  • the data may enable calculation of a position of the plurality of tracking markers in a navigation coordinate space or another coordinate space.
  • the data may comprise information about a characteristic of each individual tracking marker, such as a wavelength, pulsating frequency, or geometric pattern of each individual tracking marker.
  • the data corresponding to the detected arrangement of the plurality of tracking markers is independent of the information corresponding to the predicted arrangement of the plurality of tracking markers.
  • the data is generated without reference to the information, and the information is generated without reference to the data.
  • the method 400 also comprises comparing the predicted arrangement of the plurality of tracking markers to the detected arrangement of the plurality of tracking markers to determine whether the custom, one-time reference frame has been created (step 412).
  • the detected arrangement matches the predicted arrangement, the exact pose of the robotic arm is known and therefore enables the robotic arm (and/or the tracking markers fixedly secured to the robotic arm) to be used as a custom, one-time reference frame usable instead of a separate reference frame that must be secured to or held by the robotic arm or another object or person.
  • the comparing may comprise using one or more algorithms such as the algorithms 128 to translate a position of the tracking markers in one coordinate system (e.g., in a robotic coordinate system) to a position of the tracking markers in another coordinate system (e.g., in a navigation coordinate system).
  • the comparing may also comprise overlaying an image contained in or generated using the data corresponding to the detected arrangement of the plurality of tracking markers on a virtual image generated based on the information corresponding to the predicted arrangement of the plurality of tracking markers to determine whether the tracking markers in both images line up with each other.
  • Other comparison methods may also be used to determine whether the predicted arrangement of the plurality of tracking markers matches the detected arrangement of the plurality of tracking markers.
  • the method 400 also comprises confirming a position of an object based on the creation of the custom, one-time reference frame (step 416). As noted above, when the predicted arrangement matches the detected arrangement, the precise position of the robotic arm is known, such that the robotic arm (and/or the tracking markers fixedly secured to the robotic arm) can be used as a reference frame.
  • the position (and, in some embodiments, the orientation) of an object can be detected simultaneously with the detection of the plurality of tracking markers, and/or if the robotic arm is in contact with a known surface or feature of the object at the moment the custom, one-time reference frame is created, then the position (and, in some embodiments, the orientation) of the object may be determined using the custom, one-time reference frame, just as the position (and, in some embodiments, the orientation) of an object may be determined using a reference frame separate from the robotic arm.
  • the present disclosure encompasses embodiments of the method 400 with more or fewer steps than those described above.
  • the present disclosure encompasses methods with fewer than all of the steps identified in Figs. 2A, 3, and 4 (and the corresponding description of the methods 200, 300, and 400), as well as methods that include additional steps beyond those identified in Figs. 2A, 3, and 4 (and the corresponding description of the methods 200, 300, and 400).

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Manipulator (AREA)

Abstract

A robotic navigation system includes a robot base; a robotic arm with a proximal end secured to the robot base, a distal end movable relative to the proximal end, and one or more arm segments between the proximal end and the distal end; and a plurality of tracking markers secured to the robotic arm.

Description

ROBOTIC REFERENCE FRAMES FOR NAVIGATION
FIELD
[0001] The present technology generally relates to robotic surgery, and relates more particularly to navigation during robotic surgery.
BACKGROUND
[0002] Surgical navigation systems are used to track the position of one or more objects during surgery. Surgical robots are useful for holding one or more tools or devices during a surgery, and may operate autonomously (e.g., without any human input during operation), semi-autonomously (e.g., with some human input during operation), or non-autonomously (e.g., only as directed by human input).
SUMMARY
[0003] A robotic navigation system according to one embodiment of the present disclosure comprises: a robot base; a robotic arm comprising a proximal end secured to the robot base, a distal end movable relative to the proximal end, and one or more arm segments between the proximal end and the distal end; and a plurality of tracking markers secured to the robotic arm.
[0004] Each of the one or more arm segments may support at least one of the plurality of tracking markers. The plurality of tracking markers may comprise at least three tracking markers. Each of the plurality of tracking markers may be an LED. Each of the plurality of tracking markers may be configured to emit or reflect light at a different wavelength than at least another one of the plurality of tracking markers. Each of the plurality of tracking markers may be configured to emit light in pulses at a first frequency different than a second frequency at which at least another one of the plurality of tracking markers is configured to emit light in pulses. Each of the plurality of tracking markers may be a geometric pattern. At least one of the plurality of tracking markers may be moveably secured to the robotic arm. The at least one of the plurality of tracking markers may be selectively moveable between a first position and a second position. At least two of the plurality of tracking markers may be circumferentially spaced about one of the one or more arm segments of the robotic arm.
[0005] The robotic navigation system may further comprise at least one processor and a memory. The memory may store instructions for execution by the at least one processor that, when executed, cause the at least one processor to determine, based on a pose of the robotic arm and a known location of each of the plurality of tracking markers on the robotic arm, a predicted arrangement of the plurality of tracking markers. The memory may store additional instructions for execution by the at least one processor that, when executed, further cause the at least one processor to: receive, from a camera configured to detect the plurality of tracking markers, information about a detected arrangement of the plurality of tracking markers; and compare the detected arrangement to the predicted arrangement.
[0006] A method of utilizing a robotic reference frame for navigation according to another embodiment of the present disclosure comprises: receiving, from a tracking marker sensor, first information about a plurality of tracking markers fixedly secured to a robotic arm of a robot; receiving, from the robot, second information corresponding to a position of the robotic arm in a robotic coordinate system; and registering the robotic coordinate system to a navigation coordinate system based on the first information and the second information.
[0007] Each of the plurality of tracking markers may comprise an LED. Each of the plurality of tracking markers may comprise a geometric pattern. The first information may comprise information about a position of the plurality of tracking markers in the navigation coordinate system. The second information may comprise information about a position at which each of the plurality of tracking markers is fixedly secured to the robotic arm.
[0008] A device for surgical navigation utilizing a robotic reference frame according to still another embodiment of the present disclosure comprises: at least one communication interface for receiving information from a robot; at least one tracking marker sensor for detecting a plurality of tracking markers on a robotic arm of the robot; at least one processor; and at least one memory. The memory stores instructions for execution by the at least one processor that, when executed, cause the at least one processor to: receive, from the robot, information corresponding to a predicted arrangement of the plurality of tracking markers, the predicted arrangement defining a custom, one-time reference frame; receive, from the at least one tracking marker sensor, data corresponding to a detected arrangement of the plurality of tracking markers; and compare the predicted arrangement to the detected arrangement to determine whether the custom, one-time reference frame has been created.
[0009] The memory may store additional instructions for execution by the processor that, when executed, further cause the processor to: confirm a position of an object in a predetermined coordinate space based on creation of the custom, one-time reference frame. Each of the plurality of tracking markers may be a geometric pattern. The at least one tracking marker sensor may be a camera, and each of the plurality of tracking markers may be an LED. At least two of the plurality of tracking markers may have different wavelengths or be configured to pulse at different frequencies.
[0010] The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
[0011] The phrases “at least one”, “one or more”, and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together. When each one of A, B, and C in the above expressions refers to an element, such as X, Y, and Z, or class of elements, such as Xi-Xn, Yi-Ym, and Zi-Zo, the phrase is intended to refer to a single element selected from X, Y, and Z, a combination of elements selected from the same class (e.g., Xi and X2) as well as a combination of elements selected from two or more classes (e.g., Yi and Z0).
[0012] The term “a” or “an” entity refers to one or more of that entity. As such, the terms “a” (or “an”), “one or more” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising”, “including”, and “having” can be used interchangeably.
[0013] The preceding is a simplified summary of the disclosure to provide an understanding of some aspects of the disclosure. This summary is neither an extensive nor exhaustive overview of the disclosure and its various aspects, embodiments, and configurations. It is intended neither to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure but to present selected concepts of the disclosure in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other aspects, embodiments, and configurations of the disclosure are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
[0014] Numerous additional features and advantages of the present invention will become apparent to those skilled in the art upon consideration of the embodiment descriptions provided hereinbelow. BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The accompanying drawings are incorporated into and form a part of the specification to illustrate several examples of the present disclosure. These drawings, together with the description, explain the principles of the disclosure. The drawings simply illustrate preferred and alternative examples of how the disclosure can be made and used and are not to be construed as limiting the disclosure to only the illustrated and described examples. Further features and advantages will become apparent from the following, more detailed, description of the various aspects, embodiments, and configurations of the disclosure, as illustrated by the drawings referenced below.
[0016] Fig. 1A is a block diagram of a system according to at least one embodiment of the present disclosure;
[0017] Fig. IB depicts a robot according to at least one embodiment of the present disclosure; [0018] Fig. 2A is a flowchart of a method according to at least one embodiment of the present disclosure;
[0019] Fig. 2B depicts a predicted arrangement of tracking markers according to at least one embodiment of the present disclosure;
[0020] Fig. 2C depicts a detected arrangement of tracking markers according to at least one embodiment of the present disclosure;
[0021] Fig. 3 is another flowchart of a method according to at least one embodiment of the present disclosure; and
[0022] Fig. 4 is another flowchart of a method according to at least one embodiment of the present disclosure.
DETAILED DESCRIPTION
[0023] It should be understood that various aspects disclosed herein may be combined in different combinations than the combinations specifically presented in the description and accompanying drawings. It should also be understood that, depending on the example or embodiment, certain acts or events of any of the processes or methods described herein may be performed in a different sequence, and/or may be added, merged, or left out altogether (e.g., all described acts or events may not be necessary to carry out the disclosed techniques according to different embodiments of the present disclosure). In addition, while certain aspects of this disclosure are described as being performed by a single module or unit for purposes of clarity, it should be understood that the techniques of this disclosure may be performed by a combination of units or modules associated with, for example, a computing device and/or a medical device. [0024] In one or more examples, the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
[0025] Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple Al l, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
[0026] Before any embodiments of the disclosure are explained in detail, it is to be understood that the disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Further, the present disclosure may use examples to illustrate one or more aspects thereof. Unless explicitly stated otherwise, the use or listing of one or more examples (which may be denoted by “for example,” “by way of example,” “e.g.,” “such as,” or similar language) is not intended to and does not limit the scope of the present disclosure.
[0027] Navigated robotic procedures often involve reference frames and trackers whose positions are detected by a tracking marker sensor. For example, a navigation system could use a camera as a tracking marker sensor, which can detect optical tracking markers on a reference frame attached to a robotic arm. With this information, the coordinate system of the robotic system can be correlated to the coordinate system of the navigation system. In other circumstances, the information from the tracking marker sensor, which can accurately determine the position and orientation of the robotic arm, can be used to calibrate the robotic system.
[0028] However, these optical systems need clear lines of sight between the cameras and the tracking markers, which can restrict robotic arm movement during surgery. To overcome these problems (and others), tracking markers (e.g., LEDs or reference frame indicia) may be incorporated onto or into a robotic arm at various locations and in various patterns. By correlating the LED positions relative to the robotic system, a camera or other tracking marker sensor can more easily detect the position and orientation of the robotic arm despite potential line-of-sight issues or other difficulties.
[0029] For example, unique geometry can be created by LEDs on different parts of the robotic arm. The actual tracked geometry changes according to the relative motion of the robotic arm joints. Furthermore, if the robot is connected to the patient, the location of the robotic arm can be used to correlate navigation space, robotic space, and patient space.
[0030] Embodiments of the present disclosure eliminate the need for a physical reference frame mounted to a robotic arm, as well as for the use of snapshot reference frames and, more generally, for the snapshot process of matching navigation and robotic coordinate spaces. Embodiments of the present disclosure further enable verification of navigation system integrity in real time by comparing sensed positions of robotic arm tracking markers (sensed, for example, by a navigation system) to predicted positions of the tracking markers (predicted, for example, based on information from the robot about a position and/or orientation (e.g., a pose) of the robotic arm). Similarly, embodiments of the present disclosure enable verification of robotic integrity in real time by comparing robotic arm position and/or orientation information (e.g., pose information) based on actual encoder readings to robotic arm position and/or orientation information determined based on the sensed positions of robotic arm tracking markers. Embodiments of the present disclosure therefore increase ease of use and decrease operation times over known navigation systems, registration procedures, and calibration operations.
[0031] Turning first to Fig. 1A, a block diagram of a system 100 according to at least one embodiment of the present disclosure is shown. The system 100 may be used, for example: to carry out one or more aspects of one or more of the methods disclosed herein; for navigation purposes; for registration purposes; for calibration operations; to verify operational integrity of a navigation system (such as the navigation system 160) or of a robot (such as a robot 136); or for any other useful purpose. The system 100 comprises a computing device 102, a tracking marker sensor 132, a robot 136, a navigation system 160, a database 164, and a cloud 168. Notwithstanding the foregoing, systems according to other embodiments of the present disclosure may omit any one or more of the computing device 102, the tracking marker sensor 132, the robot 136, the navigation system 160, the database 164, and/or the cloud 168. Additionally, systems according to other embodiments of the present disclosure may arrange one or more components of the system 100 differently (e.g., one or more of the tracking marker sensor 132, the robot 136, and the navigation system 160 may comprise the components shown in Fig. 1A as being part of the computing device 102).
[0032] The computing device 102 comprises at least one processor 104, at least one communication interface 108, at least one user interface 112, and at least one memory 116. A computing device according to other embodiments of the present disclosure may omit one or both of the communication interface(s) 108 and the user interface(s) 112.
[0033] The at least one processor 104 of the computing device 102 may be any processor identified or described herein or any similar processor. The at least one processor 104 may be configured to execute instructions stored in the at least one memory 116, which instructions may cause the at least one processor 104 to carry out one or more computing steps utilizing or based on data received, for example, from the tracking marker sensor 132, the robot 136, the navigation system 160, the database 164, and/or the cloud 168.
[0034] The computing device 102 may also comprise at least one communication interface 108. The at least one communication interface 108 may be used for receiving image data or other information from an external source (such as the tracking marker sensor 132, the robot 136, the navigation system 160, the database 164, the cloud 168, and/or a portable storage medium (e.g., a USB drive, a DVD, a CD)), and/or for transmitting instructions, images, or other information from the at least one processor 104 and/or the computing device 102 more generally to an external system or device (e.g., another computing device 102, the tracking marker sensor 132, the robot 136, the navigation system 160, the database 164, the cloud 168, and/or a portable storage medium (e.g., a USB drive, a DVD, a CD)). The at least one communication interface 108 may comprise one or more wired interfaces (e.g., a USB port, an ethemet port, a Firewire port) and/or one or more wireless interfaces (configured, for example, to transmit information via one or more wireless communication protocols such as 802.1 la/b/g/n, Bluetooth, Bluetooth low energy, NFC, ZigBee, and so forth). In some embodiments, the at least one communication interface 108 may be useful for enabling the device 102 to communicate with one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
[0035] The at least one user interface 112 may be or comprise a keyboard, mouse, trackball, monitor, television, touchscreen, button, joystick, switch, lever, and/or any other device for receiving information from a user and/or for providing information to a user of the computing device 102. The at least one user interface 112 may be used, for example, to receive a user selection or other user input in connection with any step of any method described herein; to receive a user selection or other user input regarding one or more configurable settings of the computing device 102 and/or of another component of the system 100; to receive a user selection or other user input regarding how and/or where to store and/or transfer data received, modified, and/or generated by the computing device 102; and/or to display information (e.g., text, images) and/or play a sound to a user based on data received, modified, and/or generated by the computing device 102. Notwithstanding the inclusion of the at least one user interface 112 in the system 100, the system 100 may automatically (e.g., without any input via the at least one user interface 112 or otherwise) carry out one or more, or all, of the steps of any method described herein.
[0036] Although the at least one user interface 112 is shown as part of the computing device 102, in some embodiments, the computing device 102 may utilize a user interface 112 that is housed separately from one or more remaining components of the computing device 102. In some embodiments, the user interface 112 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 112 may be located remotely from one or more other components of the computer device 102. [0037] The at least one memory 116 may be or comprise RAM, DRAM, SDRAM, other solid- state memory, any memory described herein, or any other tangible non-transitory memory for storing computer-readable data and/or instructions. The at least one memory 116 may store information or data useful for completing, for example, any step of the methods 200 or 300 described herein. The at least one memory 116 may store, for example, information about one or more predetermined coordinate systems 120 (e.g., information about a robotic coordinate system or space, information about a navigation coordinate system or space, information about a patient coordinate system or space); instructions 124 for execution by the at least one processor 104, for example to cause the at least one processor 104 to carry out one or more of the steps of the method 200 and/or of the method 300; and/or one or more algorithms 128 for use by the processor in carrying out any calculations necessary to complete one or more of the steps of the method 200 and/or of the method 300, or for any other calculations. Such predetermined coordinate system(s) 120, instructions 124, and/or algorithms 128 may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines, and may cause the at least one processor 104 to manipulate data stored in the at least one memory 116 and/or received from or via another component of the system 100.
[0038] The tracking marker sensor 132 is operable to detect one or more tracking markers 156 (described below). The tracking marker sensor 132 may be, for example, an optical camera; an infrared camera; a 3D camera system; a stereoscopic vision system; another imaging device; or any other sensor that can detect one or more tracking markers 156. The tracking marker sensor 132 may comprise a dedicated processor for executing instructions stored in a dedicated memory of the tracking marker sensor 132, or the tracking marker sensor 132 may simply be configured to transmit data collected therewith to the computing device 102 or to another component of the system 100. Although shown in Fig. 1A as being in communication only with the computing device 102, in some embodiments, the tracking marker sensor 132 may be in communication with any one or more of the computing device 102, the robot 136, the navigation system 160, the database 164, and/or the cloud 168. Also, in some embodiments, the computing device 102 may comprise the tracking marker sensor 132, while in other embodiments, the navigation system 160 may comprise the tracking marker sensor 132. In still other embodiments, the robot 136 may comprise the tracking marker sensor 132. [0039] The tracking marker sensor 132 may be positioned directly above an operating table or portion thereof, or above and to one side of an operating table or portion thereof, or in another convenient position within an operating room or other room housing the robot 136. The tracking marker sensor 132 may be positioned at a location selected to provide the tracking marker sensor 132 with a clear and/or unobstructed view of the robotic arm 144 of the robot 136 (and thus of one or more tracking markers 156 fixedly secured to the robotic arm 144) during operation thereof. In some embodiments, the tracking marker sensor 132 is fixed, while in other embodiments, the tracking marker sensor 132 may be precisely movable (whether manually or automatically) in one or more directions.
[0040] The tracking marker sensor 132 may be configured to capture data regarding sensed tracking markers 156 only at a given moment in time. For example, where the tracking marker sensor 132 is a camera, the tracking marker sensor 132 may be configured to capture still images comprising one or more tracking markers 156. The tracking marker sensor 132 may be configured to capture such data at periodic intervals, or when commanded by a user (e.g., via a user interface 112), or upon a signal (generated either autonomously or in response to user input) from the computing device 102, the robot 136, and/or the navigation system 160.
[0041] The tracking marker sensor 132 may additionally or alternatively be operable to capture data corresponding to one or more tracking markers 156 continuously, in real-time. In such embodiments, the tracking marker sensor 132 may provide a stream of real-time sensor data to the computing device 102, which may continuously process the sensor data to detect one or more tracking markers 156 therein. In some embodiments, the tracking marker sensor 132 may comprise more than one tracking marker sensor 132.
[0042] With reference still to Fig. 1A, and also to Fig. IB, the robot 136 may be any surgical robot or surgical robotic system. The robot 136 may be or comprise, for example, the Mazor X™ Stealth Edition robotic guidance system. The robot 136 may comprise a base 140 that supports a robotic arm 144. The robot 136 may comprise one or more robotic arms 144. In some embodiments, the robotic arm 144 may comprise a first robotic arm and a second robotic arm. In other embodiments, the robot 136 may comprise more than two robotic arms 144. The robotic arm 144 may, in some embodiments, assist with a surgical procedure (e.g., by holding a tool in a desired trajectory or pose and/or supporting the weight of a tool while a surgeon or other user operates the tool, or otherwise) and/or automatically carry out a surgical procedure. [0043] Referring still to Figs. 1A-1B, the robotic arm 144 may have three, four, five, six, or more degrees of freedom. The robotic arm 144 may comprise one or more segments 152. Each segment 152 may comprise a member 176 and a joint 172 to which the member 176 is attached and/or from which the member 176 extends. The joint 172 may be secured, for example, to the base 140 or to the member 176 of another segment 152. The joint 172 may be any type of joint that enables selective movement of the member 176 relative to the structure to which the joint 172 is attached. For example, the joint 172 may be a pivot joint, a hinge joint, a saddle joint, or a ball- and-socket joint. The joint 172 may allow movement of the member 176 in one dimension or in multiple dimensions, and/or along one axis or along multiple axes.
[0044] In embodiments of the robot 136 comprising a robotic arm 144 with only one segment 152, the joint 172 of the segment 152 may be secured to the base 140, and the member 176 of the segment 152 may comprise a proximal end secured to the joint 172 and a distal end supporting an end effector. The end effector may be, for example, a tool (e.g., a drill, saw, imaging device) or a tool guide (e.g., for guiding a biopsy needle, ablation probe, or other tool along a desired trajectory).
[0045] In embodiments of the robot 136 comprising a robotic arm 144 with a plurality of segments 152, such as that illustrated in Fig. IB, a first segment 152 may comprise a joint 172 secured to the base 140, and the member 176 of the first segment 152 may comprise a proximal end secured to the joint 172 and a distal end supporting a joint of a second segment 152. The member 176 of the second segment 152 may comprise a proximal end secured to the joint 172 of the second segment 152, and a distal end supporting a joint 172 of a third segment 152, and so on. The member 176 of the final segment 152 may comprise a distal end that supports an end effector 180, which may be the same as or similar to the end effector described above. In such embodiments, the joints 172 of the various segments 152 may or may not be of the same type, and the members 176 of the various segments 152 may or may not be identical.
[0046] All or some of the joints 172 of the segments 152 of the robotic arm 144 may be powered (so as to be selectively controllable without physical manipulation by a human). Any one or more of electric, pneumatic, hydraulic, and/or other means may be used to selectively control movement of a member 176 about the joint 172. For example, each segment 152 may comprise a servo for selectively moving the member 176 of that segment 152 relative to the joint 172 of that segment 152. [0047] The robotic arm 144 also comprises one or more sensors 148. Each sensor 148 may be positioned to detect a position of a member 176 of a given segment 152 relative to the joint 172 of the segment 152. For example, where the joint 172 of a given segment 152 is or comprises a hinge joint, a sensor 148 may detect an angular position of the member 176 relative to an axis of the hinge joint. Where the joint 172 of a given segment 152 is or comprises a rotary joint (e.g., configured to allow rotation of the member 176 about an axis that extends through the member 176 and the joint 172), the sensor 148 may detect an angular position of the member 176 relative to the axis that extends through the member 176 and the joint 172. Each sensor 148 may be, for example, a rotary encoder, a linear encoder, or an incremental encoder.
[0048] Data from the sensors 148 may be provided to a processor of the robot 136, to the processor 104 of the computing device 102, and/or to the navigation system 160. The data may be used to calculate a position in space of the robotic arm 144 relative to a predetermined coordinate system. For example, the robot 136 may calculate a position in space of the robotic arm 144 relative to a coordinate system with an origin at the position where the joint 172 of the first segment 152 of the robotic arm 144 is secured to the base 140. The calculation may be based not just on data received from the sensor(s) 148, but also on data or information (such as, for example, physical dimensions) corresponding to each segment 152 and/or corresponding to an end effector secured to the final segment 152. By way of example only, a known location of the proximal end of the robotic arm 144 (e.g., where a joint 172 of the first segment 152 is secured to the base 140), known dimensions of each segment 152, and data from the sensor(s) 148 about an orientation of the member 176 of each segment 152 relative to the joint 172 of each segment 152 may be used to calculate the path of the robotic arm through space.
[0049] Referring still to Figs. 1A-1B, a plurality of tracking markers 156 are fixedly secured to or positioned on the robotic arm 144. As used herein, “fixedly secured” does not mean “permanently secured,” and indeed the tracking markers 156 may be detachable from the robotic arm 144. The tracking markers 156 may be light-emitting diodes (LEDs). The tracking markers 156 may all be identical, or one or more of the tracking markers 156 may be different than another one or more of the tracking markers 156. In some embodiments, one or more of the tracking markers 156 may be configured to emit light at a first wavelength, and another one or more of the tracking markers 156 may be configured to emit light at a second wavelength different than the first wavelength. Also in some embodiments, one or more of the tracking markers 156 may be configured to reflect light at a first wavelength, while another one or more of the tracking markers may be configured to reflect light at a second wavelength that is different than the first wavelength. The emitted and/or reflected wavelengths of light of the embodiments described above may be wavelengths within a particular spectrum (e.g., wavelengths corresponding to red light versus wavelengths corresponding to blue light in the visible spectrum, or different wavelengths in the infrared spectrum) as well as wavelengths from different spectrums (e.g., a wavelength in the visible spectrum versus a wavelength in the infrared spectrum).
[0050] In some embodiments, one or more of the tracking markers 156 may be or comprise an LED that pulses at a first frequency, and another one or more of the tracking markers 156 may be or comprise an LED that pulses at a second frequency different than the first frequency. In some embodiments, the tracking markers 156 may be or comprise reflective spheres, geometric patterns (such as, for example, QR codes), or other items or features that may be readily distinguished by the tracking marker sensor 132. The tracking markers may be configured to be detectable by a tracking marker sensor 132 even when covered by a drape or other covering that may be arranged on or over the robotic arm 144 to maintain a sterile operating room environment.
[0051] In some embodiments, at least one tracking marker 156 is fixedly secured to or positioned on each segment 152. One or more groups of tracking markers 156 may be secured to one or more of the segments 152. In some embodiments, the number of tracking markers 156 fixedly secured to or positioned on the robotic arm 144 may be at least three. Also in some embodiments, each segment 152 may have fixedly secured thereto, or positioned thereon, a plurality of tracking markers 156, arranged so that at least one tracking marker 156 is visible (e.g., to the tracking sensor 132) from any one of a plurality of possible orientations of the segment 152. In other embodiments, the tracking markers 156 are arranged on the robotic arm 144 so that at least three tracking markers 156 are visible (e.g., to the tracking sensor 132) from any one of a plurality of possible orientations of the robotic arm 144. For example, in some embodiments, the tracking markers 156 may be circumferentially spaced about the robotic arm 144 or about a particular segment of the robotic arm 144. In some embodiments, the tracking markers 156 may also be radially and/or longitudinally spaced about the robotic arm 144 or about a particular segment of the robotic arm 144.
[0052] In some embodiments of the present disclosure, the plurality of tracking markers 156 may be moveably secured to the robotic arm 144, and may further be selectively moveable relative to the robotic arm 144. In such embodiments, one or more of the plurality of tracking markers 156 may be configured to move (or to be moved automatically) from a first position on the robotic arm 144 to a second position on the robotic arm 144 when the robotic arm 144 moves into or out of a certain position or set of positions. The purpose of such movement of the one or more of the plurality of tracking markers 156 may be to facilitate maintenance of a line of sight between each (or at least a subset) of the plurality of tracking markers 156 and the tracking sensor 132. In such embodiments, the robot 136 (and/or another component of the system 100) may be configured to track whether each of the plurality of tracking markers 156 is in its respective first position or second position, and to provide such information to the navigation system 160 (or to any other component of the system 100) to enable correlation of a robotic coordinate system with a navigation coordinate system based on a position of the tracking markers 156 relative to the robotic arm 144 as known by the robot 136 (and/or another component of the system 100), and further based on a position of the tracking markers 156 as detected by the navigation system 160 (e.g., using a tracking sensor 132).
[0053] The number of tracking markers 156 on the robotic arm 144 may be selected based on a minimum number of tracking markers 144 needed to determine a position in space of the robotic arm 144 based on the relative orientation of the detected tracking markers 156, as described in more detail below. For example, if the minimum number of tracking markers 156 needed to determine a position of the robotic arm 144 is 3, then the total number of tracking markers 156 on the robotic arm may be 3 or more. Alternatively, if the minimum number of tracking markers 156 needed to determine a position of the robotic arm 144 is 4, then the total number of tracking markers 156 on the robotic arm 144 may be 4 or more. The greater the multiple, the greater the likelihood that the minimum number of tracking markers 156 will be visible to or otherwise detectable by the tracking marker sensor 156 regardless of the orientation of the robotic arm 144. [0054] The minimum number of tracking markers 156 needed to determine a position of the robotic arm 144 may be the minimum number needed to ensure that the position of the tracking markers 156 in space (e.g., as detected by the tracking marker sensor 132) is unique for every possible orientation of the robotic arm 144. For example, if only one tracking marker 156 were positioned on a distal end of a robotic arm 144 having a plurality of segments 152, then a plurality of orientations or poses of the robotic arm 144 could likely be assumed without moving the tracking marker 156. Adding additional tracking markers 156 to the robotic arm 144 reduces the number of possible positions of the robotic arm 144 associated with each arrangement of the tracking markers 156, until a threshold number of tracking markers 156 is reached that represents the minimum number of tracking markers 156 needed to determine a position of the robotic arm 144. For a robotic arm 144 with only one segment 152; the minimum number of tracking markers 156 needed to determine a position of the robotic arm 144 may be two. For robotic arms 144 having multiple segments 152, the minimum number of tracking markers 156 needed to determine a position of the robotic arm 144 may be three or more. As discussed herein, additional tracking markers 156 may be provided to ensure redundancy should one or more of the tracking markers 156 have an obstructed line-of-sight to the tracking marker sensor 132.
[0055] Referring again to Fig. 1A, the navigation system 160 may provide navigation for a surgeon and/or for the robot 136 during an operation. The navigation system 160 may be any now- known or future-developed navigation system, including, for example, the Medtronic Stealth Station™ S8 surgical navigation system. The navigation system 160 may include a camera or other sensor(s) for detecting and/or tracking one or more reference markers, navigated trackers, or other objects within an operating room or other room where a surgical procedure takes place. In some embodiments, the navigation system 160 may comprise the tracking marker sensor 132. In various embodiments, the navigation system 160 may be used to track a position of the robotic arm 144 (or, more particularly, of tracking markers 156 attached to the robotic arm 144). The navigation system 160 may be used to track a position of one or more reference markers or arrays or other structures useful for detection by a camera or other sensor of the navigation system 160. The navigation system 160 may include a display for displaying one or more images from an external source (e.g., the computing device 102, tracking marker sensor 132, or other source) or a video stream from the camera or other sensor of the navigation system 160. In some embodiments, the system 100 may operate without the use of the navigation system 160.
[0056] The database 164 may store information that correlates each particular arrangement of tracking markers 156 to a corresponding position and orientation, or pose, of the robotic arm 144. In such embodiments, information from the tracking marker sensor 132 about the position of each of a plurality of detected tracking markers 156 may be used to look up, in the database 164, a corresponding position of the robotic arm 144. The database 164 may additionally or alternatively store, for example, information about or corresponding to one or more characteristics of the tracking markers 156; one or more surgical plans for use by the robot 136, the navigation system 160, and/or a user of the computing device 102 or of the system 100; one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information. The database 164 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud 168. In some embodiments, the database 164 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
[0057] The cloud 168 may be or represent the Internet or any other wide area network. The computing device 102 may be connected to the cloud 168 via the communication interface 108, using a wired connection, a wireless connection, or both. In some embodiments, the computing device 102 may communicate with the database 164 and/or an external device (e.g., a computing device) via the cloud 168.
[0058] Turning now to Fig. 2A, a method 200 for utilizing a robotic reference frame for navigation may be performed, for example, by at least one processor. The at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above. The at least one processor may be part of a robot (such as the robot 136) or part of a navigation system (such as the navigation system 160). A processor other than any processor described herein may also be used to execute the method 200. The at least one processor may perform the method 200 by executing instructions stored in a memory, such as the instructions 124 of the memory 116. The instructions may correspond to one or more steps of the method 200 described below. The instructions may cause the processor to execute one or more algorithms, such as the algorithms 128.
[0059] The method 200 comprises determining, based on a planned pose of a robotic arm, a predicted arrangement of a plurality of tracking markers (step 204). In some embodiments, the planned pose of the robotic arm may be a current pose of the robotic arm. The robotic arm may be the same as or similar to the robotic arm 144 of the robot 136. The plurality of tracking markers may be the same as or similar to the tracking markers 156. The determining may comprise accessing stored information — whether from a memory such as the memory 116, a database such as the database 164, or elsewhere — about a position of each of the plurality of tracking markers relative to the robotic arm or a portion thereof. The determining may further comprise calculating a predicted position of each of the plurality of tracking markers based on the planned position of the robotic arm and information about a position of each of the plurality of tracking markers relative to the robotic arm (or a portion thereof). The determining may comprise calculating a predicted position of every tracking marker fixedly secured to the robotic arm, or only of a subset of every tracking marker fixedly secured to the robotic arm. The determining may further comprise compiling the calculated predicted positions of each of the plurality of tracking markers into a predicted arrangement of the plurality of tracking markers.
[0060] The predicted arrangement of the robotic arm may be determined relative to a coordinate system. The coordinate system may be a robotic coordinate system, a navigation coordinate system, or another coordinate system.
[0061] The tracking markers may be the same as or similar to the tracking markers 156 described above. For example, the tracking markers may be LEDs, or reflective spheres, or geometric patterns such as QR codes. The tracking markers may all be identical, or each may be distinguishable from the others by a unique characteristic (e.g., a unique wavelength, a unique pulsating frequency). In some embodiments, some tracking markers may be distinguishable from one or more other tracking markers by a shared characteristic. For example, a first set of tracking markers may emit light at a first wavelength, and a second set of tracking markers may emit light at a second wavelength.
[0062] Fig. 2B illustrates a predicted arrangement 250 of a plurality of tracking markers according to at least one embodiment of the present disclosure, with the corresponding position of the robot (including the robotic arm) shown in dotted lines.
[0063] Referring again to Fig. 2A, the method 200 also comprises receiving, from a tracking marker sensor, information about a detected arrangement of the plurality of tracking markers (step 208). The tracking marker sensor may be the same as or similar to the tracking marker sensor 132. The tracking marker sensor may be an optical camera, an infrared camera, or any other sensor configured to detect the tracking markers. The tracking marker sensor, in some embodiments, may be part of a robot such as the robot 136, or part of a navigation system such as the navigation system 160, or part of a computing device such as the computing device 102. In some embodiments, the tracking marker sensor may be independent of any of the foregoing components, but may be in electronic communication with one or more of the foregoing components. [0064] The information may comprise a position of each of a plurality of tracking markers in a detected arrangement of tracking markers. The position of one or more of the plurality of tracking markers may be defined based on a coordinate system (such as a robotic coordinate system or a navigation coordinate system) or relative to another one or more of the plurality of tracking markers. The information may comprise only information about a position of each of the plurality of tracking markers relative to each other, and may be useful for calculating a position of each of the plurality of tracking markers relative to a coordinate system (such as a robotic coordinate system or a navigation coordinate system).
[0065] The information may be received, for example, via a communication interface such as the communication interface 108.
[0066] Fig. 2C shows a detected arrangement 260 of the plurality of tracking markers, according to at least one embodiment of the present disclosure, as might be shown or reflected in the information about the detected arrangement of the plurality of tracking markers received in step 208.
[0067] The method 200 also comprises comparing the detected arrangement of the plurality of tracking markers to the predicted arrangement of the plurality of tracking markers (step 212). The comparing may comprise translating or otherwise correlating the predicted position of each of the plurality of tracking markers from one coordinate space (e.g., a robotic coordinate space) to another coordinate space (e.g., a navigation coordinate space). Alternatively, the comparing may comprise translating or otherwise correlating the detected position of each of the plurality of tracking markers from one coordinate space (e.g., a navigation coordinate space) to another coordinate space (e.g., a robotic coordinate space). The comparing may comprise simply comparing the relative positions of the plurality of tracking markers in the predicted arrangement to the relative positions of the plurality of tracking markers in the detected arrangement.
[0068] When the comparison yields a conclusion that the detected arrangement matches the predicted arrangement, the precise position of the robotic arm is known. Thus, at that moment, the robotic arm (by virtue of the tracking markers fixedly secured thereto) constitutes a custom, one-time reference frame useful for the same purposes as any known reference frame — including, for example, to register a robotic coordinate system to a navigation coordinate system, and/or to determine or confirm a position of a given object in a particular coordinate system. The robotic arm may, for example, be positioned such that an end thereof is in contact with an anatomical feature, a surgical tool, or another object, such that the robotic arm comprises a reference frame based upon which the position of the anatomical feature, the surgical tool, or the other object may be determined or confirmed.
[0069] Embodiments of the present disclosure thus advantageously eliminate a need for sometimes-bulky reference frames. For example, some navigation systems require certain minimum distances between tracking markers (or references frames for holding tracking markers to be a minimum size) to provide accurate navigation within a volume applicable to a given application. Larger volumes in particular may require larger reference frames than smaller volumes. In some instances, the reference frames holding tracking markers for surgeries or other medical procedures can extend from three inches to ten inches in multiple dimensions in order to achieve a minimum necessary size. As a result, these reference frames tend to be bulky, are easily bumped (which can, for example, cause undesired movement of the objects to which they are affixed), can get in the way of movement of a surgeon or of any movable object in an operating room environment, and can be difficult to use. The use of a robotic reference frame as described in several embodiments herein enables a somewhat bulky object already in the operating room (the robot) to be used, when needed, to create an instantaneous reference frame, thus eliminating the need for dedicated reference frames and the problems associated therewith.
[0070] Also when the comparison yields a conclusion that the detected arrangement matches the predicted arrangement, the operational integrity of the robot and of the navigation system can be confirmed. This can be useful during surgical procedures as well as for initial calibration operations for the robotic system. On the other hand, when the comparison yields a conclusion that the detected arrangement does not match the predicted arrangement, even though the robotic arm is in the pose (e.g., the position and/or orientation) used to determine the predicted arrangement, a further conclusion can be reached that one or both of the robot and the navigation system lack operational integrity. Thus, when this occurs, a warning may be displayed to an operator of the robot and/or of the navigation system, and/or an audible sound may be played, via a user interface (such as, for example, the user interface 112 of the computing device 102, or a user interface specific to the robot or to the navigation system). Provision of such a warning to an operator of the robot and/or of the navigation system helps to ensure that the suspect operational integrity of the robot and/or of the navigation system can be investigated, and any errors corrected, before the robot and/or the navigation system are used further. [0071] In some embodiments, where the detected arrangement of the plurality of tracking markers is only slightly different than the predicted arrangement of the plurality of tracking markers, another arrangement of the plurality of tracking markers (e.g., based on a different pose of the robotic arm) may be predicted, and the camera or other tracking marker sensor may provide additional information about a second detected arrangement of the tracking markers (detected, for example, when the robotic arm is in the different pose). If the second detected arrangement of the plurality of tracking markers is again only slightly different than the second predicted arrangement of the plurality of tracking markers, then an error calculation and/or calibration process may be undertaken to determine an adjustment to be applied to any further predicted arrangement so that it matches the corresponding detected arrangement, or vice versa. In other words, if an offset between a predicted arrangement of the plurality of tracking markers and a corresponding detected arrangement of the plurality of tracking markers can be characterized by a constant or a derived equation, such that the offset can be incorporated into further comparisons of a predicted arrangement and a detected arrangement of the plurality of tracking markers, then the operational integrity of the robot and/or of the navigation system may be confirmed.
[0072] The method 200 also comprises determining a pose of the robotic arm based on the predicted and/or detected arrangement (step 216). The determining may comprise receiving, from the robot having the robotic arm, information about the pose of the robotic arm that corresponds to the predicted and/or detected arrangements of the tracking markers. The determining may alternatively comprise calculating — based on information about the position of each tracking marker (and/or about the relative positions of the tracking markers in the detected arrangement of tracking markers) as well as information about a position of each tracking marker relative to the robotic arm or a segment thereof, and information about the robotic arm and/or the segment(s) thereof — a pose of the robotic arm. The determining may comprise determining a pose of the robotic arm relative to a robotic coordinate system, a navigation coordinate system, or some other coordinate system.
[0073] Where the determining comprises calculating a pose of one or more arm segments of the robotic arm, the determining may further comprise combining the calculated poses of the one or more arm segments into a calculated pose of the robotic arm as a whole. Also, information needed to calculate a pose of the robotic arm based on the predicted and/or detected arrangement of the tracking markers (including any of the information described above) may be accessed from a memory such as the memory 116 or a database such as the database 164. Such information may include, for example, information about the dimensions, arrangement, range of motion, and/or other characteristics of the segments of the robotic arm and/or of the robotic arm as a whole. One or more algorithms (such as the algorithms 128) stored in a memory (such as the memory 116) may be used when calculating the pose of the one or more segments of the robotic arm and/or of the robotic arm as a whole.
[0074] The present disclosure encompasses embodiments of the method 200 that comprise more or fewer steps than those described above.
[0075] Turning now to Fig. 3, a method 300 of utilizing a robotic reference frame for registration comprises receiving, from a sensor, first information about a plurality of tracking markers fixedly secured to a robotic arm (step 304). The sensor may be, for example, a tracking marker sensor as described elsewhere herein, including, for example, a tracking marker sensor 132, or any other sensor suitable for detecting the plurality of tracking markers. The tracking markers may be any tracking markers described herein, including for example, tracking markers 156. The robotic arm may be any robotic arm described herein, including, for example, a robotic arm 144 of a robot 136, or any other robotic arm.
[0076] The first information may be or comprise information about a position of the plurality of tracking markers in a navigation coordinate system. The first information may be or comprise information about a position of the plurality of tracking markers relative to each other. In some embodiments, the first information may be or comprise information about one or more characteristics of one or more specific tracking markers, such as information about a wavelength of one or more specific tracking markers, a pulsating frequency of one or more tracking markers, and/or a geometric pattern of one or more specific tracking markers. The first information may be received directly from the sensor or via one or more communication interfaces such as the communication interface 108, and/or via a cloud such as the cloud 168, or via any other network, device, or component.
[0077] The method 300 also comprises receiving, from the robotic arm or a robot that comprises the robotic arm, second information corresponding to a position of the robotic arm in a robotic coordinate system (step 308). The second information may be based, for example, on data obtained from one or more sensors in the robotic arm, such as the sensors 148. For example, the second information may comprise sensor data about a detected position of one or more segments of the robotic arm and/or of the robotic arm as a whole. The second information may be based on one or more settings of one or more components of the robotic arm. For example, the second information may comprise data describing a position (whether an actual position or a commanded position) of one or more motors, servos, gears, or other devices or components used to control a position of the robotic arm and/or one or more segments thereof. The second information may be obtained independently of the first information, and vice versa.
[0078] The method 300 also comprises correlating a position of the robotic arm in the robotic coordinate system to a position of the plurality of tracking markers in the robotic coordinate system (step 312). The correlating may comprise accessing information from a memory such as the memory 116, a database such as the database 164, a robot such as the robot 136, or another storage location. Such information may include, for example, information regarding a precise position of each of the plurality of tracking markers on the robotic arm, and/or information about the dimensions, arrangement, range of motion, and/or other characteristics of the segments of the robotic arm and/or of the robotic arm as a whole. Based on such information as well as information about a position of the robotic arm in the robotic coordinate system (e.g., the second information received in step 308), a position of the plurality of tracking markers in the robotic coordinate system may be calculated. Such calculations may utilize, for example, one or more algorithms such as the algorithms 128.
[0079] The result of the correlating may be a calculated or otherwise determined position, in the robotic coordinate system, of the plurality of tracking markers fixedly secured to the robotic arm. [0080] The method 300 also comprises registering the robotic coordinate system to the navigation coordinate system (step 316). The registering may be accomplished based on the first information and the second information. In some embodiments, the registering comprises the correlating described above as step 312 (e.g., step 316 may, in some embodiments, comprise step 312). The registering may comprise determining a relationship between the robotic coordinate system and the navigation coordinate system based on a known position of the plurality of tracking markers in the navigation coordinate system (as included in or determined from the first information) and a known position of the plurality of tracking markers in the robotic coordinate system (as determined from the second information). The registering may utilize one or more algorithms such as the algorithms 128 stored in the memory 116. [0081] The method 300 also comprises registering a patient coordinate system to the navigation coordinate system (step 320). Where the robot is connected to the patient (as is sometimes the case during robotic or robotic-assisted surgery), or where the robotic coordinate system has already been registered to a patient coordinate system, registration of the robotic coordinate system to the navigation coordinate system (as described with respect to step 316 above) also enables registration of the patient coordinate system to the navigation coordinate system. In some embodiments, the registering of this step 320 comprises correlating the robotic coordinate system to a patient coordinate system (or vice versa), and, based on that registration, correlating the patient coordinate system to the navigation coordinate system. In other words, the registering may comprise determining a relationship between the patient coordinate system and the navigation coordinate system based on a relationship between the patient coordinate system and the robotic coordinate system and a relationship between the robotic coordinate system and the navigation coordinate system. The registering of this step 320 may utilize one or more algorithms such as the algorithms 128 stored in the memory 116.
[0082] The present disclosure encompasses a number of variations on the method 300. For example, as noted above, in some embodiments the registering step 316 may comprise correlating a position of the robotic arm in a robotic coordinate system (as indicated in or determined from the second information received in the step 308) to a position of each of the plurality of tracking markers, as currently described with respect to the step 312. Also in some embodiments, the second information may comprise position information for each of the plurality of tracking markers in the robotic coordinate system, such that the correlating step 312 or any similar step is unnecessary. Additionally, while in the method 300 described above a position of each of the plurality of tracking markers in the robotic coordinate system is determined based on second information about a position of the robotic arm in the robotic coordinate system, in other embodiments a position of the robotic arm may be determined based on the first information, and the registering step 316 may comprise registering the robotic coordinate system to the navigation coordinate system (or vice versa) based on the position of the robotic arm as determined from the first information and the position of the robotic arm as indicated by or determined from the second information.
[0083] The method 300 beneficially enables registration of a navigation coordinate system to a robotic coordinate system or vice versa without the use of a reference frame other than a reference frame formed by the robotic arm (including the tracking markers fixedly secured to the robotic arm) itself. The method 300 thus avoids the cost of a separate reference frame, the expenditure of the time needed to secure the separate reference frame to the robotic arm, and, in instances where the reference frame would be secured directly to the patient (e.g., to the patient’s vertebra or pelvis), the incision that would otherwise be needed to secure the separate reference frame to the patient. Moreover, the method 300 removes the need for a snapshot frame that would otherwise be required during registration of a navigation coordinate system to a robotic coordinate system or vice versa.
[0084] The present disclosure encompasses embodiments of the method 300 with more or fewer steps than those described above.
[0085] With reference now to Fig. 4, a method 400 of utilizing a robotic reference frame comprises receiving information corresponding to a predicted arrangement of a plurality of tracking markers, which arrangement defines a custom, one-time reference frame (step 404). The predicted arrangement may be based on an expected position of a robotic arm to which the plurality of tracking markers are fixedly secured, or a current position of a robotic arm to which the plurality of tracking markers are fixedly secured. The information may comprise the results of one or more calculations carried out (e.g., by a robot such as the robot 136, or by a processor such as the processor 104) to determine the predicted arrangement of the plurality of tracking markers based on an expected or current position of the robotic arm to which the tracking markers are fixedly secured. The expected or current position of the robotic arm may be a position that enables an end of the robotic arm (whether an end effector or otherwise) to be in contact with an object, the position of which needs to be determined or confirmed (whether in a navigation coordinate space or otherwise). The information may comprise information about a position of each individual tracking marker relative to a robotic coordinate system or another coordinate system. The information may comprise information about a characteristic of each individual tracking marker, such as a wavelength, pulsating frequency, or geometric pattern of each individual tracking marker.
[0086] Each of the plurality of tracking markers may be any tracking marker described herein, including, for example, a tracking marker 156. Each of the tracking markers is fixedly secured to a robotic arm. The plurality of tracking markers may comprise at least three tracking markers, or at least four tracking markers, or at least five tracking markers, or at least more than five tracking markers.
[0087] The method 400 also comprises receiving data corresponding to a detected arrangement of the plurality of tracking markers (step 408). The data may be received, for example, from a tracking marker sensor such as the tracking marker sensor 132. The data may comprise position information of the plurality of tracking markers in a navigation coordinate space or another coordinate space. Alternatively, the data may enable calculation of a position of the plurality of tracking markers in a navigation coordinate space or another coordinate space. The data may comprise information about a characteristic of each individual tracking marker, such as a wavelength, pulsating frequency, or geometric pattern of each individual tracking marker.
[0088] Notably, the data corresponding to the detected arrangement of the plurality of tracking markers is independent of the information corresponding to the predicted arrangement of the plurality of tracking markers. In other words, the data is generated without reference to the information, and the information is generated without reference to the data.
[0089] The method 400 also comprises comparing the predicted arrangement of the plurality of tracking markers to the detected arrangement of the plurality of tracking markers to determine whether the custom, one-time reference frame has been created (step 412). When the detected arrangement matches the predicted arrangement, the exact pose of the robotic arm is known and therefore enables the robotic arm (and/or the tracking markers fixedly secured to the robotic arm) to be used as a custom, one-time reference frame usable instead of a separate reference frame that must be secured to or held by the robotic arm or another object or person. The comparing may comprise using one or more algorithms such as the algorithms 128 to translate a position of the tracking markers in one coordinate system (e.g., in a robotic coordinate system) to a position of the tracking markers in another coordinate system (e.g., in a navigation coordinate system). The comparing may also comprise overlaying an image contained in or generated using the data corresponding to the detected arrangement of the plurality of tracking markers on a virtual image generated based on the information corresponding to the predicted arrangement of the plurality of tracking markers to determine whether the tracking markers in both images line up with each other. Other comparison methods may also be used to determine whether the predicted arrangement of the plurality of tracking markers matches the detected arrangement of the plurality of tracking markers. [0090] The method 400 also comprises confirming a position of an object based on the creation of the custom, one-time reference frame (step 416). As noted above, when the predicted arrangement matches the detected arrangement, the precise position of the robotic arm is known, such that the robotic arm (and/or the tracking markers fixedly secured to the robotic arm) can be used as a reference frame. If the position (and, in some embodiments, the orientation) of an object can be detected simultaneously with the detection of the plurality of tracking markers, and/or if the robotic arm is in contact with a known surface or feature of the object at the moment the custom, one-time reference frame is created, then the position (and, in some embodiments, the orientation) of the object may be determined using the custom, one-time reference frame, just as the position (and, in some embodiments, the orientation) of an object may be determined using a reference frame separate from the robotic arm.
[0091] The present disclosure encompasses embodiments of the method 400 with more or fewer steps than those described above.
[0092] As may be appreciated based on the foregoing disclosure, the present disclosure encompasses methods with fewer than all of the steps identified in Figs. 2A, 3, and 4 (and the corresponding description of the methods 200, 300, and 400), as well as methods that include additional steps beyond those identified in Figs. 2A, 3, and 4 (and the corresponding description of the methods 200, 300, and 400).
[0093] The foregoing is not intended to limit the disclosure to the form or forms disclosed herein. In the foregoing Detailed Description, for example, various features of the disclosure are grouped together in one or more aspects, embodiments, and/or configurations for the purpose of streamlining the disclosure. The features of the aspects, embodiments, and/or configurations of the disclosure may be combined in alternate aspects, embodiments, and/or configurations other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed aspect, embodiment, and/or configuration. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate preferred embodiment of the disclosure.
[0094] Moreover, though the description has included description of one or more aspects, embodiments, and/or configurations and certain variations and modifications, other variations, combinations, and modifications are within the scope of the disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative aspects, embodiments, and/or configurations to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.

Claims

CLAIMS What is claimed is:
1. A robotic navigation system comprising: a robot base; a robotic arm comprising: a proximal end secured to the robot base; a distal end movable relative to the proximal end; and one or more arm segments between the proximal end and the distal end; and a plurality of tracking markers secured to the robotic arm.
2. The robotic navigation system of claim 1, wherein each of the one or more arm segments supports at least one of the plurality of tracking markers.
3. The robotic navigation system of claim 1, wherein the plurality of tracking markers comprises at least three tracking markers.
4. The robotic navigation system of claim 1, wherein each of the plurality of tracking markers is an LED.
5. The robotic navigation system of claim 1, wherein each of the plurality of tracking markers is configured to emit or reflect light at a different wavelength than at least another one of the plurality of tracking markers.
6. The robotic navigation system of claim 1, wherein each of the plurality of tracking markers is configured to emit light in pulses at a first frequency different than a second frequency at which at least another one of the plurality of tracking markers is configured to emit light in pulses.
7. The robotic navigation system of claim 1, wherein each of the plurality of tracking markers is a geometric pattern.
8. The robotic navigation system of claim 1, wherein at least one of the plurality of tracking markers is moveably secured to the robotic arm.
9. The robotic navigation system of claim 8, wherein the at least one of the plurality of tracking markers is selectively moveable between a first position and a second position.
10. The robotic navigation system of claim 1, wherein at least two of the plurality of tracking markers are circumferentially spaced about one of the one or more arm segments of the robotic arm.
11. The robotic navigation system of claim 1, further comprising: at least one processor; and a memory storing instructions for execution by the at least one processor that, when executed, cause the at least one processor to: determine, based on a pose of the robotic arm and a known location of each of the plurality of tracking markers on the robotic arm, a predicted arrangement of the plurality of tracking markers.
12. The robotic navigation system of claim 11, wherein the memory stores additional instructions for execution by the at least one processor that, when executed, further cause the at least one processor to: receive, from a camera configured to detect the plurality of tracking markers, information about a detected arrangement of the plurality of tracking markers; and compare the detected arrangement to the predicted arrangement.
13. A method of utilizing a robotic reference frame for navigation, comprising: receiving, from a tracking marker sensor, first information about a plurality of tracking markers fixedly secured to a robotic arm of a robot; receiving, from the robot, second information corresponding to a position of the robotic arm in a robotic coordinate system; and registering the robotic coordinate system to a navigation coordinate system based on the first information and the second information.
14. The method of claim 13, wherein each of the plurality of tracking markers comprises an LED.
15. The method of claim 13, wherein each of the plurality of tracking markers comprises a geometric pattern.
16. The method of claim 13, wherein the first information comprises information about a position of the plurality of tracking markers in the navigation coordinate system.
17. The method of claim 13, wherein the second information comprises information about a position at which each of the plurality of tracking markers is fixedly secured to the robotic arm.
18. A device for surgical navigation utilizing a robotic reference frame, comprising: at least one communication interface for receiving information from a robot; at least one tracking marker sensor for detecting a plurality of tracking markers on a robotic arm of the robot; at least one processor; and at least one memory storing instructions for execution by the at least one processor that, when executed, cause the at least one processor to: receive, from the robot, information corresponding to a predicted arrangement of the plurality of tracking markers, the predicted arrangement defining a custom, one-time reference frame; receive, from the at least one tracking marker sensor, data corresponding to a detected arrangement of the plurality of tracking markers; and compare the predicted arrangement to the detected arrangement to determine whether the custom, one-time reference frame has been created.
19. The device of claim 18, wherein the memory stores additional instructions for execution by the processor that, when executed, further cause the processor to: confirm a position of an object in a predetermined coordinate space based on creation of the custom, one-time reference frame.
20. The device of claim 18, wherein each of the plurality of tracking markers is a geometric pattern.
21. The device of claim 18, wherein the at least one tracking marker sensor is a camera, and each of the plurality of tracking markers is an LED.
22. The device of claim 21, wherein at least two of the plurality of tracking markers have different wavelengths or are configured to pulse at different frequencies.
EP21737848.8A 2020-06-08 2021-06-03 Robotic reference frames for navigation Pending EP4161427A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063036130P 2020-06-08 2020-06-08
PCT/US2021/035667 WO2021252263A1 (en) 2020-06-08 2021-06-03 Robotic reference frames for navigation

Publications (1)

Publication Number Publication Date
EP4161427A1 true EP4161427A1 (en) 2023-04-12

Family

ID=76797085

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21737848.8A Pending EP4161427A1 (en) 2020-06-08 2021-06-03 Robotic reference frames for navigation

Country Status (3)

Country Link
EP (1) EP4161427A1 (en)
CN (1) CN115989002A (en)
WO (1) WO2021252263A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117979917A (en) * 2021-09-30 2024-05-03 柯惠Lp公司 Setting remote center of motion in surgical robotic systems

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9526587B2 (en) * 2008-12-31 2016-12-27 Intuitive Surgical Operations, Inc. Fiducial marker design and detection for locating surgical instrument in images
US20190000569A1 (en) * 2012-06-21 2019-01-03 Globus Medical, Inc. Controlling a surgical robot to avoid robotic arm collision
WO2018178150A1 (en) * 2017-03-31 2018-10-04 Koninklijke Philips N.V. Markerless robot tracking systems, controllers and methods
AU2018346790A1 (en) * 2017-10-05 2020-04-30 Mobius Imaging, Llc Methods and systems for performing computer assisted surgery

Also Published As

Publication number Publication date
WO2021252263A1 (en) 2021-12-16
CN115989002A (en) 2023-04-18

Similar Documents

Publication Publication Date Title
US11653983B2 (en) Methods for locating and tracking a tool axis
US11510740B2 (en) Systems and methods for tracking objects
US20230255699A1 (en) Time-spaced robotic reference frames
WO2021252263A1 (en) Robotic reference frames for navigation
US20230270511A1 (en) Registration of multiple robotic arms using single reference frame
US20220192701A1 (en) Systems and methods for surgical port positioning
US20220241033A1 (en) Split robotic reference frame for navigation
EP4284287A1 (en) Multi-arm robotic systems for identifying a target
EP4284286A1 (en) Split robotic reference frame for navigation
EP4125674A1 (en) Systems for using surgical robots with navigation arrays
US20230240761A1 (en) Methods for locating and tracking a tool axis
US20220346882A1 (en) Devices, methods, and systems for robot-assisted surgery
CN116801829A (en) Split robot reference frame for navigation
US20240024028A1 (en) Systems and methods for verifying a pose of a target
US20220241032A1 (en) Multi-arm robotic systems and methods for identifying a target
US20230133689A1 (en) Arm movement safety layer
US20230255694A1 (en) Systems and methods for validating a pose of a marker
US20230404692A1 (en) Cost effective robotic system architecture
WO2022234566A1 (en) Devices, methods, and systems for robot-assisted surgery
CN117320655A (en) Apparatus, methods, and systems for robotic-assisted surgery

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20221220

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)