CN115989002A - Robotic reference frame for navigation - Google Patents

Robotic reference frame for navigation Download PDF

Info

Publication number
CN115989002A
CN115989002A CN202180041092.6A CN202180041092A CN115989002A CN 115989002 A CN115989002 A CN 115989002A CN 202180041092 A CN202180041092 A CN 202180041092A CN 115989002 A CN115989002 A CN 115989002A
Authority
CN
China
Prior art keywords
tracking
robot
robotic
robotic arm
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180041092.6A
Other languages
Chinese (zh)
Inventor
A·埃尔曼
D·朱尼奥
K·M·普克特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mazor Robotics Ltd
Original Assignee
Mazor Robotics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mazor Robotics Ltd filed Critical Mazor Robotics Ltd
Publication of CN115989002A publication Critical patent/CN115989002A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/3945Active visible markers, e.g. light emitting diodes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery

Abstract

A robot navigation system comprising: a robot base; a robotic arm having a proximal end fixed to the robotic base, a distal end movable relative to the proximal end, and one or more arm segments located between the proximal end and the distal end; and a plurality of tracking markers fixed to the robotic arm.

Description

Robotic reference frame for navigation
Technical Field
The present technology relates generally to robotic surgery and, more particularly, to navigation during robotic surgery.
Background
Surgical navigation systems are used to track the position of one or more objects during a procedure. Surgical robots are adapted to hold one or more tools or devices during surgery, and may operate autonomously (e.g., without any manual input during operation), semi-autonomously (e.g., with some manual input during operation), or non-autonomously (e.g., only as directed by manual input).
Disclosure of Invention
A robot navigation system according to an embodiment of the present disclosure includes: a robot base; a robotic arm comprising a proximal end fixed to the robotic base, a distal end movable relative to the proximal end, and one or more arm segments located between the proximal end and the distal end; and a plurality of tracking marks fixed to the robotic arm.
Each of the one or more arm segments may support at least one of the plurality of tracking marks. The plurality of tracking marks may include at least three tracking marks. Each of the plurality of tracking marks may be an LED. Each of the plurality of tracking marks may be configured to emit or reflect light of a different wavelength than at least another of the plurality of tracking marks. Each of the plurality of tracking marks may be configured to emit light in a pulsed form at a first frequency different from a second frequency at which at least another one of the plurality of tracking marks is configured to emit light in a pulsed form. Each of the plurality of tracking marks may be a geometric pattern. At least one of the plurality of tracking markers may be movably secured to the robotic arm. The at least one of the plurality of tracking marks is selectively movable between a first position and a second position. At least two of the plurality of tracking marks may be circumferentially spaced about one of the one or more arm segments of the robotic arm.
The robotic navigation system may also include at least one processor and memory. The memory may store instructions for execution by the at least one processor, the instructions when executed cause the at least one processor to determine a predicted arrangement of the plurality of tracking markers based on the pose of the robotic arm and the known position of each of the plurality of tracking markers on the robotic arm. The memory may store additional instructions for execution by the at least one processor that, when executed, further cause the at least one processor to: receiving information about the detected arrangement of the plurality of tracking marks from a camera configured to detect the plurality of tracking marks; and comparing the detected arrangement with the predicted arrangement.
A method of navigating with a robotic reference frame according to another embodiment of the present disclosure includes: receiving first information about a plurality of tracking markers fixedly secured to a robot arm of a robot from a tracking marker sensor; receiving second information from the robot corresponding to a position of the robot arm in a robot coordinate system; and registering the robot coordinate system with a navigation coordinate system based on the first information and the second information.
Each of the plurality of tracking marks may include an LED. Each of the plurality of tracking marks may include a geometric pattern. The first information may include information regarding the positions of the plurality of tracking markers in the navigation coordinate system. The second information may include information regarding a position at which each of the plurality of tracking markers is fixedly secured to the robotic arm.
An apparatus for surgical navigation with a robotic reference frame according to yet another embodiment of the present disclosure includes: at least one communication interface for receiving information from a robot; at least one tracking mark sensor for detecting a plurality of tracking marks on a robot arm of the robot; at least one processor; and at least one memory. The memory stores instructions for execution by the at least one processor, which when executed, cause the at least one processor to: receiving information from the robot corresponding to a predicted arrangement of the plurality of tracking marks, the predicted arrangement defining a customized one-time reference frame; receiving data corresponding to the detected arrangement of the plurality of tracking marks from the at least one tracking mark sensor; and comparing the predicted arrangement to the detected arrangement to determine whether the customized one-time reference frame has been created.
The memory may store additional instructions for execution by the processor that, when executed, further cause the processor to: the position of the object in the predetermined coordinate space is confirmed based on the creation of the customized one-time reference frame. Each of the plurality of tracking marks may be a geometric pattern. The at least one tracking mark sensor may be a camera, and each tracking mark of the plurality of tracking marks may be an LED. At least two of the plurality of tracking marks may have different wavelengths or be configured to be pulsed at different frequencies.
The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
The phrases "at least one," "one or more," and/or "are open-ended expressions that have both connectivity and separability in operation. For example, the expressions "at least one of A, B and C", "at least one of A, B or C", "one or more of A, B and C", "one or more of A, B or C", and "A, B and/or C" mean a alone, B alone, C, A and B together, a and C together, B and C together, or A, B and C together. When A, B and C in the above expression each refer to an element such as X, Y and Z or such as X 1 -X n 、Y 1 -Y m And Z 1 -Z o The phrase means a single element selected from X, Y and Z, an element selected from the same class (e.g., X) 1 And X 2 ) And elements selected from two or more classes (e.g., Y) 1 And Z o ) Combinations of (a) and (b).
The terms "a" and "an" refer to one or more of the entities. As such, the terms "a/an", "one or more", and "at least one" may be used interchangeably herein. It should also be noted that the terms "comprising" and "having" may be used interchangeably.
The foregoing is a simplified summary of the disclosure to provide an understanding of some aspects of the disclosure. This summary is neither an extensive nor exhaustive overview of the disclosure and its various aspects, embodiments, and configurations. It is intended to neither identify key or critical elements of the disclosure nor delineate the scope of the disclosure, but to present selected concepts of the disclosure in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other aspects, embodiments, and configurations of the present disclosure may utilize, alone or in combination, one or more of the features set forth above or described in detail below.
Many additional features and advantages of the present invention will become apparent to those skilled in the art upon consideration of the description of the embodiments provided below.
Drawings
The accompanying drawings are incorporated in and form a part of this specification to illustrate several examples of the present disclosure. Together with the description, these drawings explain the principles of the disclosure. The drawings illustrate only preferred and alternative examples of how the disclosure may be made and used, and should not be construed to limit the disclosure only to the examples shown and described. Additional features and advantages will be made apparent from the following more detailed description of the various aspects, embodiments, and configurations of the disclosure, as illustrated by the drawings to which reference is made below.
Fig. 1A is a block diagram of a system according to at least one embodiment of the present disclosure;
fig. 1B depicts a robot in accordance with at least one embodiment of the present disclosure;
fig. 2A is a flow diagram of a method according to at least one embodiment of the present disclosure;
fig. 2B depicts a predicted arrangement of tracking markers according to at least one embodiment of the present disclosure;
fig. 2C depicts a detected arrangement of tracking markers according to at least one embodiment of the present disclosure;
fig. 3 is another flow diagram of a method according to at least one embodiment of the present disclosure; and is
Fig. 4 is another flow diagram of a method according to at least one embodiment of the present disclosure.
Detailed Description
It should be understood that the various aspects disclosed herein may be combined in different combinations than those specifically set forth in the description and drawings. It will also be understood that certain acts or events of any of the processes or methods described herein can be performed in a different sequence, and/or can be added, merged, or omitted entirely, depending on the example or embodiment (e.g., not all described acts or events may be required for performing the disclosed techniques, in accordance with different embodiments of the disclosure). Additionally, for clarity, while certain aspects of the disclosure are described as being performed by a single module or unit, it should be understood that the techniques of the disclosure may be performed by a combination of units or modules associated with, for example, a computing device and/or a medical device.
In one or more examples, the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. The computer-readable medium may include a non-transitory computer-readable medium corresponding to a tangible medium such as a data storage medium (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
The instructions may be executed by one or more processors, such as one or more Digital Signal Processors (DSPs), general purpose microprocessors (e.g., intel Core i3, i5, i7, or i9 processors; intel Celeron processors; intel Xeon processors; intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; apple A10 or 10X Fusion processors; apple A11, A12X, A Z or A13 Bionic processors; or any other general purpose microprocessor), application Specific Integrated Circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Thus, the term "processor" as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementing the described techniques. In addition, the present techniques may be fully implemented in one or more circuits or logic elements.
Before any embodiments of the disclosure are explained in detail, it is to be understood that the disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of "including/comprising" or "having" and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Furthermore, the present disclosure may use examples to illustrate one or more aspects thereof. The use or listing of one or more examples (which may be indicated by "for example", "by way of example", "for example (e.g.)", "such as", or similar language) is not intended to, and does not limit the scope of the present disclosure unless expressly stated otherwise.
Navigated robotic programs typically involve a reference frame and trackers whose positions are detected by tracking marker sensors. For example, the navigation system may use a camera as a tracking marker sensor that may detect optical tracking markers attached to a reference frame of the robotic arm. With this information, the coordinate system of the robot system can be associated with the coordinate system of the navigation system. In other cases, information from the tracking mark sensors that can accurately determine the position and orientation of the robotic arm may be used to calibrate the robotic system.
However, these optical systems require a clear line of sight between the camera and the tracking markers, which can limit robotic arm movement during surgery. To overcome these problems (and others), tracking markers (e.g., LEDs or reference frame markers) may be incorporated onto or into the robotic arm at various locations and in various patterns. By associating LED positions with the robotic system, the camera or other tracking marker sensor can more easily detect the position and orientation of the robotic arm despite potential line-of-sight issues or other difficulties.
For example, unique geometries may be created by LEDs on different parts of the robotic arm. The actual tracked geometry varies according to the relative motion of the robotic arm joints. Further, if the robot is connected to a patient, the position of the robot arm may be used to associate the navigation space, the robot space, and the patient space.
Embodiments of the present disclosure eliminate the need for a physical reference frame mounted to the robotic arm and the use of a snapshot reference frame, and more generally, the need for a snapshot process that matches the navigation and robot coordinate spaces. Embodiments of the present disclosure further enable real-time verification of navigation system integrity by comparing sensed positions of the robotic arm tracking markers (e.g., sensed by the navigation system) with predicted positions of the tracking markers (e.g., predicted based on information from the robot regarding the position and/or orientation (e.g., pose) of the robotic arm). Similarly, embodiments of the present disclosure enable verification of robot integrity in real-time by comparing robot arm position and/or orientation information (e.g., pose information) based on actual encoder readings with robot arm position and/or orientation information determined based on sensed positions of robot arm tracking markers. Thus, embodiments of the present disclosure increase ease of use and reduce the number of operations compared to known navigation systems, registration procedures, and calibration operations.
Turning first to fig. 1A, a block diagram of a system 100 according to at least one embodiment of the present disclosure is shown. The system 100 may: for example, for performing one or more aspects of one or more of the methods disclosed herein; for navigation purposes; for registration purposes; for calibration operations; for verifying the operational integrity of a navigation system (such as navigation system 160) or a robot (such as robot 136); or for any other useful purpose. The system 100 includes a computing device 102, a tracking tag sensor 132, a robot 136, a navigation system 160, a database 164, and a cloud 168. Although the foregoing is described, systems according to other embodiments of the present disclosure may omit any one or more of the computing device 102, the tracking tag sensor 132, the robot 136, the navigation system 160, the database 164, and/or the cloud 168. Additionally, systems according to other embodiments of the present disclosure may arrange one or more components of the system 100 differently (e.g., one or more of the tracking tag sensor 132, the robot 136, and the navigation system 160 may include the components shown in fig. 1A as part of the computing device 102).
The computing device 102 includes at least one processor 104, at least one communication interface 108, at least one user interface 112, and at least one memory 116. Computing devices according to other embodiments of the present disclosure may omit one or both of the communication interface 108 and the user interface 112.
The at least one processor 104 of the computing device 102 may be any processor identified or described herein or any similar processor. The at least one processor 104 may be configured to execute instructions stored in the at least one memory 116 that may cause the at least one processor 104 to perform one or more computing steps using or based on data received from, for example, the tracking tag sensor 132, the robot 136, the navigation system 160, the database 164, and/or the cloud 168.
Computing device 102 may also include at least one communication interface 108. The at least one communication interface 108 may be used to receive image data or other information from an external source, such as the tracking tag sensor 132, the robot 136, the navigation system 160, the database 164, the cloud 168, and/or a portable storage medium (e.g., USB drive, DVD, CD), and/or to more generally transmit instructions, images, or other information from the at least one processor 104 and/or the computing device 102 to an external system or device (e.g., another computing device 102, the tracking tag sensor 132, the robot 136, the navigation system 160, the database 164, the cloud 168, and/or a portable storage medium (e.g., USB drive, DVD, CD)). The at least one communication interface 108 may include one or more wired interfaces (e.g., USB ports, ethernet ports, firewire ports) and/or one or more wireless interfaces (e.g., configured to communicate information via one or more wireless communication protocols, such as 802.11a/b/g/n, bluetooth low energy, NFC, zigbee, and the like). In some embodiments, the at least one communication interface 108 may be used to enable the device 102 to communicate with one or more other processors 104 or computing devices 102, whether to reduce the time required to complete a computationally intensive task or for any other reason.
The at least one user interface 112 may be or include a keyboard, mouse, trackball, monitor, television, touch screen, buttons, joystick, switches, levers, and/or any other device for receiving information from a user and/or for providing information to a user of the computing device 102. The at least one user interface 112 may be used, for example, to receive user selections or other user inputs in connection with any of the steps of any of the methods described herein; receiving a user selection or other user input regarding one or more configurable settings of the computing device 102 and/or another component of the system 100; receiving a user selection or other user input as to how and/or where to store and/or transmit data received, modified, and/or generated by the computing device 102; and/or display information (e.g., text, images) and/or play sound to a user based on data received, modified, and/or generated by the computing device 102. Although the at least one user interface 112 is included in the system 100, the system 100 may automatically (e.g., without any input through the at least one user interface 112 or otherwise) perform one or more or all of the steps of any of the methods described herein.
Although the at least one user interface 112 is shown as part of the computing device 102, in some embodiments, the computing device 102 may utilize a user interface 112 that is housed separately from one or more remaining components of the computing device 102. In some embodiments, the user interface 112 may be located proximate to one or more other components of the computing device 102, while in other embodiments, the user interface 112 may be located remotely from one or more other components of the computing device 102.
The at least one memory 116 may be or include RAM, DRAM, SDRAM, other solid state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions. The at least one memory 116 may store information or data suitable for use in performing any of the steps of the methods 200 or 300, such as described herein. The at least one memory 116 may store, for example, information about one or more predetermined coordinate systems 120 (e.g., information about a robot coordinate system or space, information about a navigation coordinate system or space, information about a patient coordinate system or space); instructions 124 for execution by the at least one processor 104, e.g., to cause the at least one processor 104 to perform one or more of the steps of method 200 and/or method 300; and/or one or more algorithms 128 for use by a processor to perform any calculations or for any other calculations necessary to perform one or more of the steps of the method 200 and/or the method 300. In some embodiments, such predetermined coordinate system 120, instructions 124, and/or algorithms 128 may be organized into one or more applications, modules, packages, layers, or engines, and may cause at least one processor 104 to manipulate data stored in at least one memory 116 and/or received from or through another component of the system 100.
The tracking mark sensor 132 is operable to detect one or more tracking marks 156 (described below). The tracking mark sensor 132 may be, for example, an optical camera; an infrared camera; a 3D camera system; a stereoscopic vision system; another imaging device; or any other sensor that can detect one or more tracking marks 156. The tracking tag sensor 132 may comprise a dedicated processor for executing instructions stored in a dedicated memory of the tracking tag sensor 132, or the tracking tag sensor 132 may simply be configured to transmit the data it collects to the computing device 102 or another component of the system 100. In some implementations, although shown in fig. 1A as communicating only with the computing device 102, the tracking tag sensor 132 may be in communication with any one or more of the computing device 102, the robot 136, the navigation system 160, the database 164, and/or the cloud 168. Also, in some embodiments, the computing device 102 may include the tracking tag sensor 132, while in other embodiments, the navigation system 160 may include the tracking tag sensor 132. In still other embodiments, the robot 136 may include a tracking mark sensor 132.
The tracking marker sensor 132 may be positioned directly above the operating table or a portion thereof, or above and to one side of the operating table or a portion thereof, or at another convenient location within the operating room or other room housing the robot 136. The tracking mark sensor 132 may be positioned at a location selected to provide the tracking mark sensor 132 with a clear and/or unobstructed view of the robotic arm 144 of the robot 136 during operation (and thus provide a clear and/or unobstructed view of one or more tracking marks 156 fixedly secured to the robotic arm 144). In some embodiments, the tracking mark sensor 132 is stationary, while in other embodiments, the tracking mark sensor 132 may be precisely moved (whether manually or automatically) in one or more directions.
The tracking mark sensor 132 may be configured to capture data about the sensed tracking mark 156 only at a given moment in time. For example, where the tracking mark sensor 132 is a camera, the tracking mark sensor 132 may be configured to capture a still image that includes one or more tracking marks 156. The tracking marker sensor 132 may be configured to capture such data at periodic intervals, or upon command by a user (e.g., through the user interface 112) or according to signals from the computing device 102, robot 136, and/or navigation system 160 (autonomously generated or in response to user input).
The tracking tag sensor 132 may additionally or alternatively be operable to capture data corresponding to one or more tracking tags 156 in real time. In such embodiments, the tracking mark sensor 132 may provide a real-time stream of sensor data to the computing device 102, which may continuously process the sensor data to detect one or more tracking marks 156 therein. In some embodiments, the tracking marker sensor 132 may include more than one tracking marker sensor 132.
Still referring to fig. 1A, and also to fig. 1B, the robot 136 may be any surgical robot or surgical robotic system. The robot 136 may be or include, for example, mazor X TM Stealth version robot bootstrap system. The robot 136 may include a base 140 that supports a robotic arm 144. The robot 136 may include one or more robotic arms 144. In some embodiments, the robotic arm 144 may include a first robotic arm and a second robotic arm. In other embodiments, the robot 136 may include more than two robot arms 144. In some embodiments, the robotic arm 144 may assist in the surgical procedure (e.g., by holding the tool in a desired trajectory or pose and/or supporting the weight of the tool as the surgeon or other user operates the tool, or otherwise) and/or perform the surgical procedure automatically.
Still referring to fig. 1A-1B, the robotic arm 144 may have three, four, five, six, or more degrees of freedom. The robotic arm 144 may include one or more segments 152. Each segment 152 may include a member 176 and a joint 172 to which member 176 is attached and/or from which member 176 extends. The joint 172 may be secured to, for example, the base 140 or a member 176 of the other segment 152. Joint 172 may be any type of joint that enables member 176 to be selectively moved relative to a structure to which joint 172 is attached. For example, joint 172 may be a pivot joint, a hinge joint, a saddle joint, or a ball and socket joint. Joint 172 may allow member 176 to move in one dimension or multiple dimensions and/or along one axis or along multiple axes.
In embodiments of the robot 136 that include a robotic arm 144 having only one segment 152, the joint 172 of the segment 152 may be secured to the base 140, and the member 176 of the segment 152 may include a proximal end secured to the joint 172 and a distal end supporting an end effector. The end effector may be, for example, a tool (e.g., drill, saw, imaging device) or a tool guide (e.g., for guiding a biopsy needle, ablation probe, or other tool along a desired trajectory).
In embodiments of a robot 136 including a robotic arm 144 having a plurality of segments 152, such as shown in fig. 1B, a first segment 152 can include a joint 172 secured to the base 140, and a member 176 of the first segment 152 can include a proximal end secured to the joint 172 and a distal end supporting the joint of the second segment 152. The member 176 of the second segment 152 can include a proximal end secured to the joint 172 of the second segment 152 and a distal end supporting the joint 172 of the third segment 152, and so on. The member 176 of the last segment 152 may include a distal end supporting an end effector 180, which may be the same as or similar to the end effector described above. In such embodiments, the joints 172 of each segment 152 may or may not be of the same type, and the members 176 of each segment 152 may or may not be the same.
All or some of the joints 172 of the segments 152 of the robotic arm 144 may be powered (so as to be selectively controllable without physical manipulation by a human). Any one or more of electrical, pneumatic, hydraulic, and/or other means may be used to selectively control movement of member 176 about joint 172. For example, each segment 152 may include a servo system for selectively moving the member 176 of the segment 152 relative to the joint 172 of the segment 152.
The robotic arm 144 also includes one or more sensors 148. Each sensor 148 may be positioned to detect the position of a member 176 of a given segment 152 relative to a joint 172 of the segment 152. For example, where joint 172 of a given segment 152 is or includes a hinge joint, sensor 148 may detect an angular position of member 176 relative to an axis of the hinge joint. Where joint 172 of a given segment 152 is or includes a rotational joint (e.g., configured to allow member 176 to rotate about an axis extending through member 176 and joint 172), sensor 148 may detect an angular position of member 176 relative to an axis extending through member 176 and joint 172. Each sensor 148 may be, for example, a rotary encoder, a linear encoder, or an incremental encoder.
Data from the sensors 148 may be provided to a processor of the robot 136, the processor 104 of the computing device 102, and/or the navigation system 160. This data may be used to calculate the spatial position of the robotic arm 144 relative to a predetermined coordinate system. For example, the robot 136 may calculate the spatial position of the robotic arm 144 relative to a coordinate system having an origin at the location where the joint 172 of the first section 152 of the robotic arm 144 is secured to the base 140. The calculation may be based not only on the data received from the sensors 148, but also on data or information (e.g., physical dimensions) corresponding to each segment 152 and/or corresponding to the end effector secured to the last segment 152. By way of example only, the known position of the proximal end of the robotic arm 144 (e.g., where the joint 172 of the first segment 152 is fixed to the base 140), the known dimensions of each segment 152, and data from the sensors 148 regarding the orientation of the member 176 of each segment 152 relative to the joint 172 of each segment 152 may be used to calculate a path of the robotic arm through space.
Still referring to fig. 1A-1B, a plurality of tracking markers 156 are fixedly secured or positioned on the robotic arm 144. As used herein, "fixedly secured" does not mean "permanently secured," and in fact the tracking indicia 156 is detachable from the robotic arm 144. The tracking indicia 156 may be a Light Emitting Diode (LED). The tracking marks 156 may all be the same, or one or more of the tracking marks 156 may be different from another one or more of the tracking marks 156. In some embodiments, one or more of the tracking marks 156 may be configured to emit light at a first wavelength, and another one or more of the tracking marks 156 may be configured to emit light at a second wavelength different from the first wavelength. Also, in some embodiments, one or more of the tracking marks 156 may be configured to reflect light at a first wavelength, while another one or more of the tracking marks may be configured to reflect light at a second wavelength different from the first wavelength. The emission and/or reflection wavelengths of light of the above-described embodiments may be wavelengths within a particular spectrum (e.g., wavelengths in the visible spectrum corresponding to red light and wavelengths corresponding to blue light, or different wavelengths in the infrared spectrum) as well as wavelengths from different spectra (e.g., wavelengths in the visible spectrum and wavelengths in the infrared spectrum).
In some embodiments, one or more of the tracking marks 156 may be or include an LED that is pulsed at a first frequency, and another one or more of the tracking marks 156 may be or include an LED that is pulsed at a second frequency different from the first frequency. In some embodiments, the tracking mark 156 may be or include reflective spheres, geometric patterns (e.g., QR codes), or other items or features that may be readily distinguished by the tracking mark sensor 132. The tracking marks may be configured to be detectable by the tracking mark sensor 132 even when covered by a drape or other covering that may be disposed on or over the robotic arm 144 to maintain a sterile operating room environment.
In some embodiments, at least one tracking marker 156 is fixedly secured to or positioned on each segment 152. One or more sets of tracking marks 156 may be affixed to one or more of the segments 152. In some embodiments, the number of tracking marks 156 fixedly secured or positioned on the robotic arm 144 may be at least three. Also in some embodiments, each segment 152 may have a plurality of tracking marks 156 fixedly secured thereto or positioned thereon, arranged such that at least one tracking mark 156 is visible from any one of a plurality of possible orientations of the segment 152 (e.g., to the tracking sensor 132). In other embodiments, the tracking markers 156 are arranged on the robotic arm 144 such that at least three tracking markers 156 are visible from any one of a plurality of possible orientations of the robotic arm 144 (e.g., to the tracking sensor 132). For example, in some embodiments, the tracking marks 156 may be spaced circumferentially around the robotic arm 144 or around a particular segment of the robotic arm 144. In some embodiments, the tracking marks 156 can also be spaced radially and/or longitudinally around the robotic arm 144 or around a particular segment of the robotic arm 144.
In some embodiments of the present disclosure, the plurality of tracking marks 156 may be movably fixed to the robotic arm 144, and further selectively movable relative to the robotic arm 144. In such embodiments, one or more of the plurality of tracking marks 156 may be configured to move (or automatically move) from a first location on the robotic arm 144 to a second location on the robotic arm 144 as the robotic arm 144 moves in or out of a position or group of positions. The purpose of such movement of one or more of the plurality of tracking marks 156 may facilitate maintenance of a line of sight between each of the plurality of tracking marks 156 (or at least a subset of the plurality of tracking marks) and the tracking sensor 132. In such embodiments, the robot 136 (and/or another component of the system 100) may be configured to track whether each of the plurality of tracking markers 156 is in its respective first or second position, and provide such information to the navigation system 160 (or any other component of the system 100) to enable correlation of the robot coordinate system with the navigation coordinate system based on the position of the tracking marker 156 relative to the robot arm 144 as known to the robot 136 (and/or another component of the system 100) and further based on the position of the tracking marker 156 as detected by the navigation system 160 (e.g., using the tracking sensor 132).
The number of tracking marks 156 on the robotic arm 144 may be selected based on a minimum number of tracking marks 144 needed to determine the spatial position of the robotic arm 144 based on the detected relative orientation of the tracking marks 156, as described in more detail below. For example, if the minimum number of tracking marks 156 required to determine the position of the robotic arm 144 is 3, the total number of tracking marks 156 on the robotic arm may be 3 or more. Alternatively, if the minimum number of tracking marks 156 required to determine the position of the robotic arm 144 is 4, the total number of tracking marks 156 on the robotic arm 144 may be 4 or more. The greater the multiple, the greater the likelihood that the minimum number of tracking marks 156 will be visible to or otherwise detectable by the tracking mark sensor 156, regardless of the orientation of the robotic arm 144.
The minimum number of tracking marks 156 required to determine the position of the robotic arm 144 may be the minimum number required to ensure that the position of the tracking mark 156 in space (e.g., as detected by the tracking mark sensor 132) is unique for each possible orientation of the robotic arm 144. For example, if only one tracking marker 156 is positioned on the distal end of the robotic arm 144 having multiple segments 152, multiple orientations or poses of the robotic arm 144 may be assumed without moving the tracking marker 156. Adding additional tracking marks 156 to the robotic arm 144 reduces the number of possible positions of the robotic arm 144 associated with each arrangement of tracking marks 156 until a threshold number of tracking marks 156 is reached, which represents the minimum number of tracking marks 156 needed to determine the position of the robotic arm 144. For a robotic arm 144 having only one segment 152, the minimum number of tracking marks 156 required to determine the position of the robotic arm 144 may be two. For a robotic arm 144 having multiple segments 152, the minimum number of tracking marks 156 required to determine the position of the robotic arm 144 may be three or more. As discussed herein, additional tracking marks 156 may be provided to ensure redundancy in the event that one or more of the tracking marks 156 blocks a line of sight with the tracking mark sensor 132.
Referring again to fig. 1A, during surgery, the navigation system 160 may provide navigation for the surgeon and/or the robot 136. The navigation system 160 can be any now known or later developed navigation system including, for example, medtronic Stealtstation TM S8, a surgical navigation system. The navigation system 160 may include a camera or other sensor for detecting and/or tracking one or more reference markers, navigation trackers, or other objects within the operating room or other room in which the surgical procedure is performed. In some embodiments, the navigation system 160 may include a tracking marker sensor 132. In various embodiments, the navigation system 160 may be used to track the position of the robotic arm 144 (or more specifically, the tracking marker 156 attached to the robotic arm 144). The navigation system 160 may be used to track the position of one or more reference markers or arrays or other structures that may be used to be detected by a camera or other sensor of the navigation system 160. The navigation system 160 may include a display for displaying one or more images from an external source (e.g., the computing device 102, the tracking tag sensor 132, or other source), or a video stream from a camera or other sensor of the navigation system 160And (4) displaying. In some implementations, the system 100 can operate without the use of the navigation system 160.
The database 164 may store information associating each particular arrangement of tracking markers 156 with a corresponding position and orientation or pose of the robotic arm 144. In such embodiments, information from the tracking mark sensor 132 regarding the position of each of the plurality of detected tracking marks 156 may be used to look up the corresponding position of the robotic arm 144 in the database 164. The database 164 may additionally or alternatively store, for example, one or more characteristics about or corresponding to the tracking indicia 156; one or more surgical plans for use by the robot 136, navigation system 160, and/or a user of the computing device 102 or system 100; information of one or more images that may be used in connection with a procedure to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information. The database 164 may be configured to provide any such information to the computing device 102 or any other device of the system 100 or any other device external to the system 100, whether directly or through the cloud 168. In some embodiments, the database 164 may be or include a portion of a hospital image storage system, such as a Picture Archiving and Communication System (PACS), a Health Information System (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
The cloud 168 may be or represent the internet or any other wide area network. Computing device 102 may connect to cloud 168 through communication interface 108 using a wired connection, a wireless connection, or both. In some implementations, the computing device 102 may communicate with the database 164 and/or an external device (e.g., a computing device) through the cloud 168.
Turning now to fig. 2A, a method 200 for navigating with a robot reference frame may be performed, for example, by at least one processor. The at least one processor may be the same as or similar to the processor 104 of the computing device 102 described above. The at least one processor may be part of a robot, such as robot 136, or part of a navigation system, such as navigation system 160. A processor other than any of the processors described herein may be used to perform the method 200. The at least one processor may perform the method 200 by executing instructions stored in a memory, such as the instructions 124 of the memory 116. The instructions may correspond to one or more steps of method 200, described below. The instructions may cause the processor to execute one or more algorithms, such as algorithm 128.
The method 200 includes determining a predicted arrangement of a plurality of tracking markers based on planned poses of the robotic arm (step 204). In some embodiments, the planned pose of the robotic arm may be a current pose of the robotic arm. The robotic arm may be the same as or similar to robotic arm 144 of robot 136. The plurality of tracking marks may be the same as or similar to tracking mark 156. This determination may include accessing stored information regarding the position of each of a plurality of tracking markers relative to the robotic arm or a portion thereof, whether from a memory (such as memory 116), a database (such as database 164), or elsewhere. The determining may also include calculating a predicted position of each of the plurality of tracking marks based on the planned position of the robotic arm and information about the position of each of the plurality of tracking marks relative to the robotic arm (or a portion thereof). The determination may include calculating the predicted positions of all of the tracking markers fixedly secured to the robotic arm, or calculating the predicted positions of only a subset of all of the tracking markers fixedly secured to the robotic arm. The determining may further include compiling the calculated predicted location for each of the plurality of tracking markers into a predicted arrangement of the plurality of tracking markers.
The predicted arrangement of the robotic arm may be determined relative to a coordinate system. The coordinate system may be a robot coordinate system, a navigation coordinate system, or another coordinate system.
The tracking marks may be the same as or similar to the tracking marks 156 described above. For example, the tracking marks may be LEDs or reflective spheres or geometric patterns (such as QR codes). The tracking marks may all be the same, or each tracking mark may be distinguished from the other tracking marks by a unique characteristic (e.g., a unique wavelength, a unique frequency of generated pulses). In some embodiments, some tracking markers may be distinguished from one or more other tracking markers by a common characteristic. For example, a first set of tracking marks may emit light at a first wavelength and a second set of tracking marks may emit light at a second wavelength.
Fig. 2B illustrates a predicted arrangement 250 of a plurality of tracking marks, wherein corresponding positions of a robot (including a robot arm) are shown in dashed lines, in accordance with at least one embodiment of the present disclosure.
Referring again to fig. 2A, the method 200 further includes receiving information regarding the detected arrangement of the plurality of tracking marks from the tracking mark sensor (step 208). The tracking mark sensor may be the same as or similar to the tracking mark sensor 132. The tracking mark sensor may be an optical camera, an infrared camera, or any other sensor configured to detect tracking marks. In some embodiments, the tracking tag sensor may be part of a robot (such as robot 136), or part of a navigation system (such as navigation system 160), or part of a computing device (such as computing device 102). In some embodiments, the tracking tag sensor may be independent of any of the aforementioned components, but may be in electronic communication with one or more of the aforementioned components.
The information may include a position of each of the plurality of tracking marks in the detected arrangement of tracking marks. The position of one or more of the plurality of tracking markers may be defined based on a coordinate system (such as a robot coordinate system or a navigation coordinate system) or relative to another one or more of the plurality of tracking markers. The information may include only information regarding the position of each of the plurality of tracking markers relative to each other and may be used to calculate the position of each of the plurality of tracking markers relative to a coordinate system, such as a robot coordinate system or a navigation coordinate system.
Information may be received, for example, via a communication interface, such as communication interface 108.
Fig. 2C illustrates a detected arrangement 260 of a plurality of tracking marks according to at least one embodiment of the present disclosure, as may be shown or reflected in the information received in step 208 regarding the detected arrangement of the plurality of tracking marks.
The method 200 further includes comparing the detected arrangement of the plurality of tracking markers to a predicted arrangement of the plurality of tracking markers (step 212). The comparison may include translating or otherwise associating the predicted position of each of the plurality of tracking markers from one coordinate space (e.g., a robot coordinate space) to another coordinate space (e.g., a navigation coordinate space). Alternatively, the comparison may include translating or otherwise associating the detected position of each of the plurality of tracking markers from one coordinate space (e.g., a navigation coordinate space) to another coordinate space (e.g., a robot coordinate space). The comparison may comprise simply comparing the relative positions of the plurality of tracking marks in the predicted arrangement with the detected relative positions of the plurality of tracking marks in the arrangement.
When the comparison yields a conclusion that the detected arrangement matches the predicted arrangement, then the exact position of the robotic arm is known. Thus, at this point in time, the robot arm (by means of the tracking markers fixedly secured thereto) constitutes a customized disposable reference frame that can be used for the same purpose as any known reference frame, including for example for registering the robot coordinate system with the navigation coordinate system and/or for determining or confirming the position of a given object in a particular coordinate system. For example, the robotic arm may be positioned such that its end is in contact with an anatomical feature, a surgical tool, or another object, such that the robotic arm includes a reference frame based on which the position of the anatomical feature, the surgical tool, or other object may be determined or confirmed.
Thus, embodiments of the present disclosure advantageously eliminate the need for a sometimes bulky reference frame. For example, some navigation systems require some minimum distance between the tracking markers (or a reference frame for keeping the tracking markers at a minimum size) to provide accurate navigation within a volume suitable for a given application. In particular, larger volumes may require a larger frame of reference compared to smaller volumes. In some cases, the frame of reference holding the tracking markers for a surgical or other medical procedure may extend from three inches to ten inches in multiple dimensions in order to achieve the minimum necessary size. As a result, these reference frames tend to be bulky, prone to collision (which may, for example, cause unwanted movement of objects to which they are attached), can impede movement of the surgeon or any movable object in the operating room environment, and can be difficult to use. The use of a robotic reference frame as described in several embodiments herein enables a somewhat bulky object (robot) already in the operating room when needed to be used to create a transient reference frame, thereby eliminating the need for and problems associated with a dedicated reference frame.
Also, when the comparison yields a conclusion that the detected arrangement matches the predicted arrangement, the operational integrity of the robot and navigation system may be confirmed. This may be useful during surgery and may be used for initial calibration operations of the robotic system. On the other hand, when the comparison yields a conclusion that the detected arrangement does not match the predicted arrangement, a further conclusion may be made that one or both of the robot and the navigation system lacks operational integrity even though the robotic arm is in a pose (e.g., position and/or orientation) for determining the predicted arrangement. Accordingly, when this occurs, a warning may be displayed to an operator of the robot and/or navigation system, and/or an audible sound may be played through a user interface (e.g., user interface 112 of computing device 102 or a robot or navigation system specific user interface). Providing such a warning to the operator of the robot and/or navigation system helps to ensure that the suspected operational integrity of the robot and/or navigation system can be investigated and any errors corrected before further use of the robot and/or navigation system.
In some embodiments, where the detected arrangement of the plurality of tracking markers is only slightly different from the predicted arrangement of the plurality of tracking markers, another arrangement of the plurality of tracking markers may be predicted (e.g., based on a different pose of the robotic arm), and a camera or other tracking marker sensor may provide additional information regarding a second detected arrangement of tracking markers (e.g., detected when the robotic arm is in a different pose). If the second detected arrangement of the plurality of tracking markers is again slightly different from the second predicted arrangement of the plurality of tracking markers, an error calculation and/or calibration process may be conducted to determine an adjustment to apply to any additional predicted arrangement such that it matches the corresponding detected arrangement, and vice versa. In other words, the operational integrity of the robot and/or navigation system may be confirmed if the offset between the predicted arrangement of the plurality of tracking markers and the corresponding detected arrangement of the plurality of tracking markers may be characterized by a constant or derived equation such that the offset may be incorporated into a further comparison of the predicted arrangement and detected arrangement of the plurality of tracking markers.
The method 200 further includes determining a pose of the robotic arm based on the predicted arrangement and/or the detected arrangement (step 216). The determination may include receiving information from a robot having a robotic arm regarding a pose of the robotic arm corresponding to the predicted arrangement and/or the detected arrangement of the tracking markers. The determination may alternatively comprise calculating the pose of the robotic arm based on information about the position of each tracking marker (and/or about the relative position of the tracking marker in the detected arrangement of tracking markers) and information about the position of each tracking marker relative to the robotic arm or segment thereof and information about the robotic arm and/or segment thereof. The determination may include determining a pose of the robot arm relative to a robot coordinate system, a navigation coordinate system, or some other coordinate system.
Where the determining comprises calculating a pose of one or more arm segments of the robotic arm, the determining may further comprise combining the calculated poses of the one or more arm segments into a calculated pose of the entire robotic arm. Also, information (including any of the information described above) needed to calculate the pose of the robotic arm based on the predicted arrangement and/or detected arrangement of the tracking markers may be accessed from a memory (such as memory 116) or a database (such as database 164). Such information may include, for example, information regarding the size, arrangement, range of motion, and/or other characteristics of the sections of the robotic arm and/or the entire robotic arm. One or more algorithms, such as algorithm 128, stored in a memory, such as memory 116, may be used when calculating the pose of one or more segments of the robotic arm and/or the entire robotic arm.
The present disclosure encompasses embodiments of method 200 that include more or fewer steps than the embodiments described above.
Turning now to fig. 3, a method 300 of registration with a robot reference frame includes receiving first information from a sensor regarding a plurality of tracking markers fixedly secured to a robot arm (step 304). The sensor may be, for example, a tracking mark sensor as described elsewhere herein, including, for example, tracking mark sensor 132 or any other sensor suitable for detecting a plurality of tracking marks. The tracking mark may be any of the tracking marks described herein, including, for example, tracking mark 156. The robotic arm may be any of the robotic arms described herein, including, for example, robotic arm 144 of robot 136 or any other robotic arm.
The first information may be or comprise information about the position of the plurality of tracking markers in the navigation coordinate system. The first information may be or comprise information about the position of the plurality of tracking marks relative to each other. In some embodiments, the first information may be or include information about one or more characteristics of one or more particular tracking marks, such as information about the wavelength of the one or more particular tracking marks, the frequency of the generation pulses of the one or more tracking marks, and/or the geometric pattern of the one or more particular tracking marks. The first information may be received directly from the sensor or through one or more communication interfaces (such as communication interface 108) and/or through a cloud (such as cloud 168) or through any other network, device, or component.
The method 300 further includes receiving second information corresponding to a position of the robot arm in a robot coordinate system from the robot arm or a robot including the robot arm (step 308). The second information may be based on data obtained from one or more sensors, such as sensor 148, in the robotic arm, for example. For example, the second information may include sensor data regarding detected positions of one or more segments of the robotic arm and/or the entire robotic arm. The second information may be based on one or more settings of one or more components of the robotic arm. For example, the second information may include data describing the position (whether actual or commanded) of one or more motors, servos, gears, or other devices or components used to control the position of the robotic arm and/or one or more segments thereof. The second information may be obtained independently of the first information and vice versa.
The method 300 further includes associating the position of the robotic arm in the robot coordinate system with the positions of the plurality of tracking marks in the robot coordinate system (step 312). The association may include accessing the information from a memory (such as memory 116), a database (such as database 164), a robot (such as robot 136), or another storage location. Such information may include, for example, information regarding the precise location of each of a plurality of tracking marks on the robotic arm, and/or information regarding the size, arrangement, range of motion, and/or other characteristics of segments and/or the entire robotic arm. Based on such information and information about the position of the robot arm in the robot coordinate system (e.g., the second information received in step 308), the position of the plurality of tracking marks in the robot coordinate system may be calculated. Such calculations may utilize, for example, one or more algorithms (such as algorithm 128).
The result of this correlation may be a calculated or otherwise determined position in the robot coordinate system of a plurality of tracking markers fixedly secured to the robot arm.
The method 300 further includes registering the robot coordinate system with the navigation coordinate system (step 316). The registration may be achieved based on the first information and the second information. In some embodiments, this registration includes the association described above as step 312 (e.g., in some embodiments, step 316 may include step 312). The registration may include determining a relationship between the robot coordinate system and the navigational coordinate system based on the known positions of the plurality of tracking markers in the navigational coordinate system (as included in or determined from the first information) and the known positions of the plurality of tracking markers in the robot coordinate system (as determined from the second information). The registration may utilize one or more algorithms, such as algorithm 128 stored in memory 116.
The method 300 further includes registering the patient coordinate system with the navigational coordinate system (step 320). Registration of the robot coordinate system with the navigational coordinate system (as described above with respect to step 316) also enables registration of the patient coordinate system with the navigational coordinate system, either in the case where the robot is connected to the patient (as is sometimes the case during robotic surgery or robot-assisted surgery) or in the case where the robot coordinate system is already registered with the patient coordinate system. In some embodiments, the registration of this step 320 includes associating the robot coordinate system with the patient coordinate system (or vice versa) and, based on the registration, associating the patient coordinate system with the navigational coordinate system. In other words, the registration may include determining a relationship between the patient coordinate system and the navigational coordinate system based on the relationship between the patient coordinate system and the robot coordinate system and the relationship between the robot coordinate system and the navigational coordinate system. The registration of this step 320 may utilize one or more algorithms, such as algorithm 128 stored in memory 116.
The present disclosure encompasses many variations on method 300. For example, as described above, in some embodiments, the registration step 316 may include associating the position of the robotic arm in the robot coordinate system (as indicated in or determined from the second information received in step 308) with the position of each of the plurality of tracking marks, as currently described with respect to step 312. Also, in some embodiments, the second information may include position information for each of a plurality of tracking marks in the robot coordinate system, such that the associating step 312 or any similar step is unnecessary. Additionally, although in the method 300 described above, the position of each of the plurality of tracking marks in the robot coordinate system is determined based on the second information regarding the position of the robot arm in the robot coordinate system, in other embodiments, the position of the robot arm may be determined based on the first information, and the registering step 316 may include registering the robot coordinate system with the navigation coordinate system (or vice versa) based on the position of the robot arm as determined from the first information and the position of the robot arm as indicated by the second information or as determined from the second information.
The method 300 advantageously enables registration of the navigational coordinate system with the robotic coordinate system and vice versa without using a reference system other than the reference system formed by the robotic arm itself (including tracking markers fixedly secured to the robotic arm). Thus, the method 300 avoids the cost of a separate reference frame, the expense of time required to secure the separate reference frame to the robotic arm, and the incision that would otherwise be required to secure the separate reference frame to the patient if the reference frame were to be secured directly to the patient (e.g., to the patient's vertebrae or pelvis). Further, the method 300 eliminates the need for a snapshot framework that would otherwise be required during registration of the navigation coordinate system with the robot coordinate system (and vice versa).
The present disclosure encompasses embodiments of method 300 having more or fewer steps than the embodiments described above.
Referring now to FIG. 4, a method 400 of utilizing a robotic reference frame includes receiving information corresponding to a predicted arrangement of a plurality of tracking marks, the arrangement defining a customized disposable reference frame (step 404). The predicted arrangement may be based on an expected position of the robotic arm to which the plurality of tracking markers are fixedly secured, or a current position of the robotic arm to which the plurality of tracking markers are fixedly secured. The information may include results of one or more calculations (e.g., performed by a robot, such as robot 136, or by a processor, such as processor 104) to determine a predicted arrangement of the plurality of tracking markers based on an expected or current position of a robotic arm to which the tracking markers are fixedly secured. The expected or current position of the robotic arm may be a position that enables the end of the robotic arm (whether an end effector or other feature) to come into contact with an object whose position needs to be determined or confirmed (whether in a navigational coordinate space or other coordinate space). The information may comprise information about the position of each individual tracking mark relative to the robot coordinate system or another coordinate system. The information may include information about the characteristics of each individual tracking mark, such as the wavelength, frequency of generation pulses, or geometric pattern of each individual tracking mark.
Each of the plurality of tracking markers may be any of the tracking markers described herein, including, for example, tracking marker 156. Each of the tracking markers is fixedly secured to the robotic arm. The plurality of tracking marks may comprise at least three tracking marks, or at least four tracking marks, or at least five tracking marks, or at least more than five tracking marks.
The method 400 also includes receiving data corresponding to the detected arrangement of the plurality of tracking marks (step 408). The data may be received, for example, from a tracking tag sensor, such as tracking tag sensor 132. The data may include location information of the plurality of tracking markers in a navigational coordinate space or another coordinate space. Alternatively, the data may enable the calculation of the position of the plurality of tracking markers in the navigation coordinate space or another coordinate space. The data may include information about the characteristics of each individual tracking mark, such as the wavelength, frequency of generation pulses, or geometric pattern of each individual tracking mark.
Notably, the data corresponding to the detected arrangement of the plurality of tracking marks is independent of information corresponding to the predicted arrangement of the plurality of tracking marks. In other words, the data is generated without referring to the information, and the information is generated without referring to the data.
The method 400 further includes comparing the predicted arrangement of the plurality of tracking marks to the detected arrangement of the plurality of tracking marks to determine whether a customized one-time reference frame has been created (step 412). When the detected arrangement matches the predicted arrangement, the exact pose of the robotic arm is known, and thus enables the robotic arm (and/or tracking markers fixedly secured to the robotic arm) to be used as a customized disposable reference frame that can be used in place of a separate reference frame that must be secured to or held by the robotic arm or another object or person. The comparison may include using one or more algorithms, such as algorithm 128, to convert the position of the tracking mark in one coordinate system (e.g., in the robot coordinate system) to the position of the tracking mark in another coordinate system (e.g., in the navigation coordinate system). The comparison may further include superimposing an image included in or generated using data corresponding to the detected arrangement of the plurality of tracking marks on a virtual image generated based on information corresponding to the predicted arrangement of the plurality of tracking marks to determine whether the tracking marks in the two images are aligned with each other. Other comparison methods may also be used to determine whether the predicted arrangement of the plurality of tracking marks matches the detected arrangement of the plurality of tracking marks.
The method 400 further includes confirming the location of the object based on the creation of the customized one-time reference frame (step 416). As described above, when the predicted arrangement matches the detected arrangement, the precise position of the robotic arm is known so that the robotic arm (and/or a tracking marker fixedly secured thereto) can be used as a frame of reference. If the position of the object (and in some embodiments, the orientation of the object) can be detected while detecting the plurality of tracking marks and/or if the robotic arm is in contact with a known surface or feature of the object at the time the customized disposable reference frame is created, the position of the object (and in some embodiments, the orientation of the object) can be determined using the customized disposable reference frame just as the position of the object (and in some embodiments, the orientation of the object) can be determined using a reference frame separate from the robotic arm.
The present disclosure encompasses embodiments of method 400 having more or fewer steps than the embodiments described above.
As can be understood based on the foregoing disclosure, the present disclosure encompasses methods (and corresponding descriptions of methods 200, 300, and 400) having fewer than all of the steps identified in fig. 2A, 3, and 4, as well as methods (and corresponding descriptions of methods 200, 300, and 400) that include additional steps in addition to those identified in fig. 2A, 3, and 4.
The foregoing is not intended to limit the disclosure to the form or forms disclosed herein. In the foregoing detailed description, for example, various features of the disclosure are grouped together in one or more aspects, embodiments, and/or configurations for the purpose of streamlining the disclosure. Features of aspects, embodiments, and/or configurations of the present disclosure may be combined in alternative aspects, embodiments, and/or configurations other than those discussed above. The methods of the present disclosure should not be construed as reflecting the intent: the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed aspect, embodiment, and/or configuration. Thus the following claims are hereby incorporated into this detailed description, with each claim standing on its own as a separate preferred embodiment of the disclosure.
Moreover, although the description has included description of one or more aspects, embodiments, and/or configurations and certain variations and modifications, other variations, combinations, and modifications are within the scope of the disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative aspects, embodiments, and/or configurations, including alternative, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternative, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.

Claims (22)

1. A robot navigation system, the robot navigation system comprising:
a robot base;
a robotic arm, the robotic arm comprising:
a proximal end secured to the robot base;
a distal end movable relative to the proximal end; and
one or more arm segments located between the proximal end and the distal end; and
a plurality of tracking markers fixed to the robotic arm.
2. The robotic navigation system of claim 1, wherein each of the one or more arm segments supports at least one of the plurality of tracking markers.
3. The robotic navigation system of claim 1, wherein the plurality of tracking markers includes at least three tracking markers.
4. The robotic navigation system of claim 1, wherein each of the plurality of tracking markers is an LED.
5. The robotic navigation system of claim 1, wherein each tracking mark of the plurality of tracking marks is configured to emit or reflect light of a different wavelength than at least another tracking mark of the plurality of tracking marks.
6. The robotic navigation system according to claim 1, wherein each of the plurality of tracking markers is configured to emit light in a pulsed form at a first frequency different from a second frequency, at least another of the plurality of tracking markers being configured to emit light in a pulsed form at the second frequency.
7. The robotic navigation system of claim 1, wherein each of the plurality of tracking markers is a geometric pattern.
8. The robotic navigation system of claim 1, wherein at least one of the plurality of tracking markers is movably secured to the robotic arm.
9. The robotic navigation system of claim 8, wherein the at least one of the plurality of tracking markers is selectively movable between a first position and a second position.
10. The robotic navigation system of claim 1, wherein at least two of the plurality of tracking markers are spaced circumferentially around one of the one or more arm segments of the robotic arm.
11. The robot navigation system of claim 1, further comprising:
at least one processor; and
a memory storing instructions for execution by the at least one processor, the instructions when executed causing the at least one processor to:
determining a predicted arrangement of the plurality of tracking markers based on the pose of the robotic arm and the known position of each of the plurality of tracking markers on the robotic arm.
12. The robotic navigation system of claim 11, wherein the memory stores additional instructions for execution by the at least one processor, the additional instructions when executed further cause the at least one processor to:
receiving information about the detected arrangement of the plurality of tracking markers from a camera configured to detect the plurality of tracking markers; and
the detected arrangement is compared to the predicted arrangement.
13. A method of navigating with a robotic reference frame, the method comprising:
receiving first information about a plurality of tracking markers fixedly secured to a robot arm of a robot from a tracking marker sensor;
receiving second information from the robot corresponding to a position of the robot arm in a robot coordinate system; and
registering the robot coordinate system with a navigation coordinate system based on the first information and the second information.
14. The method of claim 13, wherein each of the plurality of tracking marks comprises an LED.
15. The method of claim 13, wherein each of the plurality of tracking marks comprises a geometric pattern.
16. The method of claim 13, wherein the first information comprises information about the location of the plurality of tracking markers in the navigation coordinate system.
17. The method of claim 13, wherein the second information includes information regarding a position at which each of the plurality of tracking marks is fixedly secured to the robotic arm.
18. An apparatus for surgical navigation with a robotic reference frame, the apparatus comprising:
at least one communication interface for receiving information from a robot;
at least one tracking mark sensor for detecting a plurality of tracking marks on a robot arm of the robot;
at least one processor; and
at least one memory storing instructions for execution by the at least one processor, the instructions when executed causing the at least one processor to:
receiving information from the robot corresponding to a predicted arrangement of the plurality of tracking marks, the predicted arrangement defining a customized one-time reference frame;
receiving data corresponding to the detected arrangement of the plurality of tracking marks from the at least one tracking mark sensor; and
comparing the predicted arrangement to the detected arrangement to determine whether the customized one-time reference frame has been created.
19. The apparatus of claim 18, wherein the memory stores additional instructions for execution by the processor, the additional instructions when executed further cause the processor to:
the position of the object in a predetermined coordinate space is confirmed based on the creation of the customized one-time reference frame.
20. The apparatus of claim 18, wherein each of the plurality of tracking marks is a geometric pattern.
21. The apparatus of claim 18, wherein the at least one tracking mark sensor is a camera and each tracking mark of the plurality of tracking marks is an LED.
22. The apparatus of claim 21, wherein at least two tracking marks of the plurality of tracking marks have different wavelengths or are configured to be pulsed at different frequencies.
CN202180041092.6A 2020-06-08 2021-06-03 Robotic reference frame for navigation Pending CN115989002A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202063036130P 2020-06-08 2020-06-08
US63/036,130 2020-06-08
PCT/US2021/035667 WO2021252263A1 (en) 2020-06-08 2021-06-03 Robotic reference frames for navigation

Publications (1)

Publication Number Publication Date
CN115989002A true CN115989002A (en) 2023-04-18

Family

ID=76797085

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180041092.6A Pending CN115989002A (en) 2020-06-08 2021-06-03 Robotic reference frame for navigation

Country Status (3)

Country Link
EP (1) EP4161427A1 (en)
CN (1) CN115989002A (en)
WO (1) WO2021252263A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117979917A (en) * 2021-09-30 2024-05-03 柯惠Lp公司 Setting remote center of motion in surgical robotic systems

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9526587B2 (en) * 2008-12-31 2016-12-27 Intuitive Surgical Operations, Inc. Fiducial marker design and detection for locating surgical instrument in images
US20190000569A1 (en) * 2012-06-21 2019-01-03 Globus Medical, Inc. Controlling a surgical robot to avoid robotic arm collision
CN110678141A (en) * 2017-03-31 2020-01-10 皇家飞利浦有限公司 Markless robot tracking system, controller and method
EP3691558A4 (en) * 2017-10-05 2021-07-21 Mobius Imaging LLC Methods and systems for performing computer assisted surgery

Also Published As

Publication number Publication date
WO2021252263A1 (en) 2021-12-16
EP4161427A1 (en) 2023-04-12

Similar Documents

Publication Publication Date Title
US11653983B2 (en) Methods for locating and tracking a tool axis
US11510740B2 (en) Systems and methods for tracking objects
US20230255699A1 (en) Time-spaced robotic reference frames
CN115989002A (en) Robotic reference frame for navigation
US20230270511A1 (en) Registration of multiple robotic arms using single reference frame
US20220192701A1 (en) Systems and methods for surgical port positioning
US20220241033A1 (en) Split robotic reference frame for navigation
WO2022162669A1 (en) Split robotic reference frame for navigation
CN116801829A (en) Split robot reference frame for navigation
EP4125674A1 (en) Systems for using surgical robots with navigation arrays
US20230240761A1 (en) Methods for locating and tracking a tool axis
US20220346882A1 (en) Devices, methods, and systems for robot-assisted surgery
US20240024028A1 (en) Systems and methods for verifying a pose of a target
US20230133689A1 (en) Arm movement safety layer
EP4333756A1 (en) Devices, methods, and systems for robot-assisted surgery
KR20230034296A (en) Navigation and/or robot tracking methods and systems
CN117320655A (en) Apparatus, methods, and systems for robotic-assisted surgery
CN116782849A (en) Multi-arm robotic system for identifying targets
CN116761572A (en) System and method for defining a working volume

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination