WO2009092164A1 - Guidage chirurgical utilisant la rétroaction tissulaire - Google Patents

Guidage chirurgical utilisant la rétroaction tissulaire Download PDF

Info

Publication number
WO2009092164A1
WO2009092164A1 PCT/CA2009/000076 CA2009000076W WO2009092164A1 WO 2009092164 A1 WO2009092164 A1 WO 2009092164A1 CA 2009000076 W CA2009000076 W CA 2009000076W WO 2009092164 A1 WO2009092164 A1 WO 2009092164A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical
tool
robot
tissue characteristic
characteristic
Prior art date
Application number
PCT/CA2009/000076
Other languages
English (en)
Inventor
Mehran Anvari
John D. Lymer
Timothy S. Fielding
Hon Bun Yeung
Original Assignee
Mcmaster University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mcmaster University filed Critical Mcmaster University
Priority to CA2712607A priority Critical patent/CA2712607A1/fr
Publication of WO2009092164A1 publication Critical patent/WO2009092164A1/fr
Priority to US12/842,462 priority patent/US20110015649A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • A61B17/1613Component parts
    • A61B17/1626Control means; Display units
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • A61B17/1662Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans for particular parts of the body
    • A61B17/1671Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans for particular parts of the body for the spine
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • A61B17/1695Trepans or craniotomes, i.e. specially adapted for drilling thin bones such as the skull
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • A61B17/17Guides or aligning means for drills, mills, pins or wires
    • A61B17/1739Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body
    • A61B17/1757Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body for the spine
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00022Sensing or detecting at the treatment site
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/305Details of wrist mechanisms at distal ends of robotic arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/064Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
    • A61B2090/065Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring contact or contact pressure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0818Redundant systems, e.g. using two independent measuring systems and comparing the signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3995Multi-modality markers

Definitions

  • the present application relates to guidance of surgical tools and to systems therefore. It also relates to automated robot performance of surgery and systems therefore.
  • preoperative images are taken, the surgery is planned using the preoperative images, and the surgeon is provided with guidance information during surgery based on the estimated location of the tools in the images.
  • Intraoperative images can be taken to update the image information.
  • the invention provides a surgical system for use with a surgical tool and a tissue characteristic sensor associated with the surgical tool.
  • the system includes an expected tissue characteristic for tissue on a predefined trajectory of the tool in a patient, and a controller to receive a sensed tissue characteristic from the tissue characteristic sensor, such sensed tissue characteristic associated with an actual trajectory of the tool, wherein the controller compares the expected tissue characteristic for the expected location with the sensed tissue characteristic for the actual trajectory.
  • the system may further include a display displaying information to an operator of the tool based on the compared expected tissue characteristic and sensed tissue characteristic.
  • the tool may be operated by an operator through manual operation of the tool.
  • the system may further include a robot for manipulating the tool, wherein the tool is operated by the operator through the operator manually operating the robot.
  • the system may include the tissue characteristic sensor.
  • Thesystem may include the surgical tool.
  • the system may include a robot for manipulating the tool under control of the controller, which control is based on the compared expected tissue characteristic and sensed tissue characteristic.
  • the tissue characteristic sensor may be a force sensor
  • the expected tissue characteristic may be a force characteristic of expected tissue on the predefined trajectory
  • the sensed tissue characteristic may be a sensed force characteristic on the actual trajectory of the tool.
  • the system may include means for an operator to monitor robot performance while under the control of the controller.
  • the system may include means for an operator to assume control away from the controller of the manipulation of the tool.
  • the invention provides a method of using a surgical system.
  • the method includes receiving at a controller within the surgical system from a tissue characteristic sensor a sensed tissue characteristic associated with an actual trajectory of a surgical tool, and comparing within the controller the expected tissue characteristic for the expected location with the sensed tissue characteristic for the actual trajectory.
  • the method may include displaying information on a display to an operator of the tool based on the compared expected tissue characteristic and sensed tissue characteristic.
  • the tool may be operated by an operator through manual operation of the tool.
  • the tool may be operated by the operator through the operator manually operating the robot for manipulating the tool.
  • the method may include sensing the tissue characteristic through the tissue characteristic sensor.
  • the method may include controlling a robot under control of the controller to manipulate the tool, which control is based on the compared expected tissue characteristic and sensed tissue characteristic.
  • the tissue characteristic sensor may be a force sensor
  • the expected tissue characteristic may be a force characteristic of expected tissue on the predefined trajectory
  • the sensed tissue characteristic may be a sensed force characteristic on the actual trajectory of the tool.
  • FIG. 1 is a block diagram of an example surgical system according to an embodiment of an aspect of the present invention.
  • FIGS. 2 is a block diagram of a further example surgical system according to an embodiment of an aspect of the present invention.
  • FIGS. 3 is a block diagram of another example surgical system according to an embodiment of an aspect of the present invention pedicle screw installation
  • FIG. 4-7 are perspective views of an example embodiment of an aspect of the present invention in use to drill a pedicle screw hole in a vertebra;
  • FIG. 8 is a diagrammatic illustration of various examples forces sensed in some example embodiments of aspects of the present invention.
  • FIGS. 9A-9B shows fluoroscope images showing target region and tool (FIG 8A - Lateral and FIG 8B - A/P Image);
  • FIGS. 10A-10B show a patient mounted localizer array (PLA) from a distance and close-up;
  • FIG. 11 shows an imager in use from above
  • FIG. 12 shows an imager in use from one side
  • FIG. 13 is a perspective view of a robotic system at a surgical site
  • FIG. 14 shows example start and end points Identified on the fluoroscope images of FIGS. 8A-8B;
  • FIG. 15 is a system interface diagram
  • FIG. 16 illustrates a perspective view of an example manipulator arm of an example robot
  • FIGS. 17A and 17B illustrates a back view and a side view of the example manipulator arm of FIG. 16;
  • FIG. 18 is a diagram of registration and tool tracking
  • FIG. 19 is a diagram of an example system set up
  • FIG. 20 is a diagram of an example robotic system at a surgical site
  • FIG. 21 is a diagram of robotic system of FIG. 15 at the surgical site with user input of trajectory points;
  • FIG. 22 is a diagram of localization of points using two fluoroscopic images
  • FIG. 23 is an example operation functional flow for a pedicle screw insertion
  • FIG. 24 is a block diagram of example system interfaces.
  • a surgical system 1 is for use with a surgical tool 3 and a tissue characteristic sensor 5 associated with the surgical tool 3.
  • the tool 3 and sensor 5 can be associated in many different ways.
  • the sensor 5 may be on or a part of the tool 3.
  • the sensor 5 and the tool 3 may be associated by tracking so that the relationship between the tool 3 and sensor 5 is known.
  • the sensor 5 may be part of a robot that manipulates the tool 3 such that relationship between the tool 3 and sensor 5 is known through the robot.
  • the system 1 stores in memory 6 an expected tissue characteristic 7 for tissue on a predefined trajectory of the tool 3.
  • Expected tissue characteristics for surgical tasks can be stored as models in the memory 6 for use by the surgical system.
  • a controller 11 receives a sensed tissue characteristic 13 from the tissue characteristic sensor 5.
  • the sensed tissue characteristic 13 is associated with an actual trajectory of the tool 3.
  • the controller 11 compares the expected tissue characteristic 7 for the expected location with the sensed tissue characteristic for the actual trajectory.
  • the predefined trajectory may be based on images as will be later discussed. Alternatively a surgeon may select a predefined trajectory through external viewing of a patient based on accumulated knowledge.
  • a display 15 displays information to an operator 17 of the tool 3 based on the compared expected tissue characteristic and sensed tissue characteristic.
  • the tool 3 may be operated by an operator 17, for example a surgeon, through manual operation of the tool 3 with the operator of the tool 3 viewing the displayed information and manually operating the tool 3 accordingly.
  • Other interfaces such as audible or tactile interfaces can be used to feedback information to the operator about the compared expected tissue characteristic and the sensed tissue characteristic.
  • audible or tactile interfaces can be used to feedback information to the operator about the compared expected tissue characteristic and the sensed tissue characteristic.
  • a tactile increase in pressure to magnify the force on a handheld tool, or a robot operated tool may be used to provide information to an operator.
  • a surgical system 20 is similar to system 1 and includes a robot 22 for manipulating the tool 3.
  • the tool 3 may take different forms on the different embodiments depending on how it is to be held and used.
  • a hand held scalpel (as tool 3) may be different from a robot handheld scalpel (as tool 3), as will be known to those skilled in the art.
  • the tool 3 is operated by the operator through the operator manually operating the robot.
  • the tissue characteristic sensor 5 may be supplied as part of the system 1 or may be provided separately.
  • the surgical tool 3 may be provided as part of the system 1 or may be provided separately.
  • a surgical system 30 is similar to the system 20.
  • the robot 22 is under control of the controller, which control is based on the compared expected tissue characteristic and sensed tissue characteristic.
  • the surgical system 30 can incorporate the following safety features:
  • Redundant sensors to facilitate signal cross checking.
  • robot 22 joints in the examples described herein have two position sensors that are checked against one another to ensure sensors are sending valid data. If these signals do not agree, an error is flagged and motion is halted,
  • No go zones defined within the surgical system for example by software executed thereon, to limit the available workspace for the surgical task, which no go zones can include a combination of user defined and system defined zones (such as avoiding known objects or targets),
  • Robot position feedback sensors are monitored against the commanded trajectory to ensure the robot is following the command within acceptable boundaries
  • Tissue characteristic feedback that is typically tool specific, but can include sensors on the tools to detect specific types of tissues
  • a deadman switch External stop switch for manual cancellation of task at the operator's discretion
  • FIGS. 4-7 an example will be described utilizing an expected tissue characteristic and sensed tissue characteristic for control of a surgical robot, such as for example robot 22 of system 30, in a pedicle screw hole drilling surgical task. It is to be understood that FIGS. 4-7 show the volume of the vertebra about the pedicle channel 44 in perspective view without cross-section; however, the trajectory for the surgical task is through the interior of the pedicle channel 44 in the interior of the vertebra. Accordingly, the drill bit 42 in the FIGS, is proceeding through the interior of the vertebra and not above the surface.
  • the tissue characteristic sensor 5 utilized in this example is a force sensor 5, the expected tissue characteristic is a force characteristic of expected tissue on the predefined trajectory, and the sensed tissue characteristic is a sensed force characteristic on the actual trajectory of the tool 3.
  • tissue characteristics capable of being sensed other than by force charateristics are also suitable for use with the surgical system.
  • the system can utilize photonics and lasers to drill fine tracks in the bone or soft tissue, for example to implant strengthening rods or radioactive seeds.
  • Sensors can be included to sense tissue distortion, for example, measured radiologically or by use of photonics.
  • a difference in the compared expected tissue characteristic 7 and the sensed tissue characteristic 13 can be used by the surgical system 30 to control the motion of the robot 22.
  • a drill bit 42 is used to drill a pedicle screw hole (surrounding drill bit) through a pedicle channel 44 of a vertebra 45.
  • the drill bit 42 proceeds through its pre-planned trajectory 46 to a destination 47 it encounters hard bone 48 then once through the hard bone 48 it encounter softer inner bone 50 that force sensor 5 senses as a resistive (anti- rotational) force of Fl on the drill bit 42.
  • the senor 5 can be six axis force sensor 5 utilizing strain gauges mounted on a flexure to measure strains, and thus the forces, applied to the sensor 5.
  • the sensor 5 is placed in the mechanical load path, so the loads are transferred through the flexure, where the strains are measured. Examples of the six axis sensed are described below. Such sensors are commercially available.
  • the sensor 5 can be a current sensor for a drill tool 3 of the robot 22.
  • Current drawn by a drill tool 3 will be related to the resistive force on the drill bit 42. As the resistive force increases the current drawn will increase.
  • Other sensors 5 can be used, as an example can include pressure sensors 5. The type and location of the sensor 5 will depend upon the applicable tool, surgical task, and force to be sensed. Multiple sensors 5 may be used to derive a tissue characteristic from multiple tissue characteristics. Tissue characteristics may be sensed over time to derive a tissue characteristic.
  • the pedicle channel 44 narrows and it is possible that the actual trajectory of the drill bit 42 will result in the drill bit 42 encounter hard bone 48 at a wall 52 of the pedicle channel 44. This results in the force sensor 5 sensing a resistive force of F2 greater than Fl.
  • the sensed forces Fl, F2 are transmitted back to the surgical system controller 11 on an ongoing basis in realtime and the controller 11 continuously compares the sensed forces against the expected forces.
  • the controller 11 can stop the robot 22, or in more sophisticated applications the controller 11 can adjust the planned trajectory 46 to an adjusted trajectory 54 with a destination 55 to move away from the hard bone 48 and toward the soft bone 50 in the pedicle channel 44.
  • a six axis sensor 5 mentioned previously can provided some direction information as to where the force is being exerted.
  • the surgical system can then adjust from the trajectory 46 to an adjusted trajectory 54 away from the force.
  • the surgical system 30 may not know how to adjust the trajectory 46, the surgical system 30 may have to pull back the drill bit 42 slightly and take an initial correction. If less force is encountered then the surgical system 30 may continue on the adjusted trajectory 54 until further correction is required. If the same force is encountered, or a greater force is encountered at an earlier position, on the adjusted trajectory 54 then a readjusted trajectory can be attempted. Thus a desired adjusted trajectory can be iteratively obtained. Alternatively, a planned trajectory that favors a particular side of the pedicle channel 44 may be chosen. If the wall 52 of the pedicle channel 44 is encountered then an initial correction can be made in a direction away from the side that was favored.
  • Adjustment of a planned trajectory 46 based on sensed forces can be applied to many other surgical tasks, and tools.
  • Forces may be sensed in multiple degrees of freedom for example, an x, y and z axis.
  • the x and z axis may be consider orthogonal lateral forces 60, 62, while the y axis may be a longitudinal force 64 along the drill bit axis.
  • Three rotational forces 66, 68, 70 can include rotation about each of the x, y and z axis.
  • other coordinate systems may be used to define the forces being sensed.
  • Encountered forces may be sensed indications of tissues other than soft bone 50 and hard bone 48.
  • skin can present a different force characteristic from internal organs.
  • Membranes may present different forces characteristics from the contents of the membranes.
  • Anticipated force characteristics that match sensed force characteristics can be used to by the surgical system for automated control of the robot. For example, if a desired location is behind skin and two membranes, the sensed force can be used to count the punctures of the skin and the two membranes before an action is taken by the robot, such as acquiring a sample.
  • Example interfaces of surgical systems with the OR and staff will be described.
  • the surgical systems can be implemented utilizing robots 22 such as a master slave device modified to provide automated surgical procedures using robotic capabilities for following a predefined series of surgical steps/sequences to produce a desired surgical outcome. It is recognized that specific embodiments of the robots 22 described herein are referenced only as examples upon which to implement the guidance and other functionality described herein. Other robots 22 and tools 3 may be used to carry out the functionality described herein. [0061] To enhance understanding of the principles described herein example surgical tasks will be outlined and an example description provided for robots 22 functioning as a single (or multiple) armed, image-guided system in the OR.
  • Example tasks that can take advantage of tool guidance utilizing the principals described herein include pedicle screw hole drilling, needle insertion for the precision placement of medication such as spinal pain management, and biopsy, for example.
  • Other example tasks that can be performed by an automated robot can include direct surgeon-in-the-loop (for directing a robot to perform a sequence of predefined surgical steps) and multiple armed applications for microsurgical and laparoscopic tasks, for example. It is recognized that the predefined surgical steps can be planned outside of the OR, inside of the OR, or a combination thereof.
  • image guided capabilities can be added to a robot to accomplish automatic, image guided, drive-to-target applications.
  • Pedicle screw insertion is an example of such applications and the majority of the remainder of this description will describe example embodiments with respect to pedicle screw insertion.
  • Performance of defined surgical steps can be guided for example by images.
  • images can be acquired using many well known techniques for surgical applications, such as fluoroscopic images, machine vision camera, and other imaging techniques that produce images of a patient and surgical tools (e.g.
  • the system can have the capability of accepting suitable images in DICOM format so that the system can be used with a fluoroscope when available.
  • a CT/Fluoro imaging system may be used to provide 3D images.
  • USS focused ultrasound scan
  • USS information in some procedures may reduce the radiation exposure levels experienced from CT/Fluoro.
  • the location of interest is internal and fluoroscopic, CT or MR image or other techniques are typically used for guidance information.
  • surgeons may be required to interpret the guidance information and use anatomical cues and navigational tricks.
  • surgeons perform the procedure 'blind', i.e. relying on the hands-on surgical abilities of the surgeon.
  • surgical precision or other constraints
  • some embodiments of the surgical system can reduce time spent verifying the initial position and orientation of the tool to gain confidence that a straight, forward trajectory will reach the desired destination.
  • Some embodiments of the surgical system can save precious time to verify anatomical tissue response and surgical precision issues during surgery.
  • some embodiments of the surgical system are particularly suitable to precise tool positioning at locations within the patient (as directed by image interpretation in view of the patient anatomy that is not directly visible to the surgeon).
  • Other applicable surgical tasks can include surgical instrumentation or intervention including biopsy, excision or tissue destruction using a variety of chemical or electro-mechanical or temperature sources.
  • Such tasks can be well suited to embodiments of the surgical system so that outcomes can be improved and surgical capabilities can be extended where they might otherwise be limited due to for example timing constraints, precision constraints, expertise/experience constraints.
  • Some embodiments of the surgical system can be used to perform certain surgical tasks within a larger surgical procedure.
  • Embodiments of the system can take a form to allow the robot to function like a fluoroscope, where the robot is rolled into the sterile field when it is needed for a particular task, and rolled out when it is finished.
  • the surgical system is directly linked to an imaging system, for example a CT/fluoro machine which is used as needed, or based on predetermined timings (as part of the predefined surgical tasks) to acquire data to allow the system to control the robot to carry out specific precise surgical tasks based on a pre-planned set of actions.
  • the surgical system uses trajectory-following and destination- selection capabilities of a robot to address discrepancies, 'close the loop', between the destination seen in the image and the actual destination within the patient, as well as to deal with any encountered (e.g. not predefined) obstacles/hindrances/considerations during performance of the predefined surgical task.
  • the surgeon is no longer performing a blind task, but rather is an intelligent connection between the information supplied by the image and the intended tool position defined in the physical world of the robot.
  • the surgeon is an intelligent connection in that the surgeon establishes the desired placement of the pedicle screws using the supplied image data. As surgical planning systems become more sophisticated it will be possible to interpret the image and determine from the image characteristics where the appropriate trajectory and destination. In current embodiments the surgeon performs this function.
  • Destination is the desired end point and trajectory is the direction to follow in reaching the reach the end point.
  • a combination of the destination and trajectory provides a surgical path.
  • the trajectory may be implied from a beginning point and an end point.
  • a destination may be implied from a beginning point and a direction and a distance from the beginning point in the specified direction.
  • Other ways in which a trajectory and destination may be specified will be evident to those skilled in the art. It is to be understood that a requirement for a trajectory and a destination does not require the actual trajectory and destination to be supplied, but rather information from which the trajectory and destination could be derived.
  • a surgical task to be performed by a surgical system utilizing an automated robot steps in an example can be: 1. Take one or more images (Imaging system)
  • a patient mounted localizer array within the images is registered with the system.
  • the robot is brought to the surgical field, and the patient localizer array is registered to the robot with the system.
  • Registration is a process by which coordinates and distances in the image are matched to coordinates of the robot. As is known in the art this can be done in many different ways.
  • a tool of the robot is displayed together graphically on a monitor with the image, so that a surgeon can select an initial position, trajectory and final destination of the tool using the fused image (it is recognized that this definition of the predefined task(s) - e.g.
  • the surgical system transforms the starting point, trajectory and destination defined in the image to the robot coordinates and is able to automatically control the robot to move the tool to the destination.
  • the precision of the movement is then dependent on the surgical system, including for example the mechanical design of the robot and the control precision, including any control software.
  • the task may be virtually rehearsed if desired to confirm that the performed motion is what the surgeon intended (e.g. follows the surgical path predefined by the surgeon in a manner that is suitable to the surgical task).
  • the surgical system provides interfaces to the surgeon to select the robotic motion, continually monitor the progress via the fused image, and have the ability to halt or modify motion of the robot at any time during performance of the surgical task(s).
  • Embodiments of the surgical system also provides an interface to allow the surgeon to input safety parameters which allows the surgical system to function within specified safety zones, such as for example anatomical barriers, force tension barriers (an example of force feedback based on encountered tissue characteristics), and/or electromechanical recordings.
  • the surgical system 30 is configured to be stowed in the OR away from the sterile field until it is needed to effectively perform a given surgical task. Any patient preparation, dissection or exposure may be performed first by the surgeon in a traditional fashion.
  • the robot 22 is bagged and rolled into the sterile field when it is time to perform the surgical task.
  • the robot is configured for quick deployment by a nurse. For example, in the case of the pedicle screw drilling task, the robot is deployed after the spine is exposed and it is time to drill the holes.
  • FIGS. 1OA, 10B Mount the patient localizer array to the patient in a location where it will be visible in two images (FIG. 9A, 9B) to be taken in the subsequent steps. ( FIGS. 1OA, 10B)
  • the robotic workstation will receive the data from the imager, register the patient localizer array (PLA) visible in the images.
  • PLA patient localizer array
  • the surgical system can be brought into the surgical field at this point, if it is not there already. Not all aspects of the surgical system are required to be in the surgical field, only those to be accessed by surgeon or other OR personnel in the surgical field and those that necessarily are required to be in the surgical field to perform there allotted task. The following example steps are anticipated for the preparation of the surgical system :
  • a tracking system will localize the patient mounted localizer array (PLA) and a robot end effector.
  • PPA patient mounted localizer array
  • Another aspect of the robot or a device localized to the robot can be used to localize the patient and the robot to use to track the robot as will be evident to those skilled in the art.
  • a representation of an Aurora Tracking System from NDI is shown in FIG. 13Error! Reference source not found., along with volume over which tools can be tracked (transparent volume in FIG. 13). From knowledge of these positions, the tool position can now be overlaid onto the images. Desired motions of the robotic system can now be programmed. To do this, the operator will:
  • System moves robot to defined start position [0075] Now, the tool can be automatically translated along the programmed trajectory via a hand controller deflection or single automove command.
  • the tracking system will monitor the positions of the array and the robot end effector to update the tool overlay and verify the trajectory in real time.
  • the PLA which is visible in the fluoroscope images and also tracked by the tracking system in 3D space, provides the link between the patient location and the images used to guide the surgery.
  • the PLA also allows for tracking of patient motion during the surgical task. Employing image guidance alone assumes that the anatomical target within the patient has a fixed relationship to the PLA from the point of where it is attached to the patient and images are taken, until the conclusion of the image guided portion of the operation.
  • the selected points are transformed by the surgical system under control of robotic control software into tool positional coordinates, which, when commanded to start by the surgeon, will be automatically followed by the robot. Possible limitations to the accuracy include for example the robot mechanical design and the quality of the image.
  • the surgical system provides a display for the surgeon to monitor progress as tool motion is updated in real time and provides the surgeon with the ability to stop motion at any time. Until the surgeon intervenes the tool and surgical task are operating under the control of the surgical system.
  • a second set of images can be taken for tool position verification. If the task is successful, the tool is removed from the surgical site with an automatic reverse motion. If the destination is not correct, a second trajectory and destination can be selected in the same way as the first trajectory was selected to adjust the tool position. Further holes can be drilled, or tissue samples obtained in the same manner.
  • the robotic task is complete, the tool is removed from the robot. The robot is disconnected, rolled out and de-bagged.
  • the surgical system must be compatible with standard procedures and processes of typical Operating Rooms (OR). The most important would be to maintain the sterility of the surgical field.
  • Example methods include:
  • Example interfaces between elements of the surgical system, external systems and an operator are:
  • Example surgical system states and modes are summarized in Table 1.
  • Table 1 System States
  • Example system functional capabilities include:
  • telesurgery potential such that the surgical system is teleoperable, where it can be controlled and/or planned at the patient side or by a surgeon/interventionalist from a remote networked location, for example network communications with the surgical system can be based on Web-based control software
  • a surgical system in accordance with one or more of the embodiments described herein can utilize incisions sufficient for entry of the tool only, thus result in reduction of incisions size over procedures that require the interaction of the surgeon during operation of the predefined task. This can include a reduced need to accommodate the ergonomic considerations of having direct interaction with the surgeon and the patient during operation of the surgical system.
  • Software for example operating on a programmed controller, such as a computer, of the surgical system can facilitate implementation of the above-described capabilities, as performed by the surgical hardware of the surgical system.
  • the surgical system can be packaged to quickly roll-in and roll-out, anchor and fasten to the operating table, and connect with utilities such as power, data, and video.
  • a further example configuration may incorporate a base of the robot into an operating table such that the robotic hardware components (e.g. shoulders) are attached directly to the table periphery, at locations_based on the surgery/procedure to be performed.
  • the robotic hardware components of the robot can move along the length of bed to be positioned in selected position(s) for imaging and/or performance of a predefined surgical task.
  • the robot can also be designed to seamlessly connect to a selected standard imager or group of imagers.
  • the surgical system is broken into three major physical components, the arm(s) (e.g. surgical hardware) and associated base, the control electronics cabinet, and the workstation (e.g. containing the surgical system controller) and displays.
  • Each component can be mounted on a base with wheels that can be moved by a nurse, however, the electronics cabinet may be shared among OR's and therefore may be mounted at a single central location, for example with cables routed to each OR.
  • the robotic arm is mounted to a base that contains features that permit attachment to the operating table and anchoring, or stabilizing to the floor.
  • the volume of this component can be minimized at the operating table to allow access and space for attending surgeons and nurses.
  • Tools can be manually or automatically attached and detached to the robotic arm by a nurse, as well as automatically recognized by the robotic controller as to the configuration of the coupled tools and arms. This is sometimes referred to in computer applications as plug and play capability.
  • the workstation component contains a large display, a hand controller for arm motion commands under direct surgeon control, a trajectory and destination selection device (such as a mouse and keyboard) and additional monitors for video and data displays.
  • IGAR can have three displays. One to display CT/Fluoro or USS imaging obtained. One to show the superimposed imaging and the surgical anatomy obtained from an outside camera (fused image) and showing the tracking markers to ensure visually to the surgeon that the system is operating correctly and a third to show the preplanned action steps and what the next action is the robot going to do. Further displays can show other parameters such as robotic operational parameters (e.g. force sensing at the tip) and patient parameters (e.g. the temperature or pulse, etc.).
  • robotic operational parameters e.g. force sensing at the tip
  • patient parameters e.g. the temperature or pulse, etc.
  • the surgical system under automated control is not a master-slave type setup (where all movements of the surgical hardware is under direct manipulation control of the surgeon), rather the surgical system allows for issuance of a command that causes the predefined surgical task to be automated as it performed under the control of the surgical system and under supervision (rather than direct manipulation) of the surgeon. It is also recognized that in appropriate situations (e.g. under emergency conditions or at preplanned surgeon hands-on interaction points) the surgeon can take control of the surgical system and perform the predefined surgical task and/or other surgical tasks manually, as desired.
  • the surgical robot can have a stop button or other interface which allows the surgeon to halt the performance of the predefined surgical task and a clutch system for the surgeon to enable and disable the robotic arm to use manually with the aid of the hand controller.
  • a stop button or other interface which allows the surgeon to halt the performance of the predefined surgical task
  • a clutch system for the surgeon to enable and disable the robotic arm to use manually with the aid of the hand controller.
  • a similar setup can be used in the planning mode to allow the surgeon to plan the set of movements and correct trajectory for robotic action.
  • the robot could be trained to learn the surgical task through interpreting the actual movements of the robotic hardware via the surgeon, when the surgical system is in the master-slave mode.
  • the controller of the surgical system can be used to create the definitions for the predefined surgical task through monitoring and processing of the movements recorded in the master-slave mode.
  • the master-slave mode could be used in the planning stage to help with the programming of the surgical system controller to create the definitions for the predefined surgical task.
  • the robotic arm for this system can be especially suited to automated microsurgical robotic tasks.
  • the system as shown has a single arm which may be controlled telerobotically by a master hand controller for issuing commands to the robot to start the predefined surgical task(s).
  • Robot manipulator having configuration other than the robotic arm illustrator herein may be used within the surgical system.
  • the image guided capability (as coordinated by the programmable surgical system controller) enables the surgical robotic hardware to perform precise automated surgical tasks, according to implementation of a sequence of pre-defined steps for a desired surgical result, for example automated movement from a defined start point to a defined end point.
  • the imaging of the target site can be done with machine vision cameras. These can provide the images for the operator to register the tool in the image and select/predefine the trajectory for the robot to follow.
  • the target sample is shown as a spine model representing a patient.
  • a surgical system controller 1501 is a robot control obtaining input 1503 from sensors at robot 1505, the surgeon 1507 via a hand controller 1509 or other computer interface suitable for initiating or halting the performance of the predefined surgical tasks, and position feedback determined form interpretation of digital images by a combination of tracking system information 1511 with imager data 1513 as performed by an image processing task space command module 1515.
  • the image processing task space command module 1515 could also be part of the robot control, as desired.
  • different functions of the robot control could be distributed throughout surgical system 1517. For example, additional intelligence could be built directly into the robot 1505 itself.
  • an surgical system controller 1501 can be integrated (for example on a single computer) alone or together with other components, such as the robot or image processor, and the functions of the surgical system controller 1501 can be distributed within the surgical system 1517.
  • an operating bed or table 1519 can be associated with a robot with up to, for example, eight flexible robotic arms or manipulators in an operating room (OR) under control of the surgical system.
  • Each of the arms can be releasably secured to a respective base station which can travel along a track system positioned on the perimeter of the table.
  • the base can be securely mounted to the tracking system, such that the base can be remotely controlled by the surgical system controllers to reposition the surgical hardware at various locations with respect to the anatomy of the patient on the table.
  • the relative position and orientation of the surgical system hardware is monitored by the surgical system controller with respect to a common reference coordinate system, such as for example a room coordinate system, table coordinate system, patient coordinate system where patient position trackers are used.
  • the arms can have six degrees of freedom and can enable robotic surgery (as supervised by the surgeon) in cooperation with real time radiological evaluations by either, for example, CT, MRI or fluoroscopy imaging apparatus.
  • the selectable position capability of the base stations with respect to the table can add another motion degree-of-freedom to each arm that can be used by the surgical system controller to increase the workspace of the arm and/or maintain the distal arm position/orientation while moving the arm out of the way of other arms or another OR device, such as for example a fluoroscopic imager.
  • Sensors of the surgical system hardware provide position/orientation information of the base, arms, tool-tips as feedback to the surgical system controller, so as to help guide the surgical system hardware in view of the interpreted images during performance of the surgical task(s).
  • the position tracking devices facilitate the surgical system to adjust to slight, for example micro movements of the patient during the performance of the surgical task. Micro movements may be, for example, small patient motions (breathing for instance), as opposed to gross motions, like standing up or rolling over.
  • the surgeon can determine the range of patient movement acceptable beyond which the system has to re-register its tool position in relation to predetermined landmark using a combination of tracking markers and CT/fluoro or USS imaging of internal organ landmarks, for example.
  • Position sensors can also provide data to the controller to facilitate automatic potential collision detection and avoidance between arms/tools, as well as to help in avoiding predefined no-go zones with respect to patient anatomy.
  • the surgical system controller includes a data signal module for receiving/transmitting data to and from the arm, such as for example camera signals or position sensor signals, and a control signal module for transmitting control signals to actuated components of the arms, such as motors and camera operation, in performance of the predefined task.
  • the control signal module also receives feedback signals from the actuated components of the arm, such as from force sensors.
  • Such force sensors can for example sense resistive force, such as anti-rotational resistance, being encountered by a drill bit as it moves through tissue in the body. Encountered forces can be compared against anticipated forces by the surgical system controller. Where there is a difference between the anticipated force and the encountered force then the surgical system controller can control the robot accordingly. For example, the robot can be stopped and an indication provided to the surgeon of the unexpected condition.
  • resistive force such as anti-rotational resistance
  • the surgical system controller is also coupled to a command module for receiving/confirming commands issued by the surgeon to initiate/halt the performance of the predefined surgical task.
  • the command module can also be used to provide feedback to the surgeon in terms of the progress of the surgical task, as well as to request for direction when parameters are encountered that are outside of the definitions of the predefined task, for example the occurrence or predicted occurrence of a bone fracture that was not anticipated in performance of the surgical task.
  • the types of arms that are part of the surgical system hardware can be changed to suit the type of surgical procedure such as but not limited to laparoscopic, orthopaedic, trauma, and microsurgery including neurosurgery and minimal access cardiac.
  • the physical form/abilities and/or communications capability (with the controller) for each arm can be different as suits the intended surgical procedure for each specific arm/tool combination.
  • the surgical system can be configured with a common base for each category of procedures and the forearm hardware can be changed depending on the specific task to be performed. It is possible that in a single operation (e.g. including one or more surgical tasks), a number of different forearms may be needed to complete the whole operation.
  • a base and forearm is used capable of holding a drill and exerting the right amount of force, whereas for pain delivery or biopsy task a much smaller, thinned and more radio-opaque forearm may be used.
  • the arms and corresponding base stations preferably provide access to all parts of the patient in a single surgical procedure (i.e. predefined surgical task) as monitored by the surgeon, depending upon the particular selection of combined arms, instruments, base stations and their location with respect to the table.
  • This combination can be used to provide a dynamically configurable surgical system suited to the planned surgical procedure on the patient. Configuration of the surgical system (either automatic, semi-automatic, and/or manual) can be facilitated by a configuration manager of the controller. Further, it is recognised that each arm has a proximal end that is coupled to the base station and a distal end for holding the surgical instruments.
  • the arms can be articulated multi-segmented manipulators and that the base stations can be positioned independently of one another with respect to the table (e.g. one or more arms can be attached to one or more base stations). Further, articulation of each of the arms can be done independently through assigned control modules of the surgical system controllers. Various portions of the arms and the base stations are tracked for position and/or orientation in the coordinate system, as reported to the surgical system controller.
  • an example robot has a base 1600 and a manipulator arm.
  • the manipulator arm as shown has a plurality of segments: shoulder made up of a shoulder roll 1601 and shoulder pitch 1603, upper arm 1605, forearm 1609, wrist 1611 and an end-effector 1613.
  • the segments are connected to form joints. Some joints have limited degrees of freedom to rotate about a single axis or multiple axes depending on the function of the segments as implied by the names used above.
  • the end effector 1613 provides an interface between the arm and any tools with a suitable corresponding interface.
  • the end effector 1613 allows for manipulation of the tool, such as rotation or actuation of a tool function. It may also contain an electrical interface to connect to any sensors on the tool, actuate any electrical devices on the tool or identify the tool.
  • example dimensions for the robot illustrated in FIGS. 16, 17A and 17B are in length by width by height in millimetres are base 1601 133 x (variable) x 106, Shoulder 1603 62x 108 x 113, upper arm 1605 60 x 60 x 210, forearm 1609 46 x 46 x 171, wrist 1611 73 x 73 x 47 and end effector 161345 x 45 x 118.
  • the surgical system can recognize what forearm is attached, thus adapting its maneuverability and functionality to the series of tasks which can be achieved with the specific forearm .
  • the system can be adapted to automated tool change by disconnection and connection of tools with the end effector to complete a set of surgical tasks in sequence that require different tools.
  • the position of the patient is monitored during the operation so that motion of the patient can be identified.
  • a patient mounted localizer array (PLA) is used as mentioned previously.
  • PPA patient mounted localizer array
  • This provides a reference frame to located features in the fluoroscope images.
  • the same feature is located in two different (non co-planar) images to locate these points in 3D space relative to the PLA.
  • the robot is located relative to the PLA via an external Tracking System. This locates the PLA and the robot end effector (via embedded targets). The relative position of these features allows the robot held tool to be overlaid on the fluoroscope images, and the robot position to be guided by operator inputs on the images.
  • An example registration process can involve:
  • the PLA is used to link the positions of features in the images to relative to the robot end effector.
  • the robotic system does not need to be in place when the images are taken.
  • the PLA needs to be in place, and cannot move in order to guide the robotic system via the acquired images.
  • a hybrid system could be employed, where the patient mounted localizer array is also visible in the imager. This target provides the link between image space and real world space. This direct registration can eliminate the imager specific calibration required (Tec) by the ⁇ world tracker' approach.
  • Calibration is performed of the patient target in the image (Tti).
  • the image of the target and knowledge of the target geometry is used to calculate the imager position, which is used for 3D navigation.
  • Position of the patient mounted localizer array is monitored by the tracking system during surgery to warn against patient motion and update the tool overlay. This information can also be used to move the robot to cancel relative motion between the patient and end effector.
  • the robot need not be present during imaging.
  • the patient mounted target is mounted and is kept stable relative to the patient once imaging has occurred.
  • the patient localizer array is kept in imager field of view for both images.
  • TpIa Calibration of imager specific targets and tracking system targets.
  • Tet Robot localizer array to tip position.
  • Tbe Robot kinematics. Used to determine joint motions from desired end effector position and user commanded delta.
  • Tpr Relative position of patient mounted frame and robot end effector. Used to in combination with Tpc to overlay tool position.
  • Tpil, Tpi2 Transformation of coordinates from image space to patient localizer target frame.
  • the patient mounted localizer array (PLA) is attached to the patient.
  • the imager fluoroscope or demo camera
  • the PLA position is located in each image by the system.
  • the camera positions are determined, allowing for localization of the PLA in space.
  • the imager can be removed from the area.
  • the robotic system is brought to the surgical site, along with a tracking system that will localize the PLA and the robotic system.
  • the robotic system can then be guided, relative to the PLA, to sites identified by the operator in the images.
  • the fluoroscopic imager produces an image that is a projection of objects that are between the head and the imager sensor, a point that is selected in one image represents a line of possible points in 3D space.
  • the purpose of the second image is to locate the position of the point of interest along the line.
  • the point selected in Image 1 represents a locus of possible points represented by the red line.
  • Selecting a point in Image 2 also represents a locus of possible points represented by the green line. The intersection of these points represents the desired point.
  • the range of possible points in the second image can be limited to possible valid point (along the diagonal line extending from the centre of image 1 to imager position 1).
  • the relative positions of the imager need to be known when image 1 and image 2 are taken. This can be calculated based on the registration of the PLA in these images.
  • FIG. 23 an example functional flow of the system is illustrated in block form. Additional detail of selected example steps is given in the following sections.
  • each joint is required to find a home position in order for the system to understand its pose. This operation is done away from the surgical field as part of the preparation procedures.
  • the system can be draped at the same time.
  • Absolute position encoders could be utilized, if desired.
  • Trajectory planning is performed in the example described by the operator via the workstation interface.
  • a start and end point are defined, along with any desired way points via an input device, such as a mouse or keyboard, on the acquired images.
  • the motion of the robotic system can be simulated on the screen before the system is commanded to move so that the user can verify the intended motion of the system.
  • the robotic system can advance the tool along the planned trajectory in two different modes: Master/Slave - the operator can control the position of the tool along the defined trajectory; or Automove - the operator can select a speed at which the tool will be moved automatically along the defined trajectory from a start position to a defined destination. This may include a limited number of way points, if desired.
  • FIG. 24 shown is further example embodiment of surgical system utilizing a computer 314 that has control module, which computer and control module act together as controller 300 for controlling a robotic system 112.
  • the computer 314 includes a network connection interface 301, such as a wireless transceiver or a wired network interface card or a modem, coupled via connection 318 to a device infrastructure 304.
  • the connection interface 300 is connectable during operation of the surgical system.
  • the interface 300 supports the transmission of data/signaling in messages between the computer 314 and the robotic system 112.
  • the computer 314 also has a user interface 302, coupled to the device infrastructure 304 by connection 322, to interact with an operator (e.g. surgeon).
  • the user interface 302 includes one or more user input devices such as but not limited to a QWERTY keyboard, a keypad, a track wheel, a stylus, a mouse, a microphone and the user output device such as an LCD screen display and/or a speaker. If the screen is touch sensitive, the display can also be used as the user input device as controlled by the device infrastructure 304.
  • the user interface 302 is employed by the operator of the computer 314 (e.g. work station) to coordinate messages for control of the robotic system 112.
  • the device infrastructure 304 includes a computer processor 308 and the associated memory module 316.
  • the computer processor 308 manipulates the operation of the network interface 300 and the user interface 302 by executing related instructions, which are provided by an operating system and a control module embodied in software located, for example, in the memory module 316.
  • the network interface 300 could simply be a direct interface 300 to the robotic system 112 such that commands could be issued directly to the robotic system 112 without requiring the commands to go through a network.
  • the device infrastructure 304 can include a computer readable storage medium 312 coupled to the processor 308 for providing instructions to the processor and/or to load/update the control module in the memory module 316.
  • the computer readable medium 312 can include hardware and/or software such as, by way of example only, magnetic disks, magnetic tape, optically readable medium such as CD/DVD ROMS, and memory cards.
  • the computer readable medium 312 may take the form of a small disk, floppy diskette, cassette, hard disk drive, solid-state memory card, or RAM provided in the memory module 310. It should be noted that the above listed example computer readable mediums 312 can be used either alone or in combination.
  • control module could be installed and executed on computer 314, which could have various managers 202,204,208,210,212 installed and in communication with themselves, the robotic system 112 and/or the surgeon.
  • the control module uses the user interface 302 for providing operator input to the robotic system 112 via the performance of the surgical tasks as facilitated by associated managers/modules 202,204,208,210,212,216 which could be for example configuration, communication, command, image interpretation, and other modules, as desired, to facilitate the performance of the predefined surgical task.
  • a communication manager provides for communication of data signals to/from the data manager and communication of control signals to/from a control manager.
  • the database manager provides for such as but not limited to persistence and access of image data to/from an image database, data related to the functioning/set-up of various elements of the robotic system 112, for example arms, base station, actuators, and various position/orientation sensor data, and for providing data as needed to a position and orientation manager.
  • a control manager in cooperation with the control module and position/orientation information, provides for monitoring the operation of the arms, base stations, actuators, imaging equipment (for example a camera), and tools.
  • the position/orientation manager is responsible for such as but not limited to receiving sensor data from the data manager for calculating the position and orientation of the respective arm components, tools, base stations, patient, and tabletop.
  • the calculated position/orientation information is made available to such as but not limited to the performance progress of the predefined surgical task(s), the display manager, and the control manager.
  • the configuration manager provides for such as but not limited to dynamic configuration of selected arms, base stations, the controller 300 (for example programming of parameters used to defined the predefined task), and a tabletop comprising the desired robotic system 112 setup for a particular surgical procedure.
  • the dynamic configuration can be automatic, semi-automatic, and/or manual operator intervention.
  • the display manager of the computer 314 coordinates/renders the calculated position/orientation information and the patient/tool images on the display of the user interface 302, for monitoring by the operator. For automated operation of the robotic system 112 surgical information displayed on the display (e.g.
  • controller 200 can include: pre-programmed activity of the planned surgery (i.e. surgical steps and required arms and instruments combinations); preprogrammed safety protocols for controlling the arms in the surgical environment; and necessary instruments for the surgery as well as instruments suitable for selected arm types, as facilitated by the configuration manager.
  • the controller 300 can be programmed (using the predefined surgical task) to inhibit movement of the arms and associated instruments into predefined no-go zones with respect to internal regions of the patient and external regions of the OR.
  • the controller 300 can facilitate the control of the arms and base stations to perform a variety of robotic surgeries in neurology, orthopaedic surgery, general surgery, urology, cardiovascular and plastic surgery, for example.
  • the controller 300 can also facilitate tele-robotic remote surgery by the surgeon from a remote distance.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne un système chirurgical destiné à être utilisé avec un outil chirurgical et un capteur de caractéristiques tissulaires associé audit outil. Le système présente une caractéristique tissulaire escomptée concernant le tissu sur une trajectoire prédéfinie de l'outil chez un patient et un dispositif de commande destiné à recevoir une caractéristique tissulaire détectée en provenance du capteur de caractéristiques tissulaires, une telle caractéristique tissulaire détectée étant associée à une trajectoire réelle de l'outil. Le dispositif de commande compare la caractéristique tissulaire escomptée pour l'emplacement prévu avec la caractéristique tissulaire détectée pour la trajectoire réelle. Un robot peut être utilisé pour effectuer des tâches chirurgicales automatisées et faire des ajustements sur la base de différences entre la caractéristique escomptée et la caractéristique détectée.
PCT/CA2009/000076 2008-01-25 2009-01-23 Guidage chirurgical utilisant la rétroaction tissulaire WO2009092164A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CA2712607A CA2712607A1 (fr) 2008-01-25 2009-01-23 Guidage chirurgical utilisant la retroaction tissulaire
US12/842,462 US20110015649A1 (en) 2008-01-25 2010-07-23 Surgical Guidance Utilizing Tissue Feedback

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US665508P 2008-01-25 2008-01-25
US61/006,655 2008-01-25

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/842,462 Continuation US20110015649A1 (en) 2008-01-25 2010-07-23 Surgical Guidance Utilizing Tissue Feedback

Publications (1)

Publication Number Publication Date
WO2009092164A1 true WO2009092164A1 (fr) 2009-07-30

Family

ID=40900748

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2009/000076 WO2009092164A1 (fr) 2008-01-25 2009-01-23 Guidage chirurgical utilisant la rétroaction tissulaire

Country Status (3)

Country Link
US (1) US20110015649A1 (fr)
CA (1) CA2712607A1 (fr)
WO (1) WO2009092164A1 (fr)

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8043229B2 (en) 2007-06-29 2011-10-25 Actuated Medical, Inc. Medical tool for reduced penetration force
US8328738B2 (en) 2007-06-29 2012-12-11 Actuated Medical, Inc. Medical tool for reduced penetration force with feedback means
EP2666428A1 (fr) 2012-05-21 2013-11-27 Universität Bern Système et procédé permettant d'estimer la position spatiale d'un outil à l'intérieur d'un objet
WO2014084408A1 (fr) * 2012-11-30 2014-06-05 Olympus Corporation Système de support d'opération, et procédé de commande d'un système de support d'opération
WO2016164590A1 (fr) * 2015-04-10 2016-10-13 Mako Surgical Corp. Système et procédé de commande d'un outil chirurgical pendant le déplacement autonome de l'outil chirurgical
WO2017147596A1 (fr) * 2016-02-26 2017-08-31 Think Surgical, Inc. Procédé et système de guidage de positionnement d'utilisateur d'un robot
CN107690317A (zh) * 2015-06-04 2018-02-13 克瑞肖株式会社 手术路径设定装置、包括该手术路径设定装置的手术机器人系统以及用于手术机器人的手术路径设定方法
US9987468B2 (en) 2007-06-29 2018-06-05 Actuated Medical, Inc. Reduced force device for intravascular access and guidewire placement
US10117713B2 (en) 2015-07-01 2018-11-06 Mako Surgical Corp. Robotic systems and methods for controlling a tool removing material from a workpiece
US10219832B2 (en) 2007-06-29 2019-03-05 Actuated Medical, Inc. Device and method for less forceful tissue puncture
EP2619867B1 (fr) * 2010-12-16 2019-04-24 St. Jude Medical, Atrial Fibrillation Division, Inc. Système de détection automatique et de prévention d'emballement de moteur
EP3666212A1 (fr) * 2018-12-14 2020-06-17 Globus Medical, Inc. Automatisation robotique chirurgicale au moyen de marqueurs de suivi
WO2020190637A1 (fr) * 2019-03-15 2020-09-24 Mako Surgical Corp. Système de chirurgie robotisée et procédés utilisant une fraise pour la pénétration et la canulation osseuses
US10940292B2 (en) 2015-07-08 2021-03-09 Actuated Medical, Inc. Reduced force device for intravascular access and guidewire placement
US11033341B2 (en) 2017-05-10 2021-06-15 Mako Surgical Corp. Robotic spine surgery system and methods
US11045267B2 (en) 2012-06-21 2021-06-29 Globus Medical, Inc. Surgical robotic automation with tracking markers
US11065069B2 (en) 2017-05-10 2021-07-20 Mako Surgical Corp. Robotic spine surgery system and methods
WO2021150810A1 (fr) * 2020-01-22 2021-07-29 Smith & Nephew, Inc. Procédés et systèmes de préparation d'os assistée robotisée à plusieurs étages pour implants sans ciment
US11253327B2 (en) 2012-06-21 2022-02-22 Globus Medical, Inc. Systems and methods for automatically changing an end-effector on a surgical robot
US11298196B2 (en) 2012-06-21 2022-04-12 Globus Medical Inc. Surgical robotic automation with tracking markers and controlled tool advancement
US11317971B2 (en) 2012-06-21 2022-05-03 Globus Medical, Inc. Systems and methods related to robotic guidance in surgery
US11399900B2 (en) 2012-06-21 2022-08-02 Globus Medical, Inc. Robotic systems providing co-registration using natural fiducials and related methods
WO2022175939A1 (fr) * 2021-02-18 2022-08-25 Mazor Robotics Ltd. Systèmes, dispositifs et procédés d'évitement de biseau d'outil
EP2889015B1 (fr) * 2013-12-30 2022-08-31 National Taiwan University Robot à main pour chirurgie orthopédique
US11589771B2 (en) 2012-06-21 2023-02-28 Globus Medical Inc. Method for recording probe movement and determining an extent of matter removed
US11786324B2 (en) 2012-06-21 2023-10-17 Globus Medical, Inc. Surgical robotic automation with tracking markers
US11793570B2 (en) 2012-06-21 2023-10-24 Globus Medical Inc. Surgical robotic automation with tracking markers
US11793543B2 (en) 2015-09-18 2023-10-24 Obvius Robotics, Inc. Device and method for automated insertion of penetrating member
US11801097B2 (en) 2012-06-21 2023-10-31 Globus Medical, Inc. Robotic fluoroscopic navigation
US11819365B2 (en) 2012-06-21 2023-11-21 Globus Medical, Inc. System and method for measuring depth of instrumentation
US11857149B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. Surgical robotic systems with target trajectory deviation monitoring and related methods
US11857266B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. System for a surveillance marker in robotic-assisted surgery
US11864745B2 (en) 2012-06-21 2024-01-09 Globus Medical, Inc. Surgical robotic system with retractor
US11864839B2 (en) 2012-06-21 2024-01-09 Globus Medical Inc. Methods of adjusting a virtual implant and related surgical navigation systems
US11883217B2 (en) 2016-02-03 2024-01-30 Globus Medical, Inc. Portable medical imaging system and method
US11896446B2 (en) 2012-06-21 2024-02-13 Globus Medical, Inc Surgical robotic automation with tracking markers
US11963755B2 (en) 2012-06-21 2024-04-23 Globus Medical Inc. Apparatus for recording probe movement
US11974822B2 (en) 2012-06-21 2024-05-07 Globus Medical Inc. Method for a surveillance marker in robotic-assisted surgery
US12004905B2 (en) 2012-06-21 2024-06-11 Globus Medical, Inc. Medical imaging systems using robotic actuators and related methods

Families Citing this family (77)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9107684B2 (en) 2010-03-05 2015-08-18 Covidien Lp System and method for transferring power to intrabody instruments
JP5380348B2 (ja) * 2010-03-31 2014-01-08 富士フイルム株式会社 内視鏡観察を支援するシステムおよび方法、並びに、装置およびプログラム
US9119655B2 (en) 2012-08-03 2015-09-01 Stryker Corporation Surgical manipulator capable of controlling a surgical instrument in multiple modes
EP2787910B1 (fr) 2011-12-05 2022-08-03 Mazor Robotics Ltd. Montage actif sur lit pour robot chirurgical
US9956042B2 (en) 2012-01-13 2018-05-01 Vanderbilt University Systems and methods for robot-assisted transurethral exploration and intervention
US9539726B2 (en) 2012-04-20 2017-01-10 Vanderbilt University Systems and methods for safe compliant insertion and hybrid force/motion telemanipulation of continuum robots
WO2013158983A1 (fr) 2012-04-20 2013-10-24 Vanderbilt University Dispositif robotique pour l'établissement d'un canal d'accès
WO2013158974A1 (fr) 2012-04-20 2013-10-24 Vanderbilt University Poignets habiles pour une intervention chirurgicale
US11135026B2 (en) 2012-05-11 2021-10-05 Peter L. Bono Robotic surgical system
US9333650B2 (en) * 2012-05-11 2016-05-10 Vanderbilt University Method and system for contact detection and contact localization along continuum robots
US9220570B2 (en) * 2012-06-29 2015-12-29 Children's National Medical Center Automated surgical and interventional procedures
KR102235965B1 (ko) 2012-08-03 2021-04-06 스트리커 코포레이션 로봇 수술을 위한 시스템 및 방법
US9226796B2 (en) 2012-08-03 2016-01-05 Stryker Corporation Method for detecting a disturbance as an energy applicator of a surgical instrument traverses a cutting path
US20140135790A1 (en) * 2012-10-01 2014-05-15 Aaron Fenster System and method for guiding a medical device to a target region
KR102079945B1 (ko) * 2012-11-22 2020-02-21 삼성전자주식회사 수술 로봇 및 수술 로봇 제어 방법
EP2967522A4 (fr) * 2013-03-13 2016-11-02 Think Surgical Inc Systèmes et procédés pour la planification préopératoire et le placement de tunnel osseux précis pour une reconstruction de ligament
KR102380980B1 (ko) 2013-03-15 2022-04-01 스트리커 코포레이션 수술 로봇 조작기의 엔드 이펙터
CN105899146B (zh) 2013-12-15 2018-11-02 马佐尔机器人有限公司 半刚性骨骼附接机器人手术系统
WO2015154069A1 (fr) * 2014-04-04 2015-10-08 Surgical Theater LLC Navigation dynamique et interactive dans un environnement chirurgical
US9272417B2 (en) 2014-07-16 2016-03-01 Google Inc. Real-time determination of object metrics for trajectory planning
EP3182922A4 (fr) * 2014-08-23 2018-03-21 Intuitive Surgical Operations, Inc. Systèmes et procédés de contrôle de trajectoire dynamique
US10272573B2 (en) * 2015-12-18 2019-04-30 Ge Global Sourcing Llc Control system and method for applying force to grasp a brake lever
EP3112965A1 (fr) * 2015-07-02 2017-01-04 Accenture Global Services Limited Automatisation de processus robotique
CA2976516C (fr) * 2015-07-27 2022-11-22 Synaptive Medical (Barbados) Inc. Retroaction de navigation pour point de cheminement peroperatoire
DE102015118918B3 (de) * 2015-11-04 2017-05-04 Haddadin Beteiligungs UG (haftungsbeschränkt) Roboter mit Ansteuerung zur diskretisierten manuellen Eingabe von Positionen und/oder Posen
EP3414686A4 (fr) * 2015-12-07 2019-11-20 M.S.T. Medical Surgery Technologies Ltd. Détection automatique de dysfonctionnement dans des outils chirurgicaux
IL245339A (en) 2016-04-21 2017-10-31 Rani Ben Yishai Method and system for verification of registration
US10695134B2 (en) * 2016-08-25 2020-06-30 Verily Life Sciences Llc Motion execution of a robotic system
US10631933B2 (en) * 2016-08-31 2020-04-28 Covidien Lp Pathway planning for use with a navigation planning and procedure system
US9805306B1 (en) 2016-11-23 2017-10-31 Accenture Global Solutions Limited Cognitive robotics analyzer
US11793394B2 (en) 2016-12-02 2023-10-24 Vanderbilt University Steerable endoscope with continuum manipulator
EP3554414A1 (fr) 2016-12-16 2019-10-23 MAKO Surgical Corp. Techniques pour modifier le fonctionnement d'un outil dans un système robotisé chirurgical sur la base de la comparaison des états réels et commandés de l'outil par rapport à un site chirurgical
CA3028792C (fr) * 2017-02-15 2024-03-12 Synaptive Medical (Barbados) Inc. Outil chirurgical a capteur et systeme d'imagerie et de suivi chirurgical pendant l'intervention integrant ledit outil
US10251709B2 (en) * 2017-03-05 2019-04-09 Samuel Cho Architecture, system, and method for developing and robotically performing a medical procedure activity
US10235192B2 (en) 2017-06-23 2019-03-19 Accenture Global Solutions Limited Self-learning robotic process automation
EP3678572A4 (fr) 2017-09-05 2021-09-29 Covidien LP Algorithmes de gestion des collisions pour systèmes chirurgicaux robotiques
WO2019055701A1 (fr) 2017-09-13 2019-03-21 Vanderbilt University Robots continuum à mouvement multi-échelle par modulation d'équilibre
CA3080151A1 (fr) 2017-10-23 2019-05-02 Peter L. BONO Instrument chirurgical oscillant/a mouvement de va-et-vient rotatif
US11344372B2 (en) 2017-10-24 2022-05-31 SpineGuard Vincennes Robotic surgical system
FR3072559B1 (fr) 2017-10-24 2023-03-24 Spineguard Systeme medical comprenant un bras robotise et un dispositif medical destine a penetrer dans une structure anatomique
US10959744B2 (en) 2017-10-30 2021-03-30 Ethicon Llc Surgical dissectors and manufacturing techniques
US11510741B2 (en) 2017-10-30 2022-11-29 Cilag Gmbh International Method for producing a surgical instrument comprising a smart electrical system
US11998193B2 (en) 2017-12-28 2024-06-04 Cilag Gmbh International Method for usage of the shroud as an aspect of sensing or controlling a powered surgical device, and a control algorithm to adjust its default operation
US11612408B2 (en) 2017-12-28 2023-03-28 Cilag Gmbh International Determining tissue composition via an ultrasonic system
US11903601B2 (en) 2017-12-28 2024-02-20 Cilag Gmbh International Surgical instrument comprising a plurality of drive systems
US11672605B2 (en) 2017-12-28 2023-06-13 Cilag Gmbh International Sterile field interactive control displays
US11076921B2 (en) 2017-12-28 2021-08-03 Cilag Gmbh International Adaptive control program updates for surgical hubs
US20190206569A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Method of cloud based data analytics for use with the hub
US12096916B2 (en) 2017-12-28 2024-09-24 Cilag Gmbh International Method of sensing particulate from smoke evacuated from a patient, adjusting the pump speed based on the sensed information, and communicating the functional parameters of the system to the hub
US20190201139A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Communication arrangements for robot-assisted surgical platforms
US11896322B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub
US12062442B2 (en) 2017-12-28 2024-08-13 Cilag Gmbh International Method for operating surgical instrument systems
US11969216B2 (en) 2017-12-28 2024-04-30 Cilag Gmbh International Surgical network recommendations from real time analysis of procedure variables against a baseline highlighting differences from the optimal solution
US20190201140A1 (en) * 2017-12-28 2019-07-04 Ethicon Llc Surgical hub situational awareness
US11969142B2 (en) 2017-12-28 2024-04-30 Cilag Gmbh International Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws
US11896443B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Control of a surgical system through a surgical barrier
US11257589B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes
US11864728B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Characterization of tissue irregularities through the use of mono-chromatic light refractivity
US11857152B2 (en) 2017-12-28 2024-01-02 Cilag Gmbh International Surgical hub spatial awareness to determine devices in operating theater
US11464559B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Estimating state of ultrasonic end effector and control system therefor
AU2019207913A1 (en) 2018-01-12 2020-09-03 Capstone Surgical Technologies, Llc Robotic surgical control system
US11337746B2 (en) 2018-03-08 2022-05-24 Cilag Gmbh International Smart blade and power pulsing
US11090047B2 (en) 2018-03-28 2021-08-17 Cilag Gmbh International Surgical instrument comprising an adaptive control system
US11523839B2 (en) * 2018-04-03 2022-12-13 Intuitive Surgical Operations, Inc. Systems and methods for grasp adjustment based on grasp properties
US11344374B2 (en) 2018-08-13 2022-05-31 Verily Life Sciences Llc Detection of unintentional movement of a user interface device
WO2020047713A1 (fr) * 2018-09-03 2020-03-12 Abb Schweiz Ag Procédé et appareil de gestion de système de robot
US11857351B2 (en) 2018-11-06 2024-01-02 Globus Medical, Inc. Robotic surgical system and method
CN111012499B (zh) * 2018-12-29 2021-07-30 华科精准(北京)医疗科技有限公司 一种医疗辅助机器人
US11298129B2 (en) 2019-02-19 2022-04-12 Cilag Gmbh International Method for providing an authentication lockout in a surgical stapler with a replaceable cartridge
US20200281675A1 (en) * 2019-03-04 2020-09-10 Covidien Lp Low cost dual console training system for robotic surgical system or robotic surgical simulator
US11701181B2 (en) * 2019-04-24 2023-07-18 Warsaw Orthopedic, Inc. Systems, instruments and methods for surgical navigation with verification feedback
CN115279294A (zh) 2020-01-13 2022-11-01 史赛克公司 在导航辅助手术期间监控偏移的系统
EP4142610A4 (fr) * 2020-04-29 2023-10-25 Seva Robotics LLC Plate-forme robotique chirurgicale collaborative pour l'exécution de tâche autonome
US12048497B2 (en) * 2021-01-11 2024-07-30 Mazor Robotics Ltd. Safety mechanism for robotic bone cutting
US12082896B2 (en) * 2021-08-04 2024-09-10 Pixee Medical Surgical navigation system on wearable computer combining augmented reality and robotics
US20240277415A1 (en) * 2023-02-21 2024-08-22 Mazor Robotics Ltd. System and method for moving a guide system
CN117393107B (zh) * 2023-12-12 2024-03-15 北京唯迈医疗设备有限公司 一种自动手术介入机器人迭代学习方法和系统及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5460182A (en) * 1992-09-14 1995-10-24 Sextant Medical Corporation Tissue penetrating apparatus and methods
WO1998033451A1 (fr) * 1997-02-04 1998-08-06 National Aeronautics And Space Administration Instrument a multimodalite s'utilisant pour la caracterisation tissulaire
EP1791070A2 (fr) * 2005-11-23 2007-05-30 General Electric Company Systèmes destinés à faciliter les procédures chirurgicales
EP1857059A1 (fr) * 2006-05-16 2007-11-21 Manfred Dr. Ottow Dispositif destiné à l'introduction et la conduite d'un instrument dans un corps
WO2008045827A2 (fr) * 2006-10-12 2008-04-17 St. Jude Medical, Atrial Fibrillation Division, Inc. Système chirurgical robotique à caractéristique de détection de contact

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6602185B1 (en) * 1999-02-18 2003-08-05 Olympus Optical Co., Ltd. Remote surgery support system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5460182A (en) * 1992-09-14 1995-10-24 Sextant Medical Corporation Tissue penetrating apparatus and methods
WO1998033451A1 (fr) * 1997-02-04 1998-08-06 National Aeronautics And Space Administration Instrument a multimodalite s'utilisant pour la caracterisation tissulaire
EP1791070A2 (fr) * 2005-11-23 2007-05-30 General Electric Company Systèmes destinés à faciliter les procédures chirurgicales
EP1857059A1 (fr) * 2006-05-16 2007-11-21 Manfred Dr. Ottow Dispositif destiné à l'introduction et la conduite d'un instrument dans un corps
WO2008045827A2 (fr) * 2006-10-12 2008-04-17 St. Jude Medical, Atrial Fibrillation Division, Inc. Système chirurgical robotique à caractéristique de détection de contact

Cited By (67)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8870865B2 (en) 2007-06-29 2014-10-28 Actuated Medical, Inc. Medical tool for reduced penetration force with feedback means
US8328738B2 (en) 2007-06-29 2012-12-11 Actuated Medical, Inc. Medical tool for reduced penetration force with feedback means
US10219832B2 (en) 2007-06-29 2019-03-05 Actuated Medical, Inc. Device and method for less forceful tissue puncture
US9987468B2 (en) 2007-06-29 2018-06-05 Actuated Medical, Inc. Reduced force device for intravascular access and guidewire placement
US8043229B2 (en) 2007-06-29 2011-10-25 Actuated Medical, Inc. Medical tool for reduced penetration force
US8777944B2 (en) 2007-06-29 2014-07-15 Actuated Medical, Inc. Medical tool for reduced penetration force with feedback means
US8777871B2 (en) 2007-06-29 2014-07-15 Actuated Medical, Inc. Medical tool for reduced penetration force with feedback means
EP2619867B1 (fr) * 2010-12-16 2019-04-24 St. Jude Medical, Atrial Fibrillation Division, Inc. Système de détection automatique et de prévention d'emballement de moteur
CN104540467A (zh) * 2012-05-21 2015-04-22 伯尔尼大学 用于估计物体中工具空间位置的系统与方法
WO2013174801A3 (fr) * 2012-05-21 2014-01-16 Universität Bern Système et procédé pour estimer la position spatiale d'un outil à l'intérieur d'un objet
EP2666428A1 (fr) 2012-05-21 2013-11-27 Universität Bern Système et procédé permettant d'estimer la position spatiale d'un outil à l'intérieur d'un objet
AU2013265396B2 (en) * 2012-05-21 2017-05-25 Universitat Bern System and method for estimating the spatial position of a tool within an object
WO2013174801A2 (fr) 2012-05-21 2013-11-28 Universität Bern Système et procédé pour estimer la position spatiale d'un outil à l'intérieur d'un objet
US9814532B2 (en) 2012-05-21 2017-11-14 Universitat Bern System and method for estimating the spatial position of a tool within an object
US11963755B2 (en) 2012-06-21 2024-04-23 Globus Medical Inc. Apparatus for recording probe movement
US11819365B2 (en) 2012-06-21 2023-11-21 Globus Medical, Inc. System and method for measuring depth of instrumentation
US11896446B2 (en) 2012-06-21 2024-02-13 Globus Medical, Inc Surgical robotic automation with tracking markers
US11253327B2 (en) 2012-06-21 2022-02-22 Globus Medical, Inc. Systems and methods for automatically changing an end-effector on a surgical robot
US11974822B2 (en) 2012-06-21 2024-05-07 Globus Medical Inc. Method for a surveillance marker in robotic-assisted surgery
US11864839B2 (en) 2012-06-21 2024-01-09 Globus Medical Inc. Methods of adjusting a virtual implant and related surgical navigation systems
US11864745B2 (en) 2012-06-21 2024-01-09 Globus Medical, Inc. Surgical robotic system with retractor
US12004905B2 (en) 2012-06-21 2024-06-11 Globus Medical, Inc. Medical imaging systems using robotic actuators and related methods
US12070285B2 (en) 2012-06-21 2024-08-27 Globus Medical, Inc. Systems and methods for automatically changing an end-effector on a surgical robot
US11857266B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. System for a surveillance marker in robotic-assisted surgery
US11857149B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. Surgical robotic systems with target trajectory deviation monitoring and related methods
US11298196B2 (en) 2012-06-21 2022-04-12 Globus Medical Inc. Surgical robotic automation with tracking markers and controlled tool advancement
US11819283B2 (en) 2012-06-21 2023-11-21 Globus Medical Inc. Systems and methods related to robotic guidance in surgery
US11801097B2 (en) 2012-06-21 2023-10-31 Globus Medical, Inc. Robotic fluoroscopic navigation
US11793570B2 (en) 2012-06-21 2023-10-24 Globus Medical Inc. Surgical robotic automation with tracking markers
US11786324B2 (en) 2012-06-21 2023-10-17 Globus Medical, Inc. Surgical robotic automation with tracking markers
US11589771B2 (en) 2012-06-21 2023-02-28 Globus Medical Inc. Method for recording probe movement and determining an extent of matter removed
US11317971B2 (en) 2012-06-21 2022-05-03 Globus Medical, Inc. Systems and methods related to robotic guidance in surgery
US11045267B2 (en) 2012-06-21 2021-06-29 Globus Medical, Inc. Surgical robotic automation with tracking markers
US11399900B2 (en) 2012-06-21 2022-08-02 Globus Medical, Inc. Robotic systems providing co-registration using natural fiducials and related methods
WO2014084408A1 (fr) * 2012-11-30 2014-06-05 Olympus Corporation Système de support d'opération, et procédé de commande d'un système de support d'opération
EP2889015B1 (fr) * 2013-12-30 2022-08-31 National Taiwan University Robot à main pour chirurgie orthopédique
US9937014B2 (en) 2015-04-10 2018-04-10 Mako Surgical Corp. System and method of controlling a surgical tool during autonomous movement of the surgical tool
CN107427330B (zh) * 2015-04-10 2020-10-16 马科外科公司 在手术工具的自主移动期间控制手术工具的系统和方法
WO2016164590A1 (fr) * 2015-04-10 2016-10-13 Mako Surgical Corp. Système et procédé de commande d'un outil chirurgical pendant le déplacement autonome de l'outil chirurgical
CN107427330A (zh) * 2015-04-10 2017-12-01 马科外科公司 在手术工具的自主移动期间控制手术工具的系统和方法
EP3280345A1 (fr) * 2015-04-10 2018-02-14 Mako Surgical Corp. Système et procédé de commande d'un outil chirurgical pendant le déplacement autonome de l'outil chirurgical
AU2016246745B2 (en) * 2015-04-10 2020-11-26 Mako Surgical Corp. System and method of controlling a surgical tool during autonomous movement of the surgical tool
CN107690317A (zh) * 2015-06-04 2018-02-13 克瑞肖株式会社 手术路径设定装置、包括该手术路径设定装置的手术机器人系统以及用于手术机器人的手术路径设定方法
EP3305233A4 (fr) * 2015-06-04 2019-01-16 Curexo, Inc. Dispositif de réglage de chemin chirurgical, système robot chirurgical comprenant celui-ci et procédé de réglage de chemin chirurgical pour robot chirurgical
US10631940B2 (en) 2015-06-04 2020-04-28 Curexo, Inc. Surgical path setting device, surgical robot system comprising same and surgical path setting method for surgical robot
CN107690317B (zh) * 2015-06-04 2020-07-10 克瑞肖株式会社 手术路径设定装置、包括该手术路径设定装置的手术机器人系统
US10117713B2 (en) 2015-07-01 2018-11-06 Mako Surgical Corp. Robotic systems and methods for controlling a tool removing material from a workpiece
US11864852B2 (en) 2015-07-01 2024-01-09 Mako Surgical Corp. Robotic systems and methods for tool path generation and control based on bone density
US11291511B2 (en) 2015-07-01 2022-04-05 Mako Surgical Corp. Robotic systems and methods for controlling a tool removing material from a workpiece
US10940292B2 (en) 2015-07-08 2021-03-09 Actuated Medical, Inc. Reduced force device for intravascular access and guidewire placement
US11793543B2 (en) 2015-09-18 2023-10-24 Obvius Robotics, Inc. Device and method for automated insertion of penetrating member
US11883217B2 (en) 2016-02-03 2024-01-30 Globus Medical, Inc. Portable medical imaging system and method
US11872005B2 (en) 2016-02-26 2024-01-16 Think Surgical Inc. Method and system for guiding user positioning of a robot
WO2017147596A1 (fr) * 2016-02-26 2017-08-31 Think Surgical, Inc. Procédé et système de guidage de positionnement d'utilisateur d'un robot
US10864050B2 (en) 2016-02-26 2020-12-15 Think Surgical, Inc. Method and system for guiding user positioning of a robot
US11701188B2 (en) 2017-05-10 2023-07-18 Mako Surgical Corp. Robotic spine surgery system and methods
US11033341B2 (en) 2017-05-10 2021-06-15 Mako Surgical Corp. Robotic spine surgery system and methods
US11937889B2 (en) 2017-05-10 2024-03-26 Mako Surgical Corp. Robotic spine surgery system and methods
US11065069B2 (en) 2017-05-10 2021-07-20 Mako Surgical Corp. Robotic spine surgery system and methods
US12035985B2 (en) 2017-05-10 2024-07-16 Mako Surgical Corp. Robotic spine surgery system and methods
EP3666212A1 (fr) * 2018-12-14 2020-06-17 Globus Medical, Inc. Automatisation robotique chirurgicale au moyen de marqueurs de suivi
WO2020190637A1 (fr) * 2019-03-15 2020-09-24 Mako Surgical Corp. Système de chirurgie robotisée et procédés utilisant une fraise pour la pénétration et la canulation osseuses
US20220241036A1 (en) * 2019-03-15 2022-08-04 Mako Surgical Corp. Robotic surgical system and methods utilizing virtual boundaries with variable constraint parameters
US11337766B2 (en) 2019-03-15 2022-05-24 Mako Surgical Corp. Robotic surgical system and methods utilizing a cutting bur for bone penetration and cannulation
US11986260B2 (en) 2019-03-15 2024-05-21 Mako Surgical Corp. Robotic surgical system and methods utilizing virtual boundaries with variable constraint parameters
WO2021150810A1 (fr) * 2020-01-22 2021-07-29 Smith & Nephew, Inc. Procédés et systèmes de préparation d'os assistée robotisée à plusieurs étages pour implants sans ciment
WO2022175939A1 (fr) * 2021-02-18 2022-08-25 Mazor Robotics Ltd. Systèmes, dispositifs et procédés d'évitement de biseau d'outil

Also Published As

Publication number Publication date
CA2712607A1 (fr) 2009-07-30
US20110015649A1 (en) 2011-01-20

Similar Documents

Publication Publication Date Title
US20110015649A1 (en) Surgical Guidance Utilizing Tissue Feedback
JP7361751B2 (ja) 能動的ブレーキ解放制御装置を備える医療デバイス
US10874467B2 (en) Methods and devices for tele-surgical table registration
JP6543742B2 (ja) 画像キャプチャ装置及び操作可能な装置可動アームの制御された動作の間の衝突回避
KR102574095B1 (ko) 기기 교란 보상을 위한 시스템 및 방법
EP3119341B1 (fr) Alignement et accouplement pour instruments chirurgicaux actionnés et commandés à distance
KR102479015B1 (ko) 통합 수술 테이블 운동을 위한 시스템 및 방법
EP3115159B1 (fr) Système robotique médical fournissant des vues auxiliaires générées par ordinateur d'un instrument de type caméra destiné à commander le positionnement et l'orientation de sa pointe
EP2884933B1 (fr) Accouplement à libération initiée par l'utilisateur d'une plateforme de montage chirurgicale
CN110769773A (zh) 用于遥控操作的主/从配准和控制
US20210353369A1 (en) Systems and methods for magnetic sensing and docking with a trocar
US20240325099A1 (en) Remote center of motion control for a surgical robot
KR20220065092A (ko) 툴 포즈를 유지하는 시스템 및 방법
JP2018500055A (ja) 統合手術台アイコンのためのシステム及び方法
WO2008134017A1 (fr) Manipulateur chirurgical
CN113873961A (zh) 用于断开和进入远程操作模式的联锁机构
CN114401691A (zh) 用于外科机器人的手持式用户界面设备
CN116600732A (zh) 用于外科机器人的增强现实头戴式耳机
US20210030502A1 (en) System and method for repositioning input control devices
CN113795215B (zh) 用于磁性感测和与套管针对接的系统和方法
WO2024157113A1 (fr) Système robotique chirurgical et procédé de placement d'orifice d'accès assisté

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09704124

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2712607

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09704124

Country of ref document: EP

Kind code of ref document: A1