EP1786349A1 - Method and arrangement for positioning a tool - Google Patents

Method and arrangement for positioning a tool

Info

Publication number
EP1786349A1
EP1786349A1 EP05763282A EP05763282A EP1786349A1 EP 1786349 A1 EP1786349 A1 EP 1786349A1 EP 05763282 A EP05763282 A EP 05763282A EP 05763282 A EP05763282 A EP 05763282A EP 1786349 A1 EP1786349 A1 EP 1786349A1
Authority
EP
European Patent Office
Prior art keywords
computer
tool part
robot
data entity
tool
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP05763282A
Other languages
German (de)
French (fr)
Inventor
Stig Lindequist
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of EP1786349A1 publication Critical patent/EP1786349A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37097Marker on workpiece to detect reference position
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45117Medical, radio surgery manipulator

Definitions

  • the invention relates to an arrangement and method for positioning a tool provided on a robot.
  • the invention relates to software adapted to perform a method for positioning a tool provided on a robot when executed on a computer.
  • the invention relates to the use of such a system.
  • a fractured bone for example, is reduced manually to align the fractured bone ends and then maintained in the reduced state either by applying clamps over the fracture or by ap- plying traction on the fracture.
  • skin inci ⁇ sions are made as small as possible.
  • a "C- arm fluoroscope” the surgeon can direct instruments without seeing the actual bone which is covered by muscles and skin.
  • To permanently fix the fracture one or more screws are inserted in the fractured bone parts. The screws are introduced by a hand- held drill. The surgeon must first determine the drill position using C-arm fluoros ⁇ copy and then adjust the drill in three dimensions. Positioning the drill in this man ⁇ ner is a difficult task often resulting in repeated insertions of screws and consequent bone and soft tissues damage.
  • navigation systems utilizing C-arm fluoroscopy in combination with LED emitters for navigation have been developed, for example those described in US 0030060703.
  • Other systems have combined the navigation system with a freearm robot. Some of these systems, utilize data from CT- or MRI- scanners to direct the robot to a desired position. With these systems, the patient must go through preoperative procedures before the fracture can be repaired.
  • An ⁇ other system utilizes a small robot fixed directly to the patient's bone and navigated by C-arm fluoroscopy and fiducial markers.
  • an image guided system for positioning of a surgical tool is de ⁇ scribed.
  • the system comprises a fluoroscopic x-ray device, which is arranged to generate two-dimensional body part images, on which representations of the surgi ⁇ cal tool are superimposed.
  • An optical localizing device is used to determine the poses of surgical tools and the x-ray device.
  • a surgeon views the continuously up ⁇ dated 2D tool representations (tool cursors) representing the drill guide trajectory superimposed on the AP and lateral images, while aligning a hand-held drill guide to the proper position..
  • the optical localizing device is a device which adds to surgical set-up time. Fur ⁇ thermore the optical localizing device is costly and occupies space in a crowded op ⁇ eration theatre.
  • the system according to US 6,285,902 Bl does not achieve a de ⁇ fined, repeatable and precise position of the surgical tool, because of the use of hand-held devices.
  • US 5,784,767 a device for positioning of surgical instrument without using a fluoroscopic x-ray device is depicted.
  • An electrogoniometer is linked to a reference bloc, and both are then linked to a bony part of the patient by insertion of a screw. Movements of the surgical tool in relation to the reference bloc is displayed on a video display to aid the medical practitioner in manipulating the instruments relative to the portion.
  • This positioning method does not give the surgeon a detailed and anatomic view of the operation field.
  • the need for inserting an extra screw, or other fixation device, into the bone, is a drawback that causes increased risk for infection, and also weak ⁇ ens the bone.
  • the problem of insufficient positioning accuracy of tools on robot arms is solved by providing an arrangement and method which im ⁇ prove the positioning accuracy of a tool which is provided on a robot arm.
  • the present invention solves problems of inefficiency by providing arrangements and methods which improve the positioning accuracy of a tool provided on a robot arm in a less time consuming way.
  • a robot having a robot arm with at least six degrees of freedom, wherein the tool part is provided on a distal end of the robot arm; said robot is controlled by user utilization of a control member of a control unit; a computer is arranged to receive a first data entity comprising a first position in ⁇ formation of the object in a first plane and a second data entity comprising a second position information of the object in a second plane; the computer is arranged to process said first data entity and create a first data entity display; the computer is arranged to process said second data entity and create a second data entity display, said second data display being distinct from said first data entity dis- play; said control unit is arranged to send position coordinates of the tool part relative to the object to the computer; said computer is arranged to establish a relationship between said position coordi ⁇ nates of the tool part and said first and second object position information; said computer is arranged to display a marker configuration on at least
  • This solution effectively reduces the amount of undesirable radiation needed for visualising an area of interest for the user of the arrangement according to the pre ⁇ sent invention, compared to prior art wherein a rather large amount of radiation is needed.
  • the user of the tool positioning arrangement relies on an image processing procedure, instead of manual operations depending upon several fluoroscopic images, the amount of radiation needed for visualising an area of interest is reduced. This is favourable for both the user, other persons in the envi ⁇ ronment, and the object.
  • the method of positioning the tool improves cost time ef ⁇ fectiveness of the procedure.
  • the method according to the invention provides repeatability con ⁇ cerning tool positioning. Since coordinates of the tool are accessibly stored in a memory, a user of the arrangement can interrupt the procedure of positioning the tool, in case of emergency or other circumstances, whereby the tool may be removed from the object. When the user wants to recommence the positioning procedure they can easily instruct the robot to position the tool as it was before the interruption.
  • the robot can be programmed not to pass a specific plane in space.
  • a virtual protective surface is provided around the object, by giving the robot move- ment restrictions, so as to avoid accidental damage to the object by the user or the tool.
  • the risk of accidentally damaging the object is also reduced by controlling the robot with a joystick or a touch-pad. This makes it possible for the user to be in a close proximity to the robot arm and the tool, so as to closely observe the position ⁇ ing procedure.
  • the joystick or touch-pad is provided with "dead mans grip" accord- ing to safety regulations for robots, which allows inactivation of the robot when the user does not comply with predetermined conditions.
  • a higher level of safety may be achieved by mounting display means on, for example, goggles, so that the user can see representations of the ob- ject simultaneously with the real object.
  • a better overview of the positioning proce ⁇ dure is thus achieved resulting in fewer accidents. This includes reductions of acci ⁇ dents resulting in damage to the object by the tool.
  • Image processing according to the invention may advantageously be limited to 2-D, which is relatively fast. This further reduces the time required for tool positioning.
  • the position of the marker configuration is updated continuously.
  • said first and second data entities are generated by a fluoroscopic device.
  • said first and second data entities are two-dimensional fluoroscopic im ⁇ ages of the object, said images having a mutual angle displacement of 20-160 de ⁇ grees.
  • said mutual angle displacement is 90 degrees.
  • the tool part comprises a drill, saw, pin introducer, nail introducer or cannula being detachably arranged on said tool part.
  • the computer is arranged to generate the marker configuration by auto ⁇ matically extrapolating an axis of the tool part; and wherein a tool part position is identified by means of a contrast identification process.
  • said computer is arranged to generate the marker configuration in re- sponse to a manual action, semi-automatic or automatic way.
  • a position of the marker configuration is updated in a sequential way.
  • Figure I schematically illustrates a side view of a system comprising a robot pro ⁇ vided with at least one instrument, a C-arm fluoroscope, monitoring means and processing means according to an embodiment of the invention.
  • Figure Ib, Ic and Id illustrate side, top, and front views, respectively, of a part of an arrangement for positioning a robot arm provided with at least one instrument ac ⁇ cording to an aspect of the invention.
  • Figure 2 schematically illustrates a configuration comprising an instrument provided on a robot arm according to an embodiment of the invention.
  • Figure 3 a and b illustrate side and front views, respectively, of a docking device provided on a configuration comprising an instrument provided on a robot arm ac ⁇ cording to an aspect of the invention.
  • Figure 4a schematically illustrates a monitor displaying two images according to an aspect of the invention.
  • Figure 4b schematically illustrates two micro-screens provided on a spectacle mount according to an aspect of the invention.
  • Figure 4c schematically illustrates two representations of a drill sleeve, drilling pin and a marker line.
  • Figure 4d schematically illustrates two representations of a drill sleeve, drilling pin and a marker line.
  • Figure 5 illustrates an identification path of a contrast identification algorithm ac ⁇ cording to an aspect of the invention.
  • Figure 6 illustrates a flowchart of a method for positioning a tool provided on a ro ⁇ bot arm according to an embodiment of the invention.
  • Figure 7 illustrates a flowchart of a method for positioning a tool provided on a ro ⁇ bot arm according to an embodiment of the invention.
  • Figure 8a illustrates a flowchart of a method for navigating the robot arm according to an embodiment of the invention.
  • Figure 8b illustrates a flowchart of a method for visualizing robot arm movements on a displaying device according to an embodiment of the invention.
  • FIG. 9 schematically illustrates an apparatus according to an embodiment of the invention.
  • Figure I schematically illustrates a side view of system comprising a robot pro ⁇ vided with at least one tool, a C-arm and a display unit according to an embodiment of the invention.
  • An object 90 is placed on a movable table 91 with adjustable height.
  • a C-arm 10 is provided at one side of the table 91 and is arranged for generating radiographs of the object from different angles. According to one embodiment, an angle from the top as shown in the figure has 0 degrees and consequently a radiograph from the side has 90 degrees.
  • the C-arm comprises a device 11 for transmitting X-rays toward one side of the object and a collector 12 adopted to collect the transmitted X-rays on the opposite side of the object.
  • a display unit 1 is arranged for communication with the C-arm via a cable 5 and adapted for displaying the C-arm radiographs on two separate screens 2 and 3.
  • the display unit 1 is arranged for communica- tion with the C-arm via the cable 5 and a computer 60.
  • the C-arm is also referred to as a fluoroscopic radiographic device.
  • One purpose of displaying radiographs on the display unit 1 is to initially positioning the C-arm.
  • the computer 60 is arranged for communication with the C-arm.
  • the computer is adapted for data processing of the radiographs and is further arranged to perform an edge detecting analysis of generated images based on the radiographs.
  • the computer 60 is arranged for communication with a robot 100 via a cable 6.
  • the computer 60 is arranged to display processed radiographs on a monitoring means 140 provided on the robot 100.
  • the robot 100 (illustrated in greater detail with reference to Figs. lb-d) is positioned on one side of the table. Preferably, the robot is placed on a side of the table oppo ⁇ site to the C-arm. The robot is described in further detail with reference to Figs. Ib- d.
  • the object can be any one of a variety of ma ⁇ terials and items. It can be, for example, the remains of a deceased animal which can be utilized for research, training, or teaching purposes. Such uses of the inventive apparatus and method can provide advances in the fields such as bone repair re- search.
  • the present invention may also be useful for archaeologists and curators, where an object could be, for example, mummified remains, and where minimal dis ⁇ turbance to the outer layers is desired despite a need or interest to investigate or ma ⁇ nipulate the internal contents.
  • the object can also be living organism, more specifi ⁇ cally a mammal, even more specifically a human being. According to one embodiment the object is not a live mammal.
  • Figure Ib illustrates a side view of a part of an arrangement for accurately position ⁇ ing a tool provided on a robot arm according to an aspect of the invention.
  • the tool part 120 can comprise a drill, saw, pin introducer, nail introducer or other similar item.
  • the tool part can have an internal or external power supply.
  • the tool part is preferably arranged to carry a variety of instruments such as pins, nails or saw blades.
  • the tool part may comprise a cannula for injections or biopsies, for ex ⁇ ample.
  • a particular tool which can be used according to an embodiment of the in ⁇ vention is an injection needle.
  • the robot has one robot arm having 6 degrees of free ⁇ dom.
  • the robot is thus a 6-axis robot.
  • the robot comprises a robot base 110 having wheels 111, preferably four wheels. Two handle holds 130 are fixedly secured to the robot base 110 in this embodiment, one or even three or more handle holds could be used.
  • a base unit 170 is mounted on the robot base 110. The base unit comprises a control unit 175 of the robot.
  • a first support member 171 is attached to a top side of the base unit 170.
  • a first arm 151 having a control member 150 at distal end thereof is coupled to the support member 171.
  • the control member 150 is adapted for communication with the con ⁇ trol unit 175.
  • the control member comprises a joystick 155.
  • the user of the robot 100 can control movements of the robot arm by means of the joystick 155.
  • the control member 150 comprises a key pad, touch screen, touch pad, key pad, key board, or other.
  • the control member 150 comprises means for voice control, i.e. the user may control the robot with their voice.
  • the control mem ⁇ ber 150 communicates movement control signals to the control unit 175.
  • a second arm 141 having a display unit 140 at distal end thereof is coupled to the support member 171.
  • the control unit is arranged to display at least one image on the display unit 140.
  • the display unit is arranged to dis ⁇ play two different images built up based upon a corresponding radiograph.
  • An im ⁇ age corresponding to a particular radiograph essentially looks the same as the radio ⁇ graphs shown on the display unit 1.
  • the user of the robot may change col- ors or size of a processed image by means of the control member 150.
  • a second support member 180 is attached to the base unit 170 and a third support member 181 is coupled to the second support member 180 forming an underframe for a base mount 115.
  • a fourth support member 182 is arranged to unload the cou- pling between the second and the third support member 180 and 181.
  • the second, third and fourth support member 180, 181, and 182 are movable up and down rela ⁇ tively to 170.
  • a first robot arm portion 116 is mounted on the base mount 115.
  • a second robot arm portion 117 is rotatably mounted on the first robot arm portion 116 and a third robot arm portion 118 is rotatably mounted on the second robot arm portion 117 forming a robot arm having 6 degrees of freedom.
  • the robot has at least 6 degrees of freedom.
  • the robot may have 7 or 8 degrees of freedom.
  • a tool part 120 is provided on the third robot arm portion 118. The tool part will be described in greater detail with reference to Figure 2.
  • the computer 60 and the control unit 175 are inte ⁇ grated and provided in the robot 100.
  • the display de ⁇ vice 1 is not present whereby radiographs from the C-arm are processed directly by the computer 60 and not shown by the display device 1.
  • Figure Ic illustrates a top view of the arrangement for positioning of a tool provided on a robot arm according to an aspect of the invention.
  • Figure Id illustrates a rear end view of the arrangement for positioning of a tool provided on a robot arm according to an aspect of the invention.
  • Figure 2 schematically illustrates a configuration comprising a tool 120 provided on the robot arm according to an embodiment of the invention.
  • the second robot arm portion 117 connected to the third robot arm portion 118, which is provided with a first fastener 201.
  • a first member 202 is fixedly secured to the first fastener.
  • the first member could be fixed di ⁇ rectly to the robot arm.
  • the first member 202 has an elongated shape.
  • the first member 202 is a cylinder hollow in a distal end thereof.
  • a second member 203 is slidably arranged in the first member 202.
  • the second mem ⁇ ber can be moved to a particular position and fixed there by means of, for example, a locking screw (not shown).
  • a sensor 198 (not shown) is arranged to detect a posi ⁇ tion of the second member and communicate this value to the control unit 175.
  • the control unit 175 then communicates that value to the computer 60.
  • the position of the second member correlates to the actual position of a distal end of the tool, which is closest to the object.
  • the position of the second member relative the first member can be used when generating restrictions of movements of the robot arm.
  • Dimensions of the mutually different tools are of course considered in the procedure of positioning the tool, in particular considering safety issues.
  • a second fastener 205 is provided on the first member at a proximal end thereof, on an side opposite from the first fastener 201.
  • the second fastener 205 is provided with a rod 206 on which a third fastener 207 is slidebly arranged.
  • the third support member 207 is provided with a drill docking station 208.
  • a drilling machine 210 connected to a power source (not shown) via a cable 211 is operatively configured in the drill docking station 208.
  • the third sup- port member 207 can be detachably arranged on the rod 206.
  • a spacing member 230 is provided at the distal end of the second member 203, be ⁇ tween the second member 203 and a first sleeve 231 holding a drill sleeve 232.
  • a drilling pin 209 runs through the drill docking station 208 and the drill sleeve 232, essentially parallel to the length axes of the first and second members 202 and 203. The drilling pin 209 is connected to the drilling machine 210.
  • FIG 3a illustrates a side view of a docking device 290.
  • the docking device 290 is provided on a configuration which comprises a tool on a robot arm according to an embodiment of the invention.
  • the first fastener 230 is connected to the first member 231 having the second member 232 slidably arranged therein.
  • a spacing arrangement 278 comprising a first spacing member 277, a ball joint member 276 and a second spacing member 275 is provided between the first member 231 and the docking device 290.
  • Two markers 298 and 299 are provided on the second member in an aligned configura ⁇ tion.
  • the docking device 290 comprises a first docking member 271 slidably arranged in a second docking member 273.
  • a pin 270 is slidably arranged in the first docking member 271.
  • the pin 270 can be fixed at a certain position by means of a first locat ⁇ ing screw 272.
  • the first docking member 271 can be fixed at a certain position by means of a second locating screw 274.
  • One purpose of the docking device is to reduce movement of the tool 120 relative to the object 90 (not shown) by fixing the pin 270 to the object.
  • FIG. 3b illustrates a cross-sectional view of the docking device 290.
  • the docking device 290 is provided on a configuration which comprises a tool on a robot arm according to an embodiment of the invention.
  • the first member 231 comprises a first T-groove 283 in which the first spacing member 277 is introduced.
  • the first member 231 further comprises a second T- groove 281 in which the first spacing member 277 can be introduced.
  • the first member 231 comprises a pin slot 282.
  • the pin 209 can be released from the robot tool by retracting the second member 232 and then pushing the pin 209 through the pin slot 282.
  • FIG 4a schematically illustrates the monitoring means 140 comprising two adja- cently arranged first and second screens 410 and 415.
  • the first screen 410 displays a first processed radiograph taken from a first angle ⁇ x (from above of the object as indicated in Figure 1) of the device (not shown).
  • the second screen 415 displays a second processed radiograph taken from a second angle ⁇ 2 (from a side of the ob ⁇ ject as indicated in Figure 1) of the device.
  • ⁇ x and ⁇ 2 are perpen- dicular to each other.
  • ⁇ x and ⁇ 2 have an internal rotational displacement of 70-130 degrees.
  • Software provided in a memory of the control unit is arranged to move an assembly comprising the drill sleeve 232 and drilling pin 209 on the monitoring means 410 in response to a user input fed via the joystick or a touch-pad (not shown). It should be noted that the tool is actually moving in response to the men ⁇ tioned user input and these movements are visualized on the monitoring means 410. After the tool has been moved to a desired location and orientation illustrated on the monitoring means 410 by rotational and translation movements the tool is locked in that particular position in the plane represented by the radiograph displayed on the monitoring means 410.
  • the user actively switches positioning plane by means of the control member.
  • the user positions the tool part in a plane corresponding to the plane of the first radiograph displayed on the monitoring means 410.
  • the user positions the tool part in a plane corresponding to the plane of the second ra ⁇ diograph displayed on the monitoring means 415.
  • the planes are perpen ⁇ dicular to each other.
  • the tool has a desired position and orientation in 3-D at a distance from the object. It is, however, still possible to manually move the second member 203 rela ⁇ tive the first member 202 in one dimension (length axes of the first and second members).
  • Figure 4b schematically illustrates two micro-screens provided on a spectacle mount, according to an aspect of the invention.
  • Goggles 450 comprising a frame 452, a first lens 453 and a second lens 454 are pro ⁇ vided with a micro-screen assembly 462 attached to the frame 452 by means of at- tachment means 460.
  • the micro-screen assembly 462 is held by a support member 461 which is attached to the attachment means 460.
  • the micro-screen assembly 462 comprises a first micro-screen 463 and a second micro-screen 464 each capable of displaying a processed radiograph image.
  • the processed radiograph images can be based upon radiographs depicted with reference to Figure 4a.
  • the micro-screen as- sembly 462 has a power supply line 480 and an information supply line 481 con- nected thereto.
  • the information supply line 481 provides the data for the radio ⁇ graphs to the micro-screen assembly 462 from the control unit.
  • the monitoring means may be superfluous or alternatively be used as a complement to the micro-screen assembly 462.
  • Figure 4c schematically illustrates two representations of the drill sleeve 232, drill ⁇ ing pin 209 and the marker line 440 in a first position (a) and a second position (b) rotated with an angle a with respect to each other.
  • the rotation can be performed either clockwise or counter-clockwise. Movements are performed by the robot and then shown on the first processed radiograph as a corresponding movement of a colored line. Then movements are performed by the robot and shown on the second processed radiograph as a corresponding movement of a colored line 440, even re- ferred to as marker configuration.
  • Figure 4d schematically illustrates two representations of the drill sleeve 232, drill ⁇ ing pin 209 and the marker line 440 in a first position (c) and a second position (d) translated a distance D in one direction.
  • Translation movement can be performed in all directions. For example, a first linear movement in a first direction can be fol ⁇ lowed by a second linear movement in a second direction, which is perpendicular to the first direction.
  • Figure 5 illustrates a search path of a contrast identification algorithm according to an aspect of the invention.
  • the monitoring means 410 displays processed radiograph images generated by the computer 60.
  • the control unit 175 is arranged to process radiographs and display them on the monitoring means 410.
  • the processed radiograph images illustrate a representation of the drill sleeve 232, drilling pin 209, and the object 90.
  • the representation of the object 90 is referred to as 405.
  • the representation can ei ⁇ ther be a displayed radiograph or an image generated by processing data correspond ⁇ ing to the radiograph.
  • the contrast identification algorithm is adapted to start at a starting point S follow ⁇ ing a specific predetermined path on the processed radiograph images so as to detect the drill sleeve 232 and/or the drilling pin 209. This is performed by a comparative process wherein contrast identification of adjacent pixels of the processed radio ⁇ graph located on the predetermined path is performed. In this way a position and orientation of the drill sleeve 232 and/or the drilling pin 209 is identified.
  • the specific predetermined path can be a plurality of circular paths, as illustrated in the figure.
  • the predetermined path can be a helically configured path starting from the starting point S and moves along a spiral line towards a centre of the radiograph.
  • Another alternative is a path comprising a plurality of segments of a circle.
  • the starting point S is set by the user of the arrangement according to the present invention.
  • the monitoring means is a touch screen
  • the user may set the starting point by touching the screen close to the drill sleeve 232 and actively choose the path com ⁇ prising a plurality of segments of a circle. This is referred to as to set the starting point S semi-automatic.
  • the user manually indicates the marker con ⁇ figuration manually by means of a touch screen coupled to the computer 60.
  • a marker line 440 having a specific length L is generated. Calibration of the marker line is performed by measuring of two mark ⁇ ings 298 and 299 with a known distance between them on the drill sleeve 232. The marker is provided in a direction parallel to a length axes x of the drill sleeve 232 and drilling pin 209 starting at a distal end of the drilling pin 209 as shown in the figure. The marker line is an extrapolation of the drill sleeve 232 and the drilling pin 209.
  • Figure 6 illustrates a flowchart of a method for accurately positioning a tool part provided on a robot arm in a spaced relation to an object according to an embodi ⁇ ment of the invention.
  • a first method step s601 comprises the sub steps of: -processing a first data entity comprising a first position information of the object in a first plane based upon a fluoroscopic radiograph;
  • the method further comprises the steps of:
  • the method further comprises the steps of: -activating the robot to move the tool part in a plane corresponding to the first plane; and
  • the method further comprises the step of:
  • the method further comprises the step of:
  • Figure 7 illustrates a flowchart of a method for positioning a tool provided on a ro ⁇ bot arm according to an embodiment of the invention.
  • a method step s710 the robot (not shown) is placed adjacent to the object 90. This is performed manually or by means of a remote control. After the method step s710 a method step s713 follows.
  • a tool for example a drill or saw blade is mounted on the tool part.
  • a method step s716 follows.
  • a marking device com- prising two steel pins each having a steel ring provided in one end thereof are used for determining the position of the tool.
  • a method step s719 follows.
  • the position of the representations of the tool or the marking device are identified in processed radiographs by means of the edge de ⁇ tection algorithm.
  • a method step s722 follows.
  • the tool or the marking line is visually outlined on the monitoring means 140 as a colored line.
  • a method step s725 follows.
  • the robot is activated and start position coordi- nates of the tool mounted on the tool part are communicated to the computer.
  • the outlined tool on the monitor is registered together with the start position co ⁇ ordinates of the tool.
  • the robot tool arm is then retracted and the robot is moved by activation of the control device 150, which is controlled manually by the user.
  • a method step s731 follows.
  • an adjustment if necessary, is performed.
  • Two additional processed radiographs are visualized and a non-optimal position of the tool, for example, encountered due to distortion of the previously visualized proc ⁇ essed fluoroscopic images, is adjusted by controlling the robot arm by using the joy ⁇ stick and whereby moving the robot tool within a defined limited distance and a de- fined limited angle, for example +/- 5 mm or +/- 5 degrees.
  • the robot When the representation of the actual position of the tool is optimal the robot is halted and can be made totally inert by switching of the power.
  • Figure 8a illustrates a flowchart of a method for navigating the robot arm according to an embodiment of the invention.
  • a user navigates the robot arm by controlling movements of the tool part depending upon the processed displayed position infor ⁇ mation of the object in one plane and corresponding movements of the marker con ⁇ figuration; and also by controlling movements of the tool part depending upon the processed displayed position information of the object in another plane and corre- sponding movements of the marker. Thereafter the method ends.
  • Figure 8b illustrates a flowchart of a method for visualizing robot arm movements on the monitoring means according to an embodiment of the invention.
  • corresponding movements of the tool part are indi ⁇ cated by the marker configuration on the screens 410 and 415.
  • the user first posi ⁇ tions the marker configuration in a desired position corresponding to a first plane by using the joystick controlling movements of the robot. Thereafter the user actively switches working mode of the robot, i.e. other constraints are placed upon the robot arm movements.
  • the user positions the marker configuration in a desired position corresponding to a second plane by using the joystick controlling move ⁇ ments of the robot. Thereafter the method ends.
  • the above-mentioned control unit 175 and computer 60 may include the ap- paratus 700.
  • the apparatus 700 comprises a non- volatile memory 720, a data proc ⁇ essing device 730 and a read/write memory 740.
  • the memory 720 has a first mem ⁇ ory portion 750 wherein a computer program, such as an operating system, is stored for controlling the function of the apparatus 700.
  • the apparatus 700 com- prises a bus controller, a serial communication port, I/O-means, an A/D-converter, a time date entry and transmission unit, an event counter and an interrupt controller responsive to dead mans grip/emergency stop (not shown).
  • the data processing device 730 may be embodied by, for example, a microproces- sor.
  • the memory 720 also has a second memory portion 760.
  • a program may be stored in the second memory portion 760 in an executable manner or in a compressed state.
  • the data processing device 730 When it is described that the data processing device 730 performs a certain function it should be understood that the data processing device 730 performs a certain part of the program which is stored in the memory 760 or a certain part of the program which is stored in the recording medium 762.
  • the data processing device 730 may communicate with a data port 799 by means of a data bus 783.
  • the non-volatile memory 720 is adapted for communication with the data bus 783 via non-volatile memory data bus 784.
  • the separate non-volatile re ⁇ cording medium 762 is adapted to communicate with the data processing device 730 via recording medium data bus 789.
  • the read/write memory 740 is adapted to com- municate with the data bus 783 via a read/write memory data bus 785.
  • the apparatus 700 can perform parts of the methods described with reference to Figures 5, 6, 7 and 8a-b by means of the data processing device 730 running the program stored in the memory portion 760. When the apparatus 700 runs the pro- gram, parts of the method described with reference to Figure 5, 6, 7 and 8a-b are executed.
  • data received on the data port 799 comprises information about updated coordinates of the robot arm and/or the tool part.
  • a computer pro ⁇ gramme comprising a programme code for performing the method steps depicted with reference to Fig. 7 when said computer programme is run on a computer.
  • a computer pro ⁇ gramme product directly storable in an internal memory of a computer comprising a computer programme for performing the method steps depicted with reference to Fig. 7, when said computer programme is run on the computer.
  • a computer pro ⁇ gramme comprising a programme code for identifying representations of the tool by performing a contrast identification procedure when said computer programme is run on a computer.

Abstract

An arrangement providing superior flexibility of use and time- and cost-effectiveness includes a robot which holds a tool and a display. A user views the display and manipulates the tool through communication with the robot. Methods of using the arrangement involve user reliance on visual data on the display to determine which way to instruct the robot to manipulate the tool.

Description

Method and arrangement for positioning a tool
Field of the invention
The invention relates to an arrangement and method for positioning a tool provided on a robot.
More specifically the invention relates to software adapted to perform a method for positioning a tool provided on a robot when executed on a computer.
Even more specifically, the invention relates to the use of such a system.
Background of the invention
Present methods for bone fixation and bone remodeling enlisting means such as drills, pins and saws are dependent on the manual skills of a surgeon. A fractured bone, for example, is reduced manually to align the fractured bone ends and then maintained in the reduced state either by applying clamps over the fracture or by ap- plying traction on the fracture. In order to minimize soft tissue damage, skin inci¬ sions are made as small as possible. With the aid of a mobile X-ray apparatus, a "C- arm fluoroscope," the surgeon can direct instruments without seeing the actual bone which is covered by muscles and skin. To permanently fix the fracture one or more screws are inserted in the fractured bone parts. The screws are introduced by a hand- held drill. The surgeon must first determine the drill position using C-arm fluoros¬ copy and then adjust the drill in three dimensions. Positioning the drill in this man¬ ner is a difficult task often resulting in repeated insertions of screws and consequent bone and soft tissues damage.
To overcome problems with radiograph-guided manual positioning of drills and saws within the human body, navigation systems utilizing C-arm fluoroscopy in combination with LED emitters for navigation have been developed, for example those described in US 0030060703. Other systems have combined the navigation system with a freearm robot. Some of these systems, utilize data from CT- or MRI- scanners to direct the robot to a desired position. With these systems, the patient must go through preoperative procedures before the fracture can be repaired. An¬ other system utilizes a small robot fixed directly to the patient's bone and navigated by C-arm fluoroscopy and fiducial markers.
Another approach, exemplified by the Pintrace System (Medical Robotics in Stock- holm AB) utilizes data from intra-operative C-arm fluoroscopy and directs the robot by manual markings of several lines and points on a computer screen. One com¬ monality of these systems is that they are non-intuitive to the surgeon and demand extensive preoperative image handling or time-consuming marking on a computer screen in the operating theatre.
In US 6,285,902 Bl an image guided system for positioning of a surgical tool is de¬ scribed. The system comprises a fluoroscopic x-ray device, which is arranged to generate two-dimensional body part images, on which representations of the surgi¬ cal tool are superimposed. An optical localizing device is used to determine the poses of surgical tools and the x-ray device. A surgeon views the continuously up¬ dated 2D tool representations (tool cursors) representing the drill guide trajectory superimposed on the AP and lateral images, while aligning a hand-held drill guide to the proper position..
The optical localizing device is a device which adds to surgical set-up time. Fur¬ thermore the optical localizing device is costly and occupies space in a crowded op¬ eration theatre. The system according to US 6,285,902 Bl does not achieve a de¬ fined, repeatable and precise position of the surgical tool, because of the use of hand-held devices. In US 5,784,767 a device for positioning of surgical instrument without using a fluoroscopic x-ray device is depicted. An electrogoniometer is linked to a reference bloc, and both are then linked to a bony part of the patient by insertion of a screw. Movements of the surgical tool in relation to the reference bloc is displayed on a video display to aid the medical practitioner in manipulating the instruments relative to the portion.
This positioning method does not give the surgeon a detailed and anatomic view of the operation field. The need for inserting an extra screw, or other fixation device, into the bone, is a drawback that causes increased risk for infection, and also weak¬ ens the bone.
Summary of the invention
According to the present invention, the problem of insufficient positioning accuracy of tools on robot arms is solved by providing an arrangement and method which im¬ prove the positioning accuracy of a tool which is provided on a robot arm.
According to one aspect of the invention, the present invention solves problems of inefficiency by providing arrangements and methods which improve the positioning accuracy of a tool provided on a robot arm in a less time consuming way.
This problem is solved by arrangement for positioning a tool part in spaced relation to an object, the arrangement comprising: a robot having a robot arm with at least six degrees of freedom, wherein the tool part is provided on a distal end of the robot arm; said robot is controlled by user utilization of a control member of a control unit; a computer is arranged to receive a first data entity comprising a first position in¬ formation of the object in a first plane and a second data entity comprising a second position information of the object in a second plane; the computer is arranged to process said first data entity and create a first data entity display; the computer is arranged to process said second data entity and create a second data entity display, said second data display being distinct from said first data entity dis- play; said control unit is arranged to send position coordinates of the tool part relative to the object to the computer; said computer is arranged to establish a relationship between said position coordi¬ nates of the tool part and said first and second object position information; said computer is arranged to display a marker configuration on at least one of said first and second data entity displays based on said established relationship; the control unit is arranged to accurately position the tool part depending upon user utilization of the control member; and the user utilizes the control member in response to a position of the marker configu- ration.
This solution effectively reduces the amount of undesirable radiation needed for visualising an area of interest for the user of the arrangement according to the pre¬ sent invention, compared to prior art wherein a rather large amount of radiation is needed. By virtue of the fact that the user of the tool positioning arrangement relies on an image processing procedure, instead of manual operations depending upon several fluoroscopic images, the amount of radiation needed for visualising an area of interest is reduced. This is favourable for both the user, other persons in the envi¬ ronment, and the object.
Furthermore, because of the expense associated with fluoroscopy, by requiring fewer fluoroscopic images the method of positioning the tool improves cost time ef¬ fectiveness of the procedure. Advantageously, the method according to the invention provides repeatability con¬ cerning tool positioning. Since coordinates of the tool are accessibly stored in a memory, a user of the arrangement can interrupt the procedure of positioning the tool, in case of emergency or other circumstances, whereby the tool may be removed from the object. When the user wants to recommence the positioning procedure they can easily instruct the robot to position the tool as it was before the interruption.
The robot can be programmed not to pass a specific plane in space. In practice, a virtual protective surface is provided around the object, by giving the robot move- ment restrictions, so as to avoid accidental damage to the object by the user or the tool. The risk of accidentally damaging the object is also reduced by controlling the robot with a joystick or a touch-pad. This makes it possible for the user to be in a close proximity to the robot arm and the tool, so as to closely observe the position¬ ing procedure. The joystick or touch-pad is provided with "dead mans grip" accord- ing to safety regulations for robots, which allows inactivation of the robot when the user does not comply with predetermined conditions.
Advantageously, a higher level of safety may be achieved by mounting display means on, for example, goggles, so that the user can see representations of the ob- ject simultaneously with the real object. A better overview of the positioning proce¬ dure is thus achieved resulting in fewer accidents. This includes reductions of acci¬ dents resulting in damage to the object by the tool.
Image processing according to the invention may advantageously be limited to 2-D, which is relatively fast. This further reduces the time required for tool positioning.
Preferably the position of the marker configuration is updated continuously.
Preferably said first and second data entities are generated by a fluoroscopic device. Preferably said first and second data entities are two-dimensional fluoroscopic im¬ ages of the object, said images having a mutual angle displacement of 20-160 de¬ grees.
Preferably said mutual angle displacement is 90 degrees.
Preferably the tool part comprises a drill, saw, pin introducer, nail introducer or cannula being detachably arranged on said tool part.
Preferably the computer is arranged to generate the marker configuration by auto¬ matically extrapolating an axis of the tool part; and wherein a tool part position is identified by means of a contrast identification process.
Preferably said computer is arranged to generate the marker configuration in re- sponse to a manual action, semi-automatic or automatic way.
Preferably a position of the marker configuration is updated in a sequential way.
Additional objects, advantages and novel features of the present invention will be- come apparent to those skilled in the art from the following details, as well as by practice of the invention. While the invention is described below, it should be un¬ derstood that the invention is not limited to the specific details disclosed. The above-mentioned skilled persons having access to the teachings herein will recog¬ nise additional applications, modifications and embodiments in other fields, which are within the scope of the invention.
Brief description of the drawings For a more complete understanding of the present invention and further objects and advantages thereof, reference is now made to the following description of examples - as shown in the accompanying drawings, in which:
Figure Ia schematically illustrates a side view of a system comprising a robot pro¬ vided with at least one instrument, a C-arm fluoroscope, monitoring means and processing means according to an embodiment of the invention.
Figure Ib, Ic and Id illustrate side, top, and front views, respectively, of a part of an arrangement for positioning a robot arm provided with at least one instrument ac¬ cording to an aspect of the invention.
Figure 2 schematically illustrates a configuration comprising an instrument provided on a robot arm according to an embodiment of the invention.
Figure 3 a and b illustrate side and front views, respectively, of a docking device provided on a configuration comprising an instrument provided on a robot arm ac¬ cording to an aspect of the invention.
Figure 4a schematically illustrates a monitor displaying two images according to an aspect of the invention.
Figure 4b schematically illustrates two micro-screens provided on a spectacle mount according to an aspect of the invention.
Figure 4c schematically illustrates two representations of a drill sleeve, drilling pin and a marker line.
Figure 4d schematically illustrates two representations of a drill sleeve, drilling pin and a marker line. Figure 5 illustrates an identification path of a contrast identification algorithm ac¬ cording to an aspect of the invention.
Figure 6 illustrates a flowchart of a method for positioning a tool provided on a ro¬ bot arm according to an embodiment of the invention.
Figure 7 illustrates a flowchart of a method for positioning a tool provided on a ro¬ bot arm according to an embodiment of the invention.
Figure 8a illustrates a flowchart of a method for navigating the robot arm according to an embodiment of the invention.
Figure 8b illustrates a flowchart of a method for visualizing robot arm movements on a displaying device according to an embodiment of the invention.
Figure 9 schematically illustrates an apparatus according to an embodiment of the invention.
Detailed description of the drawings
Figure Ia schematically illustrates a side view of system comprising a robot pro¬ vided with at least one tool, a C-arm and a display unit according to an embodiment of the invention.
An object 90 is placed on a movable table 91 with adjustable height. A C-arm 10 is provided at one side of the table 91 and is arranged for generating radiographs of the object from different angles. According to one embodiment, an angle from the top as shown in the figure has 0 degrees and consequently a radiograph from the side has 90 degrees. The C-arm comprises a device 11 for transmitting X-rays toward one side of the object and a collector 12 adopted to collect the transmitted X-rays on the opposite side of the object. A display unit 1 is arranged for communication with the C-arm via a cable 5 and adapted for displaying the C-arm radiographs on two separate screens 2 and 3. Alternatively the display unit 1 is arranged for communica- tion with the C-arm via the cable 5 and a computer 60. The C-arm is also referred to as a fluoroscopic radiographic device. One purpose of displaying radiographs on the display unit 1 is to initially positioning the C-arm.
The computer 60 is arranged for communication with the C-arm. The computer is adapted for data processing of the radiographs and is further arranged to perform an edge detecting analysis of generated images based on the radiographs. The computer 60 is arranged for communication with a robot 100 via a cable 6. The computer 60 is arranged to display processed radiographs on a monitoring means 140 provided on the robot 100.
The robot 100 (illustrated in greater detail with reference to Figs. lb-d) is positioned on one side of the table. Preferably, the robot is placed on a side of the table oppo¬ site to the C-arm. The robot is described in further detail with reference to Figs. Ib- d.
As should be clear to a skilled worker, the object can be any one of a variety of ma¬ terials and items. It can be, for example, the remains of a deceased animal which can be utilized for research, training, or teaching purposes. Such uses of the inventive apparatus and method can provide advances in the fields such as bone repair re- search. The present invention may also be useful for archaeologists and curators, where an object could be, for example, mummified remains, and where minimal dis¬ turbance to the outer layers is desired despite a need or interest to investigate or ma¬ nipulate the internal contents. The object can also be living organism, more specifi¬ cally a mammal, even more specifically a human being. According to one embodiment the object is not a live mammal.
Figure Ib illustrates a side view of a part of an arrangement for accurately position¬ ing a tool provided on a robot arm according to an aspect of the invention.
The tool part 120 can comprise a drill, saw, pin introducer, nail introducer or other similar item. The tool part can have an internal or external power supply. The tool part is preferably arranged to carry a variety of instruments such as pins, nails or saw blades.
Furthermore the tool part may comprise a cannula for injections or biopsies, for ex¬ ample. A particular tool which can be used according to an embodiment of the in¬ vention is an injection needle.
With reference to Figure Ib the robot has one robot arm having 6 degrees of free¬ dom. The robot is thus a 6-axis robot.
The robot comprises a robot base 110 having wheels 111, preferably four wheels. Two handle holds 130 are fixedly secured to the robot base 110 in this embodiment, one or even three or more handle holds could be used. A base unit 170 is mounted on the robot base 110. The base unit comprises a control unit 175 of the robot.
A first support member 171 is attached to a top side of the base unit 170. A first arm 151 having a control member 150 at distal end thereof is coupled to the support member 171. The control member 150 is adapted for communication with the con¬ trol unit 175. The control member comprises a joystick 155. The user of the robot 100 can control movements of the robot arm by means of the joystick 155. Alterna¬ tively the control member 150 comprises a key pad, touch screen, touch pad, key pad, key board, or other. Alternatively the control member 150 comprises means for voice control, i.e. the user may control the robot with their voice. The control mem¬ ber 150 communicates movement control signals to the control unit 175.
A second arm 141 having a display unit 140 at distal end thereof is coupled to the support member 171. The control unit is arranged to display at least one image on the display unit 140. In a preferred embodiment the display unit is arranged to dis¬ play two different images built up based upon a corresponding radiograph. An im¬ age corresponding to a particular radiograph essentially looks the same as the radio¬ graphs shown on the display unit 1. However, the user of the robot may change col- ors or size of a processed image by means of the control member 150.
A second support member 180 is attached to the base unit 170 and a third support member 181 is coupled to the second support member 180 forming an underframe for a base mount 115. A fourth support member 182 is arranged to unload the cou- pling between the second and the third support member 180 and 181. The second, third and fourth support member 180, 181, and 182 are movable up and down rela¬ tively to 170.
A first robot arm portion 116 is mounted on the base mount 115. A second robot arm portion 117 is rotatably mounted on the first robot arm portion 116 and a third robot arm portion 118 is rotatably mounted on the second robot arm portion 117 forming a robot arm having 6 degrees of freedom. Preferably the robot has at least 6 degrees of freedom. The robot may have 7 or 8 degrees of freedom. A tool part 120 is provided on the third robot arm portion 118. The tool part will be described in greater detail with reference to Figure 2.
According to one embodiment the computer 60 and the control unit 175 are inte¬ grated and provided in the robot 100. According to an embodiment the display de¬ vice 1 is not present whereby radiographs from the C-arm are processed directly by the computer 60 and not shown by the display device 1. Figure Ic illustrates a top view of the arrangement for positioning of a tool provided on a robot arm according to an aspect of the invention.
Figure Id illustrates a rear end view of the arrangement for positioning of a tool provided on a robot arm according to an aspect of the invention.
Figure 2 schematically illustrates a configuration comprising a tool 120 provided on the robot arm according to an embodiment of the invention.
Shown in the figure is the second robot arm portion 117 connected to the third robot arm portion 118, which is provided with a first fastener 201. A first member 202 is fixedly secured to the first fastener. Optionally, the first member could be fixed di¬ rectly to the robot arm. The first member 202 has an elongated shape. According to this embodiment the first member 202 is a cylinder hollow in a distal end thereof. A second member 203 is slidably arranged in the first member 202. The second mem¬ ber can be moved to a particular position and fixed there by means of, for example, a locking screw (not shown). A sensor 198 (not shown) is arranged to detect a posi¬ tion of the second member and communicate this value to the control unit 175. The control unit 175 then communicates that value to the computer 60. The position of the second member correlates to the actual position of a distal end of the tool, which is closest to the object. Thus, the position of the second member relative the first member can be used when generating restrictions of movements of the robot arm. Dimensions of the mutually different tools are of course considered in the procedure of positioning the tool, in particular considering safety issues.
A second fastener 205 is provided on the first member at a proximal end thereof, on an side opposite from the first fastener 201. The second fastener 205 is provided with a rod 206 on which a third fastener 207 is slidebly arranged. The third support member 207 is provided with a drill docking station 208. A drilling machine 210 connected to a power source (not shown) via a cable 211 is operatively configured in the drill docking station 208. The third sup- port member 207can be detachably arranged on the rod 206.
A spacing member 230 is provided at the distal end of the second member 203, be¬ tween the second member 203 and a first sleeve 231 holding a drill sleeve 232. A drilling pin 209 runs through the drill docking station 208 and the drill sleeve 232, essentially parallel to the length axes of the first and second members 202 and 203. The drilling pin 209 is connected to the drilling machine 210.
The above-described arrangement is optimal in cases where a drill is to be used. Clearly, other configurations of the robot and tool are required when using items such as a saw, pin introducer or nail introducer. Thus, a feature of the invention is its' applicability to a wide variety of different tools.
Figure 3a illustrates a side view of a docking device 290. The docking device 290 is provided on a configuration which comprises a tool on a robot arm according to an embodiment of the invention.
The first fastener 230 is connected to the first member 231 having the second member 232 slidably arranged therein. A spacing arrangement 278 comprising a first spacing member 277, a ball joint member 276 and a second spacing member 275 is provided between the first member 231 and the docking device 290. Two markers 298 and 299 are provided on the second member in an aligned configura¬ tion.
The docking device 290 comprises a first docking member 271 slidably arranged in a second docking member 273. A pin 270 is slidably arranged in the first docking member 271. The pin 270 can be fixed at a certain position by means of a first locat¬ ing screw 272. The first docking member 271 can be fixed at a certain position by means of a second locating screw 274.
One purpose of the docking device is to reduce movement of the tool 120 relative to the object 90 (not shown) by fixing the pin 270 to the object.
Figure 3b illustrates a cross-sectional view of the docking device 290. In this em¬ bodiment, the docking device 290 is provided on a configuration which comprises a tool on a robot arm according to an embodiment of the invention.
The first member 231 comprises a first T-groove 283 in which the first spacing member 277 is introduced. The first member 231 further comprises a second T- groove 281 in which the first spacing member 277 can be introduced. The first member 231 comprises a pin slot 282. The pin 209 can be released from the robot tool by retracting the second member 232 and then pushing the pin 209 through the pin slot 282.
Figure 4a schematically illustrates the monitoring means 140 comprising two adja- cently arranged first and second screens 410 and 415. The first screen 410 displays a first processed radiograph taken from a first angle θx (from above of the object as indicated in Figure 1) of the device (not shown). The second screen 415 displays a second processed radiograph taken from a second angle θ2 (from a side of the ob¬ ject as indicated in Figure 1) of the device. In this example, θx and θ2 are perpen- dicular to each other. According to another embodiment, θx and θ2 have an internal rotational displacement of 70-130 degrees.
Software provided in a memory of the control unit (not shown) is arranged to move an assembly comprising the drill sleeve 232 and drilling pin 209 on the monitoring means 410 in response to a user input fed via the joystick or a touch-pad (not shown). It should be noted that the tool is actually moving in response to the men¬ tioned user input and these movements are visualized on the monitoring means 410. After the tool has been moved to a desired location and orientation illustrated on the monitoring means 410 by rotational and translation movements the tool is locked in that particular position in the plane represented by the radiograph displayed on the monitoring means 410.
It should be noted that the user actively switches positioning plane by means of the control member. First, the user positions the tool part in a plane corresponding to the plane of the first radiograph displayed on the monitoring means 410. Second, the user positions the tool part in a plane corresponding to the plane of the second ra¬ diograph displayed on the monitoring means 415. Preferably the planes are perpen¬ dicular to each other.
Thus, the tool has a desired position and orientation in 3-D at a distance from the object. It is, however, still possible to manually move the second member 203 rela¬ tive the first member 202 in one dimension (length axes of the first and second members).
Figure 4b schematically illustrates two micro-screens provided on a spectacle mount, according to an aspect of the invention.
Goggles 450 comprising a frame 452, a first lens 453 and a second lens 454 are pro¬ vided with a micro-screen assembly 462 attached to the frame 452 by means of at- tachment means 460. The micro-screen assembly 462 is held by a support member 461 which is attached to the attachment means 460. The micro-screen assembly 462 comprises a first micro-screen 463 and a second micro-screen 464 each capable of displaying a processed radiograph image. The processed radiograph images can be based upon radiographs depicted with reference to Figure 4a. The micro-screen as- sembly 462 has a power supply line 480 and an information supply line 481 con- nected thereto. The information supply line 481 provides the data for the radio¬ graphs to the micro-screen assembly 462 from the control unit.
This is an alternative way of displaying the radiographs for the user of the arrange- ment. Herein the monitoring means may be superfluous or alternatively be used as a complement to the micro-screen assembly 462.
Figure 4c schematically illustrates two representations of the drill sleeve 232, drill¬ ing pin 209 and the marker line 440 in a first position (a) and a second position (b) rotated with an angle a with respect to each other. The rotation can be performed either clockwise or counter-clockwise. Movements are performed by the robot and then shown on the first processed radiograph as a corresponding movement of a colored line. Then movements are performed by the robot and shown on the second processed radiograph as a corresponding movement of a colored line 440, even re- ferred to as marker configuration.
Figure 4d schematically illustrates two representations of the drill sleeve 232, drill¬ ing pin 209 and the marker line 440 in a first position (c) and a second position (d) translated a distance D in one direction. Translation movement can be performed in all directions. For example, a first linear movement in a first direction can be fol¬ lowed by a second linear movement in a second direction, which is perpendicular to the first direction.
Figure 5 illustrates a search path of a contrast identification algorithm according to an aspect of the invention.
The monitoring means 410 displays processed radiograph images generated by the computer 60. Alternatively the control unit 175 is arranged to process radiographs and display them on the monitoring means 410. The processed radiograph images illustrate a representation of the drill sleeve 232, drilling pin 209, and the object 90. The representation of the object 90 is referred to as 405. The representation can ei¬ ther be a displayed radiograph or an image generated by processing data correspond¬ ing to the radiograph.
The contrast identification algorithm is adapted to start at a starting point S follow¬ ing a specific predetermined path on the processed radiograph images so as to detect the drill sleeve 232 and/or the drilling pin 209. This is performed by a comparative process wherein contrast identification of adjacent pixels of the processed radio¬ graph located on the predetermined path is performed. In this way a position and orientation of the drill sleeve 232 and/or the drilling pin 209 is identified.
The specific predetermined path can be a plurality of circular paths, as illustrated in the figure. Alternatively, the predetermined path can be a helically configured path starting from the starting point S and moves along a spiral line towards a centre of the radiograph. Another alternative is a path comprising a plurality of segments of a circle.
According to one embodiment of the invention the starting point S is set by the user of the arrangement according to the present invention. For example, in the case where the monitoring means is a touch screen, the user may set the starting point by touching the screen close to the drill sleeve 232 and actively choose the path com¬ prising a plurality of segments of a circle. This is referred to as to set the starting point S semi-automatic. Alternatively the user manually indicates the marker con¬ figuration manually by means of a touch screen coupled to the computer 60.
The border of the drill sleeve 501 and the border of the drilling pin 502 are identi¬ fied by the procedure described above. A marker line 440 having a specific length L is generated. Calibration of the marker line is performed by measuring of two mark¬ ings 298 and 299 with a known distance between them on the drill sleeve 232. The marker is provided in a direction parallel to a length axes x of the drill sleeve 232 and drilling pin 209 starting at a distal end of the drilling pin 209 as shown in the figure. The marker line is an extrapolation of the drill sleeve 232 and the drilling pin 209.
Figure 6 illustrates a flowchart of a method for accurately positioning a tool part provided on a robot arm in a spaced relation to an object according to an embodi¬ ment of the invention.
A first method step s601 comprises the sub steps of: -processing a first data entity comprising a first position information of the object in a first plane based upon a fluoroscopic radiograph;
-processing a second data entity comprising a second position information of the ob¬ ject in a second plane based upon a fluoroscopic radiograph;
-displaying said first and second processed data entities separately; -generating a marker configuration based upon an actual position of the tool part;
-displaying the marker configuration together with said displayed first and second processed data entities;
-positioning the tool part in a first plane based on the displayed first processed data entity and a position of the marker configuration; and -positioning the tool part in a second plane based on the displayed second processed data entity and a position of the marker configuration.
Preferably the method further comprises the steps of:
-identifying a first position of the robot tool based on said first position information; and
- identifying a second position of the robot tool based on said second position in¬ formation.
Preferably the method further comprises the steps of: -activating the robot to move the tool part in a plane corresponding to the first plane; and
-activating the robot to move the tool part in a plane corresponding to the second plane.
Preferably the method further comprises the step of:
-adjusting a position of the tool part at least once based on updated position infor¬ mation of the marker configuration in the first and second planes.
Preferably the method further comprises the step of:
-positioning a fluoroscopic device so as to obtain said first and second data entities.
Figure 7 illustrates a flowchart of a method for positioning a tool provided on a ro¬ bot arm according to an embodiment of the invention.
According to a first method step s710 the robot (not shown) is placed adjacent to the object 90. This is performed manually or by means of a remote control. After the method step s710 a method step s713 follows.
According to the method step s713 a tool, for example a drill or saw blade is mounted on the tool part. After the method step s713 a method step s716 follows.
According to the method step s716 two perpendicular C-arms' radiographs are visu¬ alized on the screens 2 and 3. According to one embodiment, a marking device com- prising two steel pins each having a steel ring provided in one end thereof are used for determining the position of the tool.. After the method step s716 a method step s719 follows. According to the method step s719 the position of the representations of the tool or the marking device are identified in processed radiographs by means of the edge de¬ tection algorithm. After the method step s719 a method step s722 follows.
According to the method step s722 the tool or the marking line is visually outlined on the monitoring means 140 as a colored line. After the method step s722 a method step s725 follows.
According to the method step s725 the robot is activated and start position coordi- nates of the tool mounted on the tool part are communicated to the computer. In this way the outlined tool on the monitor is registered together with the start position co¬ ordinates of the tool. After the method step s725 a method step s728 follows.
According to the method step s728 the robot tool arm is then retracted and the robot is moved by activation of the control device 150, which is controlled manually by the user. After the method step s728 a method step s731 follows.
When the robot tool is moved, updated coordinates of the position of the robot tool are continuously transmitted to the computer 60. The position of the marking line 440 outlining an extrapolation of the tool on the screens 410 and 415 is updated ac¬ cording to the new, actual, coordinates of the tool. In this way the user can see how a representation of the actual position of the robot tool changes location. It should be noted that the robot tool is moved accordingly.
According to the method step s731 an adjustment, if necessary, is performed. Two additional processed radiographs are visualized and a non-optimal position of the tool, for example, encountered due to distortion of the previously visualized proc¬ essed fluoroscopic images, is adjusted by controlling the robot arm by using the joy¬ stick and whereby moving the robot tool within a defined limited distance and a de- fined limited angle, for example +/- 5 mm or +/- 5 degrees. After the method step s731 the method ends.
When the representation of the actual position of the tool is optimal the robot is halted and can be made totally inert by switching of the power.
Figure 8a illustrates a flowchart of a method for navigating the robot arm according to an embodiment of the invention.
According to a method step s850 a user navigates the robot arm by controlling movements of the tool part depending upon the processed displayed position infor¬ mation of the object in one plane and corresponding movements of the marker con¬ figuration; and also by controlling movements of the tool part depending upon the processed displayed position information of the object in another plane and corre- sponding movements of the marker. Thereafter the method ends.
Figure 8b illustrates a flowchart of a method for visualizing robot arm movements on the monitoring means according to an embodiment of the invention.
According to a method step s860 corresponding movements of the tool part are indi¬ cated by the marker configuration on the screens 410 and 415. The user first posi¬ tions the marker configuration in a desired position corresponding to a first plane by using the joystick controlling movements of the robot. Thereafter the user actively switches working mode of the robot, i.e. other constraints are placed upon the robot arm movements. Secondly the user positions the marker configuration in a desired position corresponding to a second plane by using the joystick controlling move¬ ments of the robot. Thereafter the method ends.
With reference to Figure 9, a diagram of one way of embodying an apparatus 700 is shown. The above-mentioned control unit 175 and computer 60 may include the ap- paratus 700. The apparatus 700 comprises a non- volatile memory 720, a data proc¬ essing device 730 and a read/write memory 740. The memory 720 has a first mem¬ ory portion 750 wherein a computer program, such as an operating system, is stored for controlling the function of the apparatus 700. Further, the apparatus 700 com- prises a bus controller, a serial communication port, I/O-means, an A/D-converter, a time date entry and transmission unit, an event counter and an interrupt controller responsive to dead mans grip/emergency stop (not shown).
The data processing device 730 may be embodied by, for example, a microproces- sor.
The memory 720 also has a second memory portion 760. A program may be stored in the second memory portion 760 in an executable manner or in a compressed state.
When it is described that the data processing device 730 performs a certain function it should be understood that the data processing device 730 performs a certain part of the program which is stored in the memory 760 or a certain part of the program which is stored in the recording medium 762.
The data processing device 730 may communicate with a data port 799 by means of a data bus 783. The non-volatile memory 720 is adapted for communication with the data bus 783 via non-volatile memory data bus 784. The separate non-volatile re¬ cording medium 762 is adapted to communicate with the data processing device 730 via recording medium data bus 789. The read/write memory 740 is adapted to com- municate with the data bus 783 via a read/write memory data bus 785.
The apparatus 700 can perform parts of the methods described with reference to Figures 5, 6, 7 and 8a-b by means of the data processing device 730 running the program stored in the memory portion 760. When the apparatus 700 runs the pro- gram, parts of the method described with reference to Figure 5, 6, 7 and 8a-b are executed.
When data is received on the data port 799 said input data is temporarily stored in the read/write memory 740. When the received input data has been temporarily stored, the data processing device 730 is set up to perform execution of code in a manner described above. According to one embodiment, data received on the data port 799 comprises information about updated coordinates of the robot arm and/or the tool part.
According to an embodiment of the invention there is provided a computer pro¬ gramme comprising a programme code for performing the method steps depicted with reference to Fig. 7 when said computer programme is run on a computer.
According to an embodiment of the invention there is provided a computer pro¬ gramme product directly storable in an internal memory of a computer, comprising a computer programme for performing the method steps depicted with reference to Fig. 7, when said computer programme is run on the computer.
According to an embodiment of the invention there is provided a computer pro¬ gramme comprising a programme code for identifying representations of the tool by performing a contrast identification procedure when said computer programme is run on a computer.
The foregoing description of embodiments of the present invention has been pro¬ vided for the purposes of illustration and description. It is not intended to be exhaus¬ tive or to limit the invention to the precise forms disclosed. Obviously, many modi¬ fications and variations will be apparent to practitioners skilled in the art. The em¬ bodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifica¬ tions as are suited to the particular use contemplated. The references disclosed herein, including patents, are each specifically incorporated by reference in their en¬ tirety. However, the citation of such references shall not be construed as an admis- sion that the references are prior art to the present invention.

Claims

Claims
1. Arrangement for positioning a tool part (120) in spaced relation to an object (90), the arrangement comprising: a robot having a robot arm with at least six degrees of freedom, wherein the tool part (120) is provided on a distal end of the robot arm; said robot is controlled by user utilization of a control member (150) of a control unit (175); a computer (60) is arranged to receive a first data entity comprising a first position information of the object in a first plane and a second data entity comprising a sec¬ ond position information of the object in a second plane; the computer (60) is arranged to process said first data entity and create a first data entity display; the computer (60) is arranged to process said second data entity and create a second data entity display, said second data display being distinct from said first data entity display; said control unit (175) is arranged to send position coordinates of the tool part rela¬ tive to the object to the computer (60); said computer (60) is arranged to establish a relationship between said position co- ordinates of the tool part and said first and second object position information; said computer (60) is arranged to display a marker configuration (440) on at least one of said first and second data entity displays based on said established relation¬ ship; the control unit (175) is arranged to accurately position the tool part depending upon user utilization of the control member (150); and the user utilizes the control member in response to a position of the marker configu¬ ration.
2. Arrangement according to claim 1 wherein the position of the marker configura- tion is updated continuously.
3. Arrangement according to claim 1 or 2 wherein said first and second data entities are generated by a fluoroscopic device (10).
4 Arrangement according to any of claims 1-3 wherein said first and second data en¬ tities are two-dimensional fluoroscopic images of the object (90), said images hav¬ ing a mutual angle displacement of 20-160 degrees.
5. Arrangement according to claim 4 wherein said mutual angle displacement is 90 degrees.
6. Arrangement according to any of claims 1-5 wherein the tool part (120) com¬ prises a drill, saw, pin introducer, nail introducer or cannula being detachably ar¬ ranged on said tool part (120).
7. Arrangement according to any of claims 1-6, wherein the computer (60) is ar¬ ranged to generate the marker configuration (440) by automatically extrapolating an axis of the tool part (120); and wherein a tool part position is identified by means of a contrast identification process.
8. Arrangement according to any of claims 1-7, wherein said computer (60) is ar¬ ranged to generate the marker configuration (440) in response to a manual action or in a semi-automatic way.
9. Arrangement according to any of claims 1-8, wherein the position of the marker configuration (440) is updated in a sequential way.
10. Method for accurately positioning a tool part (120) provided on a robot arm in a spaced relation to an object (90), comprising the steps of: -processing a first data entity comprising a first position information of the object (90) in a first plane based upon a fluoroscopic radiograph;
-processing a second data entity comprising a second position information of the ob¬ ject (90) in a second plane based upon a fluoroscopic radiograph; -displaying said first and second processed data entities separately;
-generating a marker configuration based upon an actual position of the tool part; -displaying the marker configuration (440) together with said displayed first and second processed data entities;
-positioning the tool part in a first plane based on the displayed first processed data entity and a position of the marker configuration (440); and
-positioning the tool part in a second plane based on the displayed second processed data entity and a position of the marker configuration (440).
11 Method according to claim 10 wherein the method further comprises the steps of: -identifying a first position of the robot tool based on said first position information; and
- identifying a second position of the robot tool based on said second position in¬ formation.
12 Method according to claim 10 or 11 wherein the method further comprises the steps of:
-activating the robot to move the tool part in a plane corresponding to the first plane; and
-activating the robot to move the tool part in a plane corresponding to the second plane.
13 Method according to any of claims 10-12 wherein the method further comprises the step of:
-adjusting a position of the tool part at least once based on updated position infor- mation of the marker configuration in the first and second planes.
14 Method according to any one of claims 10-13 wherein the method further com¬ prises the step of:
-positioning a fluoroscopic device so as to obtain said first and second data entities.
15. Use of an arrangement for positioning a tool part (120) in spaced relation to an object (90), the arrangement comprising: a robot having a robot arm with at least six degrees of freedom, wherein the tool part (120) is provided on a distal end of the robot arm; said robot is controlled by user utilization of a control member (150) of a control unit (175); a computer (60) is arranged to receive a first data entity comprising a first position information of the object in a first plane and a second data entity comprising a sec¬ ond position information of the object in a second plane; the computer (60) is arranged to process said first data entity and create a first data entity display; the computer (60) is arranged to process said second data entity and create a second data entity display, said second data display being distinct from said first data entity display; said control unit (175) is arranged to send position coordinates of the tool part rela¬ tive to the object to the computer (60); said computer (60) is arranged to establish a relationship between said position co¬ ordinates of the tool part and said first and second object position information; said computer (60) is arranged to display a marker configuration (440) on at least one of said first and second data entity displays based on said established relation¬ ship; the control unit (175) is arranged to accurately position the tool part depending upon user utilization of the control member (150); and the user utilizes the control member in response to a position of the marker configu- ration.
16. Computer programme comprising a programme code for performing the method steps of any of claims 10-14 when said computer programme is run on a computer.
17. Computer programme product directly storable in an internal memory of a com¬ puter, comprising a computer programme for performing the method steps accord¬ ing to any of claims 10-14, when said computer programme is run on the computer.
18. Computer programme comprising a programme code for identifying the first and second position by performing a contrast identification procedure when said com¬ puter programme is run on a computer.
19. A method according to claim 10, wherein the fluoroscopic radiographs are pre¬ pared using a C-arm.
EP05763282A 2004-07-26 2005-07-26 Method and arrangement for positioning a tool Withdrawn EP1786349A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE0401928A SE0401928D0 (en) 2004-07-26 2004-07-26 Method and arrangement for positioning a tool
PCT/SE2005/001182 WO2006011848A1 (en) 2004-07-26 2005-07-26 Method and arrangement for positioning a tool

Publications (1)

Publication Number Publication Date
EP1786349A1 true EP1786349A1 (en) 2007-05-23

Family

ID=32867291

Family Applications (1)

Application Number Title Priority Date Filing Date
EP05763282A Withdrawn EP1786349A1 (en) 2004-07-26 2005-07-26 Method and arrangement for positioning a tool

Country Status (3)

Country Link
EP (1) EP1786349A1 (en)
SE (1) SE0401928D0 (en)
WO (1) WO2006011848A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2428110A (en) * 2005-07-06 2007-01-17 Armstrong Healthcare Ltd A robot and method of registering a robot.
JP2022025892A (en) * 2020-07-30 2022-02-10 セイコーエプソン株式会社 Teaching method and robot system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5251127A (en) * 1988-02-01 1993-10-05 Faro Medical Technologies Inc. Computer-aided surgery apparatus
US5417210A (en) * 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5799055A (en) * 1996-05-15 1998-08-25 Northwestern University Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
US6285902B1 (en) * 1999-02-10 2001-09-04 Surgical Insights, Inc. Computer assisted targeting device for use in orthopaedic surgery

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2006011848A1 *

Also Published As

Publication number Publication date
SE0401928D0 (en) 2004-07-26
WO2006011848A1 (en) 2006-02-02

Similar Documents

Publication Publication Date Title
Lueth et al. A surgical robot system for maxillofacial surgery
TWI693923B (en) Navigation method for medical operation and robotic system
US6434416B1 (en) Surgical microscope
US6198794B1 (en) Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
EP3595568A1 (en) System for guiding a surgical tool relative to a target axis in spine surgery
CN112603538A (en) Orthopedic navigation positioning system and method
US11154369B2 (en) Environmental mapping for robotic assisted surgery
WO2010044852A2 (en) Imaging platform to provide integrated navigation capabilities for surgical guidance
US20190314097A1 (en) Secondary instrument control in a computer-assisted teleoperated system
US11185373B2 (en) Method for recovering a registration of a bone
US20210228282A1 (en) Methods of guiding manual movement of medical systems
US20210315647A1 (en) Surgical Robot and Method for Displaying Image of Patient Placed on Surgical Table
US20090069945A1 (en) Adjusting and Guiding System for Tools
CN109996510B (en) Systems and methods for controlling tools having articulatable distal portions
CN113017834B (en) Joint replacement operation navigation device and method
CN112043382A (en) Surgical navigation system and use method thereof
JP7029932B2 (en) Systems and methods for measuring the depth of instruments
USRE40176E1 (en) Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
KR20220024055A (en) Tracking System Field of View Positioning System and Method
WO2006011848A1 (en) Method and arrangement for positioning a tool
WO2022138495A1 (en) Surgery assistance robot, surgery assistance system, and method for controlling surgery assistance robot
CN115211962A (en) Surgical system for computer-assisted navigation during surgical procedures
CN113907886A (en) Operation execution arm, system and control system of spine operation robot
Finlay et al. PathFinder image guided robot for neurosurgery
EP4302718A1 (en) Surgical navigation system

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20070222

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20090203