WO2017055990A1 - Optical registration of a remote center of motion robot - Google Patents

Optical registration of a remote center of motion robot Download PDF

Info

Publication number
WO2017055990A1
WO2017055990A1 PCT/IB2016/055743 IB2016055743W WO2017055990A1 WO 2017055990 A1 WO2017055990 A1 WO 2017055990A1 IB 2016055743 W IB2016055743 W IB 2016055743W WO 2017055990 A1 WO2017055990 A1 WO 2017055990A1
Authority
WO
WIPO (PCT)
Prior art keywords
patient
robot
effector
rcm
optical end
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2016/055743
Other languages
English (en)
French (fr)
Inventor
David Paul Noonan
Aleksandra Popovic
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Priority to CN201680066685.7A priority Critical patent/CN108348299B/zh
Priority to JP2018515500A priority patent/JP6865739B2/ja
Priority to EP16781202.3A priority patent/EP3355822A1/en
Priority to US15/762,757 priority patent/US20200246085A1/en
Publication of WO2017055990A1 publication Critical patent/WO2017055990A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
    • A61B90/13Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints guided by light, e.g. laser pointers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Instruments for taking body samples for diagnostic purposes; Other methods or instruments for diagnosis, e.g. for vaccination diagnosis, sex determination or ovulation-period determination; Throat striking implements
    • A61B10/02Instruments for taking cell samples or for biopsy
    • A61B10/0233Pointed or sharp biopsy instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/363Use of fiducial points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3966Radiopaque markers visible in an X-ray image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery

Definitions

  • the present disclosure generally relates to minimally invasive procedures requiring an interventional tool to be inserted into a patient along a defined tool trajectory through at a specific incision point, particularly minimally invasive neurosurgical procedures (e.g. a biopsy).
  • the present disclosure more particularly relates a registration of remote center-of-motion ("RCM”) by a remote center-of- motion robot to planned incision point into the patient for accurately positioning and orienting the interventional tool along a planned tool trajectory.
  • RCM remote center-of-motion
  • Image guided brain biopsy allows surgeons to accurately target deep seated brain lesions in a minimally invasive manner.
  • the patient's head is immobilized and registered to a pre-operative imaging scan (e.g., CT, MRI, US, etc.) using a tracking and localization system (e.g., optical, electromagnetic, mechanical or combination thereof).
  • a pre-operative imaging scan e.g., CT, MRI, US, etc.
  • a tracking and localization system e.g., optical, electromagnetic, mechanical or combination thereof.
  • such registration is performed using markers placed on the skull of the patient and/or tracked manual pointers.
  • an appropriate location of an incision point into the patient is identified by the surgeon for the biopsy.
  • the surgeon then manually aligns the insertion angles of the tracked biopsy needle based on feedback from the image guided tracking system.
  • image guidance tracking system confirms the needle trajectory and identifies when the correct insertion depth has been reached.
  • image guidance systems as known in the art may provide a mechanical needle guide which the surgeon aligns to the planned tool trajectory prior to needle insertion.
  • the present disclosure provides inventions using a robot mounted optical end- effector (e.g., laser pointer or an endoscope) for registering a remote center-of-motion (“RCM”) robot to an image of the patient during a minimally invasive procedure (e.g., a minimally invasive neurosurgical procedure).
  • a robot mounted optical end- effector e.g., laser pointer or an endoscope
  • RCM remote center-of-motion
  • an exact position and orientation of a planned tool trajectory for performing the procedure is automatically defined in an accurate manner. This in turn allows for the surgeon to deploy the interventional tool in a precise, controlled manner with minimal risk of human error.
  • One form of the inventions of the present disclosure is a robotic surgical system for a minimally invasive procedure involving a planned tool trajectory through a planned incision point into a patient.
  • the robotic surgical system employs an optical end-effector (e.g., a laser point or an endoscope) and a RCM robot for rotating the optical end-effector about a remote center-of-motion defined by a structural configuration of the RCM robot.
  • an optical end-effector e.g., a laser point or an endoscope
  • a RCM robot for rotating the optical end-effector about a remote center-of-motion defined by a structural configuration of the RCM robot.
  • the robotic surgical system further employs a robot controller for controlling an optical pointing by the RCM robot of the optical end-effector to one or more markers attached to the patient, and for controlling an axial alignment by the RCE robot of the optical end-effector to the planned tool trajectory as illustrated within a volume image of the patient based on a registration of the remote center-of-motion to the planned incision point as illustrated within the volume image of the patient derived from an optical pointing by the RCM robot of the optical end-effector to the registration marker(s) attached to the patient.
  • a robot controller for controlling an optical pointing by the RCM robot of the optical end-effector to one or more markers attached to the patient, and for controlling an axial alignment by the RCE robot of the optical end-effector to the planned tool trajectory as illustrated within a volume image of the patient based on a registration of the remote center-of-motion to the planned incision point as illustrated within the volume image of the patient derived from an optical pointing by the RCM robot of the optical end-effector to the registration marker(
  • a second form of the inventions of the present disclosure is a robotic surgical method for a minimally invasive procedure involving a planned tool trajectory through a planned incision point into a patient.
  • the robotic surgical method involves a RCM robot optically pointing an optical end-effector to one or more markers attached to the patient, and a registration module deriving a registration of a remote center-of-motion to the planned incision point as illustrated within a volume image of the patient from the optical pointing by the RCM robot to the optical end-effector to the marker(s) attached to the patient,
  • the remote center-of- motion is defined by a structural configuration of the RCM robot.
  • the robotic surgical method further involves the RCM robot axially aligning the optical end-effector to the planned tool trajectory as illustrated within the volume image of the patient based on the registration by the registration module of the remote center- of-motion to the planned incision point as illustrated within the volume image of the patient.
  • optical end-effector broadly encompasses any device serving as an end-effector of a robot and having optical capabilities for emitting and/or receiving any form of radiation
  • RCM robot broadly encompasses any robot having a structural configuration defining a remote center-of-motion whereby the robot or a portion thereof is rotatable about a point spatially fixed from the robot.
  • optical end-effector include, but are not limited to, any type of laser pointer and endoscope as known in the art and exemplary described herein
  • RCM robot includes, but is not limited to, any type of concentric arc robot as known in the art and exemplary described herein.
  • controller broadly encompasses all structural configurations of an application specific main board or an application specific integrated circuit housed within or linked to a workstation for controlling an application of various inventive principles of the present disclosure as subsequently described herein.
  • the structural configuration of the controller may include, but is not limited to, processor(s), computer-usable/computer readable storage medium(s), an operating system, application module(s), peripheral device controller(s), slot(s) and port(s).
  • Examples of the workstation include, but are not limited to, an assembly of one or more computing devices (e.g., a client computer, a desktop and a tablet), a display/monitor, and one or more input devices (e.g., a keyboard, joysticks and mouse).
  • the term "application module” broadly encompasses a component of the controller consisting of an electronic circuit and/or an executable program (e.g., executable software and/firmware) for executing a specific application.
  • controller For purposes of the inventions of the present disclosure, descriptive labeling of the a controller herein as a “robot” controller and an “imaging” controller serves to identify a particular controller as described and claimed herein without specifying or implying any additional limitation to the term "controller”.
  • registration control module serves to identify a particular application module as described and claimed herein without specifying or implying any additional limitation to the term "application module”.
  • FIG. 1 illustrates a first exemplary embodiment of a minimally invasive neurosurgical procedure in accordance with the inventive principles of the present disclosure.
  • FIG. 2 illustrates flowcharts representative of an exemplary embodiment of an robotic image guidance method in accordance with the inventive principles of the present disclosure.
  • FIGS. 3A-3D illustrate exemplary embodiments of registration markers in accordance with the inventive principles of the present disclosure.
  • FIGS. 4A-4G illustrate an exemplary registration of a RCM robot to a patient in accordance with the inventive principles of the present disclosure.
  • FIG. 5 illustrates a second exemplary embodiment of a minimally invasive neurosurgical procedure in accordance with the inventive principles of the present disclosure.
  • FIG. 6 illustrates a third exemplary embodiment of a minimally invasive neurosurgical procedure in accordance with the inventive principles of the present disclosure.
  • FIGS. 7 A and 7B illustrate exemplary embodiments of workstations in accordance with the inventive principles of the present disclosure.
  • FIG. 1 teaches basic inventive principles of using a robot mounted laser pointer for registering a remote center-of-motion (“RCM”) robot to an image of a patient during a minimally invasive neurosurgical procedure. From this description, those having ordinary skill in the art will appreciate how to apply the inventive principles of the present disclosure to various optical end-effectors for registering a RCM robot to an image of a patient during any type of minimally invasive procedure.
  • RCM remote center-of-motion
  • an imaging phase of a minimally invasive biopsy utilizes an imaging controller 20, an imaging modality 22 (e.g., a CT, MRI or US imaging modality), and a communication path 23 (e.g., wired/wireless connection(s)) between imaging controller 20 and imaging modality 22.
  • the image phase of the minimally invasive biopsy involves imaging controller 20 controlling a display of a generation by imaging modality 22 as known in the art of a volume image 21 illustrative of markers (symbolized as black dots) attached to a head of a patient 10 immobilized by a clamp 11.
  • imaging controller 22a further controls a user planning within volume image 21 as known in the art a location of an incision point on the head of patient 10 (symbolized by a circle between the markers of volume image 21) and a tool trajectory through the incision point for purposes of reaching a target lesion within the head of patient 10 (symbolized by an X within the marker circle of volume 21 representing a tool trajectory perpendicular to the incision point). Still referring to FIG.
  • a registration phase of the minimally invasive biopsy utilizes a robot platform 30, a RCM robot in the form of a concentric arc robot 40, an optical end-effector in the form a laser pointer 50, a robot controller 60a, and a communication path 63a (e.g., wire/wireless connection(s)) between robot controller 60a and a concentric arc robot 40, and between robot controller 60a and active embodiments of robot platform 30.
  • a communication path 63a e.g., wire/wireless connection(s)
  • robot platform broadly encompasses any platform structurally configured for moving a RCM robot of the present disclosure within a Cartersian coordinate system whereby a remote center-of- motion of may be moved to a desired point within the Cartesian coordinate system.
  • robot platform 30 employs a base 31
  • Robot platform 30 further employs a robot holding arm 34, and a joint 33 interconnecting robot holding arm 34 to post 32 whereby robot holding arm 34 is translatable, pivotable and/or extendable relative to post 32 within the Cartesian coordinate system.
  • robot platform 30 may be passive in terms of a manual manipulation of post 32 and/or robot holding arm 34, or active in terms of a motorized post 32 and/or a motorized joint 33 controlled by robot controller 60a for commanding command a translation, a pivot and/or an extension of base 31 and/or robot holding arm 34 via communication path 63a.
  • post 32 and/or or joint 33 may include encoders (not shown) for generating encoded signals informative of a pose of post 32 relative to base 31 and/or of a pose of robot holding arm 34 within the Cartesian coordinate system whereby robot controller 60a may track post 32 and/or robot holding arm 34.
  • Concentric arc robot 40 employs a pitch arc 42 mechanically coupled to a pitch actuator 41, and mechanically coupled to or physically integrated with a yaw actuator 43.
  • concentric arc robot 40 further includes a yaw arc 44 mechanically coupled to yaw actuator 43, and mechanically coupled to or physically integrated with an end-effector holder 45.
  • Pitch actuator 41 includes an encoded motor (not shown) controllable by robot controller 60a via communication path 63a for selectively actuating pitch actuator 41 to simultaneously revolve pitch arc 42, yaw actuator 43, yaw arc 44, end-effector holder 45 and laser pointer 50 about a pitch axis PA of pitch actuator 41 as symbolized by the bidirectional arrow encircling pitch axis PA.
  • Yaw actuator 43 includes an encoded motor (not shown) controllable by robot controller 60a via communication path 63a for selectively actuating yaw actuator 43 to thereby revolve yaw actuator 43, yaw arc 44, end-effector holder 45 and laser pointer 50 about a yaw axis YA of yaw actuator 41 as symbolized by the directional arrow encircling yaw axis YA.
  • End-effector holder 45 is structurally configured as known in the art to hold laser pointer 50 whereby a laser beam LB emitted by laser pointer 50a is aligned with a longitudinal axis of end-effector holder 45.
  • a relative orientation of pitch actuator 41, yaw actuator 43 and end-effector holder 45 defines a remote center-of- motion RCM as an intersection of pitch axis PA, yaw axis YA and the end-effector axis (represented by laser beam LB).
  • robot controller 60a executes a servo module 61a as known in the art for strategically positioning remote center-of- motion RCM relative to the head of patient 10.
  • robot controller 60a executes servo module 61a as known in the art for tactically orienting laser pointer 50 relative to the markers attached to the head of the patient 10.
  • the registration phase of the minimally invasive biopsy involves robot controller 60a executes a registration module 62a for registering remote center-of-motion RCM to the location of the incision point within volume image 21 of the head of patient 10 whereby laser beam LA is aligned with the tool trajectory TT planned during the imaging phase.
  • a registration module 62a for registering remote center-of-motion RCM to the location of the incision point within volume image 21 of the head of patient 10 whereby laser beam LA is aligned with the tool trajectory TT planned during the imaging phase.
  • a biopsy phase of the minimally invasive biopsy involves a removal of laser pointer 50 from end effector holder 45 and an insertion of a tool guide 70 within end-effector holder 45. Based on the registered alignment of laser beam LB to the planned tool trajectory TT during the registration phase, a biopsy needle 71 may be deployed in a precise, controlled manner as measured by tool guide 70 to reach the target lesion within the head of patient 10.
  • registration module 62a To facilitate a further understanding of registration module 62a, the following is a description of an imaging phase and registration phase of an exemplary robotic image guidance method of the present disclosure as shown in FIG. 2 in the context of the minimally invasive biopsy shown in FIG. 1. From this description, those having ordinary skill in the art will appreciate how to apply the inventive principles of the present disclosure for embodying a registration module applicable to any particular minimally invasive procedure and any particular structural configuration of a RCM robot and a robot platform of the present disclosure.
  • FIG. 2 illustrates a flowchart 80 represents actions performed by a surgeon of the exemplary robotic image guidance method of the present disclosure, and a flowchart 100 represents actions performed by controllers of the exemplary robotic image guidance method of the present disclosure.
  • a stage S82 of flowchart 80 encompasses a surgeon attaching radio opaque markers to patient 10 as known in the art
  • a stage S84 of flowchart 80 encompasses the surgeon interfacing with imaging controller 20 whereby imaging modality 22 as controlled by imaging controller 20 during a stage SI 02 of flowchart 100 generates volume image 21 illustrative of the opaque markers attached to the head of patient 10.
  • the markers may have an identical configuration, or each marker have a unique shapes to facilitate individual identification of the markets within volume image 21.
  • FIG. 3A illustrates a radio opaque marker 130 having a star shape
  • FIG. 3B illustrates a radio opaque marker 131 having a cross shape
  • FIG. 3C illustrates a radio opaque marker 132 having a diamond shape
  • FIG. 3D illustrates a radio opaque marker 133 having an hexagonal shape.
  • a stage S86 of flowchart 80 encompasses the surgeon interfacing with image controller 20 during a stage SI 04 of flowchart 100 for planning the location of the incision point into the head of patient 20 within volume image 21 as known in the art and for planning the tool trajectory through the incision point to reach the lesion in the head of patient 20 within the volume image 21 as known in the art. Still referring to FIGS. 1 and 2, at the onset of the registration phase of FIG.
  • a stage S88 of flowchart 80 encompasses the surgeon manually manipulating a passive robot platform 30 or interfacing with servo module 61a for an active robot platform 30 to arbitrarily position remote center-of-motion RCM spatially from the head of patient 10 as exemplarily shown in FIG. 4 A.
  • registration module 62a is not informed via encoded signals of the aforementioned manual or servo control of robot platform 30 and proceeds to a stage SI 08 of flowchart 100.
  • a stage S104 of flowchart 100 encompasses registration module 62a being informed via encoded signals of the arbitrary positioning of the remote center-of-motion RCM spatially from the head of patient 10 during stage S88 whereby registration module 62a initiates a tracking of robot platform 30.
  • stage S90 of flowchart 80 encompasses the surgeon interfacing with servo module 61a for sequentially centering laser beam LB of laser pointer 50 on each marker whereby a stage S108 of flowchart 100 encompasses servo module 61a or registration module 62a recording an encoded position of pitch actuator 41 and yaw actuator 42 for a centering of laser beam LB on each marker.
  • FIG. 4B illustrates a servo control by servo module 61a of yaw actuator 42 for centering laser beam LB on a first marker whereby servo module 61a or registration module 62a records an encoded position of pitch actuator 41 and yaw actuator 42 corresponding to the centering of laser beam LB on the first marker.
  • FIG. 4C illustrates a servo control by servo module 61a of pitch actuator 41 for centering laser beam LB on a second marker whereby servo module 61a or registration module 62a records an encoded position of pitch actuator 41 and yaw actuator 42 corresponding to the centering of laser beam LB on the second marker.
  • FIG. 4D illustrates a servo control by servo module 61a of pitch actuator 41 and yaw actuator 42 for centering laser beam LB on a third marker whereby servo module 61a or registration module 62a records an encoded position of pitch actuator 41 and yaw actuator 42 corresponding to the centering of laser beam LB on the third marker.
  • FIG. 4E illustrates a servo control by servo module 61a of pitch actuator
  • servo module 61a or registration module 62a records an encoded position of pitch actuator 41 and yaw actuator 42 corresponding to the centering of laser beam LB on the fourth marker.
  • stage SI 10 of flowchart 100 encompasses registration module 62a processing the recorded encoded positions of pitch actuator 41 and yaw actuator 42 as known in the art to register concentric arc robot 40 to the markers and planned incision location as illustrated in volume image 21.
  • stage SI 12 of flowchart 100 Based on the registration of stage SI 10, for un-encoded passive or active embodiments of robot platform 30 ("URP"), a stage SI 12 of flowchart 100
  • stage S92 of flowchart 80 encompasses the surgeon marking the incision location on the head of patient as indicated by laser beam LB during stage SI 12 and a stage S94 of flowchart 80 encompasses the surgeon manually manipulating a passive robot platform 30 or interfacing with servo module 61a for an active robot platform 30 to align the remote center-of- motion RCM with the incision marker during a stage SI 14 of flowchart 100 as exemplarily shown in FIG. 4G.
  • Registration module 62a provides a graphical and/or textual confirmation of the alignment.
  • stage S96 of flowchart 90 encompasses the surgeon interfacing with servo module 61a for sequentially centering laser beam LB of laser pointer 50 on each marker including the incision marker whereby a stage SI 14 of flowchart 100 encompasses servo module 61a or registration module 62a recording an encoded position of pitch actuator 41 and yaw actuator 42 for a centering of laser beam LB on each marker.
  • FIGS. 4B-4E are exemplary of stage SI 14 with the remote center-of-motion being aligned with the incision marker as opposed to being spaced from the head of patient 10.
  • a stage SI 18 of flowchart 100 encompasses registration module 62a processing the recorded encoded positions of pitch actuator 41 and yaw actuator 42 as known in the art to register the remote center-of-motion RCM to incision marker as illustrated in volume image 21.
  • a stage S120 of flowchart 100 encompasses an automatic servo control via servo module 61a of pitch actuator 41 and/or yaw actuator 43 as needed to axially align laser beam LB with the planned tool trajectory TT as shown in FIG. 1.
  • stage S92 of flowchart 80 again encompasses the surgeon marking the incision location on the head of patient as indicated by laser beam LB during stage SI 12
  • stage S94 of flowchart 80 again encompasses the surgeon manually manipulating a passive robot platform 30 to align the remote center-of- motion RCM with the incision marker during stage SI 14 of flowchart 100 as exemplarily shown in FIG. 4G.
  • Registration module 62a provides a graphical and/or textual confirmation of the alignment.
  • servo module 61a proceeds from stage SI 14 to stage S120 for an automatic servo control via servo module 61a of pitch actuator 41 and/or yaw actuator 43 as needed to axially align laser beam LB with the planned tool trajectory TT as shown in FIG. 1.
  • a RCM alignment in accordance with stage SI 14 and a registration of the remote center-of-motion RCM to the incision maker in accordance with stage SI 16 and SI 18 are omitted in view of the registration of the remote center-of-motion RCM to the incision point as illustrated in volume image during stage SI 10.
  • servo module 61a proceeds from stage SI 10 to stage S120 for an automatic servo control via servo module 61a of pitch actuator 41 and/or yaw actuator 43 as needed to axially align laser beam LB with the planned tool trajectory TT as shown in FIG. 1.
  • the biopsy phase of FIG. 1 may be conducted in a precise, controller manner with minimal human error.
  • the registration phase may be further automated and/or may utilize a variety of different optical end effectors.
  • FIG. 5 illustrates an exemplary registration phase of the present disclosure incorporating a camera 140 within the operating space (e.g., attached to robot platform 30 or concentric arc robot 40) in a manner that positions the markers attached to the head of patient 10 and laser pointer 50 within a field of view of camera 140.
  • a servo module 61b is structurally configured to communicate with a controller of camera 130 via communication path 63b to automatically perform the servo control of robot 30 when centering laser beam LB of laser pointer 50 on the markers as previously described herein.
  • Such control eliminates any need for the surgeon to interface with servo module 61b during stage SI 08 and stage SI 18 (if applicable) as shown in FIG. 2.
  • FIG. 6 illustrates an exemplary registration phase of the present disclosure utilizing an endoscope 51 alternative to laser pointer 50.
  • a servo module 61c is structurally configured to perform an automatic servo control that involves a centering of each marker within a field-of-view of endoscope 51 during stage SI 08 and stage SI 18 (if applicable) as shown in FIG. 2. More particularly, for markers 130-133 as shown respectively in FIGS. 3A-3D, servo module 61c performs the automatic servo control in a sequential centering markers 130-133 within the field-of-view of endoscope 51 stage SI 08 and stage SI 18 (if applicable) as shown in FIG. 2.
  • controllers of FIG. 1 may be installed within a single
  • FIG. 7A illustrates an imaging workstation 150 having an imaging controller of the present disclosure (e.g., imaging controller 20) installed therein for
  • a surgical robot workstation 151 having a robot controller of the present disclosure (e.g., robot controller 60) installed therein for executing a servo module and a registration module of the present disclosure.
  • robot controller of the present disclosure e.g., robot controller 60
  • FIG. 7B illustrates a workstation 152 having an imaging controller of the present disclosure (e.g., imaging controller 20) installed therein for CT, MRI or US imaging, and having a robot controller of the present disclosure (e.g., robot controller 60) installed therein for executing a servo module and a registration module of the present disclosure.
  • an imaging controller of the present disclosure e.g., imaging controller 20
  • a robot controller of the present disclosure e.g., robot controller 60
  • the controllers may
  • a registration module of the present disclosure may be an application of an imaging controller of the present disclosure in communication with a robot controller of the present disclosure.
  • FIGS. 1-9 may be implemented in various combinations of electronic components/circuitry, hardware, executable software and executable firmware and provide functions which may be combined in a single element or multiple elements.
  • the functions of the various features, elements, components, etc. shown/illustrated/depicted in the FIGS. 1-7 can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
  • processor When provided by a processor, the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared and/or multiplexed.
  • explicit use of the term "processor” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor ("DSP") hardware, memory (e.g., read only memory (“ROM”) for storing software, random access memory (“RAM”), non-volatile storage, etc.) and virtually any means and/or machine (including hardware, software, firmware, circuitry, combinations thereof, etc.) which is capable of (and/or
  • DSP digital signal processor
  • any flow charts, flow diagrams and the like can represent various processes which can be substantially represented in computer readable storage media and so executed by a computer, processor or other device with processing capabilities, whether or not such computer or processor is explicitly shown.
  • exemplary embodiments of the present disclosure can take the form of a computer program product or application module accessible from a computer-usable and/or computer-readable storage medium providing program code and/or instructions for use by or in connection with, e.g., a computer or any instruction execution system.
  • a computer-usable or computer readable storage medium can be any apparatus that can, e.g., include, store, communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus or device.
  • Such exemplary medium can be, e.g., an electronic, magnetic, optical, electromagnetic, infrared or semiconductor system (or apparatus or device) or a propagation medium.
  • Examples of a computer- readable medium include, e.g., a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), flash (drive), a rigid magnetic disk and an optical disk.
  • Current examples of optical disks include compact disk - read only memory (CD-ROM), compact disk - read/write (CD-R/W) and DVD.
  • corresponding and/or related systems incorporating and/or implementing the device or such as may be used/implemented in a device in accordance with the present disclosure are also contemplated and considered to be within the scope of the present disclosure.
  • corresponding and/or related method for manufacturing and/or using a device and/or system in accordance with the present disclosure are also contemplated and considered to be within the scope of the present disclosure.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Robotics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Manipulator (AREA)
PCT/IB2016/055743 2015-09-28 2016-09-26 Optical registration of a remote center of motion robot Ceased WO2017055990A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201680066685.7A CN108348299B (zh) 2015-09-28 2016-09-26 远程运动中心机器人的光学配准
JP2018515500A JP6865739B2 (ja) 2015-09-28 2016-09-26 遠隔運動中心ロボットの光学的位置合わせ
EP16781202.3A EP3355822A1 (en) 2015-09-28 2016-09-26 Optical registration of a remote center of motion robot
US15/762,757 US20200246085A1 (en) 2015-09-28 2016-09-26 Optical registation of a remote center of motion robot

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562233664P 2015-09-28 2015-09-28
US62/233,664 2015-09-28

Publications (1)

Publication Number Publication Date
WO2017055990A1 true WO2017055990A1 (en) 2017-04-06

Family

ID=57130415

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2016/055743 Ceased WO2017055990A1 (en) 2015-09-28 2016-09-26 Optical registration of a remote center of motion robot

Country Status (5)

Country Link
US (1) US20200246085A1 (enExample)
EP (1) EP3355822A1 (enExample)
JP (1) JP6865739B2 (enExample)
CN (1) CN108348299B (enExample)
WO (1) WO2017055990A1 (enExample)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021113227A1 (en) * 2019-12-02 2021-06-10 Think Surgical, Inc. System and method for aligning a tool with an axis to perform a medical procedure
US11789099B2 (en) * 2018-08-20 2023-10-17 Children's Hospital Medical Center System and method for guiding an invasive device

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12402960B2 (en) 2010-10-11 2025-09-02 Ecole Polytechnique Federale De Lausanne (Epfl) Mechanical manipulator for surgical instruments
EP3102139B1 (en) 2014-02-03 2018-06-13 DistalMotion SA Mechanical teleoperated device comprising an interchangeable distal instrument
US11564757B2 (en) * 2017-02-27 2023-01-31 The Regents Of The University Of California Laser-assisted surgical alignment
AU2019218707B2 (en) 2018-02-07 2024-10-24 Distalmotion Sa Surgical robot systems comprising robotic telemanipulators and integrated laparoscopy
US12376927B2 (en) * 2018-02-07 2025-08-05 Distalmotion Sa Surgical robot systems comprising robotic telemanipulators and integrated laparoscopy
CN116096322A (zh) * 2020-09-04 2023-05-09 上海联影医疗科技股份有限公司 用于辅助将手术器械放置到对象中的系统和方法
CN113180828B (zh) * 2021-03-25 2023-05-12 北京航空航天大学 基于旋量理论的手术机器人约束运动控制方法
CN115590618B (zh) * 2021-06-28 2025-11-11 敏捷医疗科技(苏州)有限公司 交互系统、手术机器人导航定位系统及其控制方法
CA3229319A1 (en) * 2021-08-23 2023-03-02 Idan Rotem Positioning systems for robotic-surgery devices
DE102021133060A1 (de) * 2021-12-14 2023-06-15 B. Braun New Ventures GmbH Chirurgisches Robotersystem und Steuerverfahren
CN116269668B (zh) * 2023-01-31 2025-10-14 杭州华匠医学机器人有限公司 定位器、内窥镜系统及远心定位方法
WO2024210161A1 (ja) * 2023-04-06 2024-10-10 国立研究開発法人量子科学技術研究開発機構 マニピュレータの位置決め装置及びマニピュレータの位置決め方法
EP4477181A1 (de) * 2023-06-15 2024-12-18 Siemens Healthineers AG Biopsieeinrichtung, unterstützungsverfahren und biopsiegerät
CN119423982B (zh) * 2023-07-31 2025-11-25 武汉联影智融医疗科技有限公司 光磁一体标记物、定位方法、系统和计算机设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080194945A1 (en) * 2007-02-13 2008-08-14 Siemens Medical Solutions Usa, Inc. Apparatus and Method for Aligning a Light Pointer With a Medical Interventional Device Trajectory
US20080200876A1 (en) * 2007-02-20 2008-08-21 Siemens Medical Solutions Usa, Inc. Needle Guidance With a Dual-Headed Laser
US20090274271A1 (en) * 2008-05-02 2009-11-05 Siemens Medical Solutions Usa, Inc. System and method for selecting a guidance mode for performing a percutaneous procedure
US20100137880A1 (en) * 2007-06-19 2010-06-03 Medtech S.A. Multi-application robotized platform for neurosurgery and resetting method

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2551582Y2 (ja) * 1990-02-27 1997-10-22 株式会社島津製作所 医用案内針の刺入指示装置
CA2571057A1 (en) * 2004-06-30 2006-01-12 James V. Sitzmann Medical devices for minimally invasive surgeries and other internal procedures
US8971597B2 (en) * 2005-05-16 2015-03-03 Intuitive Surgical Operations, Inc. Efficient vision and kinematic data fusion for robotic surgical instruments and other applications
US8414469B2 (en) * 2008-06-27 2013-04-09 Intuitive Surgical Operations, Inc. Medical robotic system having entry guide controller with instrument tip velocity limiting
EP2523626B1 (en) * 2010-01-14 2016-09-14 The Regents of The University of California Apparatus and system for robotic microsurgery
US8670017B2 (en) * 2010-03-04 2014-03-11 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
US20140253684A1 (en) * 2010-09-10 2014-09-11 The Johns Hopkins University Visualization of registered subsurface anatomy
US20140287393A1 (en) * 2010-11-04 2014-09-25 The Johns Hopkins University System and method for the evaluation of or improvement of minimally invasive surgery skills
US20140039314A1 (en) * 2010-11-11 2014-02-06 The Johns Hopkins University Remote Center of Motion Robot for Medical Image Scanning and Image-Guided Targeting
US20120226145A1 (en) * 2011-03-03 2012-09-06 National University Of Singapore Transcutaneous robot-assisted ablation-device insertion navigation system
US11013480B2 (en) * 2012-06-28 2021-05-25 Koninklijke Philips N.V. C-arm trajectory planning for optimal image acquisition in endoscopic surgery
RU2015107009A (ru) * 2012-08-02 2016-09-27 Конинклейке Филипс Н.В. Определение устройства управления удаленного центра движения робота
DE102013213727A1 (de) * 2013-07-12 2015-01-15 Siemens Aktiengesellschaft Interventionelles Bildgebungssystem
CN105979902A (zh) * 2014-02-04 2016-09-28 皇家飞利浦有限公司 用于机器人系统的使用光源的远程运动中心定义

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080194945A1 (en) * 2007-02-13 2008-08-14 Siemens Medical Solutions Usa, Inc. Apparatus and Method for Aligning a Light Pointer With a Medical Interventional Device Trajectory
US20080200876A1 (en) * 2007-02-20 2008-08-21 Siemens Medical Solutions Usa, Inc. Needle Guidance With a Dual-Headed Laser
US20100137880A1 (en) * 2007-06-19 2010-06-03 Medtech S.A. Multi-application robotized platform for neurosurgery and resetting method
US20090274271A1 (en) * 2008-05-02 2009-11-05 Siemens Medical Solutions Usa, Inc. System and method for selecting a guidance mode for performing a percutaneous procedure

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11789099B2 (en) * 2018-08-20 2023-10-17 Children's Hospital Medical Center System and method for guiding an invasive device
WO2021113227A1 (en) * 2019-12-02 2021-06-10 Think Surgical, Inc. System and method for aligning a tool with an axis to perform a medical procedure
JP2023505164A (ja) * 2019-12-02 2023-02-08 シンク サージカル, インコーポレイテッド 医療処置を施行するために道具を軸と整列させるためのシステム及び方法
US12329461B2 (en) 2019-12-02 2025-06-17 Think Surgical, Inc. System and method for aligning a tool with an axis to perform a medical procedure

Also Published As

Publication number Publication date
EP3355822A1 (en) 2018-08-08
CN108348299A (zh) 2018-07-31
JP2018530383A (ja) 2018-10-18
CN108348299B (zh) 2021-11-02
JP6865739B2 (ja) 2021-04-28
US20200246085A1 (en) 2020-08-06

Similar Documents

Publication Publication Date Title
US20200246085A1 (en) Optical registation of a remote center of motion robot
US11950858B2 (en) Systems for performing computer assisted surgery
JP6840815B2 (ja) 追跡マーカーを備えた手術用ロボットオートメーション
CN112839608B (zh) 多功能多臂机器人手术系统
EP3278758B1 (en) Surgical robotic automation with tracking markers
EP3648674B1 (en) Robotic instrument guide integration with an acoustic probe
US20250032208A1 (en) Mobile system for bilateral robotic tool feeding
JP2020096829A (ja) ドリルガイド固定具、頭蓋挿入固定具、および関連する方法およびロボットシステム
US20190175293A1 (en) Image guidance for a decoupled kinematic control of a remote-center-of-motion
JP6979049B2 (ja) 自然基準を使用した共登録を提供するロボットシステムおよび関連方法
JP6751461B2 (ja) 追跡マーカーを備えた手術用ロボットオートメーション
CN113491578B (zh) 将医学图像配准到圆环-圆弧组件的方法
EP3636394A1 (en) Robotic system for spinal fixation elements registration with tracking markers
EP3586784B1 (en) Methods of adjusting a virtual implant and related surgical navigation systems
EP4225250B1 (en) Systems and methods for determining and maintaining a center of rotation
CN118401191A (zh) 外科手术机器人系统和控制方法
CN112451097A (zh) 手术机器人系统
JP7323489B2 (ja) 誘導された生検針の軌道をロボットによりガイダンスするためのシステムと、関連する方法および装置
CN119403510A (zh) 用于引导手术计划的可视投射的激光引导机器人、投射方法和激光引导机器人系统
Seung et al. Image-guided positioning robot for single-port brain surgery robotic manipulator
HK40019646B (en) Surgical robotic automation with tracking markers
HK40037852A (en) Surgical robotic automation with tracking markers
HK40019645B (en) Methods of adjusting a virtual implant and related surgical navigation systems
HK40019645A (en) Methods of adjusting a virtual implant and related surgical navigation systems
HK40020881A (en) Robotic system for spinal fixation elements registration with tracking markers

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16781202

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2018515500

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016781202

Country of ref document: EP