EP2666428A1 - System und Verfahren zur Schätzung der räumlichen Position eines Werkzeugs in einem Objekt - Google Patents
System und Verfahren zur Schätzung der räumlichen Position eines Werkzeugs in einem Objekt Download PDFInfo
- Publication number
- EP2666428A1 EP2666428A1 EP20120168772 EP12168772A EP2666428A1 EP 2666428 A1 EP2666428 A1 EP 2666428A1 EP 20120168772 EP20120168772 EP 20120168772 EP 12168772 A EP12168772 A EP 12168772A EP 2666428 A1 EP2666428 A1 EP 2666428A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- output signal
- tool
- model
- trajectory
- candidate output
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000003754 machining Methods 0.000 claims abstract description 49
- 239000000463 material Substances 0.000 claims abstract description 49
- 210000000988 bone and bone Anatomy 0.000 claims abstract description 12
- 238000004590 computer program Methods 0.000 claims abstract description 6
- 239000013598 vector Substances 0.000 claims description 28
- 238000005553 drilling Methods 0.000 claims description 16
- 230000037182 bone density Effects 0.000 claims description 6
- 230000009466 transformation Effects 0.000 claims description 6
- 239000011159 matrix material Substances 0.000 claims description 4
- 230000000875 corresponding effect Effects 0.000 description 42
- 230000001276 controlling effect Effects 0.000 description 12
- 230000006870 function Effects 0.000 description 12
- 239000000523 sample Substances 0.000 description 12
- 210000001519 tissue Anatomy 0.000 description 12
- 238000013459 approach Methods 0.000 description 9
- 230000008878 coupling Effects 0.000 description 9
- 238000010168 coupling process Methods 0.000 description 9
- 238000005859 coupling reaction Methods 0.000 description 9
- 230000002596 correlated effect Effects 0.000 description 7
- 210000001595 mastoid Anatomy 0.000 description 7
- 238000005259 measurement Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000002591 computed tomography Methods 0.000 description 4
- 241001465754 Metazoa Species 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 210000004872 soft tissue Anatomy 0.000 description 2
- 238000001356 surgical procedure Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001054 cortical effect Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- ACGUYXCXAPNIKK-UHFFFAOYSA-N hexachlorophene Chemical compound OC1=C(Cl)C=C(Cl)C(Cl)=C1CC1=C(O)C(Cl)=CC(Cl)=C1Cl ACGUYXCXAPNIKK-UHFFFAOYSA-N 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000003801 milling Methods 0.000 description 1
- 238000006213 oxygenation reaction Methods 0.000 description 1
- 230000002980 postoperative effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000012113 quantitative test Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 210000003625 skull Anatomy 0.000 description 1
- 210000003582 temporal bone Anatomy 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 238000012285 ultrasound imaging Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
- A61B2034/104—Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
- A61B2090/065—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring contact or contact pressure
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
Definitions
- the invention relates to a system for estimating the spatial position (pose) of a tool within an object, particularly in the form of a human bone (e.g. the mastoid process of the temporal bone of a human skull) that is to be machined with said tool. Further, the invention relates to a method for estimating the spatial position of a tool within an object.
- a human bone e.g. the mastoid process of the temporal bone of a human skull
- machining systems such as computer numerically controlled machines, robots and the like utilize position sensors to estimate a tool pose relative to the coordinate system of the object being manipulated.
- position sensing is achieved by means of digital encoders, stereo-optic cameras, and laser range scanners among others.
- a digital model of the object being manipulated is available and used to define actions of the tool to the object. For instance in surgery, a three-dimensional scan of a bone is used to define the optimal position for a cavity to be machined into the bone.
- the model (and machining system) must be referenced with respect to the physical object, a process typically referred to as registration.
- the registration process relies on correlation of identifiable landmarks, in both the object and in the model.
- Landmark positions can be identified in the model by means of image processing and on the object by means of digitization using a measurement device. At least three landmarks are needed to register the object with its model.
- the registration can, however, be incorrect due to improper identification of landmarks on either or both the model and the object. This may result in an incorrect overlay of the coordinate systems of the object and the model leading to false manipulation of the object. For example in surgery, the cavity planned in the model of the bone will be carried out at in an incorrect position and/or orientation.
- WO 2009/092164 a surgical system utilizing tissue feedback is described to use estimates of tissue properties along a predetermined trajectory and make adjustments to an automated surgical task based on the difference between the actual and expected tissue properties as measured by the surgical tool.
- the type of controller utilized in this device is a linear type controller where for example a force sensor provides the input signal and the automated device (robot) can be controlled based on this signal.
- An illustrative example is given in the application of drilling a hole for pedicle screw placement. In this example it is desirable that the drill bit remain in the inner channel of the pedicle, thus if the drill bit contacts the outer wall (cortical bone) of this channel, the controller can adjust the trajectory to move away from this boundary.
- a method of advancing a tool such as a probe into tissue such as the brain under the control of a single degree of freedom actuator attached to a stereotactic frame is described. Also described is a numerical control mechanism trained to identify both abnormal tissues and blood vessels based on tissue properties. Sensors may include modalities such as force, interstitial pressure, oxygenation, blood flow, etc. An initial portion of the tool trajectory is used to gather training data towards training of the control algorithm. The invention enables surgical probe advancement to be stopped at a tissue boundary such as a blood vessel.
- a method of controlling machining feed rate based on the hardness of the material being worked is introduced. Spatial variation of the hardness of the material (bone) is determined from medical image data. Thus, the feed rate of the milling device can be altered according to the local hardness, which is derived from the image data. Force feedback can also be used to adjust the feed rate.
- a steerable electrode and a device used to insert it into the body is disclosed.
- the insertion system comprises a device, which actively controls the tools position (i.e. insertion depth) based on sensing of applied forces at the tool tip.
- US 2011/0112397 discloses a technique for determining the location of a bony structure surrounded by soft tissue through the use of a three-dimensional position tracking device and a needle probe.
- the device functions similar to a digitizing coordinate measurement machine in that a probe is inserted through the soft tissue and contacts the bone registering a peak in force and thereby triggering recording of the needles spatial position.
- a 3D map of the bony surface can be created.
- DE 102005029002 an invention is described related to the measurement of tissue forces and stiffness's with a probe that is simultaneously tracked in space.
- a three-dimensional map of mechanical properties can be created.
- DE 10 2008 013 429 A1 an invention is described correlating forces at the distal end of a surgical instrument with the 3D position within the body, whereby focus of the application is on forces directed lateral to the tool axis.
- WO 2010/147972 relates to an invention to provide feedback to a surgeon when a surgical probe could be nearing the outer boundary of the pedicle and thus prevent a breach. Predictions of force thresholds are based on preoperative medical image data, where image intensity values are correlated with bone properties.
- a system and method for aligning an ultrasound imaging probe relative to a predefined target inside a body is described.
- the system can track and follow moving targets by means of a robotic mechanism, which can automatically adjust the tools position in response to commands from a control system.
- a force sensor may also be utilized to maintain contact of the probe with the body.
- the underlying objective of the present invention is to provide a simple system and a method that enables self-referencing of the manipulation tool relative to the coordinate system of the model and thus to the object being processed.
- the system comprises a sensor being designed to be coupled to a tool for machining a cavity into the object along an actual trajectory, which sensor is further designed to generate an sensor output signal upon machining of said object by means of said tool, which sensor output signal depends on a material property of said object along said actual trajectory (e.g. a material density), an analyzing means connected to the sensor and being designed to compare said sensor output signal, particularly upon machining said object (i.e. in real-time), with the at least one or a plurality of pre-determined candidate output signals in order to determine a correlation/similarity between said signals that is used for determining the spatial position of the tool within the object.
- a material property of said object along said actual trajectory e.g. a material density
- an analyzing means connected to the sensor and being designed to compare said sensor output signal, particularly upon machining said object (i.e. in real-time), with the at least one or a plurality of pre-determined candidate output signals in order to determine a correlation/similarity between said signals that is used for determining the
- Determining a correlation/similarity between the sensor output signal and the candidate output signal(s) may involve or may be performed by determining a scalar quantity that represents a measure for a similarity/dissimilarity or correlation between the measured sensor output signal and the candidate (theoretical or predicted) output signals.
- a scalar quantity may be denoted as dissimilarity measure or correlation index and may be employed as a weight when forming averages.
- the spatial position of the tool within the object may be determined from at least the model trajectory whose associated (expected) output signal shows the best similarity/correlation with the measured sensor output signal according to the determined measure (i.e. shows a certain correlation index or (small) dissimilarity measure).
- model trajectories may be used to determine the (current) spatial position of the tool within the object by computing a weighted average according to the correlation/similarity between the output signal measured by the sensor and the candidate output signals, which are each associated to a model trajectory in a unique manner.
- Each of the candidate output signals is generated beforehand using said material property along an associated model trajectory in a material property model of said object, which can be represented by a suitable (particularly 3D) representation (image) of the object.
- Determining the spatial position of the tool within the object may involve the determination of all six degrees of freedom, or less, of the tool forming a rigid body.
- said spatial position (pose) of the tool within the object may be determined using at least the model trajectory whose associated candidate output signal has a correlation index corresponding to the highest correlation among the correlations between the sensor output signal and the candidate output signals. Since the tool machines a cavity along its (actual) trajectory and has a positive fit with the cavity, the model trajectory whose associated candidate output signal has the highest correlation with the sensor output signal directly yields the most likely position of the tool. For instance, the end point (target point) of said model trajectory can be taken as the spatial position of the (tip of the) tool in conjunction with the orientation of said model trajectory (with respect to the object).
- the analyzing means is configured to take the candidate output signal Y i showing the smallest d i among the candidate output signals as the one that correlates best with the measured sensor output signal.
- the associated model trajectory can then be used for determining the spatial position of the tool (e.g. by considering the end point of said model trajectory and its orientation ( T ) as the full spatial position of the tool as described above).
- all model trajectories can be taken into account when determining the (current) spatial position of the tool according to a further aspect of the invention, but are then weighted with individual weights which reflect the similarity/correlation between the sensor output signal and the candidate output signals that are uniquely associated to their respective model trajectory.
- the corresponding weight particularly is a function of the difference (distance) between the sensor output signal and the candidate output signal that is associated to this model trajectory, wherein particularly the respective weight is a function of the squared difference between the sensor output signal and the respective candidate output signal.
- the system is able to scale the temporal (spatial) component of the sensor output signal (i.e. the time or position axis) to correspond to the candidate output signal based on one or more of the following components: Initial contact of the machining tool with a surface of the object, a feed rate of the tool (according to encoder values of the automated machining apparatus/movement generating device retaining the tool), or position measurements, e.g. by way of an external observer (for instance a stereo vision system), which return the 3D pose of the tool in the reference coordinate frame by way of triangulation.
- an external observer for instance a stereo vision system
- the analyzing means is designed to compare (correlate) the whole current course of the sensor output signal, i.e., from a beginning of the actual trajectory to a current position of the tool along the actual trajectory and/or from the time where machining of the object started up to the current elapsed time during machining of said object, with a corresponding section (interval) of the course of said pre-determined candidate output signals upon machining the object e.g.
- said analyzing means is designed to estimate a current spatial position (pose) of the tool with respect to the object by identifying the model trajectory whose associated candidate output signal has a current correlation index corresponding to the best (current) correlation among the candidate output signals, or by performing the above weighting procedure repeatedly to perform an ongoing determination of the current spatial position of the tool upon machining of the object.
- the system comprises said tool for machining said object.
- the tool is formed as a drill, wherein particularly the drill comprises a drill bit (at the tip of the tool) that is designed to drill (e.g. machine) a drill hole (cavity) into the object along the actual trajectory, when the drill bit is pressed against the object upon rotating the drill about a longitudinal axis of the drill, along which the drill extends.
- the system further comprises a drive for rotating the drill around said longitudinal axis of the drill.
- said drive is coupled to the drill via a drill chuck.
- the actual trajectory is essentially a linear trajectory corresponding to a cylindrical drill hole or cavity drilled into said object.
- model trajectories are also defined as (essentially) linear trajectories.
- said model trajectories are distributed around a planned trajectory along which the object is to be machined yielding the actual trajectory, wherein particularly the model trajectories are arranged in groups.
- model trajectories in a group share the same entry point into the object.
- each group preferably comprises a trajectory having the orientation of the planned trajectory as well as trajectories having an orientation that differs by a small angle from the orientation of the planned trajectory, preferably in the range of 0,1 ° to 3,0°, more preferably 1,5°.
- the system comprises a modeling means being designed to arrange (group) the model trajectories in particular and to use said material property along the respective model trajectory as the respective candidate output signal.
- the employed model of the object is represented by a tomographic 3D-image of the material property of the object consisting of a plurality of voxels (volume pixels) each indicating a certain value of the material property (e.g. bone density) of the object, wherein said 3D-image is particularly a CT-scan.
- the modeling means is designed to discretize the respective model trajectory by dividing it into steps (bins) having a pre-defined step size, wherein for each step along the associated model trajectory voxels within a radius corresponding to a radius of the tool in a lateral plane normal to the respective model trajectory are integrated yielding a material property value, i.e. a value of the respective candidate output signal, associated with the respective step (position) along the respective model trajectory.
- the modeling means is further designed to represent the candidate output signals as vectors, wherein particularly the dimension of the respective vector corresponds to the length of the respective model trajectory in units of a certain step size.
- the analyzing means is preferably designed to represent the sensor output signal and the candidate output signals as vectors X , Y i , i labeling the candidate output vectors, wherein particularly the dimensions of these vectors correspond to a current length of the respective (model) trajectory in units of a certain step size.
- the analyzing means and/or modeling means may be formed by a computer and a corresponding software (that may be loaded into the RAM of the computer) being executed by the computer.
- the sensor is preferably connected to the computer via an interface.
- the analyzing means and/or the modeling means may also be formed as a stand-alone device.
- the senor is a 6 degree of freedom semi-conductor strain gauge based force sensor that is designed to sense a force exerted by the object onto the tool upon machining said object, particularly a force exerted by the object that is to be machined onto the drill bit along the longitudinal axis of the drill. Furthermore, forces applied at the tip of the drill bit can be transformed into a suitable coordinate system (coordinate system of the force sensor) such that forces and torques can be correctly distinguished.
- a torque acting on the drill may be sensed by corresponding sensors and compared to corresponding material properties of the object so that one has on the one hand an output signal that can be measured by the respective sensor corresponding (or correlating) to a material property (or some other characteristics of the object) and on the other hand said material property (characteristics) that can be used as an candidate output signal.
- Such a pairing of actually measured properties is given, when the measured quantity significantly correlates with the theoretically constructed (candidate) output signal, i.e., the material property (physical quantity) itself.
- Other material properties that may be considered are electrical or magnetic impedance of the object, rigidity of the object or the like.
- the system not only enables an estimation of the position of the tool within (or with respect to) the machined object, but also to move the tool by means of a movement generating means of the system that may be coupled to the analyzing means, which movement generating means is designed to automatically move the tool along a planned trajectory (resulting in machining the object along the actual trajectory).
- the movement generating means may be or may form part of an automated tool guidance system including but not limited to robot systems, computer numerically controlled machining systems etc. Particularly, the movement generating means is formed as a robot arm.
- the senor is preferably arranged between the drill chuck and the movement generating means, wherein particularly the drill chuck is coupled to the movement generating means via a coupling means comprising an interface via which the coupling means can be (releasably) coupled to the movement generating means (e.g. robot arm).
- a coupling means comprising an interface via which the coupling means can be (releasably) coupled to the movement generating means (e.g. robot arm).
- said sensor is provided on the coupling means.
- the system is designed to generate a warning signal that can be perceived by a user operating the system and/or a stop signal that causes to the drive and/or the actuators (movement generating means) of the system to stop, when a distance between the current position of the tool (actual trajectory) and a planned trajectory exceeds a pre-defined threshold value.
- the system further comprises a controlling means interacting with said movement generating means and eventually also with the drive in order to control these devices.
- the controlling means can be formed as a stand-alone device or a part thereof, but may also be formed by corresponding software executed on said computer or even on a separate computer.
- the drive/movement generating means is then connected to such a computer via an interface.
- the controlling means is designed to control the movement generating means upon machining of the object such that the current position of the tool (actual trajectory) asymptotically approaches a current planned position of the tool (planned trajectory). Further, the controlling means may be designed to control the movement generating means and drive upon machining of the object such that a current velocity of the tool along the actual trajectory approaches a current planned (reference) velocity of the tool.
- the method according to the invention comprises the steps of: providing at least one, particularly a plurality of candidate output signals, corresponding to a material property of the object along associated model trajectories in a material property model of said object (e.g. in a vicinity of a planned trajectory), providing a sensor output signal depending on a material property along an actual trajectory in said object, automatically determining a correlation/similarity between said sensor output signal and the at least one candidate output signal or the plurality of candidate (model) output signals in order to determine the spatial position of the tool within the object (see also above).
- the object may be any (artificial or natural) material that can be machined, particularly drilled.
- the object may be formed by living cells or tissues of the (human or animal) body, particularly a bone (alive or dead).
- the object may be any (machinable) object that is not formed by living cells or tissues of the living human or animal body.
- the actual machining of the object by means of the tool needs not to be a part of the method according to the invention, i.e., the tool may machine the object along the actual trajectory beforehand.
- the sensor output signal generated thereby is then used in order to determine the spatial position of the tool within the object, which tool machined the object along the actual trajectory beforehand.
- the candidate output signal that correlates best/shows the highest similarity with the measured sensor output signal is determined and used for estimating the spatial position of the tool within the object, which may be determined as an end point of the model trajectory associated to said best-correlated candidate output signal and/or as an (corresponding) orientation of said model trajectory.
- the correlation/similarity may be expressed in form of weights w i for each model trajectory, wherein particularly each weight w i is a function of the difference between the sensor output signal and the respective candidate output signal, particularly the squared difference. Then, for determining the spatial position of the tool within the object, a weighted average over corresponding points, particularly end points p i , of the model trajectories and/or over orientations o i of the model trajectories may be automatically performed, wherein each of said points p i and/or orientations o i is weighted using its associated weight w i .
- a correlation between the whole current course of the sensor output signal and a corresponding section of the at least one pre-determined candidate output signal or each of the pre-determined candidate output signals is automatically determined, particularly upon machining the object with said tool.
- the sensor output signal is generated as a function of the position of the tool along the actual trajectory and/or as a function of the time elapsed during machining of the object with the tool along the actual trajectory, wherein for simplifying the comparison between the sensor output signal and the (predicted) candidate output signals, the sensor output signal and/or the at least one candidate output signal or plurality of candidate output signals is preferably scaled along a time or position axis, particularly so as to match the extension of the sensor output signal along the time or position axis with the corresponding extension of the at least one candidate output signal or with the corresponding extensions of the plurality of candidate output signals.
- said signals are represented as vectors, these vectors have the same dimension which simplifies the calculation of the above-described weights and transformations.
- said tool is a drill, so that the model trajectories are (essentially) linear trajectories, wherein particularly said model trajectories are (automatically) distributed around a planned trajectory along which the object is to be machined (yielding the actual trajectory), wherein particularly the model trajectories are distributed in groups around the planned trajectory, wherein the model trajectories in each group have the same entry point into the object.
- each group comprises a trajectory having the orientation of the planned trajectory as well as trajectories having an orientation that differs from the orientation of the planned trajectory by a small angle, preferably in the range of 0,1 to 3,0°, more preferably 1,5°.
- said material property along the respective model trajectory is preferably used as the respective candidate output signal.
- said sensor output signal is a force, wherein the material property preferably is a density, particularly a bone density (see above), correlating therewith.
- the model from which the candidate output signals are (automatically) generated is represented by a tomographic 3D-image of the material property of the object consisting of a plurality of voxels each indicating a certain value of the material property (e.g. bone density).
- said image is created by a (cone beam) CT-scan of the object in beforehand.
- the candidate output signals are preferably discretized along the respective model trajectory by dividing them into steps having a pre-defined step size, wherein for each step along the associated model trajectory voxels within a radius corresponding to a radius of the tool in a lateral plane normal to the respective model trajectory are (automatically) integrated yielding a material property value, i.e. a value of the respective candidate output signal associated to the respective step (position) along the respective model trajectory.
- the sensor output signal corresponds to a force exerted by the object onto the tool upon machining said object by means of said tool, wherein particularly the tool is formed as a drill as described above.
- the material property is preferably a density of the object, particularly a bone density.
- the sensor output signal may be in general represented as a force vector and the candidate output signals as (axial) density vectors.
- a warning signal is generated upon moving the tool within the object along the actual trajectory, when a distance between a current spatial position of the tool that is determined by way of correlation as described above and a planned (theoretical) trajectory in the object exceeds a pre-defined threshold value. Then, according to a further aspect of the invention, the tool may also be automatically stopped.
- machining of the object by means of the tool may be automatically guided by a movement generating means, wherein moving the tool (machining the object) along the actual trajectory is controlled such that the current spatial position of the tool within the object determined by way of correlation approaches a current planned spatial position of the tool along a planned trajectory.
- said movement generating means for moving the tool along the actual trajectory upon machining the object and a drive for rotating the tool (drill) around its longitudinal axis is preferably controlled such upon machining of the object that a current velocity of the tool approaches a current planned velocity of the tool.
- a computer program product having the features of claim 15, which is particularly stored on a computer readable medium or downloadable (e.g. from computer), and which is particularly designed to be loaded into the memory of a computer (or the present system), wherein the computer program product is designed to conduct - upon being executed on a computer (or the present system) - the following steps: generating at least one candidate output signal based on a material property of an object along a model trajectory in a model of said object, the model trajectory being associated with at the least one candidate output signal, reading a sensor output signal depending on the material property along an actual trajectory in said object as an input to the computer program, and determining a correlation between the sensor output signal and the at least one candidate output signal for determining the spatial position of the tool within the object ,
- the computer program product according to the invention may be particularly designed to conduct, when being executed on a computer (or on the present system), also at least one of the steps stated in one of the claims 10 to 14 (as well as at the end of the present specification relating to claims 10 to 14)
- Figure 1 shows a model of an object 1 (CT-Image) stating the density (material property) of the object in space, which object 1 is to be machined (drilled) by means of a tool 2 that is formed as a drill. While drilling along a planned trajectory 10 into the object 1 resulting in a cavity 4 extending along a (linear) actual trajectory 30, the force at the tip (drill bit) 20 of the tool 2 that reflects the varying density along the actual trajectory 30 is measured by means of a sensor 21 coupled to the tool resulting in an sensor output signal 31 over the position along the actual trajectory 30 and/or time of the drilling process.
- CT-Image a model of an object 1
- the sensor output signal 31 is preferably represented as a vector where the position/time axis of the sensor output signal 31 is discretized into bins (intervals) and the components of the vector correspond to the values of the sensor output signal 31 for the respective position/time bin (step).
- a plurality of candidate output signals 41 are generated according to Figure 2 by considering a corresponding plurality of model trajectories 40 and taking the density 42 along the respective model trajectory as the respective "candidate" output signal 42, wherein the density may be integrated (averaged) in lateral planes across the respective model trajectory 40 over an area corresponding to the associated cross section of tool 2.
- candidate output signals 42 are discretized according to the bin (step) size of the measured sensor output signal 31 and represented as vectors. This allows one to easily correlate the sensor output signal 31 with candidate output signals 42 (entry by entry).
- the original output (sensor) signal 31 during object manipulation is correlated with corresponding portions of all available instances of the candidate (anticipated) output signals 42.
- the system selects the candidate output signal 42 that correlates best with the measured sensor output signal 31.
- the corresponding model trajectory 40 then allows for estimating the most likely position of tool 2, since this position then corresponds to said (current) model trajectory 40 within the object 1 being manipulated.
- averages may be computed as described below.
- Figure 3 shows a comparison of the tool sensor data (sensor output signal 31 representing the drilling force) with the corresponding model information (here: density (candidate output signal) 42 of the object material from computed tomography along a corresponding model trajectory 40) as well as an estimation of the tool 2 within the object 1 itself.
- Figure 4 and 5 illustrate a quantitative test of the present system and method.
- a pre-op cone-beam CT scan was taken wherein 4 trajectories planned through the mastoid bone of a human are analyzed below.
- Each of these trajectories was drilled using a robotic system (movement generating means) while recording force data by means of a sensor as discussed above.
- the drilling feed-rate was 0.3 mm/s, the drill speed was 5000 RPM.
- a square region of interest (3x3 mm) was defined around the planned trajectory p 0 .
- Five candidate trajectories were calculated at each spaced entry position (e i ) spaced at 0.3 mm with orientation o i and length l according to equation 1.
- p i e i + l * o i
- Density values (corresponding to the candidate output signal) were calculated along each p i for a total of 605 candidate trajectories as seen in Fig. 4 .
- the resulting force vector was normalized ( F ⁇ ) about its mean, and a squared difference was computed between the normalized force and each ( i ) of the candidate density vectors as in equation (2).
- r i F ⁇ - D ⁇ i 2
- a confidence value was also defined (5). This represents a comparison between the current result and the best possible result, defined as the case with a single well matched outcome and no other matching trajectories.
- c is the confidence and d is a further weighting factor (6), which is a function of the spread of the trajectories.
- s is a distance weighting exponent
- l max is the maximum distance of any trajectory from the planned trajectory and all other variables are as previously defined.
- the orientation of each of the trajectories undergoes the same process as described above; the direction which is closest to the true tool orientation should be correlated most closely and the weighted mean vector direction should reflect this.
- FIG. 5 An example of the actual density 43 along the drilled path 30, the best correlated density vector 42, and force 31 along the drilled actual trajectory 30 are shown in Figure 5 . Also shown (top part of Figure 5 ) is a slice of the object 1 (model) displaying the actual density in the drilled cavity (upper part) as well as the density that correlated best with the measured force (sensor output signal).
- Fig. 6 shows in conjunction with Fig. 7 a movement generating means 100 of the system according to the invention by means of which the tool (drill) 2 can be automatically moved (e.g. controlled by the afore-described controlling means).
- the movement generating means 100 may be controlled by a controlling means.
- the movement generating means 100 preferably uses a serial kinematics, i.e., is formed as a robot arm, that comprises preferably at least six axes (preferably rotational axes, but a linear axis may also be present) being represented by corresponding (rotating or, if present, translational) joints, respectively. Each axis (joint) may be actuated by a corresponding actuator.
- the drill 2 attached to the robot arm 100 via a drill chuck 102 can approach any target point in a workspace of the robot arm 100 with an arbitrary orientation.
- a coupling means 110 is provided to which the drill chuck 102 is connected and which provides an interface 112 via which the coupling means is coupled to the free end of the robot arm 100.
- the sensor 120 for sensing the forces acting on the drill upon machining an object is preferably arranged in the coupling means 110. i.e., between the drill chuck 102 and the robot arm 100.
- a drive for driving a drill fastened to the (thinner) free end of the drill chuck 102 can be connected to the other (broader) free end of the drill chuck 102 shown in Fig. 7 .
- the analyzing means is preferably designed to determine a correlation between the whole current course of the sensor output signal (31) and a corresponding section of the at least one pre-determined candidate output signal (42) or corresponding sections of each of the pre-determined candidate output signals (42) upon machining the object (2) for determining the current spatial position of the tool within the object (1).
- the system is designed to generate said sensor output signal (31) by means of said sensor (21) as a function of the position of the tool (2) along the actual trajectory (30) and/or as a function of the time elapsed during machining of the object (1).
- the system is designed to scale the sensor output signal (31) along a time or position axis, particularly so as to match the extension of the sensor output signal (31) along the time or position axis with the corresponding extension of the at least one candidate output signal (42) or with the corresponding extensions of the plurality of candidate output signals (42).
- the actual trajectory (30) is a linear trajectory.
- the at least one model trajectory (40) or the model trajectories (40) are linear trajectories, wherein particularly said model trajectories (40) are distributed around a planned trajectory (10) along which the object (1) is to be machined yielding the actual trajectory (30), wherein particularly the model trajectories (40) form groups, wherein model trajectories in a group have the same entry point (E) into the object (1), and wherein each group comprises a model trajectory (40) having the orientation of the planned trajectory (10) as well as model trajectories (40) having an orientation that is different from the orientation of the planned trajectory (10).
- the system comprises a modeling means being designed to generate the at least one candidate output signal or the plurality of candidate output signals (42) as said material property along the respective model trajectory (40), wherein particularly the model of the object (1) is represented by a 3D-image of the material property of the object (1) consisting of a plurality of voxels each indicating a certain value of the material property, and wherein particularly the modeling means is designed to integrate voxels within a radius corresponding to a radius of the tool (2) in a lateral plane normal to the respective model trajectory (40).
- the sensor output signal corresponds to a force exerted by the object (1) onto the tool (2) upon machining said object (1), particularly a force exerted by the object (2) onto the drill bit (21) along the longitudinal axis (L) of the drill (2).
- the drive is coupled to the drill (2) via a drill chuck (102).
- the system comprises a movement generating means (100) for moving the tool (2) along the actual trajectory (30) for machining the object (1), wherein particularly the movement generating means (100) is formed as a robot arm (100).
- the drill chuck (102) is coupled to the movement generating means (100), particularly via a coupling means (110) comprising an interface (112) that is designed to be coupled to said robot arm (100).
- the senor (21) is arranged between the movement generating means (100) and the drill chuck (102), wherein particularly the sensor (21) is arranged in said coupling means (110).
- the system is designed to generate a warning signal when a distance between a spatial position of the tool (2) within the object (1) and a planned trajectory (10) exceeds a pre-defined threshold value.
- system is designed to stop the drive and/or the movement generating means (100) when a distance between a spatial position of the tool (2) and a planned trajectory (10) exceeds a pre-defined threshold value.
- the system comprises a controlling means for controlling the tool (2), wherein particularly the controlling means is designed to control the movement generating means (100) upon machining of the object (1) such that a determined current spatial position of the tool (2) within the object (1) approaches a current planned spatial position of the tool (2) along a planned trajectory (10), and wherein particularly the controlling means is designed to control the movement generating means (100) and drive upon machining of the object (1) such that a current velocity of the tool (2) approaches a current planned velocity of the tool (2).
- a correlation between the whole current course of the sensor output signal (31) and a corresponding section of the at least one pre-determined candidate output signal (42) or corresponding sections of each of the pre-determined candidate output signals (42) is preferably determined, particularly upon machining the object (2) with said tool (2).
- the sensor output signal (31) is generated as a function of the position of the tool (2) along the actual trajectory (30) and/or as a function of the time elapsed during machining of the object (1) with the tool (2) along the actual trajectory (30).
- the sensor output signal (31) is scaled along a time or position axis, particularly so as to match the extension of the sensor output signal (31) along the time or position axis with the corresponding extension of the at least one candidate output signal (42) or with the corresponding extensions of the plurality of candidate output signals (42).
- the at least one model trajectory or the model trajectories (40) are linear trajectories, wherein particularly said model trajectories (40) are distributed around a planned trajectory (10) along which the object (1) is to be machined yielding the actual trajectory (30), wherein particularly the model trajectories (40) are arranged to form groups, wherein model trajectories (40) in a group have the same entry point (E) into the object (1), and wherein each group comprises a model trajectory (40) having the orientation of the planned trajectory (10) as well as model trajectories (40) having an orientation that is different from the orientation of the planned trajectory (10).
- the at least one candidate output signal (42) or a plurality of candidate output signals (42) are generated as a function of said material property along the respective model trajectory (40), wherein particularly the model of the object (1) is represented by a 3D-image of the material property of the object consisting of a plurality of voxels each indicating a certain value of the material property, and wherein particularly voxels within a radius corresponding to a radius of the tool in a lateral plane normal to the respective model trajectory (40) are integrated in order to generate the respective candidate output signal (42).
- the sensor output signal (31) corresponds to a force exerted by the object (1) onto the tool (2) upon machining said object (1)
- the tool (2) is formed as a drill
- particularly the drill comprises a drilling bit (20) that is designed to drill a hole (4) into the object (1), when the drilling bit (20) is pressed against the object (1) upon rotating the drill (2) about a longitudinal axis (L) of the drill (2), along which the drill (2) extends
- the sensor output signal corresponds to a force exerted by the object (2) onto said drill bit (21) along the longitudinal axis (L) of the drill.
- the material property is a density of the object (1), particularly a bone density.
- a warning signal is generated when a distance between a determined current spatial position of the tool (2) within the object (1) and a planned trajectory (10) exceeds a pre-defined threshold value.
- a movement of the tool (2) along the actual trajectory is stopped when a distance between a determined current spatial position of the tool (2) within the object (1) and a planned trajectory (10) exceeds a pre-defined threshold value.
- a movement of the tool (2) along the actual trajectory is controlled such that the determined current spatial position of the tool (2) within the object (1) approaches a current planned spatial position of the tool (2) along a planned trajectory (10), and wherein particularly a movement generating means (100), particularly a robot arm, for moving the tool (2) along the actual trajectory (30) upon machining the object (1) and a drive for rotating the tool (2) around its longitudinal axis (L) is controlled upon machining of the object (1) such that a current velocity of the tool (2) approaches a current planned velocity of the tool (2).
- a movement generating means particularly a robot arm
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Robotics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Numerical Control (AREA)
- Surgical Instruments (AREA)
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12168772.7A EP2666428B1 (de) | 2012-05-21 | 2012-05-21 | System und Verfahren zur Schätzung der räumlichen Position eines Werkzeugs in einem Objekt |
AU2013265396A AU2013265396B2 (en) | 2012-05-21 | 2013-05-21 | System and method for estimating the spatial position of a tool within an object |
US14/402,323 US9814532B2 (en) | 2012-05-21 | 2013-05-21 | System and method for estimating the spatial position of a tool within an object |
CN201380026826.9A CN104540467B (zh) | 2012-05-21 | 2013-05-21 | 用于估计物体中工具空间位置的系统与方法 |
PCT/EP2013/060389 WO2013174801A2 (en) | 2012-05-21 | 2013-05-21 | System and method for estimating the spatial position of a tool within an object |
US15/811,674 US10342622B2 (en) | 2012-05-21 | 2017-11-13 | System and method for estimating the spatial position of a tool within an object |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12168772.7A EP2666428B1 (de) | 2012-05-21 | 2012-05-21 | System und Verfahren zur Schätzung der räumlichen Position eines Werkzeugs in einem Objekt |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2666428A1 true EP2666428A1 (de) | 2013-11-27 |
EP2666428B1 EP2666428B1 (de) | 2015-10-28 |
Family
ID=48672563
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12168772.7A Active EP2666428B1 (de) | 2012-05-21 | 2012-05-21 | System und Verfahren zur Schätzung der räumlichen Position eines Werkzeugs in einem Objekt |
Country Status (5)
Country | Link |
---|---|
US (2) | US9814532B2 (de) |
EP (1) | EP2666428B1 (de) |
CN (1) | CN104540467B (de) |
AU (1) | AU2013265396B2 (de) |
WO (1) | WO2013174801A2 (de) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9119655B2 (en) | 2012-08-03 | 2015-09-01 | Stryker Corporation | Surgical manipulator capable of controlling a surgical instrument in multiple modes |
US9226796B2 (en) | 2012-08-03 | 2016-01-05 | Stryker Corporation | Method for detecting a disturbance as an energy applicator of a surgical instrument traverses a cutting path |
WO2016164590A1 (en) * | 2015-04-10 | 2016-10-13 | Mako Surgical Corp. | System and method of controlling a surgical tool during autonomous movement of the surgical tool |
US9480534B2 (en) | 2012-08-03 | 2016-11-01 | Stryker Corporation | Navigation system and method for removing a volume of tissue from a patient |
US9820818B2 (en) | 2012-08-03 | 2017-11-21 | Stryker Corporation | System and method for controlling a surgical manipulator based on implant parameters |
WO2018042400A1 (en) * | 2016-09-04 | 2018-03-08 | Universitat Bern | System for determining proximity of a surgical tool to key anatomical features |
US9921712B2 (en) | 2010-12-29 | 2018-03-20 | Mako Surgical Corp. | System and method for providing substantially stable control of a surgical tool |
US11202682B2 (en) | 2016-12-16 | 2021-12-21 | Mako Surgical Corp. | Techniques for modifying tool operation in a surgical robotic system based on comparing actual and commanded states of the tool relative to a surgical site |
CN113855249A (zh) * | 2021-12-02 | 2021-12-31 | 极限人工智能有限公司 | 一种机器控制方法、装置、手术机器人及可读存储介质 |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2666428B1 (de) * | 2012-05-21 | 2015-10-28 | Universität Bern | System und Verfahren zur Schätzung der räumlichen Position eines Werkzeugs in einem Objekt |
WO2016065458A1 (en) * | 2014-10-29 | 2016-05-06 | Intellijoint Surgical Inc. | Systems and devices including a surgical navigation camera with a kinematic mount and a surgical drape with a kinematic mount adapter |
US10631934B2 (en) | 2015-01-15 | 2020-04-28 | Think Surgical Inc. | Image and laser guided control of cutting using a robotic surgical system |
US9881477B2 (en) * | 2015-02-27 | 2018-01-30 | Elwha Llc | Device having a sensor for sensing an object and a communicator for coupling the sensor to a determiner for determining whether a subject may collide with the object |
US10117713B2 (en) * | 2015-07-01 | 2018-11-06 | Mako Surgical Corp. | Robotic systems and methods for controlling a tool removing material from a workpiece |
JP6496338B2 (ja) | 2017-03-14 | 2019-04-03 | ファナック株式会社 | 工作機械の制御システム |
JP6474450B2 (ja) | 2017-04-17 | 2019-02-27 | ファナック株式会社 | 工作機械の制御システム |
JP6514264B2 (ja) | 2017-04-20 | 2019-05-15 | ファナック株式会社 | 工作機械の制御システム |
JP6730363B2 (ja) * | 2018-04-13 | 2020-07-29 | ファナック株式会社 | 操作訓練システム |
US11419604B2 (en) * | 2018-07-16 | 2022-08-23 | Cilag Gmbh International | Robotic systems with separate photoacoustic receivers |
CN115279294A (zh) | 2020-01-13 | 2022-11-01 | 史赛克公司 | 在导航辅助手术期间监控偏移的系统 |
US20220133331A1 (en) | 2020-10-30 | 2022-05-05 | Mako Surgical Corp. | Robotic surgical system with cut selection logic |
US12048497B2 (en) * | 2021-01-11 | 2024-07-30 | Mazor Robotics Ltd. | Safety mechanism for robotic bone cutting |
US20220257320A1 (en) * | 2021-02-18 | 2022-08-18 | Mazor Robotics Ltd. | Systems, devices, and methods for tool skive avoidance |
WO2024126718A1 (en) * | 2022-12-16 | 2024-06-20 | Nobel Biocare Services Ag | Drilling apparatus and method of assessing bone quality using the drilling apparatus |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040010190A1 (en) | 2000-02-25 | 2004-01-15 | The Board Of Trustees Of The Leland Stanford Junior University | Methods and apparatuses for maintaining a trajectory in sterotaxi for tracking a target inside a body |
US6718196B1 (en) | 1997-02-04 | 2004-04-06 | The United States Of America As Represented By The National Aeronautics And Space Administration | Multimodality instrument for tissue characterization |
DE102005029002A1 (de) | 2005-06-21 | 2006-12-28 | Universität Oldenburg | Verfahren und Vorrichtung zur berührenden Messung einer Kraft |
US20070181139A1 (en) | 2004-05-28 | 2007-08-09 | Hauck John A | Robotic surgical system with contact sensing feature |
US20070225787A1 (en) | 2005-10-14 | 2007-09-27 | Nabil Simaan | Electrode arrays and systems for inserting same |
WO2009092164A1 (en) | 2008-01-25 | 2009-07-30 | Mcmaster University | Surgical guidance utilizing tissue feedback |
DE102008013429A1 (de) | 2008-03-10 | 2009-10-01 | Siemens Aktiengesellschaft | Vorrichtung und Verfahren für einen medizinischen Eingriff |
WO2010147972A1 (en) | 2009-06-16 | 2010-12-23 | Regents Of The University Of Minnesota | Spinal probe with tactile force feedback and pedicle breach prediction |
US20110112397A1 (en) | 2009-07-15 | 2011-05-12 | Feimo Shen | In vivo sensor for detecting bone surface |
US20110306985A1 (en) | 2010-06-09 | 2011-12-15 | National University Corporation Tokyo University | Surgical Assistance System |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6887247B1 (en) * | 2002-04-17 | 2005-05-03 | Orthosoft Inc. | CAS drill guide and drill tracking system |
US7782709B2 (en) * | 2003-08-22 | 2010-08-24 | Schlumberger Technology Corporation | Multi-physics inversion processing to predict pore pressure ahead of the drill bit |
US8175677B2 (en) * | 2007-06-07 | 2012-05-08 | MRI Interventions, Inc. | MRI-guided medical interventional systems and methods |
EP2666428B1 (de) * | 2012-05-21 | 2015-10-28 | Universität Bern | System und Verfahren zur Schätzung der räumlichen Position eines Werkzeugs in einem Objekt |
-
2012
- 2012-05-21 EP EP12168772.7A patent/EP2666428B1/de active Active
-
2013
- 2013-05-21 US US14/402,323 patent/US9814532B2/en active Active
- 2013-05-21 AU AU2013265396A patent/AU2013265396B2/en active Active
- 2013-05-21 WO PCT/EP2013/060389 patent/WO2013174801A2/en active Application Filing
- 2013-05-21 CN CN201380026826.9A patent/CN104540467B/zh active Active
-
2017
- 2017-11-13 US US15/811,674 patent/US10342622B2/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6718196B1 (en) | 1997-02-04 | 2004-04-06 | The United States Of America As Represented By The National Aeronautics And Space Administration | Multimodality instrument for tissue characterization |
US20040010190A1 (en) | 2000-02-25 | 2004-01-15 | The Board Of Trustees Of The Leland Stanford Junior University | Methods and apparatuses for maintaining a trajectory in sterotaxi for tracking a target inside a body |
US20070181139A1 (en) | 2004-05-28 | 2007-08-09 | Hauck John A | Robotic surgical system with contact sensing feature |
DE102005029002A1 (de) | 2005-06-21 | 2006-12-28 | Universität Oldenburg | Verfahren und Vorrichtung zur berührenden Messung einer Kraft |
US20070225787A1 (en) | 2005-10-14 | 2007-09-27 | Nabil Simaan | Electrode arrays and systems for inserting same |
WO2009092164A1 (en) | 2008-01-25 | 2009-07-30 | Mcmaster University | Surgical guidance utilizing tissue feedback |
DE102008013429A1 (de) | 2008-03-10 | 2009-10-01 | Siemens Aktiengesellschaft | Vorrichtung und Verfahren für einen medizinischen Eingriff |
WO2010147972A1 (en) | 2009-06-16 | 2010-12-23 | Regents Of The University Of Minnesota | Spinal probe with tactile force feedback and pedicle breach prediction |
US20110112397A1 (en) | 2009-07-15 | 2011-05-12 | Feimo Shen | In vivo sensor for detecting bone surface |
US20110306985A1 (en) | 2010-06-09 | 2011-12-15 | National University Corporation Tokyo University | Surgical Assistance System |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9921712B2 (en) | 2010-12-29 | 2018-03-20 | Mako Surgical Corp. | System and method for providing substantially stable control of a surgical tool |
US10350017B2 (en) | 2012-08-03 | 2019-07-16 | Stryker Corporation | Manipulator and method for controlling the manipulator based on joint limits |
US9820818B2 (en) | 2012-08-03 | 2017-11-21 | Stryker Corporation | System and method for controlling a surgical manipulator based on implant parameters |
US9480534B2 (en) | 2012-08-03 | 2016-11-01 | Stryker Corporation | Navigation system and method for removing a volume of tissue from a patient |
US9566125B2 (en) | 2012-08-03 | 2017-02-14 | Stryker Corporation | Surgical manipulator having a feed rate calculator |
US10420619B2 (en) | 2012-08-03 | 2019-09-24 | Stryker Corporation | Surgical manipulator and method for transitioning between operating modes |
US9681920B2 (en) | 2012-08-03 | 2017-06-20 | Stryker Corporation | Robotic system and method for reorienting a surgical instrument moving along a tool path |
US9795445B2 (en) | 2012-08-03 | 2017-10-24 | Stryker Corporation | System and method for controlling a manipulator in response to backdrive forces |
US10426560B2 (en) | 2012-08-03 | 2019-10-01 | Stryker Corporation | Robotic system and method for reorienting a surgical instrument moving along a tool path |
US12070288B2 (en) | 2012-08-03 | 2024-08-27 | Stryker Corporation | Robotic system and method for removing a volume of material from a patient |
US10463440B2 (en) | 2012-08-03 | 2019-11-05 | Stryker Corporation | Surgical manipulator and method for resuming semi-autonomous tool path position |
US12004836B2 (en) | 2012-08-03 | 2024-06-11 | Stryker Corporation | Surgical manipulator and method of operating the same using virtual rigid body modeling preliminary |
US9226796B2 (en) | 2012-08-03 | 2016-01-05 | Stryker Corporation | Method for detecting a disturbance as an energy applicator of a surgical instrument traverses a cutting path |
US11672620B2 (en) | 2012-08-03 | 2023-06-13 | Stryker Corporation | Robotic system and method for removing a volume of material from a patient |
US11639001B2 (en) | 2012-08-03 | 2023-05-02 | Stryker Corporation | Robotic system and method for reorienting a surgical instrument |
US10314661B2 (en) | 2012-08-03 | 2019-06-11 | Stryker Corporation | Surgical robotic system and method for controlling an instrument feed rate |
US9119655B2 (en) | 2012-08-03 | 2015-09-01 | Stryker Corporation | Surgical manipulator capable of controlling a surgical instrument in multiple modes |
US9566122B2 (en) | 2012-08-03 | 2017-02-14 | Stryker Corporation | Robotic system and method for transitioning between operating modes |
US11471232B2 (en) | 2012-08-03 | 2022-10-18 | Stryker Corporation | Surgical system and method utilizing impulse modeling for controlling an instrument |
US11179210B2 (en) | 2012-08-03 | 2021-11-23 | Stryker Corporation | Surgical manipulator and method for controlling pose of an instrument based on virtual rigid body modelling |
US11045958B2 (en) | 2012-08-03 | 2021-06-29 | Stryker Corporation | Surgical robotic system and method for commanding instrument position based on iterative boundary evaluation |
JP2018516107A (ja) * | 2015-04-10 | 2018-06-21 | マコ サージカル コーポレーション | 外科用工具の自律移動の際に同外科用工具を制御するためのシステム及び方法 |
US9937014B2 (en) | 2015-04-10 | 2018-04-10 | Mako Surgical Corp. | System and method of controlling a surgical tool during autonomous movement of the surgical tool |
CN107427330A (zh) * | 2015-04-10 | 2017-12-01 | 马科外科公司 | 在手术工具的自主移动期间控制手术工具的系统和方法 |
EP3280345A1 (de) * | 2015-04-10 | 2018-02-14 | Mako Surgical Corp. | System und verfahren zur steuerung eines chirurgischen werkzeugs während einer autonomen bewegung des chirurgischen werkzeugs |
WO2016164590A1 (en) * | 2015-04-10 | 2016-10-13 | Mako Surgical Corp. | System and method of controlling a surgical tool during autonomous movement of the surgical tool |
CN107427330B (zh) * | 2015-04-10 | 2020-10-16 | 马科外科公司 | 在手术工具的自主移动期间控制手术工具的系统和方法 |
AU2016246745B2 (en) * | 2015-04-10 | 2020-11-26 | Mako Surgical Corp. | System and method of controlling a surgical tool during autonomous movement of the surgical tool |
JP7112754B2 (ja) | 2016-09-04 | 2022-08-04 | ウニヴェルズィテート ベルン | 重要な解剖学的特徴に対する外科用器具の接近度を決定するためのシステム |
JP2022109984A (ja) * | 2016-09-04 | 2022-07-28 | ウニヴェルズィテート ベルン | 重要な解剖学的特徴に対する外科用器具の接近度を決定するためのシステム |
WO2018042400A1 (en) * | 2016-09-04 | 2018-03-08 | Universitat Bern | System for determining proximity of a surgical tool to key anatomical features |
JP2019531849A (ja) * | 2016-09-04 | 2019-11-07 | ウニヴェルズィテート ベルン | 重要な解剖学的特徴に対する外科用器具の接近度を決定するためのシステム |
US11202682B2 (en) | 2016-12-16 | 2021-12-21 | Mako Surgical Corp. | Techniques for modifying tool operation in a surgical robotic system based on comparing actual and commanded states of the tool relative to a surgical site |
US11850011B2 (en) | 2016-12-16 | 2023-12-26 | Mako Surgical Corp. | Techniques for modifying tool operation in a surgical robotic system based on comparing actual and commanded states of the tool relative to a surgical site |
CN113855249A (zh) * | 2021-12-02 | 2021-12-31 | 极限人工智能有限公司 | 一种机器控制方法、装置、手术机器人及可读存储介质 |
Also Published As
Publication number | Publication date |
---|---|
WO2013174801A2 (en) | 2013-11-28 |
US20150157419A1 (en) | 2015-06-11 |
CN104540467B (zh) | 2017-10-13 |
AU2013265396B2 (en) | 2017-05-25 |
EP2666428B1 (de) | 2015-10-28 |
US10342622B2 (en) | 2019-07-09 |
US9814532B2 (en) | 2017-11-14 |
AU2013265396A1 (en) | 2014-12-18 |
US20180085172A1 (en) | 2018-03-29 |
WO2013174801A3 (en) | 2014-01-16 |
CN104540467A (zh) | 2015-04-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10342622B2 (en) | System and method for estimating the spatial position of a tool within an object | |
US20210038325A1 (en) | Drilling control system and drilling control method | |
CN108135529A (zh) | 用于引导医疗工具的插入的系统和方法 | |
US20140142429A1 (en) | Ultrasound guided robot for flexible needle steering | |
US11298186B2 (en) | Surgery assistive system and method for obtaining surface information thereof | |
US11998289B2 (en) | Method and system for autonomous therapy | |
US20210259781A1 (en) | Force based digitization for bone registration | |
JP2023511272A (ja) | ナビゲーション支援手術中にオフセットをモニタするシステム及び方法 | |
Nelson et al. | Positional Calibration of an Ultrasound Image‐Guided Robotic Breast Biopsy System | |
Li et al. | Comparative quantitative analysis of robotic ultrasound image calibration methods | |
US20220143366A1 (en) | Systems and methods for determining buckling and patient movement during a medical procedure | |
EP3738515A1 (de) | Ultraschallsystem und verfahren zur verfolgung der bewegung eines objekts | |
US20220257320A1 (en) | Systems, devices, and methods for tool skive avoidance | |
Liu et al. | Preoperative surgical planning for robot-assisted liver tumour ablation therapy based on collision-free reachable workspaces | |
CN114732518A (zh) | 用于单个图像配准更新的系统和方法 | |
Ahmad et al. | Development and 3D spatial calibration of a parallel robot for percutaneous needle procedures with 2D ultrasound guidance | |
Bhagvath et al. | Design and Accuracy Assessment of an Automated Image-Guided Robotic Osteotomy System | |
EP3811889B1 (de) | Operationsunterstützungssystem zum erhalten von oberflächeninformationen | |
US20240130811A1 (en) | Systems and methods for determining a safety layer for an anatomical element | |
CN118695810A (zh) | 用于使用深度学习在医疗程序期间跟踪医疗工具的方法 | |
HOFFMANN | STÉPHANE LAVALLÉE, JOCELYNE TROCCAZ, LINE GABORIT, PHILIPPE CINQUIN, ALIM LOUIS BENABID, AND | |
Hofmann et al. | Calibration of the motor-assisted robotic stereotaxy system: MARS |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20130529 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20150430 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 757497 Country of ref document: AT Kind code of ref document: T Effective date: 20151115 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602012011922 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20151028 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 757497 Country of ref document: AT Kind code of ref document: T Effective date: 20151028 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20160228 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20160128 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 5 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20160229 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20160129 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602012011922 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20160531 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20160729 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: LU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20160521 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 6 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20160521 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 7 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20120521 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 Ref country code: MT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20160531 Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151028 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20240522 Year of fee payment: 13 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240517 Year of fee payment: 13 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: CH Payment date: 20240602 Year of fee payment: 13 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20240522 Year of fee payment: 13 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: IT Payment date: 20240531 Year of fee payment: 13 |