WO2007136739A2 - A method and apparatus for controlling a haptic device - Google Patents
A method and apparatus for controlling a haptic device Download PDFInfo
- Publication number
- WO2007136739A2 WO2007136739A2 PCT/US2007/011891 US2007011891W WO2007136739A2 WO 2007136739 A2 WO2007136739 A2 WO 2007136739A2 US 2007011891 W US2007011891 W US 2007011891W WO 2007136739 A2 WO2007136739 A2 WO 2007136739A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- actuator
- sensor
- haptic
- velocity
- load
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 56
- 230000004044 response Effects 0.000 claims abstract description 29
- 238000004891 communication Methods 0.000 claims abstract description 23
- 230000005540 biological transmission Effects 0.000 claims description 42
- 238000013016 damping Methods 0.000 claims description 20
- 230000005484 gravity Effects 0.000 claims description 13
- 238000001914 filtration Methods 0.000 claims description 4
- 230000003287 optical effect Effects 0.000 claims description 4
- 230000008569 process Effects 0.000 description 36
- 238000009877 rendering Methods 0.000 description 14
- 230000033001 locomotion Effects 0.000 description 9
- 230000008901 benefit Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 230000009977 dual effect Effects 0.000 description 5
- 230000003993 interaction Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 238000005520 cutting process Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 238000001356 surgical procedure Methods 0.000 description 3
- 241000567769 Isurus oxyrinchus Species 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000007792 addition Methods 0.000 description 2
- 210000000988 bone and bone Anatomy 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000012636 effector Substances 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 208000032366 Oversensing Diseases 0.000 description 1
- 210000003484 anatomy Anatomy 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 230000002939 deleterious effect Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000001627 detrimental effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000009022 nonlinear effect Effects 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1638—Programme controls characterised by the control loop compensation for arm bending/inertia, pay load weight/inertia
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/02—Hand grip control means
- B25J13/025—Hand grip control means comprising haptic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
Definitions
- the invention relates generally to the field of haptic devices and more specifically to the field of haptic device controllers.
- Cable drive haptic devices incorporate a cable transmission having a proximal (or drive) end and a distal (or load) end.
- the proximal end includes actuators (such as but not limited to motors) that drive the transmission to thereby transmit load to an endpoint of the distal end.
- the endpoint of the haptic device is disposed in physical space and a haptic rendering algorithm generates virtual haptic surfaces (or haptic objects) that are located in the physical space.
- the haptic device enables a user to interact with the virtual haptic surfaces, for example, by controlling the actuators to transmit a load to the distal end of the transmission when the endpoint encounters a haptic surface.
- Conventional cable drive haptic devices may include sensors (e.g., position sensors such as encoders) mounted with the actuators at the proximal end of the cable transmission. Data from the actuator sensors (e.g., motor angles) is input to a forward kinematics process to calculate a position of the endpoint.
- sensors e.g., position sensors such as encoders
- Data from the actuator sensors e.g., motor angles
- a forward kinematics process is input to a forward kinematics process to calculate a position of the endpoint.
- the calculated position of the endpoint, and thus the haptic surfaces, in physical space may not correspond to the actual position of the endpoint due to compliance and hysteresis in the cable transmission.
- the cable transmission may flex, resulting in endpoint movement even if the controller maintains the actuator output position. That is, the compliance of the cables of the cable transmission permits some movement of the endpoint even if the actuator attempts to respond to maintain a desired position. This movement results in an error between the actual endpoint location relative to the location of the endpoint as computed by the controller based on the actuator output position.
- the inaccuracy between the actual endpoint position of the haptic device and the calculated endpoint position is not important because it is not necessary to locate precisely the haptic surfaces in the physical workspace of the haptic device.
- the haptic surfaces can be initially positioned anywhere convenient within the workspace without affecting the user's interaction with the virtual environment. For this reason, the endpoint positioning accuracy of conventional cable drive haptic devices is rarely even considered as important.
- haptic devices are generally designed to be compact and have minimal moving mass and inertia, so they typically will not have extra position sensors, especially on the load end of the transmission, where the sensors will have a larger deleterious effect on the haptic performance.
- Some haptic applications may require a high degree of endpoint positioning accuracy.
- the haptic surfaces define a cutting boudary for a cutting tool attached to the haptic device and thus must be precisely postioned in the physical space of the patient.
- a haptic device with a stiff transmission such as a geared transmission, may be used.
- stiff transmissions may not be backdriveable and/or suitable for use in a haptic device.
- conventional cable drive haptic devices are backdriveable, they present the endpoint positioning accuracy problem described above.
- One possibility for improving the endpoint positioning accuracy is to relocate the sensors from the proximal (or drive) end to the distal (or load) end of the cable transmission, such as relocating the sensor from the actuator to the joint. This permits a more accurate determination of the position of the endpoint.
- Relocating the sensors to the load end of the cable transmission may cause the controller to exhibit instability because the sensing and actuation are not located at the same place and are connected by a transmission that is not rigid and has dynamics that can be excited by the controller.
- a haptic device when a haptic device includes sensors on only one side of the cable transmission, the controller lacks additional information useful for improving the stability of haptic control, which allows for increased haptic wall stiffness. Increased haptic wall stiffness is important when the haptic device is used in computer aided surgery because the haptic surfaces must sufficiently convey to the surgeon the location of the tool with respect the actual tissue surface.
- Other conventional positioning devices and industrial robots may also require precise endpoint positioning, but, unlike a haptic device, these devices usually have stiff transmissions and rely solely on actuator position sensors for control. In some cases, positioning systems use both drive and load end position sensors, but these systems are typically used for positioning and not for user interaction or rendering haptic objects.
- the use of both actuator and load position sensors improves haptic wall stiffness in two ways. First, v. without the load position sensors, when the user applies a force to the end of the device, the transmission will flex and the endpoint will move, even if the controller maintains the actuator output position. That is, the compliance of the cables of the system permits some movement even if the actuator attempts to respond to maintain haptic position.
- the use of both actuator and load output position provides additional information that the controller can use to help improve the stability of the haptic control, allowing for increased haptic wall stiffness. While there are many ways in which to use two input sensors to compute a haptic control output, using the actuator output position sensor to provide a velocity signal and using the load output position sensor to provide the load output position signal to the control algorithm is a simple, fast method that enhances the stability and accuracy of the device compared to single sensor solutions. Increased haptic wall stiffness is particularly important, for example, when the haptic device is used in computer aided surgery because the haptic surface must accurately and robustly convey to the surgeon the location of the tool with respect the actual tissue surface. The present invention addresses these needs.
- the invention relates to a method and apparatus for controlling a haptic device.
- the invention relates to a haptic device.
- the haptic device includes an actuator; an actuator sensor in communication with the actuator, the actuator sensor producing an actuator signal indicative of the actuator velocity; a load; a load sensor in communication with the load, the load sensor producing load signal indicative of the load position; and a controller in electrical communication with the load sensor, the actuator sensor and the actuator.
- the controller controls the actuator in response to the actuator signal and the load signal to provide a haptic response to a user.
- the haptic device includes a cable drive transmission in communication with the actuator and the load.
- the controller determines a gravity compensation torque and a Cartesian endpoint position in response to the load signal.
- the controller controls the actuator by computing an endpoint velocity by filtering the actuator signal to form a filtered actuator velocity and multiplying the filtered actuator velocity by a Jacobian.
- the controller computes a damping force by subtracting a reference velocity from the endpoint velocity to form an endpoint velocity difference and multiplying the endpoint velocity difference by a damping gain.
- the controller computes a desired haptic force in response to the damping force.
- the load sensors are selected from the group comprising optical encoders, electric encoders, magnetic encoders, and potentiometers .
- the invention relates to a method for controlling an actuator of a haptic device.
- the method includes the steps of producing an actuator signal indicative of the velocity of an actuator; producing a load signal indicative of the position of a load; and controlling the actuator in response to the actuator signal and the load signal to produce a haptic response to a user.
- the method includes the step of determining a gravity compensation torque and a Cartesian endpoint position in response to the load signal.
- the controlling of the actuator includes the step of computing an endpoint velocity by filtering the actuator signal to form a filtered actuator velocity and multiplying the filtered actuator velocity by a Jacobian.
- the method includes the step of computing a damping force by subtracting a reference velocity from the endpoint velocity to form an endpoint velocity difference and multiplying the endpoint velocity difference by a damping gain.
- the method includes computing a desired haptic force in response to the damping force.
- the invention is a haptic device including an actuator; an actuator sensor in communication with the actuator, the actuator sensor producing an actuator signal indicative of the actuator velocity; a load; a load sensor in communication with the load, the load sensor producing a load signal indicative of the load position; and a controller in electrical communication with the load sensor, the actuator sensor and the actuator, the controller controlling the actuator in response to the actuator signal and the load signal to provide a haptic response to a user.
- the invention is a haptic device including a transmission having an input side and an output side; an actuator in communication with the input side; an actuator sensor in communication with the actuator, the actuator sensor producing a actuator signal indicative of the actuator velocity; an position sensor in communication with the output side, the position sensor producing a position signal indicative of the position of the output side; and a controller in electrical communication with the position sensor, the actuator sensor and the actuator, the controller controlling the actuator in response to the actuator signal and the position signal to provide a haptic response Io a user.
- Fig. 1 is a perspective diagram of an embodiment of a cable drive system
- Fig. 2 is a diagram of an embodiment of a model of a one dimensional constraint
- FIG. 2A is a diagram, of an embodiment of a model of a one dimensional constraint implemented according to the present invention.
- FIG. 3 is a block diagram of an embodiment of a system and process of the invention in Cartesian space
- Fig. 3 A is a block diagram of the haptic force calculator block of Fig. 3;
- FIG. 4 is a block diagram of an embodiment of a system and process of the invention in joint space.
- Fig. 4A is a block diagram of another embodiment of the system of Fig. 4.
- the cable drive haptic device comprises an arm
- the arm 100 is equipped with four drive end sensors 112, 122, 132, 142 installed on rotary actuators 1 10, 120, 130, 140 as well as four load end sensors 118, 128, 138, 148 installed on joint links 116, 126, 136, 146.
- the cable drive transmissions provide a gear reduction so that smaller actuators can be used without introducing backlash, friction, or other non-linear effects that make control difficult.
- the cables introduce some compliance and hysteresis, it is advantagous to include the sensors 118, 128, 138, 148 on the load end of each transmission 114, 124, 134, 144 to provide sufficient endpoint positioning accuracy. Because of control problems caused when sensors and actuators are not located at the same position, it is advantageous to also include the sensors 112, 122, 132, 142 on the drive end of each transmission 114, 124, 134, 144.
- the sensors of the haptic device may be position sensors, such as, for example, optical encoders, electric encoders, resolvers, magnetic scale sensors, magnetostrictive sensors, potentiometers, RVDTs, sychros, and the like.
- the drive end sensors are actuator encoders
- the load end sensors are joint encoders.
- the sensors may be incremental and require a homing process. As is well known, the homing process initializes the sensor (e.g., an encoder) so that an intial position of the sensor is known. Homing may be accomplished, for example, by manually rotating the sensor to a reference position or until an index marker on the sensor is read. The reference position or index marker correlates to a known absolute position of the sensor.
- the sensors may be absolute sensors (e.g., absolute encoders) that do not require a homing process.
- the position sensors provide position measurements. If desired, velocity, which is a derivative of position, may be calculated based on position data from the position sensors. Alternatively, velocity can be measured directly using velocity sensors such as, for example, tachometers. In embodiments where a particular sensor is used to determine only velocity, it is not necessary to use an absolute sensor or a sensor with a reference position or index marker as described above in connection with the homing process.
- a haptic device provides tactile feedback, such as vibration or force feedback, to a user in contact with the haptic device.
- a haptic device may activate actuators to produce force and/or torque (e.g., based on a haptic rendering algorithm) that is applied to the user as the user manipulates the haptic device.
- This force feedback is perceived by the user as a virtual constraint (for example a virtual wall) and constrains the user's movement of the haptic device in certain directions.
- the virtual wall therefore, is capable of preventing motion in a direction that would be detrimental to the operation being performed.
- a virtual wall can be defined so that the haptic device will generate a force that prevents the surgeon from moving the burr beyond a certain depth into the bone.
- the haptic device is a haptic device as described in U.S. Patent Application Serial No. 11/357,197 (Pub. No. US 2006/0142657), filed February 21, 2006, and hereby incorporated by reference herein in its entirety, and/or the HAPTIC GUIDANCE SYSTEM manufactured by MAKO SURGICAL CORP. ® , Ft. Lauderdale, Florida.
- Virtual constraints can restrict the operator in from one to six degrees of freedom.
- a model of a single degree of freedom virtual wall 210 is implemented with a virtual spring 214 and a virtual damper 218 for a rigid link manipulator such that a tool located at the distal end of the rigid link manipulator is prevented from penetrating a forbidden wall 222.
- actuator and sensor are co-located, which results in good haptic stability characteristics.
- haptic rendering suffers instability as well as reduced haptic performance.
- FIG. 2 is for co-located actuator and sensor, in cases where the actuator and sensor are not in the same location, for a flexible transmission system, the dynamics due to the elastic transmission components introduce additional haptic control difficulties. This results in non-minimum phase response.
- Fig. 2A a diagram of a preferred implementation of a virtual wall model is shown for cases where there is a physical compliance in the transmission 220.
- a position sensor is placed at the proximal (or drive) end.
- the proximal sensor may also be referred to, for example, as the actuator sensor.
- a position sensor is placed at the distal (or load) end.
- the distal sensor may also be referred to as the load sensor or the joint sensor.
- an encoder installed at the distal end provides position information while velocity is computed from signals of an encoder mounting at the proximal end (e.g., an actuator encoder).
- a virtual spring 214 is implemented with position information from the distal sensor
- a virtual damper 218 is implemented with velocity information computed from the proximal sensor.
- Using the proximal sensor to compute haptic controller velocity terms and the distal sensor to compute haptic controller position terms is more stable than when the distal sensor is used to compute the position as well as the velocity terms.
- This "dual sensor" haptic control of Fig. 2A can be easily extended into multi-axis manipulators such as serial manipulators or parallel manipulators that have elastic load members.
- the invention utilizes a tracking system (e.g., global GPS, RF, laser tracking, high-speed camera, etc.) that tracks an end effector or other portion of the haptic device.
- a tracking system e.g., global GPS, RF, laser tracking, high-speed camera, etc.
- This tracking system obviates the need for the load sensor, which may be difficult to design into the haptic device without adding mass, size, and complexity.
- the tracking system must be fast enough (haptic rates) and have low latency and good dynamic performance.
- one or more independent mechanical arms may be attached to portions of the haptic device and used instead of the integrated load sensors to provide the measurement of the position of the load side of the transmission.
- the mechanical arm may be an articulated linkage that includes position sensors to enable a position of an end of the mechanical arm to be determined or tracked.
- the mechanical arm comprises an articulating linkage as disclosed, for example, in U.S. Patent 6,322,567, which is hereby incorporated by reference herein in its entirety.
- one or more string potentiometers or fiber-optic position sensing devices may be used instead of a mechanical arm with linkages. Using these other technologies to track the end effector or endpoint of the haptic device has an advantage over sensing the individual joints of the device with a load sensor.
- such technologies also measure any physical compliance from the structure of the haptic device between where the load sensors are mounted and the endpoint of the device. As a result, this compliance can be compensated for by the control system in the same manner as the control system compensates for transmission compliance.
- High resolution position sensors include optical encoders with read heads that can interpolate 10x to 10Ox relative to the physical lines on the encoder disc; large diameter encoders with many lines possibly combined with interpolating read head; and interpolating read heads used with tape-scales that can be applied to the outer diameter of a rotating portion of the joint of interest.
- a control loop is depicted in Cartesian space.
- the control loop may be used, for example, in combination with the HAPTIC GUIDANCE SYSTEMTM manufactured by MAKO SURGICAL CORP. ® , Ft. Lauderdale, Florida, and/or the haptic device disclosed in the above- referenced U.S. Pub. No. US 2006/0142657, each of which includes a robotic arm incorporating cable drive transmissions.
- a tool is installed on the distal end of the arm.
- an actuator encoder or sensor 410 measures the output position of the actuator.
- This output position is converted to a velocity by a velocity filter 414 by measuring the amount of actuator output position change per unit time.
- the velocity is operated on by a Jacobian process 418 to obtain a calculated endpoint velocity.
- This calculated endpoint velocity differs from the actual endpoint velocity because it does not take into account transmission and mechanical effects.
- the velocity filter 414 is a washout filter.
- a washout filter combines differentiating and smoothing functions into one filter.
- the washout filter can be represented as:
- a load encoder or sensor 422 determines the load position which is an input value to a forward kinematic process 426 (which computes the Cartesian endpoint position of the arm as a function of the load position) and a gravity compensation process 430 (which computes the actuator torque or force (as determined by the type of actuator) required to counteract the gravity load on the arm links as a function of the load position).
- the gravity compensation process 430 may compute the joint torque which is then converted to actuator torque before being sent to the actuator, which is part of the haptic device arm dynamics process 470, which causes the actuator to apply torque to the system.
- the output value of the forward kinematic process 426 which is the current tool tip location, is an input value to a haptic controller or haptic force calculator 432.
- the velocity from the Jacobian process 418 is a second input to the haptic controller 432.
- the output of the haptic controller 432 is the haptic force (Fhaptic).
- the haptic force (Fhaptic) is the input to a Jacobian transpose process 462 whose output is the haptic actuator torque and/or force (xhapii c )- Alternatively, the output is the haptic joint torque and/or force which is converted to a haptic actuator torque and/or force before being supplied to block 470.
- the haptic controller 432 includes both a haptic rendering process or algorithm 434 and an adder 438.
- The. output of the haptic rendering process 434 which is the reference tool tip location, is one input to the adder 438.
- the other input to the adder 438 is the positional information from the forward kinematic process 426.
- the output of the adder 438 is the difference between the current tool tip location (x) and the reference tool tip location (xd) or the location deviation (dx).
- the haptic rendering process 434 may be, for example, a haptic rendering process as disclosed in U.S. Patent Application Serial No. 11/646,204 filed on December 27, 2006; a U.S.
- the location deviation (dx) is multiplied by the spring constant (Kp) 450 to obtain the spring force (Fspring), and the velocity deviation (dv) is multiplied by the damping constant (Kd) 454 to obtain the damping force (Fdamping).
- the damping force (Fdamping) and the spring force (Fspring) are added by an adder 458 to produce the haptic force (Fhaptic).
- the haptic torques and/or force ( ⁇ hap t ⁇ C ) is added to the output of the gravity compensation process 430, the gravitational compensation torque or force ( ⁇ gravity_comp), by an adder 466 to obtain the total torque or force ( ⁇ total) to be generated by the actuator.
- This total torque or force ( ⁇ total) is the input to the arm dynamics process of block 470 which then responds to the user interaction, anatomy interaction, and actuator forces which cause the actuator to move.
- the motion of the actuator again causes changes which are detected by the actuator encoder 410 and the load encoder 422, closing the control loop.
- the load encoder 422 is replaced with a direct measurement of endpoint location.
- blocks 422 (load encoder) and 426 (forward kinematics) are not needed and a direct signal from the endpoint sensor is supplied to block 432.
- the gravity compensation process 430 takes its input from the actuator position output from block 410, which now must be converted to a joint angle by the gravity compensation process 430.
- the control loop of the previous figure is depicted in joint space.
- an actuator encoder or sensor 510 measures the output position of the actuator. This actuator output position is
- a joint encoder or sensor 522 determines the joint (load) position (qL) which is an input value to a forward kinematic process 526, a gravity compensation process 530, and an adder 538.
- the output value of the forward kinematic process 526 which is the current tool tip location, is the input value to a haptic rendering process or algorithm 534.
- the output of the haptic rendering process 534 which is the reference tool tip location is the input to an inverse kinematics process 536 whose output, the reference joint angle (qLd), is both the second input to the adder 538 and the input to a differentiator or differential operator 542.
- the output of the adder 538 is the difference between the current joint position and the reference joint position, or the joint position deviations (dq).
- the output of the differentiator 542 is the desired joint velocity.
- joint velocity ( q L) from the differentiator 542 is input to an adder 546, and the
- the haptic torque ( ⁇ haptic) is added to the output of the gravity compensation process 530, the gravitational compensation torque ( ⁇ gravity_comp) by an adder 566 to obtain the total torque ( ⁇ total) to be generated by the actuator.
- This total torque ( ⁇ total) is the input to the arm dynamics process 570 which causes the actuator to apply torque to the system.
- the torque may cause motion which is detected by the actuator encoder 510 and the joint encoder 522, closing the control loop.
- the gains Kp and Kd are multiplied by the joint angle, not the tool tip location as in Fig. 3. This implementation may be advantageous if different types of actuators are used on the system and the gains Kp and Kd must be tuned for each individual joint.
- Fig. 4 is described in terms of a single torque or force for a single degree of freedom system, in a multi -degree of freedom system, the process may be replicated and the torques or forces added to generate the total torques or forces on the system.
- the gravity compensation block 530 obtains its input values from the output of the actuator encoder 510.
- the forward kinematics process 526 and the inverse kinematics process 536 are eliminated.
- the haptic rendering algorithm 534 is used to render joint-space haptic objects and outputs the desired joint angle rather than a tool tip location.
- a software-created joint stop or detent can be used to alter the physical behavior of the joint from the user's perspective without having to alter the physical hardware of the joint.
- These joint-space haptic objects can also be combined with other haptic objects by adding the taujhaptic outputs from multiple controllers shown in Fig. 3 and Fig. 4 before supplying them to the adder 566.
- One advantage of the dual sensor control of the present invention is that the use of both drive and load end position sensors provides additional information that the controller can use to improve the stability of the haptic control, thereby allowing for increased haptic wall stiffness.
- Another advantage of the dual sensor control of the present invention is that data from the load end sensor can be compared to data from the drive end sensor to detect failures in the sensors or transmission, thereby enhancing safety of the system.
- the dual sensors may be used to compute a haptic control output in any suitable manner
- a drive end output position sensor e.g., an actuator encoder
- a load end output position sensor e.g, a joint encoder
- a load output position signal to the control algorithm
- data from the load end sensor can be compared to data from the drive end sensor to determine and correct for the impact of compliance and hysteresis in the cable transmission. As a result, endpoint positioning accuracy is improved.
- the present invention enables a haptic device to be controlled to compensate for compliance and hysteresis in a cable transmission to enable rendering of haptic surfaces in precise locations in physical space with sufficient wall stiffness to accurately and robustly guide the actions of a user.
- the present invention has been described in terms of certain exemplary preferred embodiments, it will be readily understood and appreciated by one of ordinary skill in the art that it is not so limited, and that many additions, deletions and modifications to the preferred embodiments may be made within the scope of the invention as hereinafter claimed. Accordingly, the scope of the invention is limited only by the scope of the appended claims. [0052] What is claimed is:
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Manipulator (AREA)
- Control Of Electric Motors In General (AREA)
- Mechanical Control Devices (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2007800181132A CN101449229B (en) | 2006-05-19 | 2007-05-18 | A method and apparatus for controlling a haptic device |
CA2651780A CA2651780C (en) | 2006-05-19 | 2007-05-18 | A method and apparatus for controlling a haptic device |
JP2009511080A JP2009537228A (en) | 2006-05-19 | 2007-05-18 | Method and apparatus for controlling a haptic device |
EP07795023.6A EP2018606B1 (en) | 2006-05-19 | 2007-05-18 | A method and apparatus for controlling a haptic device |
AU2007254217A AU2007254217A1 (en) | 2006-05-19 | 2007-05-18 | A method and apparatus for controlling a haptic device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US80185006P | 2006-05-19 | 2006-05-19 | |
US60/801,850 | 2006-05-19 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2007136739A2 true WO2007136739A2 (en) | 2007-11-29 |
WO2007136739A3 WO2007136739A3 (en) | 2008-07-24 |
Family
ID=38723850
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2007/011891 WO2007136739A2 (en) | 2006-05-19 | 2007-05-18 | A method and apparatus for controlling a haptic device |
Country Status (7)
Country | Link |
---|---|
US (1) | US7683565B2 (en) |
EP (1) | EP2018606B1 (en) |
JP (1) | JP2009537228A (en) |
CN (1) | CN101449229B (en) |
AU (1) | AU2007254217A1 (en) |
CA (1) | CA2651780C (en) |
WO (1) | WO2007136739A2 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9119655B2 (en) | 2012-08-03 | 2015-09-01 | Stryker Corporation | Surgical manipulator capable of controlling a surgical instrument in multiple modes |
US9226796B2 (en) | 2012-08-03 | 2016-01-05 | Stryker Corporation | Method for detecting a disturbance as an energy applicator of a surgical instrument traverses a cutting path |
US9480534B2 (en) | 2012-08-03 | 2016-11-01 | Stryker Corporation | Navigation system and method for removing a volume of tissue from a patient |
US9603665B2 (en) | 2013-03-13 | 2017-03-28 | Stryker Corporation | Systems and methods for establishing virtual constraint boundaries |
US9636185B2 (en) | 2002-03-06 | 2017-05-02 | Mako Surgical Corp. | System and method for performing surgical procedure using drill guide and robotic device operable in multiple modes |
US9652591B2 (en) | 2013-03-13 | 2017-05-16 | Stryker Corporation | System and method for arranging objects in an operating room in preparation for surgical procedures |
US9820818B2 (en) | 2012-08-03 | 2017-11-21 | Stryker Corporation | System and method for controlling a surgical manipulator based on implant parameters |
US9921712B2 (en) | 2010-12-29 | 2018-03-20 | Mako Surgical Corp. | System and method for providing substantially stable control of a surgical tool |
RU2718595C1 (en) * | 2019-11-25 | 2020-04-08 | Ассистирующие Хирургические Технологии (Аст), Лтд | Operator control unit for robotic surgical complex |
US11103315B2 (en) | 2015-12-31 | 2021-08-31 | Stryker Corporation | Systems and methods of merging localization and vision data for object avoidance |
RU2757969C1 (en) * | 2020-12-22 | 2021-10-25 | Акционерное общество "Казанский электротехнический завод" | Robotic surgical complex manipulator control device |
US11202682B2 (en) | 2016-12-16 | 2021-12-21 | Mako Surgical Corp. | Techniques for modifying tool operation in a surgical robotic system based on comparing actual and commanded states of the tool relative to a surgical site |
Families Citing this family (50)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8944070B2 (en) | 1999-04-07 | 2015-02-03 | Intuitive Surgical Operations, Inc. | Non-force reflecting method for providing tool force information to a user of a telesurgical system |
US8996169B2 (en) | 2011-12-29 | 2015-03-31 | Mako Surgical Corp. | Neural monitor-based dynamic haptics |
US11202676B2 (en) | 2002-03-06 | 2021-12-21 | Mako Surgical Corp. | Neural monitor-based dynamic haptics |
AU2003218010A1 (en) | 2002-03-06 | 2003-09-22 | Z-Kat, Inc. | System and method for using a haptic device in combination with a computer-assisted surgery system |
US9789608B2 (en) | 2006-06-29 | 2017-10-17 | Intuitive Surgical Operations, Inc. | Synthetic representation of a surgical robot |
US7938009B2 (en) * | 2006-02-03 | 2011-05-10 | Immersion Corporation | Haptic device testing |
US20090192523A1 (en) | 2006-06-29 | 2009-07-30 | Intuitive Surgical, Inc. | Synthetic representation of a surgical instrument |
US9718190B2 (en) | 2006-06-29 | 2017-08-01 | Intuitive Surgical Operations, Inc. | Tool position and identification indicator displayed in a boundary area of a computer display screen |
US10008017B2 (en) | 2006-06-29 | 2018-06-26 | Intuitive Surgical Operations, Inc. | Rendering tool information as graphic overlays on displayed images of tools |
US10258425B2 (en) | 2008-06-27 | 2019-04-16 | Intuitive Surgical Operations, Inc. | Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide |
US7759894B2 (en) * | 2006-10-26 | 2010-07-20 | Honeywell International Inc. | Cogless motor driven active user interface haptic feedback system |
US8620473B2 (en) | 2007-06-13 | 2013-12-31 | Intuitive Surgical Operations, Inc. | Medical robotic system with coupled control modes |
US9138129B2 (en) | 2007-06-13 | 2015-09-22 | Intuitive Surgical Operations, Inc. | Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide |
US9084623B2 (en) | 2009-08-15 | 2015-07-21 | Intuitive Surgical Operations, Inc. | Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide |
US9089256B2 (en) | 2008-06-27 | 2015-07-28 | Intuitive Surgical Operations, Inc. | Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide |
US9469034B2 (en) * | 2007-06-13 | 2016-10-18 | Intuitive Surgical Operations, Inc. | Method and system for switching modes of a robotic system |
US8209054B2 (en) * | 2008-05-09 | 2012-06-26 | William Howison | Haptic device grippers for surgical teleoperation |
US8864652B2 (en) | 2008-06-27 | 2014-10-21 | Intuitive Surgical Operations, Inc. | Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip |
US8078440B2 (en) | 2008-09-19 | 2011-12-13 | Smith & Nephew, Inc. | Operatively tuning implants for increased performance |
US8344863B2 (en) * | 2008-12-10 | 2013-01-01 | Postech Academy-Industry Foundation | Apparatus and method for providing haptic augmented reality |
US8992558B2 (en) | 2008-12-18 | 2015-03-31 | Osteomed, Llc | Lateral access system for the lumbar spine |
US8918211B2 (en) | 2010-02-12 | 2014-12-23 | Intuitive Surgical Operations, Inc. | Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument |
US9492927B2 (en) | 2009-08-15 | 2016-11-15 | Intuitive Surgical Operations, Inc. | Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose |
US8989898B2 (en) * | 2009-10-22 | 2015-03-24 | Electroimpact, Inc. | Robotic manufacturing system with accurate control |
US8679125B2 (en) | 2010-09-22 | 2014-03-25 | Biomet Manufacturing, Llc | Robotic guided femoral head reshaping |
DE102010043574A1 (en) * | 2010-11-08 | 2012-05-10 | Fresenius Medical Care Deutschland Gmbh | Manually open clamp holder with sensor |
US9101379B2 (en) | 2010-11-12 | 2015-08-11 | Intuitive Surgical Operations, Inc. | Tension control in actuation of multi-joint medical instruments |
US20130274712A1 (en) * | 2011-11-02 | 2013-10-17 | Stuart O. Schecter | Haptic system for balloon tipped catheter interventions |
JP5930753B2 (en) * | 2012-02-13 | 2016-06-08 | キヤノン株式会社 | Robot apparatus control method and robot apparatus |
JP5930754B2 (en) * | 2012-02-13 | 2016-06-08 | キヤノン株式会社 | Robot apparatus control method and robot apparatus |
WO2013192598A1 (en) * | 2012-06-21 | 2013-12-27 | Excelsius Surgical, L.L.C. | Surgical robot platform |
CN103902019A (en) * | 2012-12-25 | 2014-07-02 | 苏茂 | Data glove elbow joint detection device |
US10507066B2 (en) | 2013-02-15 | 2019-12-17 | Intuitive Surgical Operations, Inc. | Providing information of tools by filtering image areas adjacent to or on displayed images of the tools |
US9677840B2 (en) | 2014-03-14 | 2017-06-13 | Lineweight Llc | Augmented reality simulator |
US9880046B2 (en) * | 2014-05-15 | 2018-01-30 | Texas Instruments Incorporated | Method, apparatus and system for portable device surface and material analysis |
FR3037841B1 (en) * | 2015-06-26 | 2017-08-18 | Haption | ARTICULATED MOTORIZED ARM WITH SAFE CABLE CABESTAN. |
DE102015009048B3 (en) * | 2015-07-13 | 2016-08-18 | Kuka Roboter Gmbh | Controlling a compliant controlled robot |
JP6652292B2 (en) * | 2015-09-24 | 2020-02-19 | キヤノン株式会社 | Control method, control program, robot system, control method of rotary drive device, and robot device |
DE102015012961B4 (en) * | 2015-10-08 | 2022-05-05 | Kastanienbaum GmbH | robotic system |
CN105404156B (en) * | 2015-12-31 | 2018-02-06 | 微创(上海)医疗机器人有限公司 | Haptic feedback devices and its variable damper control methods and applications |
TWI764891B (en) * | 2016-03-29 | 2022-05-21 | 瑞典商寇格尼博迪克斯有限公司 | Method, constraining device and system for determining geometric properties of a manipulator |
EP3503814B1 (en) | 2016-08-23 | 2024-07-10 | Stryker European Operations Holdings LLC | Instrumentation for the implantation of spinal implants |
JP7289834B2 (en) * | 2017-10-30 | 2023-06-12 | エシコン エルエルシー | Modular Surgical Instrument Control System Configuration |
US11191532B2 (en) | 2018-03-30 | 2021-12-07 | Stryker European Operations Holdings Llc | Lateral access retractor and core insertion |
US11745354B2 (en) | 2018-08-16 | 2023-09-05 | Mitutoyo Corporation | Supplementary metrology position coordinates determination system including an alignment sensor for use with a robot |
US11002529B2 (en) * | 2018-08-16 | 2021-05-11 | Mitutoyo Corporation | Robot system with supplementary metrology position determination system |
US11564674B2 (en) | 2019-11-27 | 2023-01-31 | K2M, Inc. | Lateral access system and method of use |
CN111426493A (en) * | 2020-03-25 | 2020-07-17 | 上海荣泰健康科技股份有限公司 | Massage chair fault detection method, system, terminal and medium |
CN112085052B (en) * | 2020-07-28 | 2024-07-16 | 中国科学院深圳先进技术研究院 | Training method of motor imagery classification model, motor imagery method and related equipment |
JP2022065646A (en) * | 2020-10-15 | 2022-04-27 | 株式会社ミツトヨ | Robot system using supplementary measurement position determination system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040034282A1 (en) * | 2002-03-06 | 2004-02-19 | Quaid Arthur E. | System and method for using a haptic device as an input device |
US6985133B1 (en) * | 1998-07-17 | 2006-01-10 | Sensable Technologies, Inc. | Force reflecting haptic interface |
US7035716B2 (en) * | 2001-01-29 | 2006-04-25 | The Acrobot Company Limited | Active-constraint robots |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58151885A (en) * | 1982-03-03 | 1983-09-09 | Hitachi Ltd | Control method for position of motor |
US4481453A (en) * | 1982-07-23 | 1984-11-06 | Motornetics Corporation | Torque loop control system and method |
US4621332A (en) * | 1983-06-20 | 1986-11-04 | Hitachi, Ltd. | Method and apparatus for controlling a robot utilizing force, position, velocity, spring constant, mass coefficient, and viscosity coefficient |
US5023808A (en) * | 1987-04-06 | 1991-06-11 | California Institute Of Technology | Dual-arm manipulators with adaptive control |
US5086401A (en) | 1990-05-11 | 1992-02-04 | International Business Machines Corporation | Image-directed robotic system for precise robotic surgery including redundant consistency checking |
US5322320A (en) * | 1992-01-14 | 1994-06-21 | Nippondenso Co., Ltd. | Shock absorber damping force control system for vehicle |
US5629594A (en) * | 1992-12-02 | 1997-05-13 | Cybernet Systems Corporation | Force feedback system |
US5739811A (en) * | 1993-07-16 | 1998-04-14 | Immersion Human Interface Corporation | Method and apparatus for controlling human-computer interface systems providing force feedback |
US5625576A (en) * | 1993-10-01 | 1997-04-29 | Massachusetts Institute Of Technology | Force reflecting haptic interface |
US5691898A (en) * | 1995-09-27 | 1997-11-25 | Immersion Human Interface Corp. | Safe and low cost computer peripherals with force feedback for consumer applications |
US5999168A (en) * | 1995-09-27 | 1999-12-07 | Immersion Corporation | Haptic accelerator for force feedback computer peripherals |
US5828197A (en) * | 1996-10-25 | 1998-10-27 | Immersion Human Interface Corporation | Mechanical interface having multiple grounded actuators |
US6020876A (en) * | 1997-04-14 | 2000-02-01 | Immersion Corporation | Force feedback interface with selective disturbance filter |
US6104382A (en) * | 1997-10-31 | 2000-08-15 | Immersion Corporation | Force feedback transmission mechanisms |
US6281651B1 (en) * | 1997-11-03 | 2001-08-28 | Immersion Corporation | Haptic pointing devices |
US6067077A (en) * | 1998-04-10 | 2000-05-23 | Immersion Corporation | Position sensing for force feedback devices |
US6322567B1 (en) | 1998-12-14 | 2001-11-27 | Integrated Surgical Systems, Inc. | Bone motion tracking system |
US6084371A (en) * | 1999-02-19 | 2000-07-04 | Lockheed Martin Energy Research Corporation | Apparatus and methods for a human de-amplifier system |
US6762745B1 (en) * | 1999-05-10 | 2004-07-13 | Immersion Corporation | Actuator control providing linear and continuous force output |
US6522952B1 (en) * | 1999-06-01 | 2003-02-18 | Japan As Represented By Secretary Of Agency Of Industrial Science And Technology | Method and system for controlling cooperative object-transporting robot |
US8010180B2 (en) | 2002-03-06 | 2011-08-30 | Mako Surgical Corp. | Haptic guidance system and method |
US7308352B2 (en) * | 2003-08-07 | 2007-12-11 | Siemens Energy & Automation, Inc. | Enhanced braking system and method |
US7285932B2 (en) * | 2003-10-28 | 2007-10-23 | United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Method and apparatus for loss of control inhibitor systems |
US7982711B2 (en) * | 2003-12-19 | 2011-07-19 | Immersion Corporation | Haptic profiling system and method |
WO2006039403A1 (en) * | 2004-09-29 | 2006-04-13 | Northwestern University | System and methods to overcome gravity-induced dysfunction in extremity paresis |
US20080007517A9 (en) * | 2005-02-23 | 2008-01-10 | Northwestern University | Electrical damping system |
CA2651784C (en) | 2006-05-19 | 2015-01-27 | Mako Surgical Corp. | Method and apparatus for controlling a haptic device |
US7710061B2 (en) * | 2006-08-07 | 2010-05-04 | The Board Of Trustees Of The Leland Stanford Junior University | Motor control amplifier |
US7750593B2 (en) * | 2006-10-26 | 2010-07-06 | Honeywell International Inc. | Active human-machine interface system without a force sensor |
-
2007
- 2007-05-18 US US11/804,374 patent/US7683565B2/en active Active
- 2007-05-18 AU AU2007254217A patent/AU2007254217A1/en not_active Abandoned
- 2007-05-18 CN CN2007800181132A patent/CN101449229B/en active Active
- 2007-05-18 JP JP2009511080A patent/JP2009537228A/en not_active Withdrawn
- 2007-05-18 EP EP07795023.6A patent/EP2018606B1/en active Active
- 2007-05-18 WO PCT/US2007/011891 patent/WO2007136739A2/en active Application Filing
- 2007-05-18 CA CA2651780A patent/CA2651780C/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6985133B1 (en) * | 1998-07-17 | 2006-01-10 | Sensable Technologies, Inc. | Force reflecting haptic interface |
US7035716B2 (en) * | 2001-01-29 | 2006-04-25 | The Acrobot Company Limited | Active-constraint robots |
US20040034282A1 (en) * | 2002-03-06 | 2004-02-19 | Quaid Arthur E. | System and method for using a haptic device as an input device |
Non-Patent Citations (1)
Title |
---|
ZINN M ET AL: "A new actuation approach for human friendly robot design" ROBOTICS AND AUTOMATION, 2004. PROCEEDINGS. ICRA '04. 2004 IEEE INTERNATIONAL CONFERENCE ON NEW ORLEANS, LA, USA APRIL 26-MAY 1, 2004, PISCATAWAY, NJ, USA,IEEE, US, vol. 1, 26 April 2004 (2004-04-26), pages 249-254, XP010768284 ISBN: 0-7803-8232-3 * |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9636185B2 (en) | 2002-03-06 | 2017-05-02 | Mako Surgical Corp. | System and method for performing surgical procedure using drill guide and robotic device operable in multiple modes |
US9921712B2 (en) | 2010-12-29 | 2018-03-20 | Mako Surgical Corp. | System and method for providing substantially stable control of a surgical tool |
US9820818B2 (en) | 2012-08-03 | 2017-11-21 | Stryker Corporation | System and method for controlling a surgical manipulator based on implant parameters |
US9226796B2 (en) | 2012-08-03 | 2016-01-05 | Stryker Corporation | Method for detecting a disturbance as an energy applicator of a surgical instrument traverses a cutting path |
US9566122B2 (en) | 2012-08-03 | 2017-02-14 | Stryker Corporation | Robotic system and method for transitioning between operating modes |
US12070288B2 (en) | 2012-08-03 | 2024-08-27 | Stryker Corporation | Robotic system and method for removing a volume of material from a patient |
US9480534B2 (en) | 2012-08-03 | 2016-11-01 | Stryker Corporation | Navigation system and method for removing a volume of tissue from a patient |
US12004836B2 (en) | 2012-08-03 | 2024-06-11 | Stryker Corporation | Surgical manipulator and method of operating the same using virtual rigid body modeling preliminary |
US9681920B2 (en) | 2012-08-03 | 2017-06-20 | Stryker Corporation | Robotic system and method for reorienting a surgical instrument moving along a tool path |
US9795445B2 (en) | 2012-08-03 | 2017-10-24 | Stryker Corporation | System and method for controlling a manipulator in response to backdrive forces |
US11179210B2 (en) | 2012-08-03 | 2021-11-23 | Stryker Corporation | Surgical manipulator and method for controlling pose of an instrument based on virtual rigid body modelling |
US9566125B2 (en) | 2012-08-03 | 2017-02-14 | Stryker Corporation | Surgical manipulator having a feed rate calculator |
US10314661B2 (en) | 2012-08-03 | 2019-06-11 | Stryker Corporation | Surgical robotic system and method for controlling an instrument feed rate |
US10350017B2 (en) | 2012-08-03 | 2019-07-16 | Stryker Corporation | Manipulator and method for controlling the manipulator based on joint limits |
US11672620B2 (en) | 2012-08-03 | 2023-06-13 | Stryker Corporation | Robotic system and method for removing a volume of material from a patient |
US10420619B2 (en) | 2012-08-03 | 2019-09-24 | Stryker Corporation | Surgical manipulator and method for transitioning between operating modes |
US10426560B2 (en) | 2012-08-03 | 2019-10-01 | Stryker Corporation | Robotic system and method for reorienting a surgical instrument moving along a tool path |
US10463440B2 (en) | 2012-08-03 | 2019-11-05 | Stryker Corporation | Surgical manipulator and method for resuming semi-autonomous tool path position |
US9119655B2 (en) | 2012-08-03 | 2015-09-01 | Stryker Corporation | Surgical manipulator capable of controlling a surgical instrument in multiple modes |
US11639001B2 (en) | 2012-08-03 | 2023-05-02 | Stryker Corporation | Robotic system and method for reorienting a surgical instrument |
US11471232B2 (en) | 2012-08-03 | 2022-10-18 | Stryker Corporation | Surgical system and method utilizing impulse modeling for controlling an instrument |
US11045958B2 (en) | 2012-08-03 | 2021-06-29 | Stryker Corporation | Surgical robotic system and method for commanding instrument position based on iterative boundary evaluation |
US10512509B2 (en) | 2013-03-13 | 2019-12-24 | Stryker Corporation | Systems and methods for establishing virtual constraint boundaries |
US11183297B2 (en) | 2013-03-13 | 2021-11-23 | Stryker Corporation | System and method for arranging objects in an operating room in preparation for surgical procedures |
US9603665B2 (en) | 2013-03-13 | 2017-03-28 | Stryker Corporation | Systems and methods for establishing virtual constraint boundaries |
US11464579B2 (en) | 2013-03-13 | 2022-10-11 | Stryker Corporation | Systems and methods for establishing virtual constraint boundaries |
US9652591B2 (en) | 2013-03-13 | 2017-05-16 | Stryker Corporation | System and method for arranging objects in an operating room in preparation for surgical procedures |
US11918305B2 (en) | 2013-03-13 | 2024-03-05 | Stryker Corporation | Systems and methods for establishing virtual constraint boundaries |
US10410746B2 (en) | 2013-03-13 | 2019-09-10 | Stryker Corporation | System and method for arranging objects in an operating room in preparation for surgical procedures |
US11806089B2 (en) | 2015-12-31 | 2023-11-07 | Stryker Corporation | Merging localization and vision data for robotic control |
US11103315B2 (en) | 2015-12-31 | 2021-08-31 | Stryker Corporation | Systems and methods of merging localization and vision data for object avoidance |
US11850011B2 (en) | 2016-12-16 | 2023-12-26 | Mako Surgical Corp. | Techniques for modifying tool operation in a surgical robotic system based on comparing actual and commanded states of the tool relative to a surgical site |
US11202682B2 (en) | 2016-12-16 | 2021-12-21 | Mako Surgical Corp. | Techniques for modifying tool operation in a surgical robotic system based on comparing actual and commanded states of the tool relative to a surgical site |
RU2718595C1 (en) * | 2019-11-25 | 2020-04-08 | Ассистирующие Хирургические Технологии (Аст), Лтд | Operator control unit for robotic surgical complex |
WO2021107819A1 (en) * | 2019-11-25 | 2021-06-03 | Общество С Ограниченной Ответственностью "Ассистирующие Хирургические Технологии" | Operator controller for controlling a robotic surgical complex |
RU2757969C1 (en) * | 2020-12-22 | 2021-10-25 | Акционерное общество "Казанский электротехнический завод" | Robotic surgical complex manipulator control device |
Also Published As
Publication number | Publication date |
---|---|
CA2651780C (en) | 2015-03-10 |
CN101449229A (en) | 2009-06-03 |
WO2007136739A3 (en) | 2008-07-24 |
EP2018606A2 (en) | 2009-01-28 |
US7683565B2 (en) | 2010-03-23 |
EP2018606B1 (en) | 2019-02-20 |
CN101449229B (en) | 2011-10-12 |
US20070296366A1 (en) | 2007-12-27 |
CA2651780A1 (en) | 2007-11-29 |
JP2009537228A (en) | 2009-10-29 |
AU2007254217A1 (en) | 2007-11-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2651780C (en) | A method and apparatus for controlling a haptic device | |
KR102584754B1 (en) | Robotic system and method of reversing it | |
US10688659B2 (en) | Robot | |
US20200282558A1 (en) | System and method for controlling a robot with torque-controllable actuators | |
CN107921623B (en) | Robot system | |
US7109678B2 (en) | Holding arrangement having an apparatus for balancing a load torque | |
Carignan et al. | Closed-loop force control for haptic simulation of virtual environments | |
Peer et al. | A new admittance-type haptic interface for bimanual manipulations | |
JP6111562B2 (en) | robot | |
US6330837B1 (en) | Parallel mechanism | |
US11850014B2 (en) | Control system, control method, and surgical arm system | |
US7170250B2 (en) | Holding arrangement having a device for actively damping vibration | |
CN104781050A (en) | Constraining robotic manipulators with redundant degrees of freedom | |
KR20140066544A (en) | Robot and friction compensation method for the robot | |
EP3946833B1 (en) | Reducing energy buildup in servo-controlled mechanisms | |
Rodriguez et al. | Hybrid control strategy for force and precise end effector positioning of a twisted string actuator | |
CN115972209B (en) | Main mobile manipulator and joint torque control method thereof | |
Ahmad et al. | Shape recovery from robot contour-tracking with force feedback | |
US20230320798A1 (en) | Joint control in a mechanical system | |
Peer et al. | Towards a mobile haptic interface for bimanual manipulations | |
CN115300110A (en) | Endoscopic surgery control system | |
JP2003157114A (en) | Method and device for lost motion correction | |
KR100332296B1 (en) | Method and apparatus for controlling parallel robots | |
Rovers et al. | Design of a robust master-slave controller for surgery applications with haptic feedback | |
Xue et al. | 2-DOF Haptic Device based on Closed-loop EBA Controller for Gastroscope Intervention |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200780018113.2 Country of ref document: CN |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2007254217 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2651780 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007795023 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009511080 Country of ref document: JP |
|
ENP | Entry into the national phase |
Ref document number: 2007254217 Country of ref document: AU Date of ref document: 20070518 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07795023 Country of ref document: EP Kind code of ref document: A2 |