US11751967B2 - Fingers-driven haptic interface for robot-assisted surgical system - Google Patents

Fingers-driven haptic interface for robot-assisted surgical system Download PDF

Info

Publication number
US11751967B2
US11751967B2 US16/931,442 US202016931442A US11751967B2 US 11751967 B2 US11751967 B2 US 11751967B2 US 202016931442 A US202016931442 A US 202016931442A US 11751967 B2 US11751967 B2 US 11751967B2
Authority
US
United States
Prior art keywords
sensors
surgical instrument
response
movement
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US16/931,442
Other versions
US20210038332A1 (en
Inventor
Damien Brasset
Stefano Maghini
Riccardo Pozzato
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Asensus Surgical US Inc
Original Assignee
Asensus Surgical US Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Asensus Surgical US Inc filed Critical Asensus Surgical US Inc
Priority to US16/931,442 priority Critical patent/US11751967B2/en
Publication of US20210038332A1 publication Critical patent/US20210038332A1/en
Assigned to ASENSUS SURGICAL US, INC. reassignment ASENSUS SURGICAL US, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Maghini, Stefano, Pozzato, Riccardo, BRASSET, DAMIEN
Priority to US18/457,199 priority patent/US12059227B2/en
Application granted granted Critical
Publication of US11751967B2 publication Critical patent/US11751967B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00203Electrical control of surgical instruments with speech control or speech recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00216Electrical control of surgical instruments with eye tracking or head position tracking control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/0042Surgical instruments, devices or methods, e.g. tourniquets with special provisions for gripping
    • A61B2017/00438Surgical instruments, devices or methods, e.g. tourniquets with special provisions for gripping connectable to a finger
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00535Surgical instruments, devices or methods, e.g. tourniquets pneumatically or hydraulically operated
    • A61B2017/00539Surgical instruments, devices or methods, e.g. tourniquets pneumatically or hydraulically operated hydraulically
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00973Surgical instruments, devices or methods, e.g. tourniquets pedal-operated
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • A61B2034/741Glove like input devices, e.g. "data gloves"

Definitions

  • Surgical robotic systems are typically comprised of one or more robotic manipulators and a user interface.
  • the robotic manipulators carry surgical instruments or devices used for the surgical procedure.
  • a typical user interface includes input devices, or handles, manually moveable by the surgeon to control movement of the surgical instruments carried by the robotic manipulators.
  • the surgeon uses the interface to provide inputs into the system and the system processes that information to develop output commands for the robotic manipulator.
  • haptic feedback is communicated to the surgeon in the form of forces applied by motors to the surgeon interface, so that as the surgeon moves the handles of the surgeon interface, s/he feels resistance against movement representing the direction and magnitude of forces experienced by the robotically controlled surgical device.
  • Forces represented can include both the forces at the tips of the robotically controlled devices and/or the forces being applied by the shaft of the robotically controlled device to the trocar at the entrance point to the body, giving the surgeon complete understanding of the forces applied to the device so s/he can better control the device during surgery.
  • the present application describes a haptic user input device that is worn on the hand and that generates input to the system via articular of the user's fingers and that communicates haptic feedback to the user using inflatable bladders positioned on the user's fingers.
  • FIG. 1 shows an example of a robot-assisted surgical system
  • FIG. 2 illustrates components of a first embodiment of a haptic user interface
  • FIG. 3 shows a surgical instrument in a curved configuration
  • FIG. 4 schematically illustrates degrees of freedom of finger motion that may be used to give input using a fingers-driven haptic interface of the type described herein, and further illustrates positioning of the haptic bladders relative to the degrees of freedom.
  • This application describes a haptic user interface for a robot-assisted surgical system.
  • the inventions described herein may be used on a variety of robotic surgical systems, the embodiments will be described with reference to a system of the type shown in FIG. 1 .
  • a surgeon console 12 has two input devices such as the new haptic interface described below, and/or handles 17 , 18 .
  • the new haptic interface described below may be provided in lieu of, or in addition to, the handles 17 , 18 . Where both types are provided, the user might choose to utilize the more conventional user input devices 17 , 18 to give input when performing certain surgical tasks, and to use the new haptic interface for other surgical tasks.
  • the input devices are configured to be manipulated by a user to generate signals that are used to command motion of a robotically controlled device in multiple degrees of freedom.
  • the user selectively assigns the two input devices to two of the robotic manipulators 13 , 14 , 15 , allowing surgeon control of two of the surgical instruments 10 a , 10 b , and 10 c disposed at the working site at any given time.
  • one of the two input devices is operatively disengaged from one of the initial two instruments and then operatively paired with the third instrument.
  • a fourth robotic manipulator may be optionally provided to support and maneuver an additional instrument.
  • One of the instruments 10 a , 10 b , 10 c is a camera that captures images of the operative field in the body cavity.
  • the camera may be moved by its corresponding robotic manipulator using input from a variety of types of input devices, including, without limitation, one of the new haptic interface devices, the handles 17 , 18 , additional controls on the console, a foot pedal, an eye tracker 21 , voice controller, etc.
  • the console may also include a display or monitor 23 configured to display the images captured by the camera, and for optionally displaying system information, patient information, etc.
  • a control unit 30 is operationally connected to the robotic arms and to the user interface.
  • the control unit receives user input from the input devices corresponding to the desired movement of the surgical instruments, and the robotic arms are caused to manipulate the surgical instruments accordingly.
  • the input devices are configured to be manipulated by a user to generate signals that are processed by the system to generate instructions used to command motion of the manipulators in order to move the instruments in multiple degrees of freedom.
  • one or more of the degrees of freedom of the handles may be coupled with an electromechanical system capable of providing gravity compensation for the user input, and/or providing haptic feedback to the surgeon.
  • the surgical system allows the operating room staff to remove and replace surgical instruments carried by the robotic manipulator, based on the surgical need. Once instruments have been installed on the manipulators, the surgeon moves the input devices to provide inputs into the system, and the system processes that information to develop output commands for the robotic manipulator in order to move the instruments and, as appropriate, operate the instrument end effectors.
  • FIG. 2 schematically illustrates an embodiment of a fingers-driven haptic interface 100 .
  • Each interface 100 is worn on a hand of a user, and includes acceleration and angular velocity sensors 102 , (each of which may be an IMU, or separate accelerometer and a gyroscope components; for simplicity here the term IMU is used to connote either configuration) to generate signals in response to movement of a digit of the user's hand, an input switch 104 such as a push button, sliding switch, touch sensitive surface, joystick, etc. allowing the user to cause an input signal to be delivered to the system,
  • the interface further includes bladders 106 .
  • the bladders are designed to be filled with gas or other fluid to activate pressure that simulates touch.
  • a gas or fluid source which may be one or more neighboring bladders containing the gas/fluid source.
  • an electromechanical actuation system or pump is provided to drive the fluid into the relevant haptic bladder of the user interface device when the processor causes it to do so in order to do so in order to give haptic feedback to the user.
  • FIG. 2 there is an IMU for one phalanx of the thumb, and IMU's on each of two phalanx's for the index finger. While not shown in FIG. 2 , two additional IMU's may be positioned on the middle finger, one on each phalanx.
  • the switch 104 is preferably positioned on the index finger so that the thumb may be used to activate/engage it. As shown, the switch may be included on the IMU for that finger.
  • the haptic interface positions bladders 106 on each of the thumb, index finger and middle finger as represented in FIG. 2 . More specifically, bladders 106 are positioned to contact the pads of these fingers.
  • the thumb and index finger each include one such bladder, and the system is programmed to deliver force feedback to the index finger and thumb by inflating one or both of these bladders when the end effector of the surgical instrument is closed into contact with tissue, a needle, or another instrument or material used in the surgical procedure.
  • the haptic interface of the FIG. 2 configuration utilizes five bladders positioned on the middle finger. These bladders are positioned on the user's hand to deliver force feedback to the user along three axes of motion of the surgical instrument as it is used during the surgical procedure.
  • One pair of bladders which generates force feedback corresponding to forces experienced by the surgical instrument along the pitch axes, is positioned with one bladder on the pad of the finger and one bladder on the back of the finger.
  • a second pair of bladders, which generates force feedback corresponding to forces experienced by the surgical instrument along the yaw axis are positioned on the sides of the middle finger.
  • the fifth bladder which generates force feedback along the instrument's insertion axis (along its longitudinal axis), is disposed at the tip of the finger.
  • the arrangement of IMUs may be used to generate signals used by a process of the surgical system to control electromechanical motion of a surgical instrument held by a robotic manipulator. For example, movement of the thumb and index finger towards/away from one another may be mapped to motion closing the jaws of the surgical instrument. Motion to move the user's hand in pitch and jaw directions (relative to the user's wrist) may be used to move the instrument in pitch and yaw. Forward/rearward motion of the user's arm along the axis of the user's forearm may be mapped to movement of the instrument along the insertion axis, and bending motion of the middle finger or forefinger may be mapped to bending (as shown in FIG. 3 ) or articulation of the bending or articulating portion of a surgical instrument.
  • the IMUs, bladders etc. may be mounted to the user's hand in a number of ways.
  • the IMUs are worn as rings on the identified digits, and the switch is positioned on one of the rings.
  • Bladders are disposed in sleeves placed over the fingertips.
  • a hand-mounted device is provided that has the IMUs mounted to it at the corresponding phalanxes, and the bladders and button mounted to it to contact the fingers at locations such as those shown in FIG. 2 .
  • the components of the haptic interface are in wired or wireless communication with a control system, as discussed above in the description of the surgical robotic system.
  • IMUs are described as the sensors for measuring hand motion, other sensors might be used instead of, or in combination, with the IMUs.
  • a hand-mounted haptic interface may include articulating linkages that track motion of the relevant fingers, and sensors that measure movement of the linkages.
  • input for some degrees of freedom or actions of the instrument might come from such sensors (e.g. instrument articulation, jaw open-close), while others (axial roll, pitch and yaw) might come from IMUs.
  • FIG. 4 schematically represents articulation degrees of freedom corresponding to thumb, index, and middle fingers of a human hand that may be used to generate input using the second type of haptic interface device described here.
  • the proximal lines 200 represent fixed members.
  • Line 202 represents the single articulation of the thumb portion, and thumb bladder 203 is disposed at the tip of the thumb.
  • Lines 204 represent the two articulations of the index finger portion and include bladder 205 at the distal end.
  • Lines 206 represent the two articulations of the middle finger portion and have at their end the bladders 207 (which may be the arrangement of five bladders described above).

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Robotics (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Manipulator (AREA)

Abstract

A haptic interface for a robot-assisted surgical system makes use of a plurality of sensors and a plurality of bladders each mountable to fingers a human hand. A processor receives user input from the sensors in response to movement of the fingers and, in response to the input, causing movement or actuation of a surgical instrument on a robotic manipulator. The processor also receives force input from force sensors of the surgical system corresponding to forces against the surgical instrument, and in response to the force input causing fluid to move into one or more of the bladders.

Description

This application claims the benefit of U.S. Provisional Application No. 62/874,979, filed Jul. 16, 2019.
BACKGROUND
Surgical robotic systems are typically comprised of one or more robotic manipulators and a user interface. The robotic manipulators carry surgical instruments or devices used for the surgical procedure. A typical user interface includes input devices, or handles, manually moveable by the surgeon to control movement of the surgical instruments carried by the robotic manipulators. The surgeon uses the interface to provide inputs into the system and the system processes that information to develop output commands for the robotic manipulator.
The ability to understand the forces that are being applied to the patient by the robotically controlled surgical devices during minimally invasive surgery is advantageous to the surgeon. Communication of information representing such forces to the surgeon via the surgeon interface is referred to as “haptic feedback.” In some systems, haptic feedback is communicated to the surgeon in the form of forces applied by motors to the surgeon interface, so that as the surgeon moves the handles of the surgeon interface, s/he feels resistance against movement representing the direction and magnitude of forces experienced by the robotically controlled surgical device. Forces represented can include both the forces at the tips of the robotically controlled devices and/or the forces being applied by the shaft of the robotically controlled device to the trocar at the entrance point to the body, giving the surgeon complete understanding of the forces applied to the device so s/he can better control the device during surgery.
The present application describes a haptic user input device that is worn on the hand and that generates input to the system via articular of the user's fingers and that communicates haptic feedback to the user using inflatable bladders positioned on the user's fingers.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 shows an example of a robot-assisted surgical system;
FIG. 2 illustrates components of a first embodiment of a haptic user interface;
FIG. 3 shows a surgical instrument in a curved configuration;
FIG. 4 schematically illustrates degrees of freedom of finger motion that may be used to give input using a fingers-driven haptic interface of the type described herein, and further illustrates positioning of the haptic bladders relative to the degrees of freedom.
DETAILED DESCRIPTION
This application describes a haptic user interface for a robot-assisted surgical system. Although the inventions described herein may be used on a variety of robotic surgical systems, the embodiments will be described with reference to a system of the type shown in FIG. 1 .
In the illustrated system, a surgeon console 12 has two input devices such as the new haptic interface described below, and/or handles 17, 18. The new haptic interface described below may be provided in lieu of, or in addition to, the handles 17, 18. Where both types are provided, the user might choose to utilize the more conventional user input devices 17, 18 to give input when performing certain surgical tasks, and to use the new haptic interface for other surgical tasks.
The input devices are configured to be manipulated by a user to generate signals that are used to command motion of a robotically controlled device in multiple degrees of freedom. In use, the user selectively assigns the two input devices to two of the robotic manipulators 13, 14, 15, allowing surgeon control of two of the surgical instruments 10 a, 10 b, and 10 c disposed at the working site at any given time. To control a third one of the instruments disposed at the working site, one of the two input devices is operatively disengaged from one of the initial two instruments and then operatively paired with the third instrument. A fourth robotic manipulator, not shown in FIG. 1 , may be optionally provided to support and maneuver an additional instrument.
One of the instruments 10 a, 10 b, 10 c is a camera that captures images of the operative field in the body cavity. The camera may be moved by its corresponding robotic manipulator using input from a variety of types of input devices, including, without limitation, one of the new haptic interface devices, the handles 17, 18, additional controls on the console, a foot pedal, an eye tracker 21, voice controller, etc. The console may also include a display or monitor 23 configured to display the images captured by the camera, and for optionally displaying system information, patient information, etc.
A control unit 30 is operationally connected to the robotic arms and to the user interface. The control unit receives user input from the input devices corresponding to the desired movement of the surgical instruments, and the robotic arms are caused to manipulate the surgical instruments accordingly.
The input devices are configured to be manipulated by a user to generate signals that are processed by the system to generate instructions used to command motion of the manipulators in order to move the instruments in multiple degrees of freedom.
Where handles 17, 18 are used in addition to the new haptic interface devices, one or more of the degrees of freedom of the handles may be coupled with an electromechanical system capable of providing gravity compensation for the user input, and/or providing haptic feedback to the surgeon.
The surgical system allows the operating room staff to remove and replace surgical instruments carried by the robotic manipulator, based on the surgical need. Once instruments have been installed on the manipulators, the surgeon moves the input devices to provide inputs into the system, and the system processes that information to develop output commands for the robotic manipulator in order to move the instruments and, as appropriate, operate the instrument end effectors.
New Fingers-Driven Haptic Interface
FIG. 2 schematically illustrates an embodiment of a fingers-driven haptic interface 100. Each interface 100 is worn on a hand of a user, and includes acceleration and angular velocity sensors 102, (each of which may be an IMU, or separate accelerometer and a gyroscope components; for simplicity here the term IMU is used to connote either configuration) to generate signals in response to movement of a digit of the user's hand, an input switch 104 such as a push button, sliding switch, touch sensitive surface, joystick, etc. allowing the user to cause an input signal to be delivered to the system, The interface further includes bladders 106. The bladders are designed to be filled with gas or other fluid to activate pressure that simulates touch. As such they are fluidly coupled to a gas or fluid source, which may be one or more neighboring bladders containing the gas/fluid source. Additionally, an electromechanical actuation system or pump is provided to drive the fluid into the relevant haptic bladder of the user interface device when the processor causes it to do so in order to do so in order to give haptic feedback to the user.
The arrangement of these components that is illustrated in FIG. 2 will next be described. It should be understood, however, that in alternative embodiments different number and positions for these components may be used.
In the configuration illustrated in FIG. 2 , there is an IMU for one phalanx of the thumb, and IMU's on each of two phalanx's for the index finger. While not shown in FIG. 2 , two additional IMU's may be positioned on the middle finger, one on each phalanx. The switch 104 is preferably positioned on the index finger so that the thumb may be used to activate/engage it. As shown, the switch may be included on the IMU for that finger.
In the illustrated configuration, the haptic interface positions bladders 106 on each of the thumb, index finger and middle finger as represented in FIG. 2 . More specifically, bladders 106 are positioned to contact the pads of these fingers. In one configuration, the thumb and index finger each include one such bladder, and the system is programmed to deliver force feedback to the index finger and thumb by inflating one or both of these bladders when the end effector of the surgical instrument is closed into contact with tissue, a needle, or another instrument or material used in the surgical procedure.
The haptic interface of the FIG. 2 configuration utilizes five bladders positioned on the middle finger. These bladders are positioned on the user's hand to deliver force feedback to the user along three axes of motion of the surgical instrument as it is used during the surgical procedure. One pair of bladders, which generates force feedback corresponding to forces experienced by the surgical instrument along the pitch axes, is positioned with one bladder on the pad of the finger and one bladder on the back of the finger. A second pair of bladders, which generates force feedback corresponding to forces experienced by the surgical instrument along the yaw axis, are positioned on the sides of the middle finger. The fifth bladder, which generates force feedback along the instrument's insertion axis (along its longitudinal axis), is disposed at the tip of the finger.
The arrangement of IMUs may be used to generate signals used by a process of the surgical system to control electromechanical motion of a surgical instrument held by a robotic manipulator. For example, movement of the thumb and index finger towards/away from one another may be mapped to motion closing the jaws of the surgical instrument. Motion to move the user's hand in pitch and jaw directions (relative to the user's wrist) may be used to move the instrument in pitch and yaw. Forward/rearward motion of the user's arm along the axis of the user's forearm may be mapped to movement of the instrument along the insertion axis, and bending motion of the middle finger or forefinger may be mapped to bending (as shown in FIG. 3 ) or articulation of the bending or articulating portion of a surgical instrument.
The IMUs, bladders etc. may be mounted to the user's hand in a number of ways. In one embodiment, the IMUs are worn as rings on the identified digits, and the switch is positioned on one of the rings. Bladders are disposed in sleeves placed over the fingertips. In another embodiment, a hand-mounted device is provided that has the IMUs mounted to it at the corresponding phalanxes, and the bladders and button mounted to it to contact the fingers at locations such as those shown in FIG. 2 .
The components of the haptic interface are in wired or wireless communication with a control system, as discussed above in the description of the surgical robotic system.
Note that while IMUs are described as the sensors for measuring hand motion, other sensors might be used instead of, or in combination, with the IMUs.
For example, a hand-mounted haptic interface may include articulating linkages that track motion of the relevant fingers, and sensors that measure movement of the linkages. In this embodiment, input for some degrees of freedom or actions of the instrument might come from such sensors (e.g. instrument articulation, jaw open-close), while others (axial roll, pitch and yaw) might come from IMUs.
FIG. 4 schematically represents articulation degrees of freedom corresponding to thumb, index, and middle fingers of a human hand that may be used to generate input using the second type of haptic interface device described here. In this drawing, the proximal lines 200 represent fixed members. Line 202 represents the single articulation of the thumb portion, and thumb bladder 203 is disposed at the tip of the thumb. Lines 204 represent the two articulations of the index finger portion and include bladder 205 at the distal end. Lines 206 represent the two articulations of the middle finger portion and have at their end the bladders 207 (which may be the arrangement of five bladders described above).

Claims (9)

We claim:
1. A haptic interface for a robot-assisted surgical system that includes a manipulator, a surgical instrument on the manipulator, and one or more forces sensors positioned to detect forces against the surgical instrument, the haptic interface comprising:
a plurality of sensors mountable to a human hand, wherein the plurality of sensors includes a plurality of accelerators, gyroscopes, or IMUs;
a plurality of bladders mountable to fingers of the human hand;
at least one processor and at least one memory, the at least one memory storing instructions executable by said at least one processor to:
receive user input from the sensors in response to movement of the fingers and, in response to the input, causing movement or actuation of a surgical instrument,
receive force input from force sensors of the surgical system corresponding to forces against the surgical instrument, and, in response to the force input causing fluid to move into one or more of the bladders to provide tactile feedback to the human hand corresponding to the forces against the surgical instrument.
2. The system of claim 1, wherein the haptic interface further includes a plurality of linkages including joints, the linkages moveable relative to the joints in response to articulating of a corresponding finger, and wherein the plurality of sensors includes sensors positioned to measure movement of linkages.
3. The system of claim 1, wherein the instructions are executable by the processor to cause movement of the surgical instrument in at least one of the following: pitch, roll, yaw, articulation or bending, jaw actuation, insertion axis motion.
4. The system of claim 1, wherein the plurality of sensors includes first and second IMUs, the first IMU mountable on an index finger of a user and the second IMU mountable on mountable to a thumb of the user, wherein the instructions are executable by said at least one processor to close jaws of the surgical instrument in response to relative movement of the index finger towards the thumb detected by the first or second IMU.
5. The system of claim 4, wherein the instructions are further executable by said at least one processor to close jaws of the surgical instrument in response to relative movement of the index finger away from the thumb detected by the first or second IMU.
6. The system of claim 1, wherein the instructions are executable by said at least one processor to move the surgical instrument in pitch or yaw in response to movement of the users hand in pitch or yaw relative to the wrist as detected by at least one of the plurality of sensors.
7. The system of claim 1, wherein the instructions are executable by said at least one processor to bend or articulate the surgical instrument in response to bending movement of a finger as detected by at least one of the plurality of sensors.
8. The system of claim 1, wherein the instructions are executable by said at least one processor to move the surgical instrument along its insertion axis in response to movement of the user's forearm along its axis as detected by at least one of the plurality of sensors.
9. A haptic interface for a robot-assisted surgical system that includes a manipulator, a surgical instrument on the manipulator, and one or more forces sensors positioned to detect forces against the surgical instrument, the haptic interface comprising:
a plurality of linkages positionable on fingers of a human hand, the linkages including joints, the linkages moveable relative to the joints in response to articulating of a corresponding finger, and wherein
sensors mountable to a human hand, wherein the sensors include a first plurality of sensors positioned to measure movement of linkages, and a second plurality of sensors comprising accelerometers, gyroscopes or IMUs positioned to detect movement of the hand in roll, pitch or yaw;
a plurality of bladders mountable to fingers of the human hand;
at least one processor and at least one memory, the at least one memory storing instructions executable by said at least one processor to:
receive user input from the first plurality of sensors in response to articulation of the linkages resulting from movement of the fingers and, in response to the input, causing movement or actuation of a surgical instrument in jaw actuation or instrument articulation;
receive user input from the second plurality of sensors and, in response to the input, causing movement or actuation of a surgical instrument movement of the surgical instrument in at least one of axial roll, pitch and yaw;
receive force input from force sensors of the surgical system corresponding to forces against the surgical instrument, and in response to the force input causing fluid to move into one or more of the bladders to provide tactile feedback to the human hand corresponding to the forces against the surgical instrument.
US16/931,442 2019-07-16 2020-07-16 Fingers-driven haptic interface for robot-assisted surgical system Active 2041-05-29 US11751967B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/931,442 US11751967B2 (en) 2019-07-16 2020-07-16 Fingers-driven haptic interface for robot-assisted surgical system
US18/457,199 US12059227B2 (en) 2019-07-16 2023-08-28 Fingers-driven haptic interface for robot-assisted surgical system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962874979P 2019-07-16 2019-07-16
US16/931,442 US11751967B2 (en) 2019-07-16 2020-07-16 Fingers-driven haptic interface for robot-assisted surgical system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/457,199 Continuation US12059227B2 (en) 2019-07-16 2023-08-28 Fingers-driven haptic interface for robot-assisted surgical system

Publications (2)

Publication Number Publication Date
US20210038332A1 US20210038332A1 (en) 2021-02-11
US11751967B2 true US11751967B2 (en) 2023-09-12

Family

ID=74501834

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/931,442 Active 2041-05-29 US11751967B2 (en) 2019-07-16 2020-07-16 Fingers-driven haptic interface for robot-assisted surgical system
US18/457,199 Active US12059227B2 (en) 2019-07-16 2023-08-28 Fingers-driven haptic interface for robot-assisted surgical system

Family Applications After (1)

Application Number Title Priority Date Filing Date
US18/457,199 Active US12059227B2 (en) 2019-07-16 2023-08-28 Fingers-driven haptic interface for robot-assisted surgical system

Country Status (1)

Country Link
US (2) US11751967B2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD981558S1 (en) * 2019-05-14 2023-03-21 Quantum Surgical Medical robot

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5004391A (en) 1989-08-21 1991-04-02 Rutgers University Portable dextrous force feedback master for robot telemanipulation
US20140215684A1 (en) * 2013-02-07 2014-08-07 Timothy J. Hardy Pressure Sensing Glove
US20160296838A1 (en) 2015-04-07 2016-10-13 Virtuix Holdings Inc. Haptic glove for use in a virtual environment
US20180196515A1 (en) * 2017-01-11 2018-07-12 International Business Machines Corporation Simulating obstruction in a virtual environment
US20180243626A1 (en) * 2017-02-28 2018-08-30 P Tech, Llc Devices, systems, and methods
US20190083187A1 (en) * 2017-09-19 2019-03-21 Pulse Biosciences, Inc. Treatment instrument and high-voltage connectors for robotic surgical system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5004391A (en) 1989-08-21 1991-04-02 Rutgers University Portable dextrous force feedback master for robot telemanipulation
US20140215684A1 (en) * 2013-02-07 2014-08-07 Timothy J. Hardy Pressure Sensing Glove
US20160296838A1 (en) 2015-04-07 2016-10-13 Virtuix Holdings Inc. Haptic glove for use in a virtual environment
US20180196515A1 (en) * 2017-01-11 2018-07-12 International Business Machines Corporation Simulating obstruction in a virtual environment
US20180243626A1 (en) * 2017-02-28 2018-08-30 P Tech, Llc Devices, systems, and methods
US20190083187A1 (en) * 2017-09-19 2019-03-21 Pulse Biosciences, Inc. Treatment instrument and high-voltage connectors for robotic surgical system

Also Published As

Publication number Publication date
US20210038332A1 (en) 2021-02-11
US12059227B2 (en) 2024-08-13
US20230397965A1 (en) 2023-12-14

Similar Documents

Publication Publication Date Title
US12023122B2 (en) Ungrounded master control devices and methods of use
JP6959264B2 (en) Control arm assembly for robotic surgery system
US9713500B2 (en) Surgical robot control apparatus
US20220117686A1 (en) Dynamic control of surgical instruments in a surgical robotic system
KR102171873B1 (en) Haptic glove and Surgical robot system
JP4335681B2 (en) Surgical microlist system
Huang et al. A three-limb teleoperated robotic system with foot control for flexible endoscopic surgery
Breedveld et al. Manipulation in laparoscopic surgery: overview of impeding effects and supporting aids
US20080167662A1 (en) Tactile feel apparatus for use with robotic operations
JP2015150425A (en) Master devices for surgical robots and control methods thereof
US12059227B2 (en) Fingers-driven haptic interface for robot-assisted surgical system
JP2004344180A (en) Operating equipment
US9387043B2 (en) Medical master/slave type device for minimally invasive surgery
US20210298855A1 (en) Master control device with finger grip sensing and methods therefor
Low et al. A review of master–slave robotic systems for surgery
US20130103199A1 (en) Surgical robot control apparatus
CN113194870B (en) User interface device, main control console of surgical robot device, and operation method thereof
Mirbagheri et al. The sina robotic telesurgery system
Peine et al. Effect of backlash on surgical robotic task proficiency
Abdi et al. Third arm manipulation for surgical applications: an experimental study
CN110772325A (en) Handle and main operating platform
WO2020209165A1 (en) Surgical operation system and method for controlling surgical operation system
Herman et al. An articulated handle to improve the ergonomic performance of robotic dextrous instruments for laparoscopic surgery
Poon et al. A novel user-specific wearable controller for surgical robots
Lee et al. Robotic endoscopy system (easyEndo) with a robotic arm mountable on a conventional endoscope

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

AS Assignment

Owner name: ASENSUS SURGICAL US, INC., NORTH CAROLINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BRASSET, DAMIEN;MAGHINI, STEFANO;POZZATO, RICCARDO;SIGNING DATES FROM 20230714 TO 20230726;REEL/FRAME:064384/0831

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE