CN116761568A - Interaction between user interface and master controller - Google Patents

Interaction between user interface and master controller Download PDF

Info

Publication number
CN116761568A
CN116761568A CN202180089910.XA CN202180089910A CN116761568A CN 116761568 A CN116761568 A CN 116761568A CN 202180089910 A CN202180089910 A CN 202180089910A CN 116761568 A CN116761568 A CN 116761568A
Authority
CN
China
Prior art keywords
movement
user input
hand
rate
input device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180089910.XA
Other languages
Chinese (zh)
Inventor
E·努希贝赞贾尼
K·瓦扎
A·苏雷什
B·D·伊科威兹
S·达菲
S·戈什
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intuitive Surgical Operations Inc
Original Assignee
Intuitive Surgical Operations Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intuitive Surgical Operations Inc filed Critical Intuitive Surgical Operations Inc
Priority claimed from PCT/US2021/060400 external-priority patent/WO2022119740A1/en
Publication of CN116761568A publication Critical patent/CN116761568A/en
Pending legal-status Critical Current

Links

Landscapes

  • Manipulator (AREA)

Abstract

There is provided a method of controlling actuation of a click event by a hand actuated selector movably mounted to a mounting structure comprising: in a first control state, applying a maintenance force to the hand-actuated selector; in the second control state, applying to the hand-actuated selector a haptic force that increases as the displacement of the hand-actuated selector from the neutral displacement position increases; a click event signal is applied to cause a click event to occur at the display system, and in a third control state, a haptic force of decreasing magnitude is applied to the hand actuated selector.

Description

Interaction between user interface and master controller
Priority claim
The present application claims priority from U.S. patent application Ser. No. 63/120,202, filed on 1 month 12 in 2020, and U.S. patent application Ser. No. 63/187,879, filed on 12 month 5 in 2021, each of which is incorporated herein by reference in its entirety.
Background
Minimally invasive medical techniques aim to reduce the amount of tissue destroyed during diagnostic or surgical procedures, thereby reducing patient recovery time, discomfort and adverse side effects. Teleoperated surgical systems using robotics (so-called surgical robotic systems) can be used to overcome the limitations of manual laparoscopy and open surgery. Advances in telepresence systems provide surgeons with a view of the interior of the patient's body, an increased number of degrees of motion of the surgical instruments, and the ability for remote surgical collaboration. Teleoperational control of surgical robotic technology typically involves user interaction with a hand-controlled manipulator to control movement of the surgical instrument and user interaction with a finger-controlled selector to trigger occurrence of a robotic system event. Haptic feedback may improve the user's teleoperational control of surgical robotic technology.
Disclosure of Invention
In one aspect, a method of controlling actuation of a firing event by a hand actuated selector movably mounted to a mounting structure is provided. The sensor senses an amount of displacement distance of the hand actuated selector from the neutral position. When the hand-actuated sensor is at a displacement distance less than a first threshold distance from the neutral position, the one or more motors are controlled to apply a maintenance force according to the first control state. When the hand actuation sensor is at a displacement distance between a first threshold distance from the neutral position and a second threshold distance from the neutral position, the motor is controlled to apply a haptic force to the hand actuation selector that increases as the displacement of the hand actuation selector from the neutral displacement position increases according to the second control state. Once the hand-actuated sensor meets a second threshold distance from the neutral position, a click event signal is applied to cause a click event to occur at the display system. In addition, once the hand-actuated sensor meets a second threshold distance from the neutral position, the one or more motors are controlled according to the third control state to reduce the magnitude of the haptic force applied to the hand-actuated selector to a reduced magnitude that is less than a maximum magnitude of the haptic force applied during the second control state.
In another aspect, a method of controlling movement of a cursor in a first two-dimensional (2D) plane based on movement of a user input device in a second 2D plane and based on movement of a hand actuated selector movably mounted to the user input device is provided. When the hand-actuated selector moves relative to the user input device at a rate less than the first threshold rate, the cursor is caused to move in the first 2D plane following movement of the user input device in the second 2D plane according to the constant rate of movement. In response to the rate of movement of the hand-actuated selector relative to the controller being between the first threshold rate and the second threshold rate, causing the cursor to move within the first 2D plane following movement of the user interface device within the second 2D plane according to a rate of movement that decreases as the rate of movement of the hand-actuated selector relative to the user input device increases. In response to the rate of movement of the hand-actuated selector relative to the user input device decreasing below the second threshold rate, the cursor is caused to move in the first 2D plane following movement of the user input device in the second 2D plane in accordance with a rate of movement that increases as the rate of movement of the hand-actuated selector relative to the user input device decreases.
Drawings
In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like reference numerals having different letter suffixes may represent different instances of similar components. Some embodiments are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which:
FIG. 1 is an illustrative schematic drawing illustrating an example teleoperated surgical system.
FIG. 2A is an illustrative drawing illustrating an example user input control system in accordance with an embodiment.
FIG. 2B is an illustrative drawing of an example instrument motion controller illustrating an example user input control system.
FIG. 2C is an illustrative drawing illustrating an example armrest of an example user input control system.
FIG. 3 illustrates an example virtual surgical site viewable at a display system viewing plane during 3D mode operation.
Fig. 4A-4D illustrate example graphical user interfaces viewable at a display system viewing plane during 2D mode operation.
FIG. 5 is an illustrative diagram showing an example viewing plane and an example haptic plane.
FIG. 6A is an illustrative view showing details of an example user input device as a mount for a hand actuated selector.
FIG. 6B is an illustrative functional block diagram showing a control system for controlling receiving user input at a user input device including a hand actuated selector and applying haptic feedback at the user input device.
FIG. 7 is an illustrative diagram showing a first control function curve representing haptic force versus displacement of a displaceable hand actuated selector position during a click event and also showing a time-aligned grip button displacement sequence and a time-aligned hand shape sequence.
FIG. 8 is an illustrative flow chart showing a control process for controlling the provision of a haptic force and triggering of a click event based on displacement of a hand actuated selector relative to a user input device.
Fig. 9 is an explanatory diagram showing a configuration of an input controller that implements the first transformation in the absence of a click event.
Fig. 10 is an explanatory diagram showing a configuration of an input controller that implements the second transformation in the case where there is a click event.
FIG. 11 is an illustrative diagram showing a second control function curve representing an example second transformation function for determining a relationship of controller motion filtering to time during a click event and also showing a sequence of grip button displacements aligned to time, a sequence of hand shapes aligned to time, and a sequence of view plane instances aligned to time.
FIG. 12 is an illustrative block diagram of an example computer system.
Detailed Description
The following description is presented to enable any person skilled in the art to make and use the system and method of medical device simulator. Various modifications to the embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments and applications without departing from the scope of the inventive subject matter. Furthermore, in the following description, numerous details are set forth for the purpose of explanation. However, it will be appreciated by one of ordinary skill in the art that the inventive subject matter may be practiced without such specific details. In other instances, well-known machine components, processes, and data structures have been shown in block diagram form in order to avoid obscuring the disclosure in unnecessary detail. The flowcharts in the figures referenced below are used to represent the processes. The computer system may be configured to perform some of these processes. Blocks of the flowchart illustrations that represent computer-implemented processes represent configurations of the computer system according to computer program code to perform the actions described with reference to the blocks. Thus, the present subject matter is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
Teleoperated surgical system
Fig. 1 is an illustrative schematic drawing illustrating an example teleoperated surgical system 100. Teleoperated surgical system 100 includes an instrument manipulator assembly 102, which instrument manipulator assembly 102 may include one or more linkages for manipulating the operation of surgical instrument 104 in response to user input controls while performing various procedures on patient 106. Instrument manipulator assembly 102 is mounted to or near an operating table 108. User input control system 110 allows user 112 to view the surgical site and control instrument manipulator assembly 102.
In alternative embodiments, the example teleoperated surgical system 100 may include more than one instrument manipulator assembly 102. The exact number of manipulator assemblies may depend on the surgical procedure and space constraints within the operating room, among other factors.
User input control system 110 may be located in the same room as operating table 108. However, it should be understood that the user 112 (such as a surgeon or clinician) may be located in a different room or completely different building than the patient 106. The user input control system 110 generally includes a vision system including a visualization system 116 and a display system 120, and includes one or more user input devices 204, one or more instrument manipulator assemblies 102, and an instrument motion and input device haptic feedback haptic controller (referred to herein as an "input controller") 118.
The one or more user input devices 204 are operably coupled to the one or more instrument manipulator assemblies 102 to control movement of the one or more instruments 104 in response to user input provided at the user input devices 204. In the example teleoperated surgical system 100, one or more user input devices 204 and one or more instrument manipulator assemblies 102 are communicatively coupled to the input controller 118. The example motion controller processes user input received at one or more user input devices 204 to control movement of one or more instrument manipulator assemblies 102. The example input controller 118 generates haptic feedback signals for adjusting the state of the haptic forces at the one or more user input devices 204 based on the motion of the one or more instrument manipulator assemblies 102 and/or based on the motion of the user input device 204.
User input device 204 may include any number of various input devices such as a gravity balance arm, joystick, trackball, glove, trigger grip, twistable knob, twistable grip, slider, lever knob, or the like. In some embodiments, the user input device 204 may be provided with the same degrees of freedom as the associated surgical instrument 104 to provide telepresence to the user 112 or to provide the perception that the user input device 204 is integral with the instrument 104 so that the user 112 has a strong feel of directly controlling the instrument 104. In some embodiments, the user input device 204 is a manual input device that moves in six degrees of freedom or more, and may also include actuatable handles or other control features (e.g., one or more buttons, switches, etc.) for actuating the instrument (e.g., for closing grasping jaws, applying an electromotive force to an electrode, delivering a medication therapy, or the like).
Visualization system 116 provides concurrent two-dimensional (2D) or three-dimensional (3D) images of the surgical site to user 112 as user 112 manipulates one or more instruments. Visualization system 116 may include a viewing mirror assembly such that visual images may be captured through an endoscope positioned within the surgical site. Visualization system 116 may be implemented as hardware, firmware, software, or a combination thereof, which interact with or are otherwise executed by one or more computer processors, which may include processors of control system 110.
The display system 120 may display visual images of the surgical site and the surgical instrument 104 captured by the visualization system 116. The display system 120 and the user input device 204 may be oriented such that the relative positions of the visual imaging device in the mirror assembly and the surgical instrument 104 are similar to the relative positions of the surgeon's eyes and hands, so that an operator (e.g., user 112) may manipulate the surgical instrument 104 with the user input device 204 as if viewing the working volume of the adjacent instrument 104 in the substantial true presence. By "true presence" is meant that the presentation of the image is a true perspective image simulating the perspective of an operator who is physically manipulating the surgical instrument 104.
The instrument motion input controller 118 includes at least one processor circuit (not shown) and typically includes a plurality of processor circuits for effecting control between the user input device 204, the user input control system 110, and the display system 120. The input controller 118 also includes software programming instructions to implement some or all of the methods described herein. Although the input controller 118 is shown as a single block in the simplified schematic of fig. 1, the input controller 118 may include multiple data processing circuits (e.g., on the user input device 204 and/or on the user input control system 110). Any of a variety of centralized or distributed data processing architectures may be employed. Further, one or more processing circuits may be implemented at a virtual machine. Similarly, the programming code may be embodied as a plurality of separate programs or subroutines, or may be integrated into a plurality of other aspects of the remote operating system described herein. In various embodiments, the input controller 118 may support wireless communication protocols such as bluetooth, irDA, homeRF, IEEE 802.11, DECT, and wireless telemetry.
The example input controller 118 may include a servo controller to provide haptic force and/or haptic torque feedback at the user input device 204 based on the force and torque sensed at the surgical instrument 104 to the user input device 204. The example input controller 118 may include a servo controller to provide haptic force and/or haptic torque feedback to the user input device 204 based on the force and torque sensed at the input device to the user input device 204. Any suitable conventional or specialized servo controller may be used. The servo controller may be separate from or integral with the instrument manipulator assembly 102. The servo controller may also be separate from or integral with the user input device 204. In the example medical system, the example servo controller and manipulator assembly 102 is provided as part of a robotic arm cart positioned adjacent to a patient 106. In the example medical system, the example servo controller and user input device 204 are positioned proximate to a user providing input at the user input device 204.
For the purposes of this document, surgical instrument 104 may be referred to as a "controlled device.
In the example teleoperated surgical system 100, the input controller 118 controls at least one controlled device 104 (e.g., a "surgical instrument") and may control movement of one or more linkages 102-1 of one or more instrument manipulator assemblies 102. The example instrument manipulator assembly 102 includes one or more motors coupled to control movement of an end effector associated with the instrument 104. The example instrument manipulator assembly 102 includes one or more motors coupled to control movement of one or more end effectors coupled to the instrument 104. The linkage 102-1 may be referred to as a setup structure that includes one or more links coupled with the joint 102-2, which allows the setup structure to be positioned and maintained in a spatial position and orientation. Motors coupled to control movement of one or more end effectors of the instrument are further coupled to the surgical instrument 104 such that the surgical instrument 104 is advanced into a natural or surgically-created anatomical orifice and the surgical instrument 104 is moved and the end effectors of the instrument are moved in multiple degrees of freedom that may include three degrees of linear motion (e.g., x, y, and z linear motions) and three degrees of rotational motion (e.g., roll, pitch, yaw). The motors of the example manipulator assembly 102 may be configured to actuate an actuator of the surgical instrument 104, such as an articulatable actuator for grasping tissue in the jaws of a biopsy device, or an actuator for obtaining a tissue sample or a medication, or another actuator for providing other treatments, for example, as described more fully below. Further information regarding camera reference control in minimally invasive surgical devices is contained in U.S. patent No. 6,671,581, entitled "Camera Referenced Control in aMinimally Invasive Surgical Apparatus," which is incorporated by reference.
In the example teleoperated surgical system 100, for training purposes, the display system 120 may display a virtual environment simulating a surgical site within a patient. The virtual environment may include various biological structures in addition to the surgical instrument 104. The user 112 operates the virtual appliance within the virtual environment to train, obtain certification, or test various skills or procedures without the possibility of injuring the real patient.
In an in-situ surgical or simulated surgical procedure, the display system 120 may be used to present a user interface to a user (e.g., user 112). In an embodiment, the display system 120 provides 3D views, such as stereoscopic displays. In another example teleoperated surgical system, the display system 120 is used to project 3D images, such as images from a high definition endoscopic camera. The user interface may be displayed as an overlay, such as by using a semi-transparent interface, or may be displayed in place of a view of the surgical field.
FIG. 2A is a diagram illustrating an example user input control system 110. A user may sit at the user input control system 110 and may access the display system 120, as well as the user input device 204 and the foot pedal panel 206. The foot switch panel 206 may act as a clutch, for example, that enables a user to switch between performing various tasks (such as exchanging between various surgical instruments) or controlling video or camera features. When seated at the user input control system 110, the user may rest their arm on the armrest 208. When operating in field surgery, the display system 120 displays the surgical field captured from a camera inserted through a small opening into the surgical site (sometimes referred to as a portal or cannula). For training purposes, a simulated environment may be displayed on the display system 120, where the simulated environment may be a stereoscopic display of the surgical site and a virtual controlled device (e.g., a surgical instrument). As the user moves user input device 204, the virtual surgical instrument may move in a corresponding manner in the stereoscopic display.
FIG. 2B is an illustrative drawing showing an example user input device 204 operably coupled to the user input control system 110. The example user input device 204 includes a gimbal mount 225, the gimbal mount 225 including an articulated arm portion including a plurality of links 227 connected together by pivot joints 229. The user holds the finger ring 210 by positioning his or her thumb and forefinger on a displaceable hand-actuated selector 212, such as, for example, a kneader push button. In the example user input device 204, the user's thumb and forefinger are typically held on the displaceable hand actuated selector 212 by straps that pass through slots to create the finger loops 210. The example selector 212 includes a first grip button 503a and a second grip button 503b that are spaced apart a distance such that they may be gripped by a user's thumb and index finger, for example, by the user's thumb engaging one grip button and the user's index finger engaging the other, as explained below with reference to fig. 5. At least one of the grip buttons is movable to reduce a displacement distance between the grip buttons in response to a user squeezing force applied to the grip buttons engaged therebetween. The joint 229 of the example user input device 204 is operably connected to a motor or the like to provide, for example, force feedback, gravity compensation, or the like. Further, a suitably positioned sensor (e.g., an encoder, or potentiometer, or the like) is positioned on each joint 229 of the example user input device 204 to enable joint positions of the example user input device 204 to be determined by the user input controller 118 to control movement of one or more instruments operatively coupled to the user input device or to control haptic feedback forces applied to the one or more input devices 204.
The example teleoperated surgical system 100 includes two user input devices 204, each having two finger loops 210 for which a user may insert the index finger and thumb of a respective hand. The two user input devices 204 may each control a surgical instrument or a virtual surgical instrument. A user may be provided with a software or hardware mechanism to interchange between multiple instruments for one or both instrument motion controllers 204. For example, the user may be provided with three instruments, such as two forceps and a retractor. One or both of the forceps may be an energy instrument capable of cauterizing tissue. The user may first use forceps at each instrument motion controller 204 and then switch the right hand example user input device 204 to control the retractor to expose a portion of the surgical field and then switch the right hand example user input device 204 back to forceps to continue cutting, probing or dissecting tissue.
In using the example user input device 204, the user is provided with a full 3D range of motion (x, y, and z axes) and rotational motion (roll, pitch, yaw) in addition to pinching motion with the index finger and thumb (or any two fingers inserted into the ring 210). Thus, by moving the appropriate user input device 204, the user is able to manipulate the corresponding surgical instrument through a full range of motion.
Fig. 2C is a diagram illustrating a handrail 208 of a user input control system 110 according to an embodiment. The armrest 208 may include more than one touch control, such as a touch screen, soft buttons, mechanical buttons, or the like. In the example shown in fig. 2C, a single touch screen 214 is shown through which a user can configure various video, audio, or other system settings.
Overview of graphical user interface controls
During operation, the user interface may be presented to the user at various times. For example, a user interface may be presented to allow a user to select from among the selections of training modules. As another example, a user interface may be presented to allow a user to configure various aspects of the operation of the user input control system 110. When a user operates the example user input device 204 with one or both hands, it may be inconvenient to release the example user input device 204 and then operate another input mechanism (e.g., a touch screen interface integrated into the armrest 208 of the user input control system 110).
Fig. 3 is an illustrative drawing showing a virtual surgical site displayed by display system 120 operating in 3D mode. The virtual surgical site 300 may be displayed on the display system 120 and include two virtual controlled devices 302. When operating in this mode, the user input device 204 is capable of 3D movement in free space (within the boundaries of the virtual surgical site 300) to control the controlled device 302. In the second mode, the user input device 204 is constrained to move within a virtual surface, which may be planar or may have contours such as, for example, a gentle arc. For example, the second mode may be used to present a graphical user interface that includes control elements (e.g., buttons, knobs, sliders, drop-down menus) that may be controlled, for example, using the user input device 204 as a pointing and clicking device. For example, the second mode may be used to present 2D images, such as, for example, preoperative images. The second mode helps provide the user input device 204 with an operating space that may be substantially aligned with the 2D virtual surface.
Fig. 4A illustrates a first example graphical user interface screen display 400 displayed by the display system 120 operating in 2D mode. The first graphical user interface 400 is optionally displayed as an overlay of the surgical site view or as a separate interface. A cursor 402 is displayed within the first user interface 400 and is used to activate one or more user interface controls, such as buttons, sliders, lists of options, and the like. The cursor 402 may be controlled by the user input device 204. Using servo controls coupled to the user input device 204, haptic feedback may be provided to the user to provide a sensation of touching the first user interface 400. For example, when a user uses the input device 204 to implement virtual motion of the user interface control structure (e.g., such as selecting a virtual button, sliding a virtual slider control, or moving a virtual dial displayed in the user interface), a user input device 204 motor coupled to the input device 204 may cause the input device 204 to vibrate, shake, apply a reaction force to oppose the user's motion, or otherwise react to actuation of the user interface control to provide sensory feedback to the user. Fig. 4A illustrates an example login display 401 that includes a user-selectable menu drop down arrow 404. The user may move the cursor 402 to overlay the menu selection arrow 402, at which point the user may select the overlaid menu selection arrow 402 by actuating (e.g., by squeezing) the displaceable hand-actuated selector 212 to apply a click event that causes a surgical system action to occur, such as, for example, displaying a drop down menu (not shown) in the display system 120 or energizing an electrosurgical instrument (not shown) within the surgical site 300.
As used herein, a "click event" may refer to a displacement of the displaceable hand actuated selector 212 relative to the user input device 204 that causes the controller 118 to send a signal to the display system to select or actuate a selectable element in the user interface display. As used herein, a "click event" may also refer to a displacement of the displaceable hand actuated selector 212 relative to the user input device 204 that causes the controller 118 to send a signal to one or more motors of the manipulator assembly to cause movement of one or more of the real or virtual instrument, instrument end effector, or manipulator arrangement. In the example teleoperated surgical system 100, a click event causes selection or actuation of a selectable element in the user interface display during 2D operation of the system 100, and a click event causes movement of a real or virtual component of the manipulator assembly during 3D mode operation of the system 100.
For example, during 2D mode operation, the user may select a menu item from a drop down menu using a cursor. The login screen 401 also includes a keyboard with a plurality of control elements (i.e., virtual buttons) 406. The user may move the cursor 402 to overlay the keyboard control element, at which point the user may select the overlaid keyboard button by squeezing the displaceable hand-actuated selector 212 to apply a click event that causes the selected element to be displayed in the display area 408. In an example teleoperated surgical system 100, a keyboard control panel, for example, may be used to select an energy level electrosurgical instrument.
Fig. 4B illustrates an example second graphical user interface screen display 421 that includes a plurality of control element menus 420 and an instrument manipulator assembly image 422, the instrument manipulator assembly image 422 showing the example instrument manipulator assembly 102. In the example teleoperated surgical system 100, a user may select a control element from within the control element menu 420 using the user input device 204 in a 2D mode of operation, at which point a different menu (not shown) corresponding to the selected menu element is displayed.
Fig. 4C illustrates the example second graphical user interface screen display 421 of fig. 4B in which an inner wrist manipulation menu 425 is displayed overlaying instrument manipulator assembly image 422 in response to user selection of the inner wrist manipulation control element of fig. 4B-4C.
Fig. 4D illustrates an example second graphical user interface screen display 421 of fig. 4B-4C in which the overlay instrument manipulator assembly image 422 displays a cursor mode menu 427 in response to user selection of the cursor mode control element of fig. 4B-4D. It will be appreciated that more or fewer screens may be used in the second user interface 400.
The example first graphical user interface display 400 and the second graphical user interface display 402 are presented as 2D interfaces. Thus, the user input device 204 is limited to the 2D area when the user is controlling a cursor in the graphical user interface. In contrast, when in the surgical simulation mode (e.g., the first mode), the user input device 204 is allowed complete or nearly complete 3D freedom of motion. However, in the graphical user interface mode (e.g., the second mode), the user input device 204 is limited to 2D motion. The 2D motion may be limited to planar areas or areas with a gentle curvature (e.g., such as a gentle convex curvature). The 2D region to which the motion of the user input device 204 is limited may be oriented in space such that the user's hand and the user input device 204 are at approximately the same angle as displayed in the display system 202. Such a correlation may help users orient their hands in 3D space relative to the displayed graphical user interface.
An example is illustrated in fig. 5, which shows a viewing plane 502 and a haptic plane 504. The viewing plane 502 represents a planar area in which graphical user interface images are presented to a user by a user in the display system 202, such as the example first graphical user interface screen display 400 and the second graphical user interface screen display 421 of fig. 4A-4D, which may be displayed in the display system 202. The cartesian coordinate locations in viewing plane 502 correspond to the cartesian coordinate locations in haptic plane 504. The user may select a control element displayed at a Cartesian coordinate location in the viewing plane 502 by moving one of the user input devices 204 to the corresponding Cartesian coordinate location in the haptic plane and clicking on the corresponding location. More particularly, movement of the cursor in the graphical user interface displayed in the viewing plane 502 follows movement of the user input device 204 in the haptic plane 504. Thus, when the user causes the user input device 204 to move within the 2D tactile area 504, the display cursor correspondingly moves within one of the first graphical user interface screen display 400 and the second graphical user interface screen display 421 displayed within the visual image area 502. In the second mode, user hand movement causes corresponding movement of the user input device 204 within the 2D region 504, which causes corresponding movement of the cursor image in the viewing region 502. To select the desired control element displayed in visual image area 502, the user may engage user input device 204 with his or her hand and move the hand to cause a corresponding movement of user input device 204 to cause a corresponding movement of the cursor to visually align the cursor with the desired control element. With the cursor aligned, the user may select a desired control element, for example, by applying finger motion to apply motion to the displaceable hand actuated selector 212 to affect a click selection ("click") user interface action. In response to clicking, the processor circuit performs an action associated with the selected control element.
The user input device 204 is confined within the 2D tactile planar area 504. When a user attempts to move user input device 204 "up" or "down" with respect to the Z-axis of haptic plane 504, the user may encounter resistance to such movement. If the user changes the orientation of viewing plane 502, such as by a display configuration setting, haptic plane 504 may be adjusted to maintain a substantially parallel orientation relative to viewing plane 502. In various embodiments, the haptic plane may be oriented at a fixed or dynamic angular offset relative to the viewing plane. Alternatively, the haptic plane may be oriented at a fixed or dynamic angular offset relative to the ground. The user may also alter constraints, such as the position or orientation of the haptic plane.
The one or more processor circuits are configured to scale movement of the user input device 204 when a corresponding movement is applied to the controlled device in a first (3D) mode, and to apply a corresponding movement to the cursor in a second (2D) mode. In the first mode, the zoom allows the user to perform complex medical procedures in a much easier manner than conventional open surgery. Scaling includes scaling the commanded movement of the user input device 204 in accordance with a scaling factor before applying the corresponding movement to the controlled device or cursor. Scaling takes into account changes in the speed and position of the user input device 204 and translates it into corresponding scaled changes in the position of the controlled device or cursor. The scaling factor is adjustable and may be different during operation in the first mode and the second mode. For example, in the second mode, the scaling factor may be uniform (i.e., not scaled). U.S. patent No. 7,843,158, entitled "Medical robotic System Adapted to Inhibit Motions Resulting in Excessive End Effector Forces," which is incorporated by reference, contains further information regarding scaling.
User input device with hand actuated click selector
FIG. 6A is an illustrative view showing details of an example user input device 204 as a mount for a displaceable hand-actuated click selector 212. The example user input device 204 is mounted on the gimbal mounting assembly 225 and is coupled to the input controller 118 of FIG. 2A. The gimbal mount 225 includes a controller motion sensor (first sensor) 529 to sense motion of the entire controller 225. The example user input device 204 may include a plurality of controller motion sensors 529 (only one shown) to sense motion of the controller in multiple degrees of freedom. The example user input device 204 includes an example displaceable hand-actuated selector 212, the hand-actuated selector 212 including a kneader mounting member configured as an elongated handle 530 including a longitudinal axis 531. The example displaceable hand actuated selector 212 is integrally formed with the example user input device 204 such that the hand actuated selector 212 moves in unison with the 2D motion of the user input device 204 and such that the hand actuated selector 212 moves in unison with the 3D motion of the user input device 204. The example hand actuated selector 212 includes a first articulatable kneader grip button 530a and a second articulatable kneader grip button 530b mounted on a handle 530. The handle 530 serves as a mounting member for mounting the grip buttons 530a, 530b of the hand actuated selector 212. The first grip button 530a and the second grip button 530b of the hand actuated selector 212 are obliquely upstanding from opposite sides of the handle 530.
The first grip button 530a and the second grip button 530b are fixed to the handle 530 to hinge with respect to the handle 530. The first and second grip buttons 530a, 530b are inclined relative to the handle 530, with their distal ends spaced closer together and their proximal ends spaced farther apart. The term "proximal" as used herein indicates a location closer to the manipulator support structure and further from the patient anatomy, and the term "distal" indicates a location further from the manipulator support structure and closer to the patient. The first and second grip buttons 530a, 530b have an angle α between their distal ends that may vary depending on the force applied thereto by the user. In the example teleoperated surgical system 100, the grip buttons 530a, 530b are in a neutral position when the user does not apply a pinching force to move the grip buttons 530a, 530b toward each other. In the example user input device 204, the grip buttons 530a, 530b are maximally displaced from each other in the neutral position. In the example user input device 204, the angle α is an acute angle when the grip member is in the neutral position. In the example user input device 204, in the neutral position, the one or more motors 545 apply a tactile counterforce to the grip buttons 530a, 530b that resists a force applied by a user to move the grip buttons 530a, 530b toward each other, and the user must overcome the tactile counterforce to move the grip buttons 530a, 530b toward each other. In an alternative example user input device 204, in the neutral position, a biasing member (not shown), such as a spring, provides a reaction force to resist displacement of the grip buttons 530a, 530b toward each other, and the user must overcome the reaction force to move the grip buttons 530a, 530b toward each other.
In the example user input device 204, in a first condition (i) when the user is not applying a force to the grip buttons 530a, 530b, and in a second condition (ii) when the grip buttons 530a, 530b have a displacement position distance therebetween that meets a prescribed threshold, the one or more motors 545 cause the grip buttons 530a, 530b to move to a neutral position. In an alternative example user input device 204, in a first condition (i) when the user is not applying a force to the grip buttons 530a, 530b, and in a second condition (ii) when the grip buttons 530a, 530b are inA biasing member (not shown), such as a spring, has a displacement position distance therebetween that meets a prescribed threshold value, causing the grip buttons 530a, 530b to move to a neutral position. In the example user input device 204, when the grip buttons 530a, 530b have a grip therebetween that is less than a second grip threshold (T G 2) The second condition occurs when the distance is displaced from the position of displacement. Clicking the event causes the surgical system to reach a position at the grip buttons 530a, 530b that is greater than the second grip threshold (T G 2) The click event of the distance displaces the position or initiates an action, such as a display action at display system 120 or an instrument action within surgical site 300. Thus, by the time the displacement distance position is in the first control state 776 of the graphic 770 described below, an action has been initiated in response to a reduced displacement distance between the grip buttons 530a, 530 b. For example, the control elements in the control element menu 420 of the second graphical user interface screen display 421 of fig. 4B-4C will be selected and the screen display corresponding to the selected control element will be displayed. For example, if the click event is a selection of a wrist manipulation control element, then wrist manipulation menu 425 of FIG. 4C will be displayed.
In the example user input device 204, a user may apply a force to the respective grip buttons 530a, 530b in the respective directions toward the handles 530 to reduce displacement therebetween until the gripping members interface with the handle mount 530 as a stop surface, at which point there is no displacement between the grip buttons and the handle mount. More specifically, according to some embodiments, the first grip button 530a and the second grip button 530b are fixed to the handle to pivot about the main pivot axis 536. One or more motors 545 or other biasing members urge the grip buttons 530a, 530b apart. In the example user input device 204, the one or more motors 545 are configured to apply a variable tactile force in a radially outward direction from the mounting member 530 toward the grip buttons 530a, 530b during movement of the user-applied grip buttons 530a, 530b toward the radially inward direction of the handle 530. In the example user input device 204, the one or more motors 545 may include a single motor (not shown) that may apply haptic forces to both grip buttons 530a, 530b. In an alternative example user input device 204, the one or more motors 545 may include a first motor (not shown) that applies a tactile force to the first grip button 530a and a second motor (not shown) that applies a tactile force to the second grip button 530b. The handle 530 includes one or more displacement sensors (second sensors) 547, such as hall effect devices, for sensing movement of the grip buttons 530a, 530b along the first path and their displacement from the neutral displacement position. A finger ring (not shown) may be attached to the handle to avoid slipping off the grip button. A wide variety of grip button structures may be used within the scope of this disclosure, including any surgical instrument handle, for example, optionally including a rigid or flexible ring for the thumb and/or fingers. The control relationship between the grip buttons and the controlled device is explained in more detail in U.S. patent No. 6,594,552 entitled "Grip Strength with Tactile Feedback for Robotic Surgery", the entire disclosure of which is expressly incorporated by reference.
In a first (3D) mode of operation, the user input device 204 and the grip buttons 530a, 530b are operatively coupled by movement means, for example, to control movement of the controlled device 104 in response to 3D movement of the user input device 204 and movement of the grip buttons 530a, 530b about the main pivot axis 536. In a second (2D) mode of operation, the controller 204 and the grip buttons 530a, 530b are operably coupled to control 2D cursor movement within the viewing plane 502 and to control element selection within the viewing plane 502.
In the example teleoperated surgical system 100, one or more motors are optionally configured to apply a variable haptic force to the grip buttons 530a, 530b in a radially outward direction away from the handle 530. The user may apply a force to the grip buttons 530a, 530b using his or her fingers, urging them toward the handle 530 therebetween and toward each other so as to cause them to move closer together. As explained below, in the second (2D) mode, the user may use the hand-actuated selector 212 to implement a click event to select a graphical user interface control element by applying a finger force in a radially inward direction toward the handle 530 to overcome neutral resistance and motor-controlled tactile forces and to cause the grip buttons 530a, 530b to be directed toward each other. As explained below, a variable tactile force is applied to the grip buttons 530a, 530b of the hand actuated selector 212 to provide tactile feedback to indicate when a click event occurs within the viewing plane 502.
The example user input device 204 includes a four-degree-of-freedom gimbal mount 225 to allow a user to rotate the actuatable mounting member handle 530 about three axes (axis 534a, axis 534b, and spool 534 c). During operation in the first (3D) mode, a physical or virtual controlled device (such as instrument 104) follows the 3D motion of user input device 204. During operation in the second (2D) mode, a controlled user interface element (such as a cursor) within the 2D viewing area 502 follows the 2D motion of the user input device 204 within the 2D area 504.
More specifically, the handle mount 530 portion of the user input device 204 is coupled to the first elbow link 514 through the first pivot joint 16. The first link 532 is coupled to the second elbow link 537 by a pivot joint 520. The second link 537 is pivotally coupled to the third elbow link 538 by pivot joint 524. In some embodiments, the motors of the arms 538 and universal joints 225 are capable of actively applying position and orientation forces to the mounting member handle 530, thereby providing tactile feedback to the surgeon. The gimbal 225 includes links 532, 537, 538. Gimbal 225 is mounted to platform 540 for rotation about axis 534d and links 532, 537, 538 define additional axes 534a, 534b, and 534c. The handle 530 is mounted to the universal joint 225 by an actively driven joint for movement about an axis 534 d. Thus, the gimbal 225 provides four degrees of driven orientation freedom, including redundant degrees of orientation. The universal joint 225, arm 538, and drive motor for these joints are described in more detail in U.S. patent No. 6,714,839 entitled "Master Having Redundant Degrees of Freedom," the entire disclosure of which is expressly incorporated herein by reference.
Fig. 6B is an illustrative functional block diagram showing a control system 680 for controlling the receipt of user input at user input device 204 including hand actuated selector 212 and the application of haptic feedback at user input device 204. The input controller 118 is coupled to control operationsThe portrait component 102 and is coupled to control a display system 120. The user input device 204 is configured to receive a first user input motion 652, such as a user hand motion, that imparts motion to the entire user input device 204. The hand actuated selector 212 is movably mounted to the user input device 204, the hand actuated selector 212 being configured to receive a second user input 654, such as a user finger movement, which imparts movement to the hand actuated selector 212 relative to the user input device 204. The one or more first sensors 547a are configured to sense movement of the entire user input device 204 and provide corresponding first sensor signals to the input controller 118 (S 1 ) 549a. The one or more second sensors 547b are configured to sense movement of the hand actuation selector 212 relative to the user input device 204 and provide corresponding second sensor signals to the input controller 118 (S) 2 ) 549b. The one or more motors 545 are coupled to receive the motor control signal 551 from the input controller 118 and to apply a haptic feedback force 553 to the hand actuated selector 212. The input controller 118 is configured to provide motor control signals (M C ) 551 to cause the one or more motors to apply a haptic feedback force (F) to the hand actuated selector 212 H )553。
The user can provide a mode selection signal (S M ) 555 to cause the input controller 118 to operate in one of a 3D mode and a 2D mode. In the 3D mode, the displaceable hand actuated selector 212 may move in 3D with movement of the user input device 204 in response to the first user input movement 652 and may be displaced relative to the user input device 204 in response to the second user input. In the 2D mode, the displaceable hand actuated selector 212 may move in 2D with the user input device 204 in response to a first user input and may be displaced relative to the user input device 204 in response to a second user input motion 654.
In the 3D mode, the input controller 118 controls the manipulator assembly 102, including one or more motors (not shown) that control operation of one or more of the instrument 104, the instrument end effector, and the manipulator links, in response to movement of the applied user input causing one or more of displacement of the entire user input device 204 and displacement of the displaceable hand actuated selector 612 relative to the user input device 204 to which the selector is mounted. In the 2D mode, the input controller 118 controls the display system 120, including a graphical user interface screen display, that includes one or more control elements, such as, for example, a menu, cursor, slider, knob, or button, in response to user input causing one or more of displacement of the entire user input device 204 and displacement of the displaceable hand-actuated selector 612 relative to the user input device 204.
The first displacement sensor 547a is coupled to sense displacement of the entire user input device 204 and provide a corresponding first sensor signal 549a to the input controller 118 indicative of displacement of the user input device 204. The second sensor 547b is coupled to sense displacement of the hand actuated selector 212 relative to the user input device 204 and provide a corresponding second sensor signal 549b to the input controller 118 indicative of the displacement of the hand actuated selector 212 relative to the user input device 204 to which the selector 212 is movably mounted. The input controller 118 includes one or more processor circuits configured with executable instructions to provide control signals to control the manipulator assembly 102 in response to the first sensor signal and the second sensor signal when in the 3D mode and to provide control signals to control the display system 120 in response to one or more of the first sensor signal and the second sensor signal when in the 2D mode. The input controller 118 is further configured to provide motor control signals 553 to the one or more motors 545 to apply a haptic feedback force F to the hand actuated selector 212 based on one or more of the first sensor signal 549a and the second sensor signal 549b H
Tactile feedback to indicate click events
Fig. 7 is an illustrative diagram showing a first control function curve 720 representing the relationship of haptic force to displacement of the displaceable hand actuated selector 212 position during a click event, the first control function curve 720 being aligned in time with a grip button displacement sequence and a hand shape sequence. The example user input device 204 of fig. 6A includes a handle 530 as a mount, with a pair of opposing grip buttons 530a, 530b mounted on the handle 530. As explained above, alternative example hand actuated selectors include gravity balance arms, joysticks, trackballs, gloves, trigger grips, twistable knobs, twistable grips, sliders, lever buttons, or the like. The user may cause a click event by applying a second user input motion that causes a prescribed displacement of the displaceable hand actuated selector 212. For the hand actuated selector 212 of fig. 6A, for example, an example user may cause a click event by reducing the distance displacement of the grip buttons 530a, 530b from a neutral position displacement to a click event completion position displacement distance. In the example user input device 204 of fig. 6A, each grip button is displaced an equal amount from the handle mount 530 of the user input device throughout movement of the grip buttons 530a, 530b between the open and closed positions, although the displacement distance between the grip buttons decreases as the grip buttons 530a, 530b move from the open position to the closed position and increases as the grip buttons 530a, 530b move from the closed position to the open position. Further, in the example user input device 204 of fig. 6A, each individual grip button 530a, 530b has a displacement distance from the handle mount 530 of the user input device that decreases as the grip button moves from the neutral displacement position to the click event completion displacement position and increases as the grip button moves from the click event displacement position to the neutral displacement position.
FIG. 7 illustrates an example sequence of displacement positions of the grip buttons 530a, 530b of the example hand actuated selector 212 of FIG. 6A during a kneader closure due to a second user input motion 646 caused during a user finger click event, represented by a sequence of user finger positions 752, 754, 756. Fig. 7 also shows a corresponding example sequence of displacement positions 762, 764, 768, 770 of the grip buttons 530a, 530b of the example hand-actuated selector 212 of fig. 6A. For simplicity and clarity of illustration, the sequence of example user digits 758, 760 and finger positions 752, 754, 756 are shown separated from the corresponding sequence of respective grip buttons 530a, 530b and grip button displacement positions 762, 764, 768, 770 with which they are in contact, however, it will be appreciated that in actual use, the respective digits 758, 760 are in contact with the respective grip buttons 530a, 530b. For example, in a practical example use, the cursor finger 758 contacts the first grip button 530a and the thumb 760 contacts the second grip button 530b.
FIG. 7 illustrates an example sequence of displacement positions 762, 764, 768, 770 of the grip buttons 530a, 530b of FIG. 6A, each corresponding to a different distance of movement of the grip buttons relative to the handle 530 of the user input device traveled during a click event. Thus, at the illustrative displacement distance indicated by the grip button displacement position 762 and the corresponding finger position 752, the grip button has traveled a first (shortest) distance in the illustrative sequence relative to the handle 530 of the user input device. At the illustrative displacement distance indicated by the grip button displacement position 764 and the corresponding finger position 754, the grip button has traveled a second distance greater than the first distance relative to the handle 530 of the user input device. At the illustrative displacement distance indicated by the grip button displacement position 768 and the corresponding finger position 756, the grip button has traveled a third distance relative to the handle 530 of the user input device that is greater than the combined first and second distances. At the illustrative displacement distances indicated by the grip button displacement position 770 and the corresponding finger position 756, the grip button has traveled a fourth distance that is greater than the first, second, and third distances relative to the handle 530 of the user input device.
Fig. 7 also shows a corresponding sequence of finger positions of hand 759 applying a second user input motion 654 to grips 530a, 530 b. The sequence of finger positions begins with a fully open finger position 752, followed by a partially closed/partially open finger position 754, followed by a substantially closed finger position 756. A sequence of grip button displacement positions around the handle mount 530 is shown. The grip button sequence begins at a fully open grip button displacement position 762, followed by a partially closed/partially open grip button displacement position 764, followed by a near fully closed grip button displacement position 768, followed by a fully closed grip button position 770.
Fig. 7 includes an illustrative graph 770 representing a control function curve that controls a combined click event trigger and haptic feedback control function implemented using the hand actuated selector 212, the second sensor 547b, the input controller 118, one or more motors 545, and a stop surface, such as a handle 530 of an input device. The input controller 118 includes one or more processor circuits programmed with executable instructions to implement the control function 700. The control function 770 controls the triggering of the click event and controls the haptic feedback force applied in response to the second user input motion 654 displacing the hand actuated selector 212. More particularly, the input controller 118 is configured to be responsive to a second sensor signal S provided by one or more second sensors 2 To implement a control function to control triggering of a click event at the display system 120 or manipulator assembly 102 and to cause the one or more motors 545 to apply a haptic force to the hand actuated selector 212. According to the control function 770, to trigger a click event, the user must displace the hand actuated selector 212 by at least a prescribed displacement distance. Further, according to the control function 770, a haptic feedback force is applied to the hand actuated selector 212 when the user is displacing the hand actuated selector 212. The haptic feedback force indicates to the user that a click event is established and triggered in response to an increase in displacement of the hand actuated selector 212 by the user. After a prescribed amount of further displacement following the triggering click event, the stop surface exerts a sudden reaction force that stops further displacement of the hand actuated selector 212.
The control function curve 770 has a plurality of states. The first control state 772 is a neutral or rest state in which the displacement of the hand actuated selector 212 is less than the first threshold displacement T D1 . In the case of the user input device 204 of fig. 6A, in the first control state 772, the maximum displacement of the grip buttons 530a, 530b from the handle 530 at the neutral position and the first threshold distance T D1 Between them. In the example teleoperated surgical system 100, in the first control state 772, the resilient member applies a maintenance force to the hand-actuated selector 212 to cause it thereIn the neutral displacement position. In the case of the user input device 204 of fig. 6A, in the first control state 772, a biasing member (such as a spring member) urges the grip buttons 530a, 530b to be maximally displaced relative to each other and from the handle 530, while the input controller 118 causes the one or more motors to apply a constant zero force. In an alternative example teleoperated surgical system 100, in a first control state, the input controller 118 causes the one or more motors 545 to apply a spring maintenance force to the hand-actuated selector 212 to urge it to a neutral displacement position. In the alternative first control state 772, the one or more motors 545 are controlled to generate a spring-like maintenance force to maintain the hand-actuated selector 212 in the neutral displacement position. In the alternative first control state 772, the one or more motors provide a force to move the hand-actuated selector 212 back to the neutral position in response to a user applying a force to overcome the maintenance force that causes the selector 212 to displace less than the first displacement threshold distance and then the user removes the force. While in the first control state 772, the user may apply motion to the hand-actuated selector to displace it somewhat from the neutral position so long as the displacement is a first threshold distance T D1 . For the example user input device of FIG. 6A, the magnitude and direction of this holding force with respect to the user's finger is represented by arrows 752a, 752 b. The first and second directions extend radially outward from the longitudinal axis 531 of the handle 530.
The second control state 774 is a haptic feedback force establishment state in which the haptic force increases at a first rate relative to an increase in displacement of the hand actuated selector 212. The second control state 774 includes the hand actuated selector 212 satisfying the first threshold distance T D1 But has not yet met the second threshold distance T D2 Is a displacement of (a). During the second control state 774, the one or more second sensors 547b sense an increase in displacement of the hand-actuated selector 212 from the neutral position and output a corresponding second sensor signal value S 2 To the input controller 118 to report the displacement increase. The input controller 118, in turn, generates motor control signals 553 to cause the one or more motors 545 to apply an increase in displacement from a neutral position relative to the hand-actuated selector 212 at a first rateIncreased tactile force. The first rate may be linear or non-linear, so long as the rate allows the user time to identify and react to the tactile sensation of increased haptic force by following or aborting an impending click event. In the case of the example user input device 204 of fig. 6A, the grip buttons 530a and 530b are displaced from each other by increasingly smaller amounts as they are progressively displaced from their widely spaced neutral positions. During the second control state 774, the establishment of a haptic force at a first rate increase magnitude relative to the increase in displacement of the hand actuated selector 212 provides a tactile indication or warning to the user that a click event is imminent: the larger the magnitude, the closer the current displacement is to causing a click event to occur. The second rate is selected so that the user has time to react to an indication that a click event is imminent, so that, for example, the user may make an informed decision to continue the displacement and follow the click event, or to stop the displacement and abort the click event. Thus, an increase in haptic feedback during the second control state 774 alerts the user that a click event is increasingly imminent as the displacement of the hand actuated selector 212 increases.
In the example user input device 204 of fig. 6A, the magnitude of the second feedback force during the second control state is represented by the length of the arrows 754a, 754b. The grip button 530a applies a force 754a to the index finger 758 in a first direction and the grip button 530b applies a force 754b to the thumb 760 in a second direction opposite the first direction. The first and second directions extend radially outward from the longitudinal axis 531 of the handle 530 and the longitudinal axis 531 of the handle 530. It will be appreciated that in order to move the grip buttons 530a, 530b closer together during the second control state 774, the user's fingers 758, 760 apply a corresponding displacement force in a direction opposite the haptic force directions 754a, 754b. The force applied by the user is of a magnitude large enough to overcome the tactile forces 754a, 754b. Additionally, it is understood that the magnitude of the tactile forces 754a, 754b applied by the respective grip buttons 530a, 530b at any example instant in the second control state 774 is greater than the forces 752a, 752b during the maintenance force during the first control state 772, as represented by the length of the arrows 754a, 754b being greater than the length of the arrows 752a, 752 b.
The third control state 776 is when the displacement of the hand actuated selector 212 meets the second threshold distance T D2 A click event that occurs at the time triggers the state. The second sensor 547b sends a second sensor signal to the input controller 118 indicating when the displacement of the hand actuated selector 212 reaches a second threshold displacement T from the neutral position D2
When operating in the 2D mode, the example input controller 118 is responsive to the hand actuation sensor 212 reaching a second threshold displacement T D2 And sends a click event trigger signal to the display system 120, for example, which causes selection of a visual UI control element in the control element menu 420 that is overlaid by the cursor 402. When operating in the 3D mode, the example input controller 118 is responsive to the hand actuation sensor 212 reaching a second threshold displacement T D2 And sends a click event trigger signal to the manipulator assembly 102, which causes actuation of one or more motors (not shown) to actuate a linkage or instrument end effector, for example.
Further, when operating in the 2D mode or the 3D mode, the example input controller 118 is responsive to the hand actuation sensor 212 reaching a second threshold displacement T D2 And a motor control signal M is applied to the line 551 C Causing the one or more motors 545 to apply a stepped-down tactile feedback force to the hand-actuated selector 212. Stepping down the haptic feedback force shifts the haptic feedback force from when the hand actuates the displacement of the selector 212 across the second threshold displacement T D2 The peak at the moment of (a) is reduced to a level that matches or substantially matches the force applied during the first control state 772. More specifically, the haptic force increases at a second rate relative to the displacement of the hand actuated selector 212. The magnitude of the second rate is greater than the magnitude of the first rate. In an example system, the second rate has an magnitude selected to provide a substantially instantaneous tactile indication to the user that the click event has been triggered. In the example user input device 204 of fig. 6A, the magnitude of the third feedback force during the third control state 776 is represented by the length of the arrows 756A, 756b. The grip button 530a applies a force 764a to the index finger 758 in a first direction and the grip button 530b applies a force 756b to the thumb 760 in a second direction opposite the first direction. In an example systemIn response to the one or more sensors 247b providing a signal to indicate when the displacement returns to the neutral displacement distance, i.e., the third control state 776, the control function returns to the first control state 772.
In the example user input device 204, the hand-actuated selector 212 reaches the third threshold displacement T under the control phase 778 D3 At or about the moment in time, the selector 212 impacts the stop surface. In the example input device 204 of fig. 6A, the hand actuated selector 212 includes displaceable grip buttons 530a, 530b, and the handle 530 acts as a stop surface. The impact of the hand actuated selector 212 against the stop surface provides a sudden reaction force that provides the user with an additional tactile sensation indicating that the click event is complete and that the hand actuated selector 212 is ready to return to the second control state 772, i.e., the rest or neutral state. Thus, stop surface 530 provides tactile feedback when control function 700 has actually returned to first control state 772 (where one or more motors 545 provide a maintenance force).
FIG. 8 is an illustrative flow chart showing a control process 800 according to the control function of FIG. 7 for controlling the provision of force and triggering of a click event based on displacement of the hand actuated selector 212 relative to the user input device 214. The input controller 118 includes one or more processor circuits programmed with executable instructions to implement the control process 800. At operation 802 during the first control state 772, when the one or more second sensors 547b indicate that the user did not provide for causing the hand-actuated selector 212 to be displaced from the neutral position by at least a first threshold distance T D1 Upon a second user input movement 654 of (a), the input controller 118 causes the one or more motors 545 to provide a neutral holding force (F) to the hand actuated selector 212 HN ). The first threshold decision operation 804 monitors one or more second sensors 547b to detect when the displacement of the hand actuated selector satisfies a first threshold displacement T D1 . In the example hand actuated selector 212 of fig. 6A, a first threshold decision operation 804 determines when the displacement position of the grip buttons 530a, 530b from their neutral positions meets a first displacement threshold. The example input controller 118 provides motor control signals when the selector 212 displacement is less than a first threshold haptic distance No., cause the one or more motors 545 to generate a haptic force of zero magnitude; when the selector 212 is displaced less than a first threshold tactile distance, a biasing member (such as a spring) maintains the selector 212 in a neutral position.
Satisfying a first threshold displacement distance T in response to displacement D1 Control transitions to the second control state 774. When the displacement has not met the first threshold displacement distance, the process 800 remains in the first control state 772. At operation 806 during the second control state 774, the input controller 118 causes the one or more motors 545 to apply a haptic feedback force to the hand-actuated selector 212 that increases at a first rate that is a function of the increase in displacement of the grip button from the neutral position. In an example system, operation 806 may set a rate of establishment of the haptic feedback force, e.g., as a linear or nonlinear function of displacement. The second threshold decision operation 808 monitors one or more second sensors 547b to detect when the displacement of the hand actuated selector satisfies a second threshold displacement T D2 . In the example hand actuated selector 212 of fig. 6A, a second threshold decision operation 808 determines when the displaced position of the grip buttons 530a, 530b from their neutral positions meets a second displacement threshold distance.
Satisfying the second threshold displacement distance T in response to the displacement D2 Control transitions to third control state 776. When the displacement has not met the second threshold displacement distance, the process 800 remains in the second control state 774. At operation 809, the input controller 118 sends a control signal to the display system 120 to trigger the occurrence of a click event. At operation 810, during a third control state 776, the input controller 118 causes the one or more motors 545 to apply a haptic feedback force to the hand-actuated selector 212 at a second rate that decreases from a peak value when transitioning from the second control state to the third control state to a level that matches or substantially matches the force applied during the first control state 772. The magnitude of the second rate is greater than the magnitude of the first rate. During decision operation 812, the input controller 118 monitors the sensor signals provided by the one or more sensors 547b to determine when the displacement of the selector 212 meets (e.g., is less than or equal to) the first threshold displacement T D1 . Responsive toThe displacement satisfies the first displacement distance and the control process 800 transitions back to operation 802.
The change in haptic force provides a tactile indication to the user of the click event state during the first control state, the second control state, and the third control state. During the second control state 774, the establishment of a haptic force indicates to the user that a triggering event is increasingly imminent. During the third control state 776, a rapid decrease in the haptic force indicates the triggering of a click event. During the first control state 772, a maintenance force to maintain the hand has occurred to strengthen the user from the haptic event. The hard stop at the physical stop surface 530 indicates to the user that the hand actuated selector 212 has returned to the neutral first control state 772.
Isolating cursor movement from push button movement
During operation of the display system 120 in the second (2D) mode, the fingers of the user's hand may be used to select control elements displayed in the viewing plane 502. During the second mode of operation, the input controller 118 causes movement of the cursor 402 in the graphical user interface displayed in the viewing plane 502 to follow movement of the user input device 204 in the haptic plane 504. The cartesian coordinate locations in viewing plane 502 correspond to the cartesian coordinate locations in haptic plane 504. The user selects a control element by first moving the user input device 204 to a coordinate location in the haptic plane 504 corresponding to the control element coordinate location in the viewing plane 502 to visually align the cursor 402 with the control element and second applying an actuation motion to the hand actuation selector 212. For example, a user's finger may apply motion to the hand-actuated selector 212 to increase its displacement from a neutral position to trigger a click event, as described above with reference to fig. 7-8.
Movement of the user's finger may affect movement of other portions of the user's hand. Referring to the example hand actuated selector 212 of fig. 6A, when a user applies a closing force to the grip buttons 530a, 530b to effect a click, a user's finger movement may cause a corresponding movement of the controller 204 to which the grip buttons 530a, 530b are mounted. The user-applied motion of the grip buttons 530a, 530b may cause a slight unintended position change or jump of the controller 204. Such an unexpected change in position of the controller 204 in the haptic plane 504 may cause a corresponding unexpected change in position of the cursor 402 in the viewing plane 502. Thus, when the buttons 530a, 530b are held for clicking on a control element in a graphical user interface menu, for example, the cursor 402 may tend to skip the control element during the clicking process. Thus, a user abrupt actuation click event may cause an abrupt unexpected jump in cursor movement, resulting in misalignment of the cursor with the target control element, which may result in the selection of the wrong target. This can be particularly frustrating if the target clickable control element is not large and the user eventually misses the target element.
Missing targets can be particularly detrimental in a surgical environment. For example, the viewing plane 502 may display a pre-operative MRI or CT scan instead of or in addition to control elements. The target element that the surgeon uses for intending to visually align with the cursor for selection may be a subtle anatomical feature, such as a nerve or blood vessel, for example. The surgeon may want to zoom in on the view of the feature or transition to other anatomical features represented in an MRI or CT scan above or below the target feature. Thus, precise alignment between the cursor and the target element may be required.
Referring again to the control system of fig. 6B, the example input controller 118 is configured to calculate a motion transformation function to provide a cursor motion control signal, a first cursor motion input signal, and a second cursor motion input signal to the display system 120 on line 561, the cursor motion control signal being indicative of motion of the user input device 204 as sensed by the one or more first sensors 547 a. The display system displays movement of a cursor 402 within the graphical user interface 400 that follows movement of the user input device 204.
In the example user input device 204, the click event generally involves a rapid movement of the hand actuated selector 212. The controller system 118 is configured with instructions to adjust the transformation function to effect a click event during the occurrence of user motion input to a hand actuated selector 212 mounted to the user input device 204, where such user motion input may also affect movement of the user input device. Fig. 9 is an explanatory diagram showing a configuration of the input controller 118 that implements the first transformation F1 in the absence of a click event. Fig. 10 is an explanatory diagram showing a configuration of the input controller 118 that implements the second transformation F2 in the presence of a click event.
Referring to fig. 9, based on the input controller 118 determining that movement of a user's finger (e.g., finger and/or thumb) and/or overall hand movement indicates that the user is not in the process of causing a click event signal to be initiated, the input controller 118 applies a first transformation function F1 that causes movement of the cursor 402 in the viewing plane 502 to follow movement of the user input device 204 in the haptic plane 504. More particularly, the example input controller 118 is configured to cause the Cartesian coordinates of the cursor 402 in the viewing plane 502 to match the Cartesian coordinates of the user input device 204 in the haptic plane 504. Further, in the absence of a click event, the example input controller 118 may scale the movement of the user input device 204 to the movement of the cursor 402 by a prescribed amount. For example, the input controller 118 may be calibrated to a predetermined one-to-one (1:1) movement ratio of user input device motion to cursor motion, where the distance the user input device 204 moves in the haptic plane 504 exactly matches the distance the cursor 402 moves in the viewing plane 502. Alternatively, for example, the input controller 118 may be calibrated to a predetermined two-to-one (2:1) movement ratio, wherein the user input device 204 moves exactly twice the distance in the haptic plane 504 that the cursor 402 moves in the viewing plane 502.
Referring to fig. 10, based on the input controller 118 determining that the movement of the user's finger (e.g., finger and/or thumb) and/or the entire hand movement indicates that the user is in the process of causing a click event to be initiated, the input controller 118 applies a second transformation function F2 that reduces the rate of cursor movement in response to user input device movement during the time interval when the user causes a click event to be initiated. The reduced movement ratio results in reduced cursor movement in the viewing plane 502 during clicking in response to user input device movement in the haptic plane 504.
FIG. 11 is an illustrative drawing showing a second control function curve 802 representing an example second transformation function F2 for determining a relationship of controller motion filtering to time during a click event and also showing a sequence of grip button displacements aligned to time, a sequence of hand shapes aligned to time, and a sequence of view plane instances aligned to time. As used herein, selector speed refers to the rate of increase of displacement from the neutral position of the hand actuated selector 212 relative to the base of the user input device 204. In the example case of fig. 6A, the selector speed refers to the rate of increase in displacement relative to the grip buttons 530a, 530b of the handle 530. As used herein, an increase in the controller motion filtering corresponds to a decrease in the movement ratio of the motion of the user input device 204 to the motion of the cursor 402. That is, the greater the magnitude of the controller motion filtering, the less the movement of the input device 204 in the haptic plane 504 causes a corresponding movement of the cursor 402 in the viewing plane 502.
The example second transformation function F2 includes a first filter function, a second filter function, and a third filter function. As depicted by the second control function curve 820, the input controller 118 is configured to apply a first filter function 822 that spans a first time interval T 1 During this time, the displacement of the hand-actuated selector 212 from the neutral position relative to the user input device 204 is at a displacement rate (T R1 ) Is a rate change of (c). As depicted by the second control function curve 820, the input controller 118 is configured to apply a second filter function 824 that spans a second time interval T 2 During this time, the displacement of the hand actuated selector 212 from the neutral position relative to the user input device 204 is at or above the first threshold rate T R1 Is a rate change of (c). As depicted by the second control function curve 820, the input controller 118 is configured to apply a third filter function 826 that is applied during the second time interval T 2 Thereafter spanning a third time interval T 3 The grip buttons 530a, 530b continue to move at this point, but at a rate no longer equal to or greater than the first threshold rate T R1 . In the case of the example selector 212 of FIG. 6A, at a first During each of the time intervals, the second time interval, and the third time interval, the displacement of the grip buttons 530a, 530b relative to the handle 530 from their neutral positions increases.
Referring to the second control function curve 820, at a first time interval T 1 During this time, when the selector 212 is actuated by hand to be less than T R1 As the rate of movement of the user input device 204 (e.g., the grip buttons 530a, 530b move relative to the handle 530), the first filter function 822 increases the user input device motion filtering corresponding to an increase in the rate of input device motion; the faster the hand actuated selector 212 moves relative to the input device 204 (e.g., the faster the grip buttons 530a, 530b move relative to the handle 530), the less the movement of the user input device 204 affects the corresponding movement of the cursor 402, which amounts to a decrease in the movement rate as the rate of grip button movement increases. Thus, in the first time interval T 1 During this time, the movement of the cursor 402 follows the movement of the user input device 204 according to the first dynamic movement ratio. Referring to the example of fig. 6A and 11, the direction of arrows 852a, 852b indicate a reduction in displacement distance between fingers 758, 760 and between grip buttons 530a, 530 b. The length of arrows 852a, 852b indicate the rate at which the grip buttons are moved closer together. The example input controller 118 may be configured to increase user input device motion filtering (which correspondingly decreases the movement ratio) using a first filtering function 822 (e.g., as a linear function of time, a logarithmic function of time, or an exponential function of time). Thus, in the first time interval T 1 During which when the selector displacement rate is less than the first threshold rate T R1 As the selector rate increases, the filtering of the user input device motion increases.
At a second time interval T 2 During this time, when the hand actuated selector 212 is actuated at or above the first threshold rate T R1 The second filter function 822 causes a substantially constant minimum movement ratio as the rate of movement relative to the selector. In an alternative example surgical system 100, when the hand actuates the selector 212 to be equal to or greater than the first threshold rate T R1 The second filter function when the velocity of (2) is shifted relative to the selectorNumber 822 causes the movement of cursor 402 to stop such that the cursor movement does not follow the movement of user input device 204. Referring to the example of fig. 6A and fig. 11, the direction of arrows 854a, 854b indicate a reduction in displacement distance between fingers 758, 760 and between grip buttons 530a, 530 b. Due to holding the buttons 530a, 530b during the second time interval T 2 During which it moves at a faster rate than during the first time interval, so that arrows 854a, 854b have a greater length than arrows 852a, 852 b. The example input controller 118 may be configured to cause the movement ratio to be zero during the second time interval. That is, the cursor 402 does not move in response to movement of the controller 204. More particularly, for example, the example input controller 118 may be configured to, at a second time interval T 2 During which the cursor movement is caused to stop, (i.e., transition to a movement ratio of 1: 0). Alternatively, for example, the example input controller 118 may be configured to, at a second time interval T 2 During which the movement ratio is caused to decrease to 1:0.1. Thus, in the second time interval T 2 During this time, when the rate of movement of the grip buttons 530a, 530b is at or exceeds T R1 When the filtering of the user input device movements and the corresponding movement ratio remain constant. Second time interval T 2 Is based on the rate matching of the movement of the selector 212, 530a, 530b or exceeds a first threshold rate T R1 Is determined by the length of time of (a).
In the example surgical system 100, at a second time interval T 2 During which a click event 809 is triggered. For example, triggering of a click event may be controlled according to the first control function curve 720 of FIG. 7 and the control process 800 of FIG. 8.
At a third time interval T 3 During this time, when the rate of movement of the hand actuated selector 212 is reduced to less than T R1 When the third filter function 826 causes the controller motion filter to decrease over time, which corresponds to an increase in the rate of movement over time. Thus, in the third time interval T 3 During this time, the movement of the cursor 402 follows the movement of the user input device 204 according to the second dynamic movement ratio. Referring to the example hand actuated selector 212 and user input device 204 of FIG. 6A, as the grip buttons 530a, 530b are opposite the handle 53 The rate of movement of 0 is within a second time interval T 2 During a third time interval T 3 The period is large, so the arrows 856a, 856b have a shorter length than the arrows 854a, 854 b. The example input controller 118 may be configured to reduce controller motion filtering (which correspondingly increases the movement ratio) using a third filter function 826 (e.g., as a linear function of time, a logarithmic function of time, or an exponential function of time). Further, the example input controller 118 may be configured to use the third filter function 826 to cause the movement ratio to increase over time until it regains a 1:1 pre-click movement ratio, for example. Thus, in the third time interval T 3 During this time, the filtering of the user input device motion decreases with time, which means that the movement ratio increases with time.
Still referring to FIG. 11, the sequence of viewing window display instances 502a, 502b, 502c, 502d graphically represents the change in the rate of movement of the cursor within the viewing plane 502 at different points along the second control function curve 820. In each sequential viewing window instance 502a, 502b, 502c, 502d, a respective arrow 503a, 503b, 503d is associated with the cursor 402. In each respective viewing window display instance, the length of the respective arrow represents the magnitude of the movement ratio, i.e., the ratio of user input device movement to corresponding cursor movement during the display time of the respective viewing window.
Before the example click event begins, the movement ratio filter function applies a maximum movement ratio represented by the long length of the corresponding arrow 503a in the corresponding viewing window display 502 a.
At a first time interval T 1 During which the selector 212 is actuated to be less than T in response to the hand R1 The example input controller 118 is configured by the second transformation function F2 to increase the user input device motion filtering as a function of the rate of movement of the hand actuated selector 212 relative to the user input device 204, which corresponds to decreasing the rate of movement as the rate of decrease of the grip button displacement increases. The decrease in the movement ratio is represented by the shorter length of arrow 503b in the viewing window display 502 b.
At the second stageTime interval T 2 During which the selector 212 is actuated to be equal to or greater than T in response to the hand R1 Is moved relative to the user input device 204, the input controller 118 is configured by the second transformation function F2 to cause a constant minimum movement rate, even though the rate of movement continues to increase. The minimum movement ratio is represented by the absence of an arrow in the viewing window display 502 c.
At a third time interval T 3 During which the selector 212 is actuated in response to the hand to no longer be equal to or greater than T R1 Is moved relative to the user input device 204, the input controller 118 is configured by the second transformation function F2 to cause the controller motion filtration to decrease over time (which correspondingly increases the movement ratio). The increase in the movement ratio is represented by the reappearance of arrow 503d in the viewing window display 502 d. Note that the shorter length of arrow 503d indicates that the movement ratio has not recovered to the pre-click level.
Computer hardware and storage device
FIG. 12 is an illustrative block diagram showing an example machine on which any one or more of the techniques (e.g., methods) discussed herein may be performed, according to an example embodiment. FIG. 12 shows an illustrative diagrammatic representation of a more specific computer system 1200 that can be used to implement, for example, the controller system 118. Computer system 1200 may be configured to implement, for example, a computerized training module. In alternative embodiments, computer system 1200 operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the computer system 1200 may operate in the capacity of a server or a client machine in server-client network environments, or as a peer machine in peer-to-peer (or distributed) network environments. Computer system 1200 may be a server computer, a client computer, a Personal Computer (PC), a tablet PC, a Personal Digital Assistant (PDA), a cellular telephone, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Furthermore, while only one machine (i.e., computer system 1200) is illustrated, the term "machine" shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
The example computer system 1200 includes a processor 1202 (e.g., a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or both), a main memory 1204, and a static memory 1206, which communicate with each other via a bus 1208. Computer system 1200 may further include a video display unit 1210 (e.g., a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED) display, a touch screen, or a Cathode Ray Tube (CRT)), for example, the video display unit 1210 may be used to display the positions of surgical instrument 124 and flexible instrument 120. Computer system 1200 also includes an alphanumeric input device 1212 (e.g., a keyboard, physical keyboard, virtual keyboard using software), a cursor control device or input sensor 1214 (e.g., a mouse, track pad, trackball, sensor or reader, machine-readable information reader, bar code reader), a disk drive unit 1216, a signal generation device 1218 (e.g., a speaker), and a network interface device or transceiver 1220.
The disk drive unit 1216 includes a non-transitory machine-readable storage medium 1222 on which is stored one or more sets of instructions 1224 (e.g., software) embodying any one or more of the methodologies or functions described herein. The instructions 1224 may also reside, completely or at least partially, within the main memory 1204, the static memory 1206, and/or within the processor 1202 during execution thereof by the computer system 1200, the main memory 1204 and the processor 1202 also constituting non-transitory machine-readable storage device media. The non-transitory machine-readable storage medium 1222 may also store an integrated circuit design and waveform structure. The instructions 1224 may further be transmitted or received over a network 1226 via a network interface device or transceiver 1220. While the machine-readable storage medium 1222 is shown in an example embodiment to be a single medium, the terms "machine-readable medium," "computer-readable medium," and similar terms should be construed to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions 1224. The term "machine-readable medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term "machine-readable medium" shall accordingly include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals.
It will be appreciated that for clarity purposes, the above description may describe some embodiments with reference to different functional units or processors. However, it will be apparent that any suitable allocation of functionality between different functional units, processors or domains may be used without detracting from the disclosure. For example, functions illustrated as being performed by separate processors or controllers may be performed by the same processor or controllers. Thus, references to specific functional units are only to be seen as references to suitable means for providing the described functionality rather than indicative of a strict logical or physical structure or organization.
Although the present disclosure has been described in connection with some embodiments, it is not intended to be limited to the specific form set forth herein. Those skilled in the art will recognize that various features of the described embodiments may be combined in accordance with the present disclosure. Further, it will be understood that various modifications and changes may be made by those skilled in the art without departing from the scope of the disclosure.
Furthermore, in the foregoing detailed description, it can be seen that various features are grouped together in one embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the detailed description, with each claim standing on its own as a separate embodiment.
The foregoing description of the embodiments according to the invention and the accompanying drawings are only illustrative of the principles of the inventive subject matter. It will therefore be appreciated that various modifications may be made to the embodiments by those skilled in the art without departing from the scope of the inventive subject matter, which is defined in the appended claims.
Thus, while certain exemplary embodiments of the present invention have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad inventive subject matter, and that this embodiment not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.
Various examples
Example 1 may include a method of controlling actuation of a click event by a hand actuated selector movably mounted to a mounting structure, comprising: sensing an amount of displacement distance of the hand actuated selector from a neutral position using one or more sensors; responsive to the hand actuation sensor being at a displacement distance less than a first threshold distance from the neutral position, controlling one or more motors to apply a maintenance force according to a first control state; in response to sensing that the hand actuation sensor is at a displacement distance between a first threshold distance from the neutral position and a second threshold distance from the neutral position, controlling one or more motors to apply a haptic force to the hand actuation selector that increases as the displacement of the hand actuation selector from the neutral displacement position increases according to a second control state; in response to sensing that the hand actuation sensor satisfies the second threshold distance from the neutral position, applying a click event signal to cause the click event to occur at a display system, and controlling the one or more motors according to a third control state to reduce the magnitude of the haptic force applied to the hand actuation selector to a reduced magnitude that is less than a maximum magnitude of the haptic force applied during the second control state.
Example 2 may include the subject matter of example 1, wherein the maintenance force is zero.
Example 3 may include the subject matter of example 1, wherein the maintenance force is less than a haptic force applied by the one or more motors according to the second control state.
Example 4 may include the subject matter of example 1, wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to apply a haptic force to the hand actuated selector that decreases in magnitude to the decreasing magnitude as a displacement of the hand actuated selector from the neutral displacement position increases.
Example 5 may include the subject matter of example 4, wherein controlling the one or more motors according to the second control state includes controlling the one or more motors to increase the magnitude of the haptic force at a first rate; and wherein controlling the one or more motors according to the third control state comprises controlling the one or more motors to reduce the magnitude of the haptic force at a second rate; and wherein the magnitude of the first rate is less than the magnitude of the second rate.
Example 6 may include the subject matter of example 4, wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to apply a haptic force to the hand actuated selector that decreases in magnitude to the decreasing magnitude as a displacement of the hand actuated selector from the neutral displacement position increases.
Example 7 may include the subject matter of example 4, wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to apply a haptic force to the hand actuated selector that decreases momentarily to the reduced magnitude.
Example 8 may include the subject matter of example 1, further comprising: the one or more motors are used to cause the hand actuated selector to move to the neutral position.
Example 9 may include the subject matter of example 1, further comprising: a resilient member is used to cause the hand actuated selector to move to the neutral position.
Example 10 may include the subject matter of example 1, further comprising: a stop surface is used to stop movement of the hand actuated selector.
Example 11 may include the subject matter of example 1, further comprising: a stop surface at the mounting structure is used to stop movement of the hand actuated selector.
Example 12 may include the subject matter of example 1, further comprising: when the hand actuated selector reaches the third threshold displacement distance from the neutral position, further displacement of the hand actuated selector is stopped using a stop surface positioned to apply a reaction force to the hand actuated selector.
Example 13 may include the subject matter of example 1, further comprising: at a displacement distance between the displacement distance at which the magnitude of reduction is first applied to the hand-actuated selector and the third threshold distance, the one or more motors are controlled to apply the magnitude of reduction to the hand-actuated selector according to a third control state.
Example 14 may include the subject matter of example 1, wherein the one or more sensors are configured to sense a displacement of the hand actuated selector relative to the mounting structure.
Example 15 may include the subject matter of example 1, wherein the one or more sensors are used to sense an amount of displacement of the hand actuated selector from a neutral position during movement of the hand actuated selector.
Example 16 may include an apparatus to control actuation of a click event by a hand actuated selector movably mounted to a mounting structure, comprising: one or more sensors configured to sense a position of the hand actuated selector; one or more motors configured to apply a haptic force to the hand actuated selector; processing circuitry; and a memory system storing instructions that, when executed by the processing circuitry, cause the processing circuitry to perform operations comprising: sensing an amount of displacement of the hand actuated selector from a neutral position using the one or more sensors; responsive to the hand actuation sensor being at a displacement distance less than a first threshold distance from the neutral position, controlling one or more motors to apply a maintenance force according to a first control state; in response to sensing that the hand actuation sensor is at a displacement distance between the first threshold distance from the neutral position and a second threshold distance from the neutral position, controlling one or more motors to apply a haptic force to the hand actuation selector that increases as the displacement of the hand actuation selector from the neutral displacement position increases according to a second control state; responsive to sensing that the hand actuation sensor satisfies the second threshold distance from the neutral position, applying a click event signal to cause the click event to occur at a display system, and controlling the one or more motors to apply a haptic force to the hand actuation selector of a magnitude that decreases to a magnitude that is less than a maximum magnitude of the haptic force applied during the second control state as a displacement of the hand actuation selector from the neutral displacement position increases; and controlling the one or more motors according to a third control state to reduce the magnitude of the haptic force applied to the hand actuated selector to a reduced magnitude that is less than a maximum magnitude of the haptic force applied during the second control state.
Example 17 may include the subject matter of example 16, wherein the maintenance force is zero.
Example 18 may include the subject matter of example 16, wherein the maintenance force is less than a haptic force applied by the one or more motors according to the second control state.
Example 19 may include the subject matter of example 16, wherein the act of controlling the one or more motors according to the third control state occurs after the act of applying the click event signal.
Example 20 may include the subject matter of example 16, wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to apply a haptic force to the hand actuated selector that decreases in magnitude to the decreasing magnitude as a displacement of the hand actuated selector from the neutral displacement position increases.
Example 21 may include the subject matter of example 20, wherein controlling the one or more motors according to the second control state includes controlling the one or more motors to increase the magnitude of the haptic force at a first rate; and wherein controlling the one or more motors according to the third control state comprises controlling the one or more motors to reduce the magnitude of the haptic force at a second rate; and wherein the magnitude of the first rate is less than the magnitude of the second rate.
Example 22 may include the subject matter of example 20, wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to apply a haptic force to the hand actuated selector that decreases in magnitude to the decreasing magnitude as a displacement of the hand actuated selector from the neutral displacement position increases.
Example 23 may include the subject matter of example 20, wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to apply a haptic force to the hand actuated selector that decreases momentarily to the reduced magnitude.
Example 24 may include the subject matter of example 16, further comprising: instructions that when executed cause the processor to perform operations comprising: the one or more motors are controlled to cause the hand actuated selector to move to the neutral position.
Example 25 may include the subject matter of example 16, further comprising: a resilient member configured to cause the hand actuated selector to move to the neutral position.
Example 26 may include the subject matter of example 16, further comprising: a stop surface configured to stop movement of the hand actuated selector when the hand actuated selector reaches the third threshold displacement distance from the neutral position.
Example 27 may include a method of controlling movement of a cursor in a first two-dimensional (2D) plane based on movement of a user input device in a second 2D plane and based on movement of a hand actuated selector movably mounted to the user input device, comprising: when the hand actuated selector moves relative to the user input device at a rate less than a first threshold rate, causing movement of the cursor within the first 2D plane to follow movement of the user input device within the second 2D plane according to a constant rate of movement; causing movement of the cursor within the first 2D plane to follow movement of the user interface device within the second 2D plane according to a rate of movement that decreases as the rate of movement of the hand actuated selector relative to the user input device increases in response to the rate of movement of the hand actuated selector relative to the controller being between the first and second threshold rates; and responsive to the rate of movement of the hand actuated selector relative to the user input device decreasing below the second threshold rate, causing movement of the cursor within the first 2D plane to follow movement of the user input device within the second 2D plane according to a rate of movement that increases as the rate of movement of the hand actuated selector relative to the user input device decreases.
Example 28 may include the subject matter of example 27, further comprising: in response to the hand-actuated selector moving relative to the user input device at a rate greater than a second threshold rate, control causes movement of the cursor within the first 2D plane to follow movement of the user input device within the second 2D plane according to a second constant movement rate that is less than the first movement rate.
Example 29 may include the subject matter of example 27, further comprising: in response to the hand actuation selector moving relative to the user input device at a rate greater than a second threshold rate, causing movement of the cursor within the first 2D plane to cease following movement of the user input device within the second 2D plane.
Example 30 may include an apparatus to control movement of a cursor within a first two-dimensional (2D) image display plane in a display system based on movement of a user input device within a second 2D tactile plane and based on movement of a hand actuated selector movably mounted to the user input device, comprising: one or more sensors configured to sense movement of the hand actuated selector; processing circuitry; and a memory system storing instructions that, when executed by the processing circuitry, cause the processing circuitry to perform operations comprising: when the hand actuated selector moves relative to the user input device at a rate less than a first threshold rate, causing movement of the cursor within the first 2D plane to follow movement of the user input device within the second 2D plane according to a constant rate of movement; causing movement of the cursor within the first 2D plane to follow movement of the user interface device within the second 2D plane according to a rate of movement that decreases as the rate of movement of the hand actuated selector relative to the user input device increases in response to the rate of movement of the hand actuated selector relative to the user input device being between the first threshold rate and a second threshold rate; and responsive to the rate of movement of the hand actuated selector relative to the user input device decreasing below the second threshold rate, causing movement of the cursor within the first 2D plane to follow movement of the user input device within the second 2D plane according to a rate of movement that increases as the rate of movement of the hand actuated selector relative to the user input device decreases.
Example 31 may include the subject matter of example 30, further comprising: instructions that when executed cause the processor to perform operations comprising: in response to the hand actuation selector moving relative to the user input device at a rate greater than a second threshold rate, causing movement of the cursor within the first 2D plane to follow movement of the user input device within the second 2D plane according to a second constant movement rate that is less than the first movement rate.
Example 32 may include the subject matter of example 30, further comprising: instructions that when executed cause the processor to perform operations comprising: in response to the hand actuation selector moving relative to the user input device at a rate greater than a second threshold rate, causing movement of the cursor within the first 2D plane to cease following movement of the user input device within the second 2D plane.

Claims (32)

1. A method of controlling actuation of a click event by a hand actuated selector movably mounted to a mounting structure, comprising:
sensing an amount of displacement distance of the hand actuated selector from a neutral position using one or more sensors;
responsive to the hand actuation sensor being at a displacement distance less than a first threshold distance from the neutral position, controlling one or more motors to apply a maintenance force according to a first control state;
In response to sensing that the hand actuation sensor is at a displacement distance between a first threshold distance from the neutral position and a second threshold distance from the neutral position, controlling one or more motors to apply a haptic force to the hand actuation selector that increases as the displacement of the hand actuation selector from the neutral displacement position increases according to a second control state;
in response to sensing that the hand actuation sensor meets the second threshold distance from the neutral position,
applying a click event signal to cause the click event to occur at a display system, an
The one or more motors are controlled according to a third control state to reduce the magnitude of the haptic force applied to the hand actuated selector to a reduced magnitude that is less than a maximum magnitude of the haptic force applied during the second control state.
2. The method according to claim 1,
wherein the maintenance force is zero.
3. The method according to claim 1,
wherein the maintenance is less than a haptic force applied by the one or more motors in accordance with the second control state.
4. The method according to claim 1,
wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to apply a haptic force to the hand actuated selector that decreases in magnitude to the decreasing magnitude as the displacement of the hand actuated selector from the neutral displacement position increases.
5. The method according to claim 4, wherein the method comprises,
wherein controlling the one or more motors according to the second control state includes controlling the one or more motors to increase the magnitude of the haptic force at a first rate; and is also provided with
Wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to reduce the magnitude of the haptic force at a second rate; and is also provided with
Wherein the magnitude of the first rate is less than the magnitude of the second rate.
6. The method according to claim 4, wherein the method comprises,
wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to apply a haptic force to the hand actuated selector that decreases in magnitude to the decreasing magnitude as the displacement of the hand actuated selector from the neutral displacement position increases.
7. The method according to claim 4, wherein the method comprises,
wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to apply a haptic force to the hand actuated selector that momentarily decreases to the reduced magnitude.
8. The method of claim 1, further comprising:
The one or more motors are used to cause the hand actuated selector to move to the neutral position.
9. The method of claim 1, further comprising:
a resilient member is used to cause the hand actuated selector to move to the neutral position.
10. The method of claim 1, further comprising:
a stop surface is used to stop movement of the hand actuated selector.
11. The method of claim 1, further comprising:
a stop surface at the mounting structure is used to stop movement of the hand actuated selector.
12. The method of claim 1, further comprising:
when the hand actuated selector reaches the third threshold displacement distance from the neutral position, further displacement of the hand actuated selector is stopped using a stop surface positioned to apply a reaction force to the hand actuated selector.
13. The method of claim 4, further comprising:
at a displacement distance between the displacement distance at which the magnitude of reduction is first applied to the hand-actuated selector and the third threshold distance, the one or more motors are controlled to apply the magnitude of reduction to the hand-actuated selector according to a third control state.
14. The method according to claim 1,
wherein the one or more sensors are configured to sense displacement of the hand actuated selector relative to the mounting structure.
15. The method according to claim 1,
wherein the one or more sensors are used to sense an amount of displacement of the hand actuated selector from a neutral position during movement of the hand actuated selector.
16. An apparatus for controlling actuation of a click event by a hand actuated selector movably mounted to a mounting structure, comprising:
one or more sensors configured to sense a position of the hand actuated selector;
one or more motors configured to apply a haptic force to the hand actuated selector;
processing circuitry; and
a memory system storing instructions that, when executed by the processing circuitry, cause the processing circuitry to perform operations comprising:
sensing an amount of displacement of the hand actuated selector from a neutral position using the one or more sensors;
responsive to the hand actuation sensor being at a displacement distance less than a first threshold distance from the neutral position, controlling one or more motors to apply a maintenance force according to a first control state;
In response to sensing that the hand actuation sensor is at a displacement distance between a first threshold distance from the neutral position and a second threshold distance from the neutral position, controlling one or more motors to apply a haptic force to the hand actuation selector that increases as the displacement of the hand actuation selector from the neutral displacement position increases according to a second control state;
in response to sensing that the hand actuation sensor meets the second threshold distance from the neutral position,
applying a click event signal to cause the click event to occur at a display system, an
The one or more motors are controlled according to a third control state to reduce the magnitude of the haptic force applied to the hand actuated selector to a reduced magnitude that is less than a maximum magnitude of the haptic force applied during the second control state.
17. The apparatus according to claim 16,
wherein the maintenance force is zero.
18. The apparatus according to claim 16,
wherein the maintenance force is less than a haptic force applied by the one or more motors according to the second control state.
19. The apparatus according to claim 16,
wherein the act of controlling the one or more motors in accordance with the third control state occurs after the act of applying the click event signal.
20. The apparatus according to claim 16,
wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to apply a haptic force to the hand actuated selector that decreases in magnitude to the decreasing magnitude as the displacement of the hand actuated selector from the neutral displacement position increases.
21. An apparatus according to claim 20,
wherein controlling the one or more motors according to the second control state includes controlling the one or more motors to increase the magnitude of the haptic force at a first rate; and is also provided with
Wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to reduce the magnitude of the haptic force at a second rate; and is also provided with
Wherein the magnitude of the first rate is less than the magnitude of the second rate.
22. An apparatus according to claim 20,
wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to apply a haptic force to the hand actuated selector that decreases in magnitude to the decreasing magnitude as the displacement of the hand actuated selector from the neutral displacement position increases.
23. The apparatus according to claim 16,
wherein controlling the one or more motors according to the third control state includes controlling the one or more motors to apply a haptic force to the hand actuated selector that momentarily decreases to the reduced magnitude.
24. The apparatus of claim 16, further comprising:
instructions that when executed cause the processor to perform operations comprising:
the one or more motors are controlled to cause the hand actuated selector to move to the neutral position.
25. The apparatus of claim 16, further comprising:
a resilient member configured to cause the hand actuated selector to move to the neutral position.
26. The apparatus of claim 16, further comprising:
a stop surface configured to stop movement of the hand actuated selector when the hand actuated selector reaches the third threshold displacement distance from the neutral position.
27. A method of controlling movement of a cursor in a first two-dimensional plane, a first 2D plane, based on movement of a user input device in a second two-dimensional plane, a second 2D plane, and based on movement of a hand actuated selector movably mounted to the user input device, comprising:
Causing movement of the cursor within the first 2D plane to follow movement of the user input device within the second 2D plane according to a constant rate of movement while the hand actuated selector is moving relative to the user input device at a rate less than a first threshold rate;
causing movement of the cursor within the first 2D plane to follow movement of the user interface device within the second 2D plane according to a rate of movement that decreases as the rate of movement of the hand actuated selector relative to the user input device increases in response to the rate of movement of the hand actuated selector relative to the controller being between the first and second threshold rates; and
in response to the rate of movement of the hand actuated selector relative to the user input device decreasing below the second threshold rate, causing movement of the cursor within the first 2D plane to follow movement of the user input device within the second 2D plane according to a rate of movement that increases as the rate of movement of the hand actuated selector relative to the user input device decreases.
28. The method of claim 27, further comprising:
In response to the hand-actuated selector moving relative to the user input device at a rate greater than a second threshold rate, control causes movement of the cursor within the first 2D plane to follow movement of the user input device within the second 2D plane according to a second constant movement rate that is less than the first movement rate.
29. The method of claim 28, further comprising:
in response to the hand actuation selector moving relative to the user input device at a rate greater than a second threshold rate, causing movement of the cursor within the first 2D plane to cease following movement of the user input device within the second 2D plane.
30. An apparatus for controlling movement of a cursor within a first two-dimensional image display plane, a first 2D image display plane, in a display system based on movement of a user input device within a second two-dimensional haptic plane, a second 2D haptic plane, and based on movement of a hand actuated selector movably mounted to the user input device, comprising:
one or more sensors configured to sense movement of the hand actuated selector;
processing circuitry; and
A memory system storing instructions that, when executed by the processing circuitry, cause the processing circuitry to perform operations comprising:
when the hand actuated selector moves relative to the user input device at a rate less than a first threshold rate, causing movement of the cursor within the first 2D plane to follow movement of the user input device within the second 2D plane according to a constant rate of movement;
causing movement of the cursor within the first 2D plane to follow movement of the user interface device within the second 2D plane according to a rate of movement that decreases as the rate of movement of the hand actuated selector relative to the user input device increases in response to the rate of movement of the hand actuated selector relative to the user input device being between the first threshold rate and a second threshold rate; and
in response to the rate of movement of the hand actuated selector relative to the user input device decreasing below the second threshold rate, causing movement of the cursor within the first 2D plane to follow movement of the user input device within the second 2D plane according to a rate of movement that increases as the rate of movement of the hand actuated selector relative to the user input device decreases.
31. The apparatus of claim 30, further comprising:
instructions that when executed cause the processor to perform operations comprising:
in response to the hand actuation selector moving relative to the user input device at a rate greater than a second threshold rate, causing movement of the cursor within the first 2D plane to follow movement of the user input device within the second 2D plane according to a second constant movement rate that is less than the first movement rate.
32. The apparatus of claim 30, further comprising:
instructions that when executed cause the processor to perform operations comprising:
in response to the hand actuation selector moving relative to the user input device at a rate greater than a second threshold rate, causing movement of the cursor within the first 2D plane to cease following movement of the user input device within the second 2D plane.
CN202180089910.XA 2020-12-01 2021-11-22 Interaction between user interface and master controller Pending CN116761568A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US63/120,202 2020-12-01
US202163187879P 2021-05-12 2021-05-12
US63/187,879 2021-05-12
PCT/US2021/060400 WO2022119740A1 (en) 2020-12-01 2021-11-22 Interaction between user-interface and master controller

Publications (1)

Publication Number Publication Date
CN116761568A true CN116761568A (en) 2023-09-15

Family

ID=87952007

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180089910.XA Pending CN116761568A (en) 2020-12-01 2021-11-22 Interaction between user interface and master controller

Country Status (1)

Country Link
CN (1) CN116761568A (en)

Similar Documents

Publication Publication Date Title
US12048505B2 (en) Master control device and methods therefor
US11723734B2 (en) User-interface control using master controller
KR102441640B1 (en) Virtual Reality Laparoscopic Tools
KR102549728B1 (en) Systems and methods for onscreen menus in a teleoperational medical system
US20210030491A1 (en) Interaction between user-interface and master controller
US20200275985A1 (en) Master control device with multi-finger grip and methods therefor
JP2022119767A (en) Virtual reality training, simulation, and cooperation in robot surgical system
EP3834765B1 (en) Staged force feedback transitioning between control states
US11712315B2 (en) Force-feedback gloves for a surgical robotic system
Low et al. A review of master–slave robotic systems for surgery
US12073013B2 (en) Interaction between user-interface and master controller
CN116761568A (en) Interaction between user interface and master controller
CN113272910A (en) Training a user using an index to a motion picture
US20210085406A1 (en) User interface for a surgical robotic system
WO2023192465A1 (en) User interface interaction elements with associated degrees of freedom of motion
WO2019222211A1 (en) Method and apparatus for manipulating tissue

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination