CN110236693B - System and method for off-screen indication of instruments in teleoperated medical systems - Google Patents

System and method for off-screen indication of instruments in teleoperated medical systems Download PDF

Info

Publication number
CN110236693B
CN110236693B CN201910406219.4A CN201910406219A CN110236693B CN 110236693 B CN110236693 B CN 110236693B CN 201910406219 A CN201910406219 A CN 201910406219A CN 110236693 B CN110236693 B CN 110236693B
Authority
CN
China
Prior art keywords
instrument
instrument tip
error
imaging
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910406219.4A
Other languages
Chinese (zh)
Other versions
CN110236693A (en
Inventor
B·D·伊科威兹
B·D·霍夫曼
P·W·莫尔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intuitive Surgical Operations Inc
Original Assignee
Intuitive Surgical Operations Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intuitive Surgical Operations Inc filed Critical Intuitive Surgical Operations Inc
Publication of CN110236693A publication Critical patent/CN110236693A/en
Application granted granted Critical
Publication of CN110236693B publication Critical patent/CN110236693B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0807Indication means
    • A61B2090/0811Indication means for the position of a particular part of an instrument with respect to the rest of the instrument, e.g. position of the anvil of a stapling instrument
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Robotics (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Endoscopes (AREA)

Abstract

The present application relates to systems and methods for off-screen indication of instruments in teleoperated medical systems. A medical imaging system includes a teleoperated assembly configured to control movement of a medical instrument including an instrument tip and a processing unit including one or more processors. The processing unit is configured to determine an instrument tip position and to determine a position error associated with the instrument tip position. The processing unit is further configured to determine at least one instrument tip bounding volume based on the positional error and determine whether the instrument tip is within a field of view of the imaging instrument.

Description

System and method for off-screen indication of instruments in teleoperated medical systems
The present application is a divisional application of chinese patent application 201580025502.2 (PCT/US 2015/021109) entitled "system and method for teleoperated off-screen indication of instruments in a medical system", filed on 17.3.2015.
Priority
This patent application claims priority and benefit of the filing date of U.S. provisional patent application U.S. 61/954,442 entitled "Systems and Methods for offset Indication of Instruments in a Teleoperational Medical System" filed 3, month 17, 2014, the entire contents of which are incorporated herein by reference.
Technical Field
The present disclosure relates to systems and methods for performing teleoperated medical procedures, and more particularly to systems and methods for providing an indication of the position of a teleoperated instrument located outside the field of view of an endoscope.
Background
Minimally invasive medical techniques attempt to reduce the amount of tissue damaged during invasive medical procedures, thereby reducing patient recovery time, discomfort, and harmful side effects. Such minimally invasive techniques may be performed through a natural orifice in the patient's body or through one or more surgical incisions. Through these natural orifices or incisions, the clinician may insert medical tools to reach the target tissue site. Minimally invasive medical tools include such instruments as therapeutic instruments, diagnostic instruments, and surgical instruments. Minimally invasive medical tools may also include imaging instruments such as endoscopic instruments. The imaging modality provides a user with a field of view within a patient's body structure. Some minimally invasive medical tools and imaging instruments may be teleoperated or otherwise computer-assisted. In teleoperated medical systems, the instrument may be controlled without being visible to the user in the field of view provided by the imaging instrument. Inadvertent movement of the instrument out of the field of view can pose a safety risk. In addition, the clinician may lose track of instruments located outside of the field of view. There is a need for systems and methods to provide a clinician with an indication of the position of an instrument outside of the field of view while minimizing instances of false indications.
Disclosure of Invention
Embodiments of the invention are summarized by the claims that follow.
In one embodiment, a medical imaging system includes a teleoperated assembly configured to control movement of a medical instrument including an instrument tip and a processing unit including one or more processors. The processing unit is configured to determine an instrument tip position and determine a position error associated with the instrument tip position. The processing unit is further configured to determine at least one instrument tip bounding volume based on the positional error and determine whether the instrument tip is within a field of view of the imaging instrument.
In another embodiment, a method of imaging includes determining an instrument tip position of a medical instrument controlled by a teleoperated assembly and determining a position error associated with the instrument tip position. The method further includes determining at least one instrument tip bounding volume based on the position error and determining whether the instrument tip is within a field of view of the imaging instrument.
Drawings
Aspects of the present disclosure are best understood from the following detailed description when read with the accompanying drawing figures. It is emphasized that, in accordance with the standard practice in the industry, various features are not drawn to scale. In fact, the dimensions of the various features may be arbitrarily increased or decreased for clarity of discussion. Moreover, the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
Fig. 1A is a schematic diagram of a teleoperated medical system according to an embodiment of the present disclosure.
FIG. 1B is a perspective view of a surgeon's console for a teleoperated medical system according to many embodiments.
FIG. 1C is a perspective view of a teleoperated medical system electronics cart, in accordance with many embodiments.
Fig. 1D is a perspective view of a patient side cart according to one example of principles described herein.
Fig. 2A illustrates the off-axis endoscopic imaging instrument in a first orientation.
Fig. 2B illustrates the off-axis endoscopic imaging instrument of fig. 2A in a second orientation.
Fig. 2C illustrates the distal end of the off-axis endoscopic imaging instrument of fig. 2A.
Fig. 3 illustrates a surgical workspace around the field of view of the imaging instrument.
Fig. 4A illustrates a field of view of a surgical workspace with two instruments in the field of view.
Fig. 4B-4D illustrate additional views of the surgical workspace of fig. 4A with one instrument out of view.
FIG. 4E illustrates another view of the surgical workspace of FIG. 4A with both instruments out of view.
FIG. 4F illustrates another view of the surgical workspace of FIG. 4A with an instrument located near a boundary of the view.
FIG. 5 illustrates a surgical workspace that surrounds the imaging instrument field of view and includes instruments with error bounding boxes.
FIG. 6 is a method for determining whether an instrument is out of view.
FIG. 7 is a method for assessing instrument tip position.
Fig. 8 is a method for estimating instrument tip position error in the imaging instrument coordinate space.
FIG. 9 is a method for determining an instrument tip bounding box in screen coordinate space.
Fig. 10 is a method for determining whether an instrument tip is outside the field of view of an imaging instrument.
FIG. 11 is a deviation curve used in the method of FIG. 6.
Fig. 12-15 illustrate the effect of the distance from the instrument tip of the imaging instrument in determining whether the instrument tip is outside the field of view of the imaging instrument.
Detailed Description
For the purposes of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiments illustrated in the drawings and specific language will be used to describe the same. Nevertheless, it will be understood that no limitation of the scope of the disclosure is intended. In the following detailed description of various aspects of the invention, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, it will be apparent to one skilled in the art that embodiments of the present disclosure may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the embodiments of the present invention.
Any alterations and further modifications in the described devices, apparatus, and methods, and any further applications of the principles of the disclosure are contemplated as would normally occur to one skilled in the art to which the disclosure relates. In particular, it is fully contemplated that the features, components, and/or steps described in connection with one embodiment may be combined with the features, components, and/or steps described in connection with other embodiments of the disclosure. In addition, the dimensions provided herein are for specific examples, and it is contemplated that different sizes, dimensions, and/or proportions may be used to implement the concepts of the present disclosure. To avoid unnecessary repetition, one or more components or actions described with reference to one illustrative embodiment can be used or omitted when applicable to other illustrative embodiments. For the sake of brevity, repeated repetition of these combinations will not be described separately. For simplicity, in some instances, the same reference numbers are used throughout the drawings to refer to the same or like parts.
The following embodiments will describe various instruments and portions of instruments according to their state in three-dimensional space. As used herein, the term "orientation" refers to the position of an object or portion of an object in three-dimensional space (e.g., three translational degrees of freedom along cartesian X, Y, Z coordinates). As used herein, the term "orientation" refers to the rotational arrangement (three rotational degrees of freedom-e.g., roll, pitch, and yaw) of an object or portion of an object. As used herein, the term "pose" refers to the position of an object or portion of an object in at least one translational degree of freedom and refers to the orientation of an object or portion of an object in at least one rotational degree of freedom (up to six degrees of freedom). As used herein, the term "shape" refers to a set of poses, orientations, or orientations measured along an object.
Referring to FIG. 1A of the drawings, for example, a teleoperated medical system for a medical procedure, including a diagnostic, therapeutic, or surgical procedure, is indicated generally by the reference numeral 10. As will be described, the teleoperated medical system of the present disclosure is under the control of the teleoperation of the surgeon. In an alternative embodiment, the teleoperated medical system may be controlled in part by a computer programmed to execute a program or subroutine. In other alternative embodiments, a fully automated medical system may be used to perform a procedure or sub-procedure under the full control of a computer programmed to perform the procedure or sub-procedure. As shown in fig. 1A, teleoperated medical system 10 generally includes a teleoperated assembly 12 mounted to or near an operating table O on which a patient P is placed. The teleoperated assembly 12 may be referred to as a patient side cart. A medical instrument system 14 and an endoscopic imaging system 15 are operatively coupled to the teleoperated assembly 12. The operator input system 16 allows a surgeon or other clinician S to view images of or representing a surgical site and to control operation of the medical instrument system 14 and/or the endoscopic imaging system 15.
Operator input system 16 may be located at a surgeon's console, which is typically located in the same room as operating table O. However, it should be understood that the surgeon S may be in a different room or completely different building than the patient P. Operator input system 16 generally includes one or more controls for controlling medical instrument system 14. The control device may include one or more of any number of various input devices (such as a handheld clamp, joystick, trackball, data glove, trigger gun, manual controller, voice recognition device, touch screen, body motion or presence sensor, etc.). In some embodiments, the control device will be provided with the same degrees of freedom as the medical instruments of the teleoperated assembly to provide a telepresence (telepresence) to the surgeon, such control device being integral to the instruments perception so that the surgeon has a strong sense of directly controlling the instruments as at the surgical site. In other embodiments, the control device may have more or fewer degrees of freedom than the associated medical instrument and still provide the surgeon with telepresence. In some embodiments, the control device is a manual input device having six degrees of freedom of movement and further including an actuatable handle for actuating the instrument (e.g., for closing grasping jaws, applying an electrical potential to an electrode, delivering a drug therapy, etc.).
The teleoperated assembly 12 supports and manipulates the medical instrument system 14 as the surgeon S views the surgical site through the console 16. Images of the surgical site can be obtained by an endoscopic imaging system 15, such as a stereoscopic imaging endoscope, where the endoscopic imaging system 15 can be manipulated by the teleoperated assembly 12 to orient the endoscope 15. The electronics cart 18 can be used to process images of the surgical site for subsequent display to the surgeon S via the surgeon' S console 16. The number of medical instrument systems 14 used simultaneously is typically dependent upon the diagnostic or surgical procedure and other factors such as space constraints within the operating room. The teleoperated assembly 12 may include one or more non-servo controlled linkage kinematic structures (e.g., one or more linkages that may be manually positioned and locked in place, commonly referred to as a set-up structure) and teleoperated manipulators. The teleoperated assembly 12 includes a plurality of motors that drive inputs on the medical instrument system 14. These motors move in response to commands from a control system (e.g., control system 20). The motor includes a drive system that, when coupled to the medical instrument system 14, can propel the medical instrument into a natural or surgically created anatomical orifice. Other motorized drive systems may move the distal end of the medical instrument in multiple degrees of freedom, which may include three dimensions of linear motion (e.g., linear motion along X, Y, Z cartesian axes) and three dimensions of rotational motion (e.g., rotation about X, Y, Z cartesian axes). Additionally, the motor may be used to actuate an articulated end effector of the instrument for grasping tissue in jaws of a biopsy device or the like.
Teleoperated medical system 10 also includes a control system 20. Control system 20 includes at least one memory and at least one processor (not shown), and typically a plurality of processors, for effecting control between medical instrument system 14, operator input system 16, and electronic system 18. The control system 20 also includes programming instructions (e.g., instructions stored by a computer-readable medium) to implement some or all of the methods described in accordance with various aspects disclosed herein. Although the control system 20 is shown as a single block in the simplified illustration of FIG. 1A, the system may include two or more data processing circuits, with a portion of the processing circuits being optionally implemented on or near the teleoperated assembly 12 and another portion of the processing circuits being implemented at the operator input system 16 or the like. Any of a wide variety of centralized or distributed data processing architectures may be employed. Likewise, the programming instructions may be implemented as several separate programs or subroutines, or they may be integrated into several other aspects of the teleoperated system described herein. In one embodiment, control system 20 supports wireless communication protocols such as Bluetooth, infrared data protocol (IrDA), radio frequency (HomeRF), IEEE 802.11, DECT, and wireless telemetry.
In some embodiments, control system 20 may include one or more servo controllers that receive force and/or torque feedback from medical instrument system 14. In response to the feedback, the servo controller transmits a signal to the operator input system 16. The servo controller(s) may also transmit signals that instruct the teleoperated assembly 12 to move the medical instrument system(s) 14 and/or the endoscopic imaging system 15 through an opening in the body to an internal surgical site within the patient's body. Any suitable conventional or dedicated servo controller may be used. The servo controller may be separate from or integral with the teleoperated assembly 12. In some embodiments, the servo controller and teleoperated assembly may be provided as part of a teleoperated arm cart located near the patient's body.
Teleoperated medical system 10 may further include optional operating and support systems (not shown), such as lighting systems, steering control systems, irrigation systems, and/or aspiration systems. In alternative embodiments, the teleoperated system may include more than one teleoperated component and/or more than one operator input system. The exact number of manipulator assemblies will be based on the surgical procedure and the space constraints within the operating room, among other factors. The operator input systems may be collocated, or they may be located in separate locations. The multiple operator input system allows more than one operator to control one or more manipulator assemblies in various combinations.
FIG. 1B is a perspective view of the surgeon's console 16. The surgeon' S console 16 includes a left eye display 32 and a right eye display 34 for presenting to the surgeon S a coordinated stereoscopic view of the surgical site that enables depth perception. The console 16 further includes one or more input controls 36 that in turn cause the teleoperated assembly 12 to manipulate one or more instruments or the endoscopic imaging system. The input control device 36 can provide the same degrees of freedom as its associated instrument 14 to provide the surgeon S with a telepresence or the input control device 36 is integral with the instrument 14 so that the surgeon has a strong sense of directly controlling the instrument 14. In view of this, position, force and tactile feedback sensors (not shown) may be applied to transmit position, force and tactile sensations from the instrument 14 back to the surgeon's hand through the input control device 36.
Fig. 1C is a perspective view of the electronics cart 18. The electronics cart 18 may be coupled with the endoscope 15 and may include a processor to process the captured images for subsequent display, such as to a surgeon at a surgeon console or to a surgeon at another suitable display located locally and/or remotely. For example, when a stereoscopic imaging endoscope is used, the electronics cart 18 may process the captured images to present the surgeon with coordinated stereoscopic images of the surgical site. Such coordination may include alignment between the opposing images and may include adjusting a stereoscopic working distance of the stereoscopic imaging endoscope. As another example, image processing may include using predetermined camera calibration parameters to compensate for imaging errors (e.g., optical aberrations) of the image capture device. The electronics cart 18 may also include display monitors and components of a control system 20.
Fig. 1D is a perspective view of one embodiment of the teleoperated assembly 12, which may be referred to as a patient side cart. The illustrated patient side cart 12 provides for the manipulation of three surgical tools 26 (e.g., instrument system 14) and an imaging device 28 (e.g., endoscopic imaging system 15), such as a stereoscopic imaging endoscope, which is used to capture images of the surgical site. The imaging device may transmit signals to the electronics cart 18 via the cable 56. Steering is provided by a teleoperated mechanism having a plurality of joints. The imaging device 28 and the surgical tool 26 may be positioned and manipulated through an incision in a patient such that the remote center of motion is maintained at the incision to minimize the size of the incision. When the distal end of the surgical tool 26 is positioned within the field of view of the imaging device 28, the image of the surgical site may include an image of the distal end of the surgical tool 26.
The patient side cart 22 includes a drivable base 58. The drivable base 58 is connected to a telescopic column 57 which allows the height of the arm 54 to be adjusted. The arm 54 may include a rotary joint 55, and the rotary joint 55 may be both rotatable and movable upward and downward. Each arm 54 may be connected to an orienting platform 53. The orienting platform 53 may be capable of 360 degree rotation. The patient side cart 22 may also include a telescoping horizontal boom 52 for moving an orienting platform 53 in a horizontal direction.
In this example, each arm 54 is connected to a manipulator arm 51. The manipulator arm 51 may be directly connected to the medical instrument 26 via a manipulator beam 59. The manipulator arm 51 may be operated remotely. In some examples, the arm 54 connected to the orienting platform is not remotely operable. More specifically, such arms 54 are positioned as desired prior to the surgeon 18 initiating surgery with the teleoperated components.
Endoscopic imaging systems (e.g., systems 15, 28) may be provided in a number of configurations, including rigid or flexible endoscopes. Rigid endoscopes include a rigid tube that houses a relay lens system for transmitting images from the distal end to the proximal end of the endoscope. Flexible endoscopes use one or more flexible optical fibers to transmit images. The endoscope may be equipped with different viewing angles including a 0 deg. viewing angle for forward axial viewing or a viewing angle between 0 deg. -90 deg. for forward tilting viewing. Digital image-based endoscopes have a "chip-on-tip" design, in which a distal digital sensor, such as one or more Charge Coupled Devices (CCD) or Complementary Metal Oxide Semiconductor (CMOS) devices, stores image data. Endoscopic imaging systems may provide two-dimensional or three-dimensional images to a viewer. Two-dimensional images may provide limited depth perception. Three-dimensional stereoscopic endoscopic images may provide a more accurate perception of depth to the viewer. Stereoscopic endoscopic instruments employ a stereoscopic camera to capture stereoscopic pictures of the patient's anatomy. Fig. 2A illustrates a rigid off-axis stereoscopic endoscopic imaging instrument 100 including a handle 102 and a shaft 104 rigidly coupled to the handle. A roll adapter 106 is rotatably coupled to the shaft 104 and/or the handle 102. The shaft includes a distal end 108 and a proximal end 110 and houses a distal image storage device, lens system, optical fibers or other stereoscopic image capture and transmission components (not shown). The shaft 104 extends along an optical axis OA. As shown in fig. 2C, instrument 100 has a viewing angle 112 and a tapered optical field of view 114. In this embodiment, the viewing angle is about 30 °, but may be any angle suitable for viewing at an oblique angle relative to the optical axis OA. In response to the manual or teleoperated control, the teleoperated assembly (e.g., assembly 12) may be operated to rotate the imaging instrument 100 about the optical axis OA, including rotating instrument body 102 and shaft 104. Fig. 2A illustrates imaging instrument 100 having a-30 deg. angle or a downward angle relative to optical axis OA. Fig. 2B illustrates imaging instrument 100 rotated 180 deg. with a +30 deg. angle or an upward angle relative to optical axis OA. The terms "above," "below," "upward," "downward," and the like are used for descriptive purposes only to denote generally opposite directions and are not intended to be limiting.
Fig. 3 illustrates a surgical workspace 200 around an imaging instrument field of view 202 of an imaging instrument 204 (e.g., imaging instruments 15, 100). The imaging instrument 204 has a distal end 205. Two surgical instruments 206, 208 (e.g., instrument 14) are operable within the workspace 200. In this embodiment, the field of view 202 has the shape of a frustum of a three-dimensional cone. If the imaging instrument 204 is a stereoscopic imaging instrument having two imaging devices, the field of view of the imaging instrument is a composite volume of frustums of the three-dimensional cone of each imaging device of the imaging instrument. In an alternative embodiment, the three-dimensional field of view of the imaging instrument or imaging device may be in the shape of a conical frustum. Medical instrument 206 includes a distal instrument tip 210, which in this embodiment is a two-finger actuatable tip including a proximal tip portion 212, a first distal tip portion 214, and a second distal tip portion 216. Likewise, the medical instrument 208 includes a distal instrument tip 218, and in this embodiment, the distal instrument tip 218 is a two-finger actuatable tip including a proximal tip portion 220, a first distal tip portion 222, and a second distal tip portion 224.
In the configuration of fig. 3, the medical instruments 206, 208 are outside the field of view of the imaging instrument 204. The determination of whether the instruments 206, 208 are within or outside the field of view of the imaging instrument 204 may be based on the calculated position of the instrument distal tips 210, 218, as described in further detail in fig. 7. The surgical workspace of FIG. 3 has a world coordinate system (X) for world space W 、Y W 、Z W ) And the distal end of the imaging instrument has an imaging or endoscopic coordinate system (X) for the endoscopic space I 、Y I 、Z I )。
Fig. 4A illustrates a displayed image 230 of a field of view 231 of the imaging instrument 204 within the surgical workspace 200. The instruments 206, 208 are visible in the displayed image 230. In this configuration, the instruments 206, 208 are considered to be "on-screen," that is, on the display screen on which the image 230 is displayed. The field of view 231 has a boundary 232. The displayed image has a display or screen coordinate system (X) for the display or screen space S 、Y S 、Z S ). In a stereoscopic imaging system, there is a separate screen coordinate system for each of the left and right screens.
Fig. 4B illustrates a displayed image 240 of a field of view 241 of the imaging instrument 204 within the surgical workspace 200. The field of view 241 has a boundary 242. The field of view of the instrument 204 may be changed, for example, zoomed in, zoomed out, moved in a pitch motion, moved in a yaw motion, or rolled between an upward perspective and a downward perspective. The instrument 206 remains within the field of view 241 and is therefore visible in the displayed image 240. Thus, the instrument 206 is considered to be on the screen. The instrument 208 is not within the field of view 241 and is therefore not visible in the displayed image 240. Thus, the instrument 208 is considered off-screen.
Because the instruments 206, 208 may be teleoperated in a manner that is invisible to the clinician in the field of view, inadvertent movement of the instruments out of the field of view poses a safety risk. In addition, the clinician may lose track of instruments located outside of the field of view. To reduce these risks, out-of-view instrument indicators may be visually or audibly presented to increase the clinician's awareness of the location of instruments that are not visible within the field of view. For example, as shown in fig. 4B, an out-of-view instrument indicator 244 is provided along the boundary 242 of the field of view 241 to indicate that the instrument 208 is located out of view generally in the direction of the indicator. In this embodiment, the indicator 244 is a graphical bar, but in other embodiments the indicator may be a series of dots or icons, an alpha-numeric indicator. In addition to or instead of the visual indicator 244, an audible out-of-view indicator, such as a siren or language-based instructions, may alert the clinician instrument 208 that it is out of view. This audible cue may hover (pan) between the left and right speakers of the surgeon's console to enhance the orientation of the instrument relative to the field of view. Alternatively, the audible prompts may emanate from a left or right speaker corresponding to a left or right hand control associated with the instrument. In addition to or as an alternative to the visual indicator 244, textual information 246 associated with the out-of-view instrument may be provided to alert the clinician and/or to provide identifying information about or instructions to visualize the instrument.
In various embodiments, the use of out-of-view indicators may be limited to avoid distracting the clinician. The use of out-of-view indicators may be context sensitive such that out-of-view indicators may be displayed only during certain modes of operation of the teleoperated system. For example, the out-of-view indicator may be displayed during a mode of the system in which the operator controls movement of the imaging system, which may be referred to as a camera control mode. As another example, out-of-view indicators may be displayed while the system is awaiting input from an operator to control an associated instrument. As another example, the out-of-view indicator may be displayed for a few seconds after the start of a system mode in which the operator controls the movement of the instrument, which is known as a follow-up mode. In other alternative embodiments, the out-of-view indicator may be disabled, or selectively enabled when the clinician wants to know the position of the out-of-view instrument. In some embodiments, the clinician must provide confirmation that the instrument tip is out of the field of view before operation of the instrument is enabled. Additional warnings or confirmations may be used for the energy release device, the sharp device, or a device that would increase patient risk if used without visualization. In some embodiments, an out-of-view indicator may be provided for instruments that are within the field of view but not visible due to occluded tissue or other structures.
Fig. 4C illustrates a displayed image 250 of the field of view 251 of the imaging instrument 204 in the surgical workspace 200 after moving the imaging instrument 204 from the position or orientation in fig. 4B. The field of view 251 has a boundary 252. The instrument 206 remains within the field of view 251 and is therefore visible in the displayed image 250. The instrument 208 is outside the field of view 251, and therefore not visible in the displayed image 250. The out-of-view indicator 244 has been repositioned to a different location along the boundary 252 to reflect the location of the instrument outside the field of view after the imaging instrument has been moved. In some embodiments, the overall length of the indicator 244 may remain the same as the imaging instrument is moved. In an alternative embodiment, the indicator may indicate with a scale the distance of the out-of-view instrument from the field of view.
Fig. 4D illustrates a displayed image 260 of a field of view 261 of the imaging instrument 204 in the surgical workspace 200 after moving the imaging instrument 204 from the position or orientation in fig. 4C. The field of view 261 has a boundary 262. The instrument 206 remains within the field of view 261 and is therefore visible in the displayed image 260. The instrument 208 is outside the field of view 261, and therefore is not visible in the displayed image 260. The out-of-view indicator 244 is further repositioned along the boundary 262 to reflect the position of the instrument outside the field of view after the imaging instrument has been moved.
Fig. 4E illustrates a displayed image 270 of the field of view 271 of the imaging instrument 204 in the surgical workspace 200 after scrolling the imaging instrument 204 from the view-up configuration in fig. 4A to the view-down configuration. The field of view 271 has a boundary 272. Both instruments 206 and 208 are outside the field of view 271 and, therefore, are not visible in the displayed image 270. Out-of-view indicator 244 indicates an approximate direction of instrument 208 out of view, and out-of-view indicator 274 indicates an approximate direction of instrument 210 out of view.
Fig. 4F illustrates a displayed image 280 of a field of view 281 of an imaging instrument 204 in the surgical workspace 200. The field of view 271 has a boundary 282. Both instruments 206 and 208 are within the field of view 281 and, therefore, visible in the displayed image 280. An alert indicator 284 may be provided to indicate that the instrument tip 218 is approaching the boundary 282. In this embodiment, the alert indicator 284 is a ring, but in alternative embodiments may be another type of graphical symbol or audible indicator. The indicator may change (e.g., blink) as the tip 218 moves closer to the boundary 282. The alert indicator 284 may appear in the displayed image 280 when either instrument tip is moved within a predefined distance (e.g., 1 cm) from the boundary 282.
As described above, the determination of whether the instruments 206, 208 are within or outside the field of view of the imaging instrument 204 may be based on the calculated position of the instrument distal tips 210, 218. Because of the small error factors associated with teleoperated systems, instruments, and/or imaging systems, the determination of the position of the instrument distal tip 210, 218 relative to the imaging instrument 204 has an associated cumulative error factor. To avoid providing the clinician with false out-of-view indicators, the determination of whether the instrument tip is out of view of the imaging system may be biased by estimating a range of possible positions of the distal tip and suppressing the out-of-view indicators in the event that any or a specified percentage of the possible estimated positions of the distal tip are within the field of view. The sensitivity of the deviation can be adjusted according to the clinician's tolerance for false-positive out-of-view indicators.
Fig. 5 illustrates a surgical workspace 200 around an imaging instrument field of view 202 of an imaging instrument 204. In this illustration, the positional error associated with the instrument tips 210, 218 of the medical instruments 206, 208 is shown. A set of error bounding boxes is associated with each instrument tip 210, 218. The error bounding box may represent the expected positions of the distal and proximal ends of the tip portion within a high degree of certainty, such as 90-99%. Error box 290 represents the expected position of proximal tip portion 212 of instrument tip 210. The error box 292 represents the expected position of the distal tip portions 214, 216 of the instrument tip 210. In this configuration, the distal tip portions 214, 216 are arranged in a closed clamp configuration (e.g., the angle between the tip portions is about zero), in which the distal tip portions are closed together. Thus, the single bounding box 292 may be used to approximate the distal tip portions 214, 216. If the distal tip portions 214, 216 are arranged in an open clamp configuration (e.g., the angle between the tip portions is much greater than zero), separate bounding boxes may be used to approximate the position of each distal tip portion. The error bounding box 294 represents the expected position of the proximal tip portion 220 of the instrument tip 218. The error bounding box 296 represents the expected position of the distal tip portions 222, 224 of the instrument tip 218. The bounding box may be used as a bias for determining whether to display out-of-view indicators to a clinician. For example, both bounding boxes 290, 292 are outside the field of view 202, and thus, an out-of-view indicator is generated for the instrument 206. At least one of the bounding boxes 294, 296 is within the field of view 202, and therefore will not generate an out-of-view indicator for the instrument 208. In an alternative embodiment, a bounding box may also be provided for the field of view, recognizing errors associated with the boundaries of the field of view. Likewise, if the bounding box of the field of view intersects the bounding box of the instrument, the out-of-view indicator will not be displayed to the clinician.
Fig. 6 illustrates a method 300 for determining whether an instrument is out of view. At process 302, method 300 includes evaluating instrument tip position in endoscope tip coordinate space, as described in further detail in fig. 7. At process 304, method 300 includes estimating instrument tip position error relative to the distal tip of the endoscope, as described in further detail in fig. 8. At process 306, method 300 includes determining an instrument tip bounding box in screen coordinate space, as described in further detail in fig. 9. At process 308, the method 300 includes determining whether the instrument tip is outside of the endoscopic view, as described in further detail in fig. 10. Although the imaging modality is referred to as an endoscope in fig. 6-10, other imaging modalities may be used.
FIG. 7 illustrates a method 310 for assessing instrument tip position in endoscope tip coordinate space. In method 310, forward motion of the teleoperated system and medical instrument is evaluated to determine the location of landmarks on the instrument tip (such as the proximal and distal tip portions) within the endoscope coordinate system. The evaluation uses forward motion and rigid body transformations. Process 312 includes evaluating forward motion position data of a manipulator (e.g., manipulator 51) and an instrument (e.g., instruments 206, 208) in world coordinate space. The process 314 includes evaluating forward motion position data of the manipulator and endoscope (e.g., imaging instrument 204) in world coordinate space. Process 316 includes converting instrument tip position data to endoscope tip coordinate space. The process 320 includes determining a clamp angle for the distal tip portion (e.g., the angle between the distal tip portions 214, 216). The process 322 includes determining a length of the instrument tip (e.g., a length between the proximal tip portion 212 and the distal tip portions 214, 216). At process 318, the clamp angle and instrument tip length are used to assess the position of the proximal and distal instrument tip points. At process 324, a proximal tip portion position in an endoscope tip coordinate space is determined. At process 326, a first distal tip portion (e.g., portion 214) position in an endoscope tip coordinate space is determined. At process 328, a second distal tip portion (e.g., portion 216) position in the endoscope tip coordinate space is determined.
FIG. 8 illustrates a method 330 for estimating instrument tip position error relative to an endoscope. Because of the large amount of link and manipulator pose variability in the kinematic chain of teleoperated systems, instrument tip position errors can be highly dependent on having a variety of ranges of configurations, e.g., from 0.5cm to 2.0cm. For all configurations, an overall upper bound on the error may be assumed, but overall this approach is likely to overestimate the error compared to the estimation for a particular configuration. In method 330, the tip position error is modeled as the sum of the errors contributed by the major constituent parts of the kinematic chain. For example, when the instrument is working close to the remote center (e.g., small insertion depth), a major portion of the tip position error may be contributed by the set-up joints while a minor error is contributed by the manipulator.
Method 330 includes a process 332 for determining an instrument manipulator arm remote center position error and a process 334 for determining an endoscope manipulator arm remote center position error. The remote center position of the manipulator arm is determined by its mounting joint sensors and the fixed mechanism offsets between the links of the kinematic chain of the teleoperated assembly. An end-to-end motion calibration procedure may be performed to eliminate variability in manufacturing and assembly. The resulting orientation error may be due to non-linearity of the assembled joint sensor, the resolution of the sensor, and the deflection of the components during calibration. The remote center orientation of the manipulator arm is independent of whether the arm is mounted with a medical or imaging instrument. The manipulator arm remote center position error is relative between two arms, one with the instrument and the other with the imaging instrument.
Method 330 further includes a process 336 for determining an instrument manipulator arm orientation error and a process 342 for determining an instrument manipulator arm orientation error. The arm orientation error is related to the error in the pointing direction of the manipulator beam, which is affected by the accuracy of the outer pitch and yaw joints of the manipulator and any misalignment during installation of the manipulator to the mounting structure of the teleoperated assembly. At process 338, the instrument arm orientation error is combined with the insertion depth. At 340, the endoscope arm orientation error is combined with the endoscope insertion depth. At process 344, the instrument arm non-observable error is determined and at process 346, the endoscope arm non-observable error is determined. This non-observable error results in an error that is not observable from the joint orientation sensor. The primary error is the flexibility of the fitting joint due to the load at the patient's body wall. This directly affects the orientation of the remote center. Another error factor that may lead to deflection of the instrument shaft is compliance, which may be a function of insertion depth. At process 348, all of the error factors determined in processes 332-346 are combined to determine the proximal end portion tip error relative to the endoscope. The method further includes a process 350 for determining instrument wrist error and a process 352 for determining instrument tip length. At process 354, the error factors from processes 350 and 352 are combined with the proximal end portion tip error from process 348 to determine a distal instrument tip error relative to the endoscope.
Fig. 9 illustrates a method 360 for determining an instrument tip bounding box in screen coordinate space. The method 360 combines the instrument tip position from the method 310 with the estimated position error from the method 330. In method 360, the orientation error is mapped to the screen and the amount of error used as a bias for the visual field test is determined. This orientation error is the basis for determining the deviation for in/out of the horizon test. Thus, the orientation error can artificially increase or decrease the bounding box of the instrument tip portion to affect the balance between false positive errors and false negative errors when detecting that the instrument tip is within or outside the imaging instrument field of view.
Method 360 includes a process 362 of retrieving the instrument tip position in endoscope tip space, a process 364 of retrieving a calibrated stereo camera model, and a process 366 of retrieving an estimated instrument tip position error in endoscope tip coordinate space. At process 370, the tip position error is projected to a screen space (e.g., from a stereo camera model) coordinate system position error. At process 372, the projected orientation error is mapped to a bias. When the deviation is 0, the system will report a mix of false positives (i.e., improperly reporting out-of-view indicators) and false negatives (i.e., not displayed when the out-of-view indicator should be displayed) errors. When the deviation is large (e.g., 100% deviation), the system will only detect that the instrument is outside the field of view when the instrument is significantly outside the field of view. The process of mapping the estimated bearing error to a bias allows the trade-off between unbiased and full bias to be adjusted. Minimizing the apparent false positive error is important because when the instrument tip is clearly visible in the field of view, if an out-of-view indicator appears, it can distract and confuse whether the instrument tip is in the field of view. Too many false positive errors may make the clinician insensitive to such alerts so that the clinician becomes ignorant to make a true positive detection. When the instrument tip is close to the endoscope tip (i.e., the instrument tip occupies most of the field of view), the size of the tip orientation error approximates the cross-sectional dimension of the eye volume. In this configuration, there will be a high false positive detection rate without using bias. Fig. 11-15 illustrate the use of the bias in more detail.
At process 368, the insertion depth of the instrument tip is determined. At process 374, a bounding box (e.g., 290, 292, 294, 296, 522, 524) is created near the instrument tip point. At process 376, the instrument tip bounding box is determined in a screen space coordinate system. Typically, the bounding boxes are not displayed to the user, but alternatively they may be displayed.
Fig. 10 is a method for determining whether an instrument tip is outside the field of view of an imaging instrument. At process 382, the instrument tip bounding box is retrieved for the left eye screen coordinate space, and at process 384, the instrument tip bounding box is retrieved for the right eye screen coordinate space. At process 386, a test is performed to determine whether the bounding box is within or outside the left field of view. At process 388, a test is conducted to determine whether the bounding box is within or outside the right eye field of view. At process 390, if all bounding boxes are out of view, a out of view indicator will be displayed to the clinician.
FIG. 11 illustrates a deviation curve 400 for mapping projected orientation errors to deviation factors to determine how much orientation errors are used to bias the out-of-tool-view indicators. When the display size of the orientation error is large (for example, when the orientation error size is 20% or more of the screen size), the deviation curve 400 applies the full deviation (deviation factor = 1). This typically occurs when the instrument tip is very close to the endoscope tip. When the display size of the error is small (e.g., when the orientation error size is less than 20% of the screen size), the deviation curve reduces the deviation (e.g., deviation factor < 1). This typically occurs when the instrument tip is remote from the endoscope. The mapping method provides a direct way to adjust the out-of-view detection process based on perceived error and may be independent of assumptions on instrument depth.
Fig. 12-15 illustrate the use of the deviation curve 400. Fig. 12 illustrates a position error 500 of the proximal tip portion 212 of the instrument tip 210. The position error 502 is shown for the distal tip portion 214, while the position error 504 is shown for the distal tip portion 216. The position error rings 500, 502, 504 represent estimates of the upper bound of the instrument position error for the tip portion. An error radius 506 is also identified. Dimension 508 is 10% of the screen size and dimension 510 is 20% of the screen size. Points 512, 514, 516 correspond to points 212, 214 and 216, respectively, and represent the coverage of the instrument tip position in absolute terms with respect to the system of endoscope tips. In this configuration, where the instrument tip is quite far from the endoscope tip, the error radius 506 is less than 10% of the screen size. Based on the deviation curve 400, no deviation is applied to the tip locations 512-516, and instead the locations 512-516 are used. The connecting segment itself is used as a bounding box to determine whether the instrument tip is in or out of view.
Fig. 13 illustrates instrument tip 210 at an intermediate distance from the endoscope tip. In this configuration, the error radius 520 of the orientation error is between the 10% dimension 508 and the 20% dimension 510. Based on the deviation curve 400, the bounding box 522 is determined as a percentage of the full-scale error (between 0% and 100%). This bounding box 522 is used to determine whether the instrument tip is in or out of view.
Fig. 14 illustrates instrument tip 210 at a close distance from the endoscope tip. In this configuration, the error radius is greater than 20% of the dimension 510. Based on the deviation curve 400, the bounding box 524 is determined as an all-around error. This bounding box 524 is used to determine whether the instrument tip is in or out of view. As shown in fig. 15, if the points 512, 514, 516 are used to determine whether the instrument tip is out of view or in view, the system will falsely report out of view indicators because the points 512, 514, 516 are out of view. When the bounding box 524 is used to test the inner/outer field of view, the system determines that the bounding box 524 is within the field of view and therefore does not report an out-of-view indicator.
One or more elements of embodiments of the invention may be implemented in software for execution on a processor of a computer system (control processing system). When implemented in software, the elements of an embodiment of the invention are essentially the code segments to perform the necessary tasks. The program or code segments can be stored on a processor readable storage medium or device and downloaded via transmission media or communication links via computer data signals embodied in a carrier wave. Processor-readable storage devices may include any medium that can store information, including optical media, semiconductor media, and magnetic media. Examples of processor-readable storage devices include electronic circuitry; a semiconductor device, a semiconductor memory device, a Read Only Memory (ROM), a flash memory, an Erasable Programmable Read Only Memory (EPROM); floppy disks, CD-ROMs, optical disks, hard disks, or other storage devices. The code segments may be downloaded via a computer network (e.g., the internet, intranet, etc.).
It is noted that the processes and displays presented may not be inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the operations described. The required structure for a variety of these systems will appear as elements in the claims. In addition, embodiments of the present invention are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein.
While certain exemplary embodiments of the invention have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that these embodiments of the invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.

Claims (10)

1. A medical imaging system, comprising:
a teleoperated assembly, comprising:
a medical instrument comprising an instrument tip; and
an imaging instrument comprising an imaging instrument tip; and
a processing unit comprising one or more processors, wherein the processing unit is configured to:
determining an instrument tip position of the instrument tip;
determining an instrument tip position error associated with the imaging instrument; and
determining at least one instrument tip bounding volume to determine whether the instrument tip is within or outside of a field of view based on the determined instrument tip position, the determined instrument tip position error, and a ratio between an error radius of the instrument tip position error and a size of a display screen.
2. The system of claim 1, wherein the processing unit is further configured to indicate when the instrument tip is outside the field of view of the imaging instrument based on the at least one instrument tip enclosed volume.
3. The system of claim 1, wherein determining the instrument tip position includes:
determining instrument tip position data in a world coordinate system;
transforming the instrument tip position data from the world coordinate system to an imaging instrument coordinate system; and
determining instrument tip position data in the imaging instrument coordinate system based on the instrument tip position data in the world coordinate system.
4. The system of claim 1, wherein the teleoperated component further comprises:
a first manipulator arm configured to control movement of the medical instrument; and
a second manipulator arm configured to control movement of the imaging instrument.
5. The system of claim 4, wherein determining the instrument tip position error associated with the imaging instrument includes:
determining a position error and an orientation error for each of the first and second manipulator arms;
determining an insertion depth for each of the medical instrument and the imaging instrument; and
determining a positional error of a proximal portion of the instrument tip based on the determined positional error and orientation error for each of the first and second manipulator arms and based on the determined insertion depth for each of the medical instrument and the imaging instrument.
6. The system of claim 5, wherein determining the instrument tip position error associated with the imaging instrument further comprises:
determining a medical instrument wrist error;
determining a length of the instrument tip; and
determining a positional error of a distal portion of the instrument tip based on the determined medical instrument wrist error, the determined length of the instrument tip, and the determined positional error of the proximal portion of the instrument tip.
7. The system of claim 1, wherein the at least one instrument tip enclosing volume includes a first volume associated with a proximal end of the instrument tip and a second volume associated with a distal end of the instrument tip.
8. The system of claim 1, wherein determining the at least one instrument tip bounding volume includes determining whether to apply a bias to the determined instrument tip position, wherein the bias is based on the ratio between the error radius of the instrument tip position error and the size of the display screen.
9. The system of claim 8, wherein determining whether to apply the bias includes determining the ratio between the error radius of the instrument tip position error and the size of the display screen, and wherein:
when the ratio between the error radius of the instrument tip position error and the size of the display screen is a first ratio, a distance between the instrument tip and a distal end of the imaging instrument tip is a first distance, and the instrument tip position is used to determine whether the instrument tip is outside a field of view of the imaging instrument;
when the ratio between the error radius of the instrument tip position error and the size of the display screen is a second ratio, the distance between the instrument tip and the distal end of the imaging instrument tip is a second distance that is less than the first distance, and a portion of the deviation is applied to the determined instrument tip position; and
when the ratio between the error radius of the instrument tip position error and the size of the display screen is a third ratio, the distance between the instrument tip and the distal end of the imaging instrument tip is a third distance that is less than the second distance, and the full amount of the deviation is applied to the determined instrument tip position.
10. The system of claim 9, wherein:
when the portion of the deviation is applied to the determined instrument tip position, the at least one instrument tip bounding volume is modified by the portion of the deviation and used to determine whether the instrument tip is outside the field of view of the imaging instrument; and
when the full amount of the deviation is applied to the determined instrument tip position, the at least one instrument tip enclosed volume is used to determine whether the instrument tip is outside the field of view of the imaging instrument.
CN201910406219.4A 2014-03-17 2015-03-17 System and method for off-screen indication of instruments in teleoperated medical systems Active CN110236693B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201461954442P 2014-03-17 2014-03-17
US61/954,442 2014-03-17
CN201580025502.2A CN106470634B (en) 2014-03-17 2015-03-17 System and method for being indicated outside the screen of the instrument in remote control operation medical system
PCT/US2015/021109 WO2015142956A1 (en) 2014-03-17 2015-03-17 Systems and methods for offscreen indication of instruments in a teleoperational medical system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201580025502.2A Division CN106470634B (en) 2014-03-17 2015-03-17 System and method for being indicated outside the screen of the instrument in remote control operation medical system

Publications (2)

Publication Number Publication Date
CN110236693A CN110236693A (en) 2019-09-17
CN110236693B true CN110236693B (en) 2022-11-04

Family

ID=54145247

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201580025502.2A Active CN106470634B (en) 2014-03-17 2015-03-17 System and method for being indicated outside the screen of the instrument in remote control operation medical system
CN201910406219.4A Active CN110236693B (en) 2014-03-17 2015-03-17 System and method for off-screen indication of instruments in teleoperated medical systems

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201580025502.2A Active CN106470634B (en) 2014-03-17 2015-03-17 System and method for being indicated outside the screen of the instrument in remote control operation medical system

Country Status (6)

Country Link
US (2) US11317979B2 (en)
EP (2) EP3566672B1 (en)
JP (3) JP6542252B2 (en)
KR (1) KR102363661B1 (en)
CN (2) CN106470634B (en)
WO (1) WO2015142956A1 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106470634B (en) 2014-03-17 2019-06-14 直观外科手术操作公司 System and method for being indicated outside the screen of the instrument in remote control operation medical system
EP3189768B1 (en) * 2014-09-05 2019-04-17 Olympus Corporation Endoscope system
KR20170127561A (en) 2015-03-17 2017-11-21 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 System and method for on-screen identification of instruments in a remotely operated medical system
KR102501099B1 (en) 2015-03-17 2023-02-17 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Systems and methods for rendering on-screen identification of instruments in teleoperated medical systems
CN109496135B (en) * 2016-06-01 2021-10-26 恩达马斯特有限公司 Endoscopy system component
US11730550B2 (en) 2016-08-12 2023-08-22 Intuitive Surgical Operations, Inc. Systems and methods for onscreen menus in a teleoperational medical system
US11370113B2 (en) * 2016-09-06 2022-06-28 Verily Life Sciences Llc Systems and methods for prevention of surgical mistakes
CN117885097A (en) 2017-03-07 2024-04-16 直观外科手术操作公司 Systems and methods for controlling tools having articulatable distal portions
JP2018202032A (en) * 2017-06-08 2018-12-27 株式会社メディカロイド Remote control apparatus for medical equipment
US20200163731A1 (en) * 2017-07-13 2020-05-28 Intuitive Surgical Operations, Inc. Systems and methods for switching control between multiple instrument arms
WO2019032450A1 (en) * 2017-08-08 2019-02-14 Intuitive Surgical Operations, Inc. Systems and methods for rendering alerts in a display of a teleoperational system
JP6880324B2 (en) * 2017-11-10 2021-06-02 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Systems and methods for controlling robot manipulators or related tools
JP7367990B2 (en) * 2018-02-02 2023-10-24 インテリジョイント サージカル インク. Remote monitoring of the operating room
US20210030497A1 (en) * 2019-07-31 2021-02-04 Auris Health, Inc. Apparatus, systems, and methods to facilitate instrument visualization
EP3804630A1 (en) * 2019-10-10 2021-04-14 Koninklijke Philips N.V. Ultrasound object zoom tracking
US11844497B2 (en) * 2020-02-28 2023-12-19 Covidien Lp Systems and methods for object measurement in minimally invasive robotic surgery
CN114099005B (en) * 2021-11-24 2023-09-15 重庆金山医疗机器人有限公司 Method for judging whether instrument is in visual field or is shielded or not and energy display method
WO2023100234A1 (en) * 2021-11-30 2023-06-08 オリンパス株式会社 Endoscope system and method for correcting coordinate system
US20230210579A1 (en) * 2021-12-30 2023-07-06 Verb Surgical Inc. Real-time surgical tool presence/absence detection in surgical videos
JP2023180371A (en) * 2022-06-09 2023-12-21 株式会社メディカロイド Surgery system
WO2024033828A1 (en) * 2022-08-10 2024-02-15 Medical Microinstruments Inc. Method for controlling a slave device, controlled by a master device in a robotic system for medical or surgical teleoperation, taking into account limits of a field of view, and related robotic system

Family Cites Families (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4150326A (en) * 1977-09-19 1979-04-17 Unimation, Inc. Trajectory correlation and error detection method and apparatus
US6659939B2 (en) * 1998-11-20 2003-12-09 Intuitive Surgical, Inc. Cooperative minimally invasive telesurgical system
US6850794B2 (en) * 2000-09-23 2005-02-01 The Trustees Of The Leland Stanford Junior University Endoscopic targeting method and system
US7607440B2 (en) * 2001-06-07 2009-10-27 Intuitive Surgical, Inc. Methods and apparatus for surgical planning
JP4047567B2 (en) * 2001-10-05 2008-02-13 オリンパス株式会社 Surgical system
US7170677B1 (en) * 2002-01-25 2007-01-30 Everest Vit Stereo-measurement borescope with 3-D viewing
US7135992B2 (en) * 2002-12-17 2006-11-14 Evolution Robotics, Inc. Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system
JP2005040205A (en) * 2003-07-23 2005-02-17 Olympus Corp Three-dimensional endoscope apparatus
JP2005143918A (en) 2003-11-17 2005-06-09 Olympus Corp Remote operation support system
US20050137751A1 (en) * 2003-12-05 2005-06-23 Cox Damon K. Auto-diagnostic method and apparatus
EP1720480A1 (en) * 2004-03-05 2006-11-15 Hansen Medical, Inc. Robotic catheter system
US7626569B2 (en) * 2004-10-25 2009-12-01 Graphics Properties Holdings, Inc. Movable audio/video communication interface system
US9789608B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
US8073528B2 (en) * 2007-09-30 2011-12-06 Intuitive Surgical Operations, Inc. Tool tracking systems, methods and computer products for image guided surgery
US8147503B2 (en) * 2007-09-30 2012-04-03 Intuitive Surgical Operations Inc. Methods of locating and tracking robotic instruments in robotic surgical systems
US8398541B2 (en) * 2006-06-06 2013-03-19 Intuitive Surgical Operations, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
US20070005002A1 (en) * 2005-06-30 2007-01-04 Intuitive Surgical Inc. Robotic surgical instruments for irrigation, aspiration, and blowing
EP3162318B1 (en) * 2005-10-20 2019-10-16 Intuitive Surgical Operations, Inc. Auxiliary image display and manipulation on a computer display in a medical robotic system
US8190238B2 (en) * 2005-12-09 2012-05-29 Hansen Medical, Inc. Robotic catheter system and methods
US8469945B2 (en) 2006-01-25 2013-06-25 Intuitive Surgical Operations, Inc. Center robotic arm with five-bar spherical linkage for endoscopic camera
US9782229B2 (en) * 2007-02-16 2017-10-10 Globus Medical, Inc. Surgical robot platform
US7841980B2 (en) * 2006-05-11 2010-11-30 Olympus Medical Systems Corp. Treatment system, trocar, treatment method and calibration method
US20090192523A1 (en) * 2006-06-29 2009-07-30 Intuitive Surgical, Inc. Synthetic representation of a surgical instrument
US9718190B2 (en) 2006-06-29 2017-08-01 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US10008017B2 (en) * 2006-06-29 2018-06-26 Intuitive Surgical Operations, Inc. Rendering tool information as graphic overlays on displayed images of tools
US9089256B2 (en) * 2008-06-27 2015-07-28 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
DE102007030907A1 (en) 2007-07-03 2009-01-08 Manroland Ag Method for driving a paddle wheel and control device and drive device therefor
US8808164B2 (en) * 2008-03-28 2014-08-19 Intuitive Surgical Operations, Inc. Controlling a robotic surgical tool with a display monitor
US8086026B2 (en) * 2008-06-27 2011-12-27 Waldean Schulz Method and system for the determination of object positions in a volume
US8418073B2 (en) * 2009-03-09 2013-04-09 Intuitive Surgical Operations, Inc. User interfaces for electrosurgical tools in robotic surgical systems
US8579012B2 (en) 2009-03-27 2013-11-12 Novelis Inc. Continuous casting apparatus for casting strip of variable width
US8423186B2 (en) * 2009-06-30 2013-04-16 Intuitive Surgical Operations, Inc. Ratcheting for master alignment of a teleoperated minimally-invasive surgical instrument
US8918211B2 (en) * 2010-02-12 2014-12-23 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US9125669B2 (en) * 2011-02-14 2015-09-08 Mako Surgical Corporation Haptic volumes for reaming during arthroplasty
CN102909728B (en) * 2011-08-05 2015-11-25 鸿富锦精密工业(深圳)有限公司 The vision correction methods of robot tooling center points
AU2012301718B2 (en) * 2011-09-02 2017-06-15 Stryker Corporation Surgical instrument including a cutting accessory extending from a housing and actuators that establish the position of the cutting accessory relative to the housing
CN106470634B (en) 2014-03-17 2019-06-14 直观外科手术操作公司 System and method for being indicated outside the screen of the instrument in remote control operation medical system
CN111184577A (en) * 2014-03-28 2020-05-22 直观外科手术操作公司 Quantitative three-dimensional visualization of an instrument in a field of view
US11737663B2 (en) * 2020-03-30 2023-08-29 Auris Health, Inc. Target anatomical feature localization

Also Published As

Publication number Publication date
CN110236693A (en) 2019-09-17
EP3566672A1 (en) 2019-11-13
EP3119339A4 (en) 2017-11-29
US11903665B2 (en) 2024-02-20
CN106470634B (en) 2019-06-14
US20170165013A1 (en) 2017-06-15
JP2019162511A (en) 2019-09-26
EP3566672B1 (en) 2023-08-23
US20220218426A1 (en) 2022-07-14
JP6851427B2 (en) 2021-03-31
JP2021072900A (en) 2021-05-13
JP6542252B2 (en) 2019-07-10
KR102363661B1 (en) 2022-02-17
EP3119339A1 (en) 2017-01-25
EP3119339B1 (en) 2019-08-28
US11317979B2 (en) 2022-05-03
WO2015142956A1 (en) 2015-09-24
KR20160133515A (en) 2016-11-22
JP2017515523A (en) 2017-06-15
JP7295153B2 (en) 2023-06-20
CN106470634A (en) 2017-03-01

Similar Documents

Publication Publication Date Title
CN110236693B (en) System and method for off-screen indication of instruments in teleoperated medical systems
US10905506B2 (en) Systems and methods for rendering onscreen identification of instruments in a teleoperational medical system
US20220047343A1 (en) Systems and methods for onscreen identification of instruments in a teleoperational medical system
US20090187288A1 (en) Remote control system
US11497569B2 (en) Touchscreen user interface for interacting with a virtual model
CN111132631A (en) System and method for interactive point display in a teleoperational assembly

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant