US20220000579A1 - Methods of using an angled endoscope for visualizing a body cavity with robotic surgical systems - Google Patents

Methods of using an angled endoscope for visualizing a body cavity with robotic surgical systems Download PDF

Info

Publication number
US20220000579A1
US20220000579A1 US17/479,548 US202117479548A US2022000579A1 US 20220000579 A1 US20220000579 A1 US 20220000579A1 US 202117479548 A US202117479548 A US 202117479548A US 2022000579 A1 US2022000579 A1 US 2022000579A1
Authority
US
United States
Prior art keywords
view
elongated body
body cavity
surgical
end effector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/479,548
Inventor
Dwight Meglan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Covidien LP
Original Assignee
Covidien LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Covidien LP filed Critical Covidien LP
Priority to US17/479,548 priority Critical patent/US20220000579A1/en
Assigned to COVIDIEN LP reassignment COVIDIEN LP ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MEGLAN, DWIGHT
Publication of US20220000579A1 publication Critical patent/US20220000579A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00006Operational features of endoscopes characterised by electronic signal processing of control signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/00048Constructional features of the display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/00149Holding or positioning arrangements using articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00174Optical arrangements characterised by the viewing angles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00174Optical arrangements characterised by the viewing angles
    • A61B1/00179Optical arrangements characterised by the viewing angles for off-axis viewing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/367Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners

Definitions

  • Robotic surgical systems have been used in minimally invasive medical procedures. During such a medical procedure, the robotic surgical system is controlled by a surgeon interfacing with a user interface.
  • the user interface allows the surgeon to see and manipulate a tool that acts on a patient.
  • the user interface includes a display and an input controller or handle that is moveable by the surgeon to control the robotic surgical system.
  • robotic surgical systems include an endoscope that is inserted through an opening of a patient to provide visualization of a surgical site within a body cavity of the patient.
  • Current endoscopes provide a limited field of view of the surgical site. Specifically, the endoscope is directed to the surgical site where the tools are acting on tissue. This leaves a majority of the body cavity unobserved.
  • non-target tissue within the body cavity that is outside the field of view of the endoscope (e.g., when tools are exchanged). As a result, observation of the non-target tissue may be desired or necessary.
  • multiple and/or specialized endoscopes are used to provide increased visualization of the body cavity.
  • the use of multiple and/or specialized endoscopes may increase the cost of medical procedures. Further, the use of multiple endoscopes may require the endoscopes to be changed or swapped during a medical procedure. Additionally or alternatively, the use of multiple endoscopes may increase the number of openings required in the body cavity of the patient to provide visualization of the body cavity.
  • a method of visualizing a body cavity during a surgical procedure includes positioning an elongated body of an angled endoscope in a first position within a body cavity of a patient, rotating the elongated body about the longitudinal axis in response to a command point, capturing a plurality of images with an image capture device as the elongated body is rotated, and generating a panoramic view of the body cavity from the plurality of images.
  • Positioning the elongated body of the angled endoscope includes positioning a surgical site within a field of view of the image capture device.
  • the image capture device positioned in a distal end portion of the elongated body.
  • the field of view of the image capture device capturing a first volume of the body cavity which includes the surgical site when the angled endoscope is in the first position.
  • the method includes translating the elongated body along a longitudinal axis that is defined by the elongated body away from the surgical site to a second position in response to the command point.
  • the field of view of the image capture device at the second position may capture a second volume of the body cavity that is larger than the first volume. Rotating the elongated body about the longitudinal axis may occur when the elongated body is in the second position.
  • the method may include returning the elongated body to the first position after generating the panoramic view of the body cavity.
  • the method includes initiating the command point or the method may include moving a surgical instrument within the body cavity such that the command point is initiated in response to movement of the surgical instrument.
  • Moving the surgical instrument may include translating the surgical instrument into the body cavity.
  • moving the surgical instrument may include translating the surgical instrument such that an end effector of the surgical instrument is withdrawn beyond a threshold distance from the surgical site. Withdrawing the end effector of the surgical instrument beyond the threshold distance may withdraw the end effector from the first volume.
  • Moving the surgical instrument may include swapping the surgical instrument for a second surgical instrument.
  • the method may include detecting an attribute of a clinician interfacing with a user interface of a robotic surgical system to initiate the command point.
  • Detecting the attribute of the clinician interfacing with the user interface may include detecting a gaze of the clinician with the user interface and initiating the command point when the gaze of the clinician is not directed to a display of the user interface.
  • detecting the attribute of the clinician interfacing with the user interface may include detecting movement of a portable display and initiating the command point based on predetermined movement of the portable display.
  • rotating the surgical instrument includes pivoting the elongated body about a pitch axis orthogonal to the longitudinal axis. Additionally, rotating the surgical instrument may include pivoting the elongated body about a yaw axis that is orthogonal to the pitch axis and the longitudinal axis. The pitch, longitudinal, and yaw axes may intersect at a common pivot point.
  • the method includes displaying the panoramic view on a wearable display such that movement of the wearable display updates a view of a clinician of the panoramic view.
  • the method may include interacting with the panoramic view of the body cavity to adjust the panoramic view of the body cavity. Interacting with the panoramic view of the body cavity includes panning the panoramic view of the body cavity. Additionally or alternatively, interacting with the panoramic view of the body cavity includes zooming the panoramic view of the body cavity.
  • FIG. 1 is a schematic, side view of an endoscope provided in accordance with the present disclosure
  • FIG. 2 is a cut-away view of a body cavity of a patient with the endoscope of FIG. 1 in a first position and a surgical instrument positioned at a surgical site within a field of view of the endoscope;
  • FIG. 3 is a cut-away view of the body cavity of the patient of FIG. 2 with the surgical instrument withdrawn from the field of view of the endoscope;
  • FIG. 4 is a cut-away view of the body cavity of the patient of FIG. 2 with the endoscope in a second position and with the surgical instrument recaptured in the field of view of the endoscope;
  • FIG. 5 is a schematic illustration of a user interface and a robotic system in accordance with the present disclosure.
  • FIG. 6 is a flow diagram of a method of viewing a body cavity of a patient in accordance with the present disclosure.
  • the term “clinician” refers to a doctor, a nurse, or any other care provider and may include support personnel.
  • proximal refers to the portion of the device or component thereof that is closest to the clinician and the term “distal” refers to the portion of the device or component thereof that is farthest from the clinician.
  • this disclosure relates to methods and devices for visualizing a body cavity of a patient during a minimally invasive surgical procedure with an endoscope.
  • this disclosure details the use of an angled endoscope for use with a robotic surgical system that can function in the role of a 0° endoscope and provides panoramic views of a surgical site within a body cavity of a patient.
  • the angled endoscope may be function as a 0° endoscope view of the surgical site where a tool is acting on tissue and may be repositionable to provide a panoramic view of the body cavity surrounding the surgical site.
  • the angled endoscope may automatically provide such a panoramic view at a predetermined command point or time interval (e.g., at predetermined time intervals or at a particular step during the surgical procedure). Additionally or alternatively, the angled endoscope may provide such a panoramic view in response to a user generated command point (e.g., at the request of a clinician).
  • the endoscope may be used as a standalone instrument or may be used as part of a robotic surgical system.
  • an angled endoscope 10 is provided in accordance with the present disclosure and includes an elongated body 12 that defines a longitudinal axis “A-A” of the angled endoscope 10 .
  • the elongated body 12 extends to a distal end portion 14 that includes an angled distal end 16 that defines an angle “ ⁇ ” from a line or plane that extends perpendicular to the longitudinal axis “A-A”.
  • the angle “ ⁇ ” is about 30°; however, the angle “ ⁇ ” may be in a range of about 0° to about 60°.
  • the angled endoscope 10 includes an image capture device 18 (e.g., a camera) that captures images of the body cavity “C” ( FIG. 2 ) through the distal end 16 of the elongated body 12 .
  • the images may be in the form of still images or video.
  • the image capture device 18 is positioned in the distal end portion 14 of the elongated body 12 ; however, it is contemplated that the image capture device 18 may be positioned anywhere within the elongated body 12 . It is also contemplated that the image capture device 18 may be positioned outside of the elongated body 12 and may include a fiber optic cable (not shown) positioned within the elongated body 12 to capture images through the distal end 16 of the elongated body 12 .
  • the image capture device 18 has a conical field of view “FV” through the distal end 16 of the elongated body 12 such that one edge or a first edge of the field of view “FV” is approximately parallel to the longitudinal axis “A-A” and another, opposite, or second edge of the field of view “FV” extends at an angle “ ⁇ ” of approximately 30° with the longitudinal axis “A-A”. It will be appreciated that for different angles “ ⁇ ” the angle of the second edge of the field of view “FV” will also change.
  • the angled endoscope 10 is inserted through an opening “O” in a body cavity “C” of a patient for visualizing the body cavity “C” during a surgical procedure.
  • the opening “O” may be a naturally occurring orifice or an incision.
  • the angled endoscope 10 may be inserted through a cannula 110 that is positioned within the opening “O”.
  • the elongated body 12 is positioned such that the longitudinal axis “A-A” passes through the surgical site “S” such that the entire surgical site “S” is within the conical field of view “FV”.
  • the angled endoscope 10 is moveable in at least four degrees of freedom to align the longitudinal axis “A-A” with the surgical site “S” and to view the body cavity with the image capture device 18 .
  • the elongated body 12 is translatable in and out along the longitudinal axis “A-A”.
  • the elongated body 12 is rollable or rotatable about the longitudinal axis “A-A”.
  • the elongated body is pivotable about a pitch axis that intersects and is orthogonal to the longitudinal axis “A-A” at pivot point “P”.
  • the elongated body 12 is pivotable about a yaw axis that intersects and is orthogonal to the longitudinal axis “A-A” and orthogonal to the pitch axis at the pivot point “P”.
  • the pivot point “P” is the point where the longitudinal axis “A-A” passes through a wall “W” defining the body cavity “C”.
  • a surgical instrument 200 is inserted through an opening “O” in the wall “W” defining the body cavity “C”.
  • the surgical instrument 200 may be inserted through the same opening “O” as the angled endoscope 10 or through a different opening “O” as shown in FIG. 2 .
  • the surgical instrument 200 includes an end effector 210 to act on tissue of a patient at the surgical site “S”.
  • the elongated body 12 of the angled endoscope 10 is positioned to visualize a volume of the body cavity “C” such that surgical site “S” and the end effector 210 of the surgical instrument 200 are within the field of view “FV” of the image capture device 18 .
  • the elongated body 12 of the angled endoscope 10 is positioned such that the end effector 210 of the surgical instrument 200 and the surgical site “S” are within the field of view “FV” of the image capture device 18 .
  • the angled endoscope 10 may operate as a 0° endoscope to view the surgical site “S” by manipulating the elongated body 12 (e.g., pivoting the angled endoscope 10 about its yaw and pivot axes). It will be appreciated that as the distance between the image capture device 18 and the surgical site “S” is reduced, details of images of the surgical site “S” captured by the image capture device 18 is increased.
  • the surgical instrument 200 may be withdrawn from the surgical site “S” or out of the body cavity “C” such that the end effector 210 of the surgical instrument 200 is withdrawn from the surgical site “S” and the field of view “FV” of the image capture device 18 .
  • the surgical instrument 200 may be withdrawn from the surgical site “S” for various reasons including, but not limited to, to swapping the surgical instrument 200 for another surgical instrument (not shown), to reloading the end effector 210 , and repositioning the end effector 210 .
  • the surgical instrument 200 may contact tissue within the body cavity “C” outside of the field of view “FV” of the image capture device 18 .
  • the angled endoscope 10 may be translated out of the body cavity “C”, along the longitudinal axis “A-A”, such that the distal end 16 of the angled endoscope 10 is moved away from the surgical site “S”.
  • the field of view “FV” of the image capture device 18 encompasses a larger volume of the body cavity “C”, wherein the end effector 210 of the surgical instrument 200 is within the field of view “FV” when the end effector 210 is withdrawn from the surgical site “S” while remaining within the body cavity “C” such that tissue surrounding the end effector 210 is within the field of view “FV” of the image capture device 18 .
  • the angled endoscope 10 may be rolled or rotated about the longitudinal axis “A-A” to capture a plurality of images within a panoramic field of view “PV” that includes the original field of view “FV”.
  • This panoramic field of view “PV” is a panoramic view of the surgical site “S” created by stitching together the plurality of images captured during the rolling or rotation of the angled endoscope 10 .
  • the panoramic field of view “PV” provides visualization of the body cavity “C” surrounding the surgical site “S” to allow a clinician greater visualization of the body cavity “C”.
  • the angled endoscope 10 may also be pivoted about the pitch axis and/or the yaw axis ( FIG. 1 ) to adjust a focal point of the image capture device 18 , to increase the panoramic field of view “PV”.
  • the rolling or rotation of the angled endoscope 10 about the longitudinal axis “A-A” is a full rotation of 360°; however, the rotation of the angled endoscope 10 may be a partial rotation of less than 360°.
  • the angled endoscope 10 may be rolled or rotated back to a pre-rotated position. It is contemplated that the full or partial rolling or rotation of the angled endoscope 10 has a duration of approximately 1.0 seconds; however, the full or partial rotation of the angled endoscope may have a duration of about 0.1 seconds to about 2.0 seconds.
  • the generation of a panoramic field of view “PV” may be initiated automatically at a predetermined command point or when a command point is initiated by a clinician.
  • a predetermined command point include, but are not limited to, when the end effector 210 is withdrawn from the surgical site “S”, when a surgical instrument (e.g., surgical instrument 200 ) is inserted through an opening “O”, when a surgical instrument is withdrawn from an opening “O”, when a surgical instrument is withdrawn from the field of view “FV”, or at time intervals.
  • the angled endoscope 10 and the surgical instrument 200 may be part of a robotic surgical system 301 in accordance with the present disclosure.
  • the robotic surgical system 301 includes a robotic system 310 , a processing unit 330 , and a user interface 340 .
  • the robotic system 310 generally includes linkages 312 and a robot base 318 .
  • the linkages 312 moveably support a tool (e.g., angled endoscope 10 or surgical instrument 200 ).
  • the linkages 312 may be in the form of arms each having an end 314 that supports a tool.
  • the user interface 340 is in communication with the robot base 318 through the processing unit 330 .
  • the user interface 340 includes a display device 344 which is configured to display three-dimensional images.
  • the display device 344 displays three-dimensional images of the surgical site “S” which may include data captured by imaging devices positioned on the ends 314 of the linkages 312 (e.g., angled endoscope 10 ) and/or include data captured by imaging devices that are positioned about the surgical theater (e.g., an imaging device positioned within the surgical site “S”, an imaging device positioned adjacent the patient, imaging device 356 positioned at a distal end of an imaging arm 352 ).
  • the imaging devices may capture visual images, infra-red images, ultrasound images, X-ray images, thermal images, and/or any other known real-time images of the surgical site “S”.
  • the imaging devices transmit captured imaging data to the processing unit 330 which creates three-dimensional images of the surgical site “S” in real-time from the imaging data and transmits the three-dimensional images to the display device 344 for display.
  • the user interface 340 also includes input arms or handles 342 which allow a clinician to manipulate the robotic system 310 (e.g., move the linkages 312 , the ends 314 of the linkages 312 , and/or the tools).
  • Each of the input handles 342 is in communication with the processing unit 330 to transmit control signals thereto and to receive feedback signals therefrom.
  • Each of the input handles 342 may include an input device which allow the surgeon to manipulate (e.g., clamp, grasp, fire, open, close, rotate, thrust, slice, etc.) the tools supported at the ends 314 of the linkages 312 .
  • the robot system 310 may operate the angled endoscope 10 to function as a 0° endoscope while a clinician is engaged with the user interface 340 by manipulating the linkages 312 .
  • a clinician disengages from the user interface 340 (e.g., when the clinician releases the input handles 342 , when the clinician looks away from the display 344 ) the user interface 340 may generate a command point such that the processing unit 330 sends a signal to the robot system 310 to generate a panoramic view of the surgical site “S” with the angled endoscope 10 , as detailed above.
  • the robotic system 310 may use kinematic tracking to generate a command point when a tool (e.g., surgical instrument 200 ) completes a particular movement (e.g., when the tool is withdrawn beyond a threshold distance, when one tool is exchanged for another tool).
  • a tool e.g., surgical instrument 200
  • a particular movement e.g., when the tool is withdrawn beyond a threshold distance, when one tool is exchanged for another tool.
  • the user interface 340 then displays the panoramic view of the surgical site “S” on the display 344 .
  • the display 344 may be a 3D display such that the panoramic view of the surgical site “S” is displayed in 3D.
  • the display 344 may be an interactive display such that a clinician may pan, rotate, zoom in, and/or zoom out of areas of interest within the panoramic view of the surgical site.
  • the display 344 may include a display helmet 344 a such that the movement of a head of clinician may allow a clinician to interact with the panoramic view of the surgical site “S”.
  • the helmet 344 a may use inertial tracking to detect movement of the head of a clinician.
  • the user interface 340 may include a portable display or monitor 344 b that is moveable relative to the display 344 .
  • the portable display 344 b displays a view of the surgical site “S” and may use inertia tracking to update the view of the surgical site “S” on the portable display 344 b as the portable display 344 b is moved relative to the display 344 .
  • a clinician may interact with the portable display 344 b to update the view of the surgical site “S” on the portable display 344 b.
  • the portable display 344 b may be used without the display 344 .
  • a method 400 of visualizing a body cavity during a surgical procedure is described in accordance with the present disclosure utilizing an angled endoscope (e.g., angled endoscope 10 ).
  • the angled endoscope is positioned within a body cavity “C” of a patient (Step 410 ).
  • the angled endoscope may be operated as a 0° endoscope (Step 420 ).
  • the angled endoscope may be operated as a 0° endoscope by manipulating the angled endoscope about its pitch, yaw, and longitudinal axes as detailed above.
  • a command point is generated (Step 430 ).
  • the angled endoscope is rotated about is longitudinal axis (Step 440 ).
  • the angled endoscope may be withdrawn within the body cavity “C” before the angled endoscope is rotated and returned to the position it had before being withdrawn after the angled endoscope is rotated (Step 444 ).
  • an image capture device 18 is disposed within the angled endoscope captures a plurality of images (Step 450 ).
  • a panoramic view of the body cavity “C” is generated from the plurality of images (Step 460 ).
  • the panoramic view is displayed to a clinician as detailed above (Step 470 ).
  • the clinician may interact with the panoramic view as detailed above.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Robotics (AREA)
  • Signal Processing (AREA)
  • Gynecology & Obstetrics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

A method of visualizing a body cavity during a surgical procedure including positioning an elongated body of an angled endoscope in a first position within a body cavity of a patient, rotating the elongated body about a longitudinal axis in response to a command point, capturing a plurality of images with an image capture device positioned within the elongated body as the elongated body is rotated, and generating a panoramic view of the body cavity from the plurality of images. In the first position of the elongated body, a surgical site is within a field of view of the image capture device. The field of view of the image capture device capturing a first volume of the body cavity, including the surgical site, when the angled endoscope is in the first position.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 15/765,866 filed Apr. 4, 2018, now U.S. Pat. No. 11,123,149, which is a U.S. National Stage Application filed under 35 U.S.C. § 371(a) claiming the benefit of and priority to International Patent Application No. PCT/US2016/055396 filed Oct. 5, 2016, which claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/239,412 filed Oct. 9, 2015. The entire contents of these cross-referenced applications are incorporated by reference herein.
  • BACKGROUND
  • Robotic surgical systems have been used in minimally invasive medical procedures. During such a medical procedure, the robotic surgical system is controlled by a surgeon interfacing with a user interface. The user interface allows the surgeon to see and manipulate a tool that acts on a patient. The user interface includes a display and an input controller or handle that is moveable by the surgeon to control the robotic surgical system.
  • Generally, robotic surgical systems include an endoscope that is inserted through an opening of a patient to provide visualization of a surgical site within a body cavity of the patient. Current endoscopes provide a limited field of view of the surgical site. Specifically, the endoscope is directed to the surgical site where the tools are acting on tissue. This leaves a majority of the body cavity unobserved. During a medical procedure, there may be contact with non-target tissue within the body cavity that is outside the field of view of the endoscope (e.g., when tools are exchanged). As a result, observation of the non-target tissue may be desired or necessary.
  • During some medical procedures, multiple and/or specialized endoscopes are used to provide increased visualization of the body cavity. The use of multiple and/or specialized endoscopes may increase the cost of medical procedures. Further, the use of multiple endoscopes may require the endoscopes to be changed or swapped during a medical procedure. Additionally or alternatively, the use of multiple endoscopes may increase the number of openings required in the body cavity of the patient to provide visualization of the body cavity.
  • There is thus a need for a robotic surgical system that is capable of increasing visualization of a body cavity during a medical procedure utilizing a single endoscope.
  • SUMMARY
  • In an aspect of the present disclosure, a method of visualizing a body cavity during a surgical procedure includes positioning an elongated body of an angled endoscope in a first position within a body cavity of a patient, rotating the elongated body about the longitudinal axis in response to a command point, capturing a plurality of images with an image capture device as the elongated body is rotated, and generating a panoramic view of the body cavity from the plurality of images. Positioning the elongated body of the angled endoscope includes positioning a surgical site within a field of view of the image capture device. The image capture device positioned in a distal end portion of the elongated body. The field of view of the image capture device capturing a first volume of the body cavity which includes the surgical site when the angled endoscope is in the first position.
  • In aspects, the method includes translating the elongated body along a longitudinal axis that is defined by the elongated body away from the surgical site to a second position in response to the command point. The field of view of the image capture device at the second position may capture a second volume of the body cavity that is larger than the first volume. Rotating the elongated body about the longitudinal axis may occur when the elongated body is in the second position. The method may include returning the elongated body to the first position after generating the panoramic view of the body cavity.
  • In some aspects, the method includes initiating the command point or the method may include moving a surgical instrument within the body cavity such that the command point is initiated in response to movement of the surgical instrument. Moving the surgical instrument may include translating the surgical instrument into the body cavity. Alternatively, moving the surgical instrument may include translating the surgical instrument such that an end effector of the surgical instrument is withdrawn beyond a threshold distance from the surgical site. Withdrawing the end effector of the surgical instrument beyond the threshold distance may withdraw the end effector from the first volume. Moving the surgical instrument may include swapping the surgical instrument for a second surgical instrument.
  • In certain aspects, the method may include detecting an attribute of a clinician interfacing with a user interface of a robotic surgical system to initiate the command point. Detecting the attribute of the clinician interfacing with the user interface may include detecting a gaze of the clinician with the user interface and initiating the command point when the gaze of the clinician is not directed to a display of the user interface. Additionally or alternatively, detecting the attribute of the clinician interfacing with the user interface may include detecting movement of a portable display and initiating the command point based on predetermined movement of the portable display.
  • In particular aspects, rotating the surgical instrument includes pivoting the elongated body about a pitch axis orthogonal to the longitudinal axis. Additionally, rotating the surgical instrument may include pivoting the elongated body about a yaw axis that is orthogonal to the pitch axis and the longitudinal axis. The pitch, longitudinal, and yaw axes may intersect at a common pivot point.
  • In aspects, the method includes displaying the panoramic view on a wearable display such that movement of the wearable display updates a view of a clinician of the panoramic view. The method may include interacting with the panoramic view of the body cavity to adjust the panoramic view of the body cavity. Interacting with the panoramic view of the body cavity includes panning the panoramic view of the body cavity. Additionally or alternatively, interacting with the panoramic view of the body cavity includes zooming the panoramic view of the body cavity.
  • Further details and aspects of exemplary embodiments of the present disclosure are described in more detail below with reference to the appended figures.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Various aspects of the present disclosure are described hereinbelow with reference to the drawings, which are incorporated in and constitute a part of this specification, wherein:
  • FIG. 1 is a schematic, side view of an endoscope provided in accordance with the present disclosure;
  • FIG. 2 is a cut-away view of a body cavity of a patient with the endoscope of FIG. 1 in a first position and a surgical instrument positioned at a surgical site within a field of view of the endoscope;
  • FIG. 3 is a cut-away view of the body cavity of the patient of FIG. 2 with the surgical instrument withdrawn from the field of view of the endoscope;
  • FIG. 4 is a cut-away view of the body cavity of the patient of FIG. 2 with the endoscope in a second position and with the surgical instrument recaptured in the field of view of the endoscope;
  • FIG. 5 is a schematic illustration of a user interface and a robotic system in accordance with the present disclosure; and
  • FIG. 6 is a flow diagram of a method of viewing a body cavity of a patient in accordance with the present disclosure.
  • DETAILED DESCRIPTION
  • Embodiments of the present disclosure are now described in detail with reference to the drawings in which like reference numerals designate identical or corresponding elements in each of the several views. As used herein, the term “clinician” refers to a doctor, a nurse, or any other care provider and may include support personnel. Throughout this description, the term “proximal” refers to the portion of the device or component thereof that is closest to the clinician and the term “distal” refers to the portion of the device or component thereof that is farthest from the clinician.
  • Generally, this disclosure relates to methods and devices for visualizing a body cavity of a patient during a minimally invasive surgical procedure with an endoscope. Specifically, this disclosure details the use of an angled endoscope for use with a robotic surgical system that can function in the role of a 0° endoscope and provides panoramic views of a surgical site within a body cavity of a patient. As detailed below, the angled endoscope may be function as a 0° endoscope view of the surgical site where a tool is acting on tissue and may be repositionable to provide a panoramic view of the body cavity surrounding the surgical site. The angled endoscope may automatically provide such a panoramic view at a predetermined command point or time interval (e.g., at predetermined time intervals or at a particular step during the surgical procedure). Additionally or alternatively, the angled endoscope may provide such a panoramic view in response to a user generated command point (e.g., at the request of a clinician). The endoscope may be used as a standalone instrument or may be used as part of a robotic surgical system.
  • Referring now to FIG. 1, an angled endoscope 10 is provided in accordance with the present disclosure and includes an elongated body 12 that defines a longitudinal axis “A-A” of the angled endoscope 10. The elongated body 12 extends to a distal end portion 14 that includes an angled distal end 16 that defines an angle “θ” from a line or plane that extends perpendicular to the longitudinal axis “A-A”. As shown, the angle “θ” is about 30°; however, the angle “θ” may be in a range of about 0° to about 60°.
  • The angled endoscope 10 includes an image capture device 18 (e.g., a camera) that captures images of the body cavity “C” (FIG. 2) through the distal end 16 of the elongated body 12. The images may be in the form of still images or video. As shown, the image capture device 18 is positioned in the distal end portion 14 of the elongated body 12; however, it is contemplated that the image capture device 18 may be positioned anywhere within the elongated body 12. It is also contemplated that the image capture device 18 may be positioned outside of the elongated body 12 and may include a fiber optic cable (not shown) positioned within the elongated body 12 to capture images through the distal end 16 of the elongated body 12. The image capture device 18 has a conical field of view “FV” through the distal end 16 of the elongated body 12 such that one edge or a first edge of the field of view “FV” is approximately parallel to the longitudinal axis “A-A” and another, opposite, or second edge of the field of view “FV” extends at an angle “α” of approximately 30° with the longitudinal axis “A-A”. It will be appreciated that for different angles “θ” the angle of the second edge of the field of view “FV” will also change.
  • With additional reference to FIG. 2, the angled endoscope 10 is inserted through an opening “O” in a body cavity “C” of a patient for visualizing the body cavity “C” during a surgical procedure. The opening “O” may be a naturally occurring orifice or an incision. The angled endoscope 10 may be inserted through a cannula 110 that is positioned within the opening “O”.
  • During a surgical procedure, the elongated body 12 is positioned such that the longitudinal axis “A-A” passes through the surgical site “S” such that the entire surgical site “S” is within the conical field of view “FV”. With particular reference to FIG. 1, the angled endoscope 10 is moveable in at least four degrees of freedom to align the longitudinal axis “A-A” with the surgical site “S” and to view the body cavity with the image capture device 18. First, the elongated body 12 is translatable in and out along the longitudinal axis “A-A”. Second, the elongated body 12 is rollable or rotatable about the longitudinal axis “A-A”. Thirdly, the elongated body is pivotable about a pitch axis that intersects and is orthogonal to the longitudinal axis “A-A” at pivot point “P”. Finally, the elongated body 12 is pivotable about a yaw axis that intersects and is orthogonal to the longitudinal axis “A-A” and orthogonal to the pitch axis at the pivot point “P”. As shown in FIG. 2, the pivot point “P” is the point where the longitudinal axis “A-A” passes through a wall “W” defining the body cavity “C”.
  • Continuing to refer to FIG. 2, during a surgical procedure, a surgical instrument 200 is inserted through an opening “O” in the wall “W” defining the body cavity “C”. The surgical instrument 200 may be inserted through the same opening “O” as the angled endoscope 10 or through a different opening “O” as shown in FIG. 2. The surgical instrument 200 includes an end effector 210 to act on tissue of a patient at the surgical site “S”. The elongated body 12 of the angled endoscope 10 is positioned to visualize a volume of the body cavity “C” such that surgical site “S” and the end effector 210 of the surgical instrument 200 are within the field of view “FV” of the image capture device 18. The elongated body 12 of the angled endoscope 10 is positioned such that the end effector 210 of the surgical instrument 200 and the surgical site “S” are within the field of view “FV” of the image capture device 18. The angled endoscope 10 may operate as a 0° endoscope to view the surgical site “S” by manipulating the elongated body 12 (e.g., pivoting the angled endoscope 10 about its yaw and pivot axes). It will be appreciated that as the distance between the image capture device 18 and the surgical site “S” is reduced, details of images of the surgical site “S” captured by the image capture device 18 is increased.
  • Referring to FIG. 3, during the surgical procedure the surgical instrument 200 may be withdrawn from the surgical site “S” or out of the body cavity “C” such that the end effector 210 of the surgical instrument 200 is withdrawn from the surgical site “S” and the field of view “FV” of the image capture device 18. The surgical instrument 200 may be withdrawn from the surgical site “S” for various reasons including, but not limited to, to swapping the surgical instrument 200 for another surgical instrument (not shown), to reloading the end effector 210, and repositioning the end effector 210. As the surgical instrument 200 is withdrawn from the field of view “FV”, the surgical instrument 200 may contact tissue within the body cavity “C” outside of the field of view “FV” of the image capture device 18.
  • With reference to FIG. 4, the angled endoscope 10 may be translated out of the body cavity “C”, along the longitudinal axis “A-A”, such that the distal end 16 of the angled endoscope 10 is moved away from the surgical site “S”. As the distal end 16 of the angled endoscope 10 is moved away from the surgical site “S”, the field of view “FV” of the image capture device 18 encompasses a larger volume of the body cavity “C”, wherein the end effector 210 of the surgical instrument 200 is within the field of view “FV” when the end effector 210 is withdrawn from the surgical site “S” while remaining within the body cavity “C” such that tissue surrounding the end effector 210 is within the field of view “FV” of the image capture device 18. In addition, the angled endoscope 10 may be rolled or rotated about the longitudinal axis “A-A” to capture a plurality of images within a panoramic field of view “PV” that includes the original field of view “FV”. This panoramic field of view “PV” is a panoramic view of the surgical site “S” created by stitching together the plurality of images captured during the rolling or rotation of the angled endoscope 10. The panoramic field of view “PV” provides visualization of the body cavity “C” surrounding the surgical site “S” to allow a clinician greater visualization of the body cavity “C”. It is contemplated that during the rolling or rotation of the angled endoscope 10, the angled endoscope 10 may also be pivoted about the pitch axis and/or the yaw axis (FIG. 1) to adjust a focal point of the image capture device 18, to increase the panoramic field of view “PV”.
  • As detailed above, the rolling or rotation of the angled endoscope 10 about the longitudinal axis “A-A” is a full rotation of 360°; however, the rotation of the angled endoscope 10 may be a partial rotation of less than 360°. When the angled endoscope 10 is only partially rolled or rotated to generate a panoramic field of view “PV”, the angled endoscope 10 may be rolled or rotated back to a pre-rotated position. It is contemplated that the full or partial rolling or rotation of the angled endoscope 10 has a duration of approximately 1.0 seconds; however, the full or partial rotation of the angled endoscope may have a duration of about 0.1 seconds to about 2.0 seconds.
  • The generation of a panoramic field of view “PV” may be initiated automatically at a predetermined command point or when a command point is initiated by a clinician. Some examples of a predetermined command point include, but are not limited to, when the end effector 210 is withdrawn from the surgical site “S”, when a surgical instrument (e.g., surgical instrument 200) is inserted through an opening “O”, when a surgical instrument is withdrawn from an opening “O”, when a surgical instrument is withdrawn from the field of view “FV”, or at time intervals.
  • Referring to FIG. 5, the angled endoscope 10 and the surgical instrument 200 may be part of a robotic surgical system 301 in accordance with the present disclosure. The robotic surgical system 301 includes a robotic system 310, a processing unit 330, and a user interface 340. The robotic system 310 generally includes linkages 312 and a robot base 318. The linkages 312 moveably support a tool (e.g., angled endoscope 10 or surgical instrument 200). The linkages 312 may be in the form of arms each having an end 314 that supports a tool. The user interface 340 is in communication with the robot base 318 through the processing unit 330.
  • The user interface 340 includes a display device 344 which is configured to display three-dimensional images. The display device 344 displays three-dimensional images of the surgical site “S” which may include data captured by imaging devices positioned on the ends 314 of the linkages 312 (e.g., angled endoscope 10) and/or include data captured by imaging devices that are positioned about the surgical theater (e.g., an imaging device positioned within the surgical site “S”, an imaging device positioned adjacent the patient, imaging device 356 positioned at a distal end of an imaging arm 352). The imaging devices may capture visual images, infra-red images, ultrasound images, X-ray images, thermal images, and/or any other known real-time images of the surgical site “S”. The imaging devices transmit captured imaging data to the processing unit 330 which creates three-dimensional images of the surgical site “S” in real-time from the imaging data and transmits the three-dimensional images to the display device 344 for display.
  • The user interface 340 also includes input arms or handles 342 which allow a clinician to manipulate the robotic system 310 (e.g., move the linkages 312, the ends 314 of the linkages 312, and/or the tools). Each of the input handles 342 is in communication with the processing unit 330 to transmit control signals thereto and to receive feedback signals therefrom. Each of the input handles 342 may include an input device which allow the surgeon to manipulate (e.g., clamp, grasp, fire, open, close, rotate, thrust, slice, etc.) the tools supported at the ends 314 of the linkages 312.
  • During a surgical procedure, the robot system 310 may operate the angled endoscope 10 to function as a 0° endoscope while a clinician is engaged with the user interface 340 by manipulating the linkages 312. When a clinician disengages from the user interface 340 (e.g., when the clinician releases the input handles 342, when the clinician looks away from the display 344) the user interface 340 may generate a command point such that the processing unit 330 sends a signal to the robot system 310 to generate a panoramic view of the surgical site “S” with the angled endoscope 10, as detailed above. Additionally, the robotic system 310 may use kinematic tracking to generate a command point when a tool (e.g., surgical instrument 200) completes a particular movement (e.g., when the tool is withdrawn beyond a threshold distance, when one tool is exchanged for another tool).
  • The user interface 340 then displays the panoramic view of the surgical site “S” on the display 344. As detailed above the display 344 may be a 3D display such that the panoramic view of the surgical site “S” is displayed in 3D. The display 344 may be an interactive display such that a clinician may pan, rotate, zoom in, and/or zoom out of areas of interest within the panoramic view of the surgical site. Additionally or alternatively, it is contemplated that the display 344 may include a display helmet 344 a such that the movement of a head of clinician may allow a clinician to interact with the panoramic view of the surgical site “S”. The helmet 344 a may use inertial tracking to detect movement of the head of a clinician. Further, it is contemplated that the user interface 340 may include a portable display or monitor 344 b that is moveable relative to the display 344. The portable display 344 b displays a view of the surgical site “S” and may use inertia tracking to update the view of the surgical site “S” on the portable display 344 b as the portable display 344 b is moved relative to the display 344. In addition, a clinician may interact with the portable display 344 b to update the view of the surgical site “S” on the portable display 344 b. It is also contemplated that the portable display 344 b may be used without the display 344.
  • Referring now to FIG. 6, a method 400 of visualizing a body cavity during a surgical procedure is described in accordance with the present disclosure utilizing an angled endoscope (e.g., angled endoscope 10). Initially, the angled endoscope is positioned within a body cavity “C” of a patient (Step 410). During the surgical procedure the angled endoscope may be operated as a 0° endoscope (Step 420). The angled endoscope may be operated as a 0° endoscope by manipulating the angled endoscope about its pitch, yaw, and longitudinal axes as detailed above.
  • During the surgical procedure, a command point is generated (Step 430). In response to the command point, the angled endoscope is rotated about is longitudinal axis (Step 440). The angled endoscope may be withdrawn within the body cavity “C” before the angled endoscope is rotated and returned to the position it had before being withdrawn after the angled endoscope is rotated (Step 444). As the angled endoscope is rotated, an image capture device 18 is disposed within the angled endoscope captures a plurality of images (Step 450). A panoramic view of the body cavity “C” is generated from the plurality of images (Step 460). The panoramic view is displayed to a clinician as detailed above (Step 470). The clinician may interact with the panoramic view as detailed above.
  • While several embodiments of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Any combination of the above embodiments is also envisioned and is within the scope of the appended claims. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular embodiments. Those skilled in the art will envision other modifications within the scope of the claims appended hereto.

Claims (21)

1-20. (canceled)
21. A method of visualizing a body cavity during a surgical procedure, the method comprising:
positioning an elongated body of an angled endoscope in a first position within a body cavity of a patient such that a surgical site and an end effector of a surgical instrument are within a field of view of an image capture device positioned in a distal end portion of the elongated body;
capturing a first plurality of images with the image capture device in the first position;
translating the elongated body of the angled endoscope from the first position to a second position within the body cavity in response to a command point;
rotating the elongated body about a longitudinal axis thereof in the second position;
capturing a second plurality of images with the image capture device in the second position as the elongated body is rotated; and
generating a panoramic view of the body cavity from the second plurality of images.
22. The method according to claim 21, wherein the end effector is not within the field of view in the second position.
23. The method according to claim 21, further comprising:
generating the command point when the end effector is not within the field of view.
24. The method according to claim 23, further comprising swapping the end effector with a second end effector when the end effector is not within the field of view.
25. The method according to claim 23, further comprising introducing another surgical instrument to the surgical site when the surgical instrument is not within the field of view.
26. The method according to claim 21, further comprising:
returning the elongated body to the first position after the panoramic view of the body cavity has been generated from the plurality of images.
27. The method according to claim 21, wherein the elongated body is pivoted at a pivot point and pivotable about a pitch axis orthogonal to the longitudinal axis.
28. The method according to claim 27, wherein the elongated body is pivotable about a yaw axis that is orthogonal to the pitch axis and the longitudinal axis, and
wherein the pitch, longitudinal, and yaw axes intersect at the pivot point.
29. The method according to claim 21, wherein the panoramic view of the body cavity is panned or zoomed.
30. A surgical system comprising:
a surgical instrument;
an end effector configured to act on tissue at a surgical site; and
an endoscope having an elongated body, configured to be positioned in a first position within a body cavity of a patient such that the surgical site and the end effector are within a field of view of an image capture device positioned in a distal end portion of the elongated body, and configured to capture a first plurality of images of the body cavity by the image capture device in the first position,
wherein the elongated body of the endoscope is translated from the first position to a second position and rotated about a longitudinal axis thereof in the second position within the body cavity in response to a command point,
wherein the image capture device captures a second plurality of images in the second position as the elongated body is rotated, and
wherein the surgical system generates a panoramic view from the second plurality of images.
31. The surgical system according to claim 30, wherein the end effector is not within the field of view in the second position.
32. The surgical system according to claim 30, wherein the command point is generated when the end effector is not within the field of view.
33. The surgical system according to claim 32, wherein the end effector is not within the field of view when the end effector is swapped with a second end effector.
34. The surgical system according to claim 32, wherein the surgical instrument is not within the field of view when another surgical instrument is introduced to the surgical site.
35. The surgical system according to claim 30, wherein the elongated body is returned to the first position after the panoramic view of the body cavity has been generated from the second plurality of images.
36. The surgical system according to claim 30, wherein the elongated body includes a pivot point and pivotable about a pitch axis orthogonal to the longitudinal axis.
37. The surgical system according to claim 36, wherein the elongated body is pivotable about a yaw axis that is orthogonal to the pitch axis and the longitudinal axis, and
wherein the pitch, longitudinal, and yaw axes intersect at the pivot point.
38. The surgical system according to claim 30, further comprising a user interface including a wearable display configured to display the panoramic view such that movement of the wearable display updates a view of a clinician of the panoramic view.
39. The surgical system according to claim 30, wherein the panoramic view of the body cavity is panned or zoomed.
40. A robotic surgical system comprising:
a first linkage connected with an end effector;
a second linkage connected with an endoscope having an elongated body, the endoscope being configured to act on tissue of a patient at a surgical site within a body cavity of a patient such that the surgical site and the end effector are within a field of view of an image capture device positioned in a distal end portion of the elongated body and configured to capture a first plurality of images of the body cavity by the image capture device in the first position; and
a robot base providing a base for the first and second linkages,
wherein the elongated body of the endoscope is translated from the first position to a second position and rotated about a longitudinal axis thereof in the second position within the body cavity in response to a command point,
wherein the image capture device captures a second plurality of images in the second position as the elongated body is rotated, and
wherein the system generates a panoramic view from the second plurality of images.
US17/479,548 2015-10-09 2021-09-20 Methods of using an angled endoscope for visualizing a body cavity with robotic surgical systems Pending US20220000579A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/479,548 US20220000579A1 (en) 2015-10-09 2021-09-20 Methods of using an angled endoscope for visualizing a body cavity with robotic surgical systems

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201562239412P 2015-10-09 2015-10-09
PCT/US2016/055396 WO2017062393A2 (en) 2015-10-09 2016-10-05 Methods of using an angled endoscopic for visualizing a body cavity with robotic surgical systems
US201815765866A 2018-04-04 2018-04-04
US17/479,548 US20220000579A1 (en) 2015-10-09 2021-09-20 Methods of using an angled endoscope for visualizing a body cavity with robotic surgical systems

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US15/765,866 Continuation US11123149B2 (en) 2015-10-09 2016-10-05 Methods of using an angled endoscope for visualizing a body cavity with robotic surgical systems
PCT/US2016/055396 Continuation WO2017062393A2 (en) 2015-10-09 2016-10-05 Methods of using an angled endoscopic for visualizing a body cavity with robotic surgical systems

Publications (1)

Publication Number Publication Date
US20220000579A1 true US20220000579A1 (en) 2022-01-06

Family

ID=58488392

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/765,866 Active 2037-08-23 US11123149B2 (en) 2015-10-09 2016-10-05 Methods of using an angled endoscope for visualizing a body cavity with robotic surgical systems
US17/479,548 Pending US20220000579A1 (en) 2015-10-09 2021-09-20 Methods of using an angled endoscope for visualizing a body cavity with robotic surgical systems

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/765,866 Active 2037-08-23 US11123149B2 (en) 2015-10-09 2016-10-05 Methods of using an angled endoscope for visualizing a body cavity with robotic surgical systems

Country Status (5)

Country Link
US (2) US11123149B2 (en)
EP (1) EP3359075B1 (en)
JP (1) JP6886968B2 (en)
CN (1) CN108882964B (en)
WO (1) WO2017062393A2 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018075218A (en) * 2016-11-10 2018-05-17 ソニー株式会社 Medical support arm and medical system
WO2019036006A1 (en) * 2017-08-16 2019-02-21 Covidien Lp Synthesizing spatially-aware transitions between multiple camera viewpoints during minimally invasive surgery
US11607283B2 (en) * 2019-01-01 2023-03-21 Asensus Surgical Us, Inc. Dynamic control of surgical instruments in a surgical system using repulsion/attraction modes
JP2020162633A (en) * 2019-03-28 2020-10-08 ソニー株式会社 Imaging control device, imaging control method, program and imaging system
CN113014871B (en) * 2021-02-20 2023-11-10 青岛小鸟看看科技有限公司 Endoscopic image display method and device and endoscopic surgery auxiliary system
CN113855257B (en) * 2021-11-05 2024-03-12 佗道医疗科技有限公司 Self-adaptive adjusting method for endoscope pose
CN114565881B (en) * 2022-04-28 2022-07-12 成都与睿创新科技有限公司 Method and system for distinguishing different scenes inside and outside body cavity
CN117221177B (en) * 2023-11-08 2024-01-09 湖南省华芯医疗器械有限公司 Image transmission delay monitoring method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060281971A1 (en) * 2005-06-14 2006-12-14 Siemens Corporate Research Inc Method and apparatus for minimally invasive surgery using endoscopes
US20070142824A1 (en) * 2005-06-30 2007-06-21 Intuitive Surgical Inc. Indicator for tool state and communication in multi-arm robotic telesurgery
US20090245600A1 (en) * 2008-03-28 2009-10-01 Intuitive Surgical, Inc. Automated panning and digital zooming for robotic surgical systems
WO2012001549A1 (en) * 2010-06-30 2012-01-05 Koninklijke Philips Electronics N.V. Robotic control of an oblique endoscope for fov images
US20180325604A1 (en) * 2014-07-10 2018-11-15 M.S.T. Medical Surgery Technologies Ltd Improved interface for laparoscopic surgeries - movement gestures

Family Cites Families (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4248213A (en) * 1979-08-13 1981-02-03 Syn-Optics Articulated optical coupler
JP3063784B2 (en) 1991-03-26 2000-07-12 オリンパス光学工業株式会社 Endoscope device
US5159446A (en) 1991-06-21 1992-10-27 Olympus Optical Co., Ltd. Electronic endoscope system provided with a separate camera controlling unit and motor controlling unit
JP3506809B2 (en) * 1995-06-08 2004-03-15 オリンパス株式会社 Body cavity observation device
US6081336A (en) * 1997-09-26 2000-06-27 Picker International, Inc. Microscope calibrator
FR2783610B1 (en) 1998-09-18 2000-11-24 Tokendo Sarl RIGID ROTARY ENDOSCOPE WITH DEVIED DISTAL VIEW AND ADJUSTABLE PROXIMAL FOCUS
US6659939B2 (en) * 1998-11-20 2003-12-09 Intuitive Surgical, Inc. Cooperative minimally invasive telesurgical system
US6522906B1 (en) * 1998-12-08 2003-02-18 Intuitive Surgical, Inc. Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US6466815B1 (en) * 1999-03-30 2002-10-15 Olympus Optical Co., Ltd. Navigation apparatus and surgical operation image acquisition/display apparatus using the same
US8229549B2 (en) * 2004-07-09 2012-07-24 Tyco Healthcare Group Lp Surgical imaging device
US20060178556A1 (en) 2001-06-29 2006-08-10 Intuitive Surgical, Inc. Articulate and swapable endoscope for a surgical robot
US6537029B1 (en) 2001-10-08 2003-03-25 Huang Chen-Lung Electric fan capable to rotate for 360 degrees
FR2832516B1 (en) 2001-11-19 2004-01-23 Tokendo Sarl ROTARY ENDOSCOPES WITH A DEVIED DISTAL VIEW
DE50310846D1 (en) 2002-02-05 2009-01-15 Kersten Zaar Endoscope with side view optics
JP2003279862A (en) 2002-03-25 2003-10-02 Machida Endscope Co Ltd Omnidirectional endoscopic device
JP4009639B2 (en) * 2002-07-31 2007-11-21 オリンパス株式会社 Endoscope device, endoscope device navigation method, endoscope image display method, and endoscope image display program
US7559890B2 (en) 2003-02-26 2009-07-14 Ikona Medical Corporation Endoscopic imaging of an organ system
US7744528B2 (en) 2003-02-26 2010-06-29 Infinite Biomedical Technologies, Llc Methods and devices for endoscopic imaging
US7381183B2 (en) 2003-04-21 2008-06-03 Karl Storz Development Corp. Method for capturing and displaying endoscopic maps
US7232409B2 (en) * 2003-11-20 2007-06-19 Karl Storz Development Corp. Method and apparatus for displaying endoscopic images
US8277373B2 (en) 2004-04-14 2012-10-02 Usgi Medical, Inc. Methods and apparaus for off-axis visualization
JP2008512217A (en) 2004-09-10 2008-04-24 ジンテック メディカル、インク. Flexible video endoscope extension and method
CA2533549C (en) 2005-01-21 2009-12-15 Karl Storz Development Corp. Variable direction of view instrument with distal image sensor
US7967742B2 (en) * 2005-02-14 2011-06-28 Karl Storz Imaging, Inc. Method for using variable direction of view endoscopy in conjunction with image guided surgical systems
EP2037794B1 (en) * 2006-06-13 2021-10-27 Intuitive Surgical Operations, Inc. Minimally invasive surgical system
US20090138025A1 (en) * 2007-05-04 2009-05-28 Hansen Medical, Inc. Apparatus systems and methods for forming a working platform of a robotic instrument system by manipulation of components having controllably rigidity
FR2920084B1 (en) * 2007-08-24 2010-08-20 Endocontrol IMAGING SYSTEM FOR MONITORING A SURGICAL TOOL IN AN OPERATIVE FIELD
US8360964B2 (en) 2007-12-10 2013-01-29 Stryker Corporation Wide angle HDTV endoscope
US8808164B2 (en) * 2008-03-28 2014-08-19 Intuitive Surgical Operations, Inc. Controlling a robotic surgical tool with a display monitor
JP5175639B2 (en) * 2008-06-30 2013-04-03 富士フイルム株式会社 Endoscope and its assembly method
WO2010080991A2 (en) * 2009-01-09 2010-07-15 Washington University In St. Louis Miniaturized photoacoustic imaging apparatus including a rotatable reflector
US10004387B2 (en) * 2009-03-26 2018-06-26 Intuitive Surgical Operations, Inc. Method and system for assisting an operator in endoscopic navigation
EP2236104B1 (en) * 2009-03-31 2013-06-19 BrainLAB AG Medicinal navigation image output with virtual primary images and real secondary images
EP2470089B1 (en) * 2009-11-13 2018-08-01 Intuitive Surgical Operations, Inc. Curved cannula and robotic manipulator
JP5380348B2 (en) * 2010-03-31 2014-01-08 富士フイルム株式会社 System, method, apparatus, and program for supporting endoscopic observation
DE102010041857A1 (en) * 2010-10-01 2012-04-05 Olympus Winter & Ibe Gmbh stereo endoscope
US9814369B2 (en) * 2011-05-13 2017-11-14 Covidien Lp Pivoting three-dimensional video endoscope
US20130250081A1 (en) * 2012-03-21 2013-09-26 Covidien Lp System and method for determining camera angles by using virtual planes derived from actual images
WO2014061553A1 (en) * 2012-10-18 2014-04-24 オリンパスメディカルシステムズ株式会社 Image processing device, and image processing method
JP2014095953A (en) 2012-11-07 2014-05-22 Tokyo Institute Of Technology Operation system for operation object device and operation input device
JP6205125B2 (en) * 2012-12-11 2017-09-27 オリンパス株式会社 Endoscope device insertion support information detection system and endoscope device
KR101740168B1 (en) * 2012-12-25 2017-05-25 가와사끼 쥬고교 가부시끼 가이샤 Surgical robot
JP6076492B2 (en) * 2012-12-28 2017-02-08 オリンパス株式会社 Stereoscopic endoscope
WO2014121116A2 (en) 2013-02-01 2014-08-07 Deka Products Limited Partnership Endoscope with pannable camera
US9948852B2 (en) * 2013-03-15 2018-04-17 Intuitive Surgical Operations, Inc. Intelligent manual adjustment of an image control element
JP6091370B2 (en) 2013-07-26 2017-03-08 オリンパス株式会社 Medical system and medical instrument control method
US10945796B2 (en) * 2014-02-12 2021-03-16 Koninklijke Philips N.V. Robotic control of surgical instrument visibility
US10548459B2 (en) * 2014-03-17 2020-02-04 Intuitive Surgical Operations, Inc. Systems and methods for control of imaging instrument orientation
JP6644699B2 (en) * 2014-03-19 2020-02-12 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Medical devices, systems and methods using gaze tracking
WO2015172021A1 (en) * 2014-05-09 2015-11-12 Nazareth Godfrey Portable surgical methods, systems, and apparatus
JP6626110B2 (en) * 2014-09-04 2019-12-25 メミック イノベーティブ サージェリー リミテッドMemic Innovative Surgery Ltd. Devices and systems including mechanical arms

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060281971A1 (en) * 2005-06-14 2006-12-14 Siemens Corporate Research Inc Method and apparatus for minimally invasive surgery using endoscopes
US20070142824A1 (en) * 2005-06-30 2007-06-21 Intuitive Surgical Inc. Indicator for tool state and communication in multi-arm robotic telesurgery
US20090245600A1 (en) * 2008-03-28 2009-10-01 Intuitive Surgical, Inc. Automated panning and digital zooming for robotic surgical systems
WO2012001549A1 (en) * 2010-06-30 2012-01-05 Koninklijke Philips Electronics N.V. Robotic control of an oblique endoscope for fov images
US20180325604A1 (en) * 2014-07-10 2018-11-15 M.S.T. Medical Surgery Technologies Ltd Improved interface for laparoscopic surgeries - movement gestures

Also Published As

Publication number Publication date
US20180280110A1 (en) 2018-10-04
JP6886968B2 (en) 2021-06-16
EP3359075A2 (en) 2018-08-15
US11123149B2 (en) 2021-09-21
CN108882964B (en) 2021-10-22
JP2018534975A (en) 2018-11-29
CN108882964A (en) 2018-11-23
WO2017062393A2 (en) 2017-04-13
EP3359075A4 (en) 2019-09-18
EP3359075B1 (en) 2021-08-18

Similar Documents

Publication Publication Date Title
US20220000579A1 (en) Methods of using an angled endoscope for visualizing a body cavity with robotic surgical systems
JP7248554B2 (en) Systems and methods for controlling the orientation of an imaging instrument
US20230218356A1 (en) Systems and methods for projecting an endoscopic image to a three-dimensional volume
US20240108428A1 (en) Console overlay and methods of using same
JP5372225B2 (en) Tool position and identification indicator displayed in the border area of the computer display screen
JP2018538036A (en) Reconfigurable end effector architecture
US20130325033A1 (en) Multi-port surgical robotic system architecture
US20230172679A1 (en) Systems and methods for guided port placement selection
US20210186558A1 (en) Imaging cannula with a hinged tip
JPH0630896A (en) Surgical treatment method and apparatus
CN114364334A (en) Robot arm with extendable prismatic links
US20230277262A1 (en) Visual detection of electrocautery arcing
Abdurahiman et al. Human-computer interfacing for control of angulated scopes in robotic scope assistant systems
EP3920821A1 (en) Hand eye coordination system for robotic surgical system
Ko et al. Compact laparoscopic assistant robot using a bending mechanism
Ryu et al. An active endoscope with small sweep volume that preserves image orientation for arthroscopic surgery
US20200205931A1 (en) Hooked surgery camera

Legal Events

Date Code Title Description
AS Assignment

Owner name: COVIDIEN LP, MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MEGLAN, DWIGHT;REEL/FRAME:057542/0081

Effective date: 20180403

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED