US11000339B2 - System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure - Google Patents
System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure Download PDFInfo
- Publication number
- US11000339B2 US11000339B2 US16/273,442 US201916273442A US11000339B2 US 11000339 B2 US11000339 B2 US 11000339B2 US 201916273442 A US201916273442 A US 201916273442A US 11000339 B2 US11000339 B2 US 11000339B2
- Authority
- US
- United States
- Prior art keywords
- instrument
- body cavity
- camera
- envelope
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000001356 surgical procedure Methods 0.000 title claims abstract description 18
- 239000002131 composite material Substances 0.000 claims abstract description 53
- 238000000034 method Methods 0.000 claims abstract description 52
- 238000002432 robotic surgery Methods 0.000 claims abstract description 34
- 230000003466 anti-cipated effect Effects 0.000 claims description 16
- 230000008569 process Effects 0.000 claims description 15
- 230000004044 response Effects 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 8
- 230000002401 inhibitory effect Effects 0.000 claims description 2
- 238000003780 insertion Methods 0.000 abstract description 103
- 230000037431 insertion Effects 0.000 abstract description 103
- 239000012636 effector Substances 0.000 description 12
- 210000003484 anatomy Anatomy 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 210000001672 ovary Anatomy 0.000 description 8
- 239000011165 3D composite Substances 0.000 description 7
- 238000005286 illumination Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 210000001015 abdomen Anatomy 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 210000000056 organ Anatomy 0.000 description 4
- 210000003101 oviduct Anatomy 0.000 description 4
- 210000004291 uterus Anatomy 0.000 description 4
- PCTMTFRHKVHKIS-BMFZQQSSSA-N (1s,3r,4e,6e,8e,10e,12e,14e,16e,18s,19r,20r,21s,25r,27r,30r,31r,33s,35r,37s,38r)-3-[(2r,3s,4s,5s,6r)-4-amino-3,5-dihydroxy-6-methyloxan-2-yl]oxy-19,25,27,30,31,33,35,37-octahydroxy-18,20,21-trimethyl-23-oxo-22,39-dioxabicyclo[33.3.1]nonatriaconta-4,6,8,10 Chemical compound C1C=C2C[C@@H](OS(O)(=O)=O)CC[C@]2(C)[C@@H]2[C@@H]1[C@@H]1CC[C@H]([C@H](C)CCCC(C)C)[C@@]1(C)CC2.O[C@H]1[C@@H](N)[C@H](O)[C@@H](C)O[C@H]1O[C@H]1/C=C/C=C/C=C/C=C/C=C/C=C/C=C/[C@H](C)[C@@H](O)[C@@H](C)[C@H](C)OC(=O)C[C@H](O)C[C@H](O)CC[C@@H](O)[C@H](O)C[C@H](O)C[C@](O)(C[C@H](O)[C@H]2C(O)=O)O[C@H]2C1 PCTMTFRHKVHKIS-BMFZQQSSSA-N 0.000 description 2
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 2
- 229910052799 carbon Inorganic materials 0.000 description 2
- 210000001072 colon Anatomy 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000003187 abdominal effect Effects 0.000 description 1
- 238000012084 abdominal surgery Methods 0.000 description 1
- 230000001154 acute effect Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000000746 body region Anatomy 0.000 description 1
- 229910002092 carbon dioxide Inorganic materials 0.000 description 1
- 239000001569 carbon dioxide Substances 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000881 depressing effect Effects 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 210000004247 hand Anatomy 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000002357 laparoscopic surgery Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/37—Master-slave robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000096—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00147—Holding or positioning arrangements
- A61B1/00149—Holding or positioning arrangements using articulated arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/313—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/02—Surgical instruments, devices or methods, e.g. tourniquets for holding wounds open; Tractors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/74—Manipulators with manual electric input means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
- A61B2034/2057—Details of tracking cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/302—Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
- A61B2090/306—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using optical fibres
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
- A61B2090/309—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using white LEDs
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/371—Surgical systems with images on a monitor during operation with simultaneous use of two cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/03—Automatic limiting or abutting means, e.g. for safety
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
Definitions
- This disclosure relates generally to robotic surgery systems and more particularly to insertion of an instrument into a body cavity of a patient for performing a surgical procedure using the robotic surgery system.
- instruments When performing surgery using a robotic surgical system, instruments are usually inserted into a body cavity of a patient.
- the insertion process has some risk since instruments may inadvertently damage organs and/or tissue while being inserted. Incorrect positioning of the instruments in the body cavity may also result in a limited range of motion within the body cavity.
- a trocar or other access port
- a trocar may then be inserted through the incision.
- a camera is first inserted through the access port and used by the surgeon to capture and relay stereoscopic images of the surgical site. Instruments are usually inserted following the camera insertion. Views provided by the camera facilitate it's positioning to focus on the surgical site, however it may not be evident to the surgeon how far the instruments will extend into the body cavity when inserted. For example, scaling of the views provided by the camera may lead to the mistaken belief that there is more clearance from sensitive anatomy than actually available.
- the surgeon may thus only be able to determine the depth to which the instrument will extend into the body cavity based on an educated guess.
- the camera and/or instruments may be positioned such that a surgical workspace within which the instruments are capable of reaching is less than optimal.
- a method for controlling insertion of an instrument into a body cavity of an animal for performing a surgical procedure using a robotic surgery system the robotic surgery system being controlled by a processor circuit.
- the method involves, by the processor circuit, receiving body cavity image data, the body cavity image data being captured by a camera inserted into the body cavity and representing an interior view of the body cavity and determining instrument parameters associated with physical extents of the instrument to be inserted into the body cavity.
- the method further involves determining by the processor circuit an instrument envelope based on the instrument parameters, the instrument envelope identifying a region through which the instrument is capable of moving when inserted into the body cavity.
- the method also involves generating by the processor circuit display signals operable to display a composite view of the interior of the body cavity on a display associated with the robotic surgery system, the composite view being based on the body cavity image data and including an envelope overlay image generated to represent the instrument envelope.
- Determining the instrument envelope may involve determining an instrument envelope identifying a region through which the instrument is capable of physically moving when inserted into the body cavity.
- Determining the instrument envelope may involve determining a physical reach of the instrument in at least one degree of freedom associated with the instrument prior to insertion of the instrument into the body cavity.
- Receiving the body cavity image data may involve receiving body cavity image data including three-dimensional spatial information, and processing the body cavity image data may involve processing the body cavity image data to determine a three-dimensional instrument envelope.
- Generating display signals may involve generating three-dimensional display signals operable to display a three-dimensional composite view of the interior of the body cavity on a three-dimensional display device associated with the robotic surgery system.
- Receiving the body cavity image data may involve receiving body cavity image data from at least one of a stereoscopic camera including a pair of image sensors each image sensor being operably configured to capture an image of the interior of the body cavity from a different perspective to facilitate determination of the three-dimensional spatial information, a time of flight camera operably configured to generate image data including the three-dimensional spatial information, or a camera in combination with a structured light source for illuminating the interior of the body cavity to facilitate determination of the three-dimensional spatial information.
- a stereoscopic camera including a pair of image sensors each image sensor being operably configured to capture an image of the interior of the body cavity from a different perspective to facilitate determination of the three-dimensional spatial information
- a time of flight camera operably configured to generate image data including the three-dimensional spatial information
- a camera in combination with a structured light source for illuminating the interior of the body cavity to facilitate determination of the three-dimensional spatial information.
- Processing the body cavity image to determine the instrument envelope may be based on determining an anticipated insertion location and orientation for the instrument relative to the camera.
- Determining the anticipated insertion location and orientation for the instrument may involve generating the location and orientation as an offset with respect to the camera based on the instrument parameters.
- the camera may be coupled to a first manipulator and the instrument may be coupled to a second manipulator, the first and second manipulators being associated with the robotic surgery system and the method may further involve determining a spatial disposition of the camera and determining the anticipated insertion location and orientation for the instrument by receiving kinematic information associated with the first and second manipulators and determining the anticipated insertion location and orientation for the instrument as offsets from the respective orientations and locations of the first and second manipulators.
- the instrument parameters may provide information for determining a possible physical extent of the instrument into the body cavity.
- the camera may be initially detached from the robotic surgery system facilitating positioning of the camera by hand to receive a desired interior view of the body cavity based on the composite view and the method may further involve connecting the camera to the robotic surgery system once the positioning is completed to facilitate further positioning of the camera by the processor circuit.
- the method may involve discontinuing display of the envelope overlay image following actual insertion of the instrument into the body cavity.
- the method may involve process by the processor circuit the body cavity image data to identify anatomical features in the image data and the composite view may further include an anatomical overlay image generated to identify at least one anatomical feature within the body cavity.
- the anatomical overlay image may include a highlighted region within the body cavity image identifying at least one anatomical feature.
- the method may involve determining by the processor circuit whether there are any regions of potential encroachment between the instrument envelope and identified anatomical features and generating an alert signal in response to identifying a region of potential encroachment.
- Generating the alert signal may involve generating a warning overlay image for display as part of the composite image.
- the camera may be coupled to a drive unit operable to move the camera within the body cavity and may further involve inhibiting further movement of the drive unit in response to the alert signal.
- the camera may be disposed in a longitudinally extended state when inserted into the body cavity and subsequently moved into a deployed state for performing the surgical procedure, and the composite view may be generated based on images captured by the camera in the longitudinally extended state including a first perspective, the composite image being further transformed for display to include a second perspective that generally corresponds to a perspective of the camera in the deployed state.
- Determining the instrument envelope may involve determining an instrument envelope identifying a workable volume within which the instrument is capable of manipulating.
- a robotic surgery system includes a camera configured to be inserted into a body cavity of an animal to capture body cavity image data representing an interior view of the body cavity and a processor circuit for controlling the robotic surgery system, the processor circuit being operably configured to receive the body cavity image data.
- the system further includes an instrument for performing a surgical procedure within the body cavity when received in the body cavity.
- the processor circuit is operably configured to determine instrument parameters associated with physical extents of the instrument to be inserted into the body cavity, and to determine an instrument envelope based on the instrument parameters, the instrument envelope identifying a region through which the instrument is capable of moving when inserted into the body cavity.
- the processor circuit is also operably configured to generate display signals operable to display a composite view of the interior of the body cavity, the composite view being based on the body cavity image data and including an envelope overlay image generated to represent the instrument envelope.
- the processor circuit is further operably configured to display associated with the robotic surgery system operably configured to receive the display signals and to display the composite view.
- the processor circuit may be operably configured to determine the instrument envelope by determining an instrument envelope identifying a region through which the instrument is capable of physically moving when inserted into the body cavity.
- the processor circuit may be operably configured to determine the instrument envelope by determining a physical reach of the instrument in at least one degree of freedom associated with the instrument prior to insertion of the instrument into the body cavity.
- the processor circuit may be operably configured to receive the body cavity image data by receiving body cavity image data including three-dimensional spatial information and the processor circuit may be operably configured to process the body cavity image data to determine a three-dimensional instrument envelope.
- the processor circuit may be operably configured to generate display signals by generating three-dimensional display signals operable to display a three-dimensional composite view of the interior of the body cavity on a three-dimensional display device associated with the robotic surgery system.
- the processor circuit may be operably configured to receive body cavity image data from at least one of a stereoscopic camera including a pair of image sensors each image sensor being operably configured to capture an image of the interior of the body cavity from a different perspective to facilitate determination of the three-dimensional spatial information, a time of flight camera operably configured to generate image data including the three-dimensional spatial information, or a camera in combination with a structured light source for illuminating the interior of the body cavity to facilitate determination of the three-dimensional spatial information.
- the processor circuit may be operably configured to process the body cavity image to determine the instrument envelope by determining the instrument envelope based on determining an anticipated insertion location and orientation for the instrument relative to the camera.
- the processor circuit may be operably configured to determine the anticipated insertion location and orientation for the instrument by generating the location and orientation as an offset with respect to the camera based on the instrument parameters.
- the camera may be coupled to a first manipulator and the instrument may be coupled to a second manipulator, the first and second manipulators being associated with the robotic surgery system and the processor circuit may be operably configured to determine a spatial disposition of the camera and determine the anticipated insertion location and orientation for the instrument by receiving kinematic information associated with the first and second manipulators and to determine the anticipated insertion location and orientation for the instrument as offsets from the respective orientations and locations of the first and second manipulators.
- the instrument parameters may provide information for determining a possible physical extent of the instrument into the body cavity.
- the robotic surgery system may be operably configured to permit the camera to be initially detached from the system facilitating positioning of the camera by hand to receive a desired interior view of the body cavity based on the composite view and the system may be further operably configured to facilitate connecting the camera to the robotic surgery system once the positioning is completed to facilitate further positioning of the camera by the processor circuit.
- the processor circuit may be operably configured to discontinue display of the envelope overlay image following actual insertion of the instrument into the body cavity.
- the processor circuit may be operably configured to process the body cavity image data to identify anatomical features in the image data and the composite view generated by the processor circuit may further include an anatomical overlay image generated to identify at least one anatomical feature within the body cavity.
- the anatomical overlay image may include a highlighted region within the body cavity image identifying at least one anatomical feature.
- the processor circuit may be operably configured to determine whether there are any regions of potential encroachment of between the instrument envelope and identified anatomical features and to generate an alert in response to identifying a region of potential encroachment.
- the processor circuit may be operably configured to generate the alert by generating a warning overlay image for display as part of the composite image.
- the camera may be coupled to a drive unit operable to move the camera within the body cavity and the processor circuit may be operably configured to inhibit further movement of the drive unit in response to the alert signal.
- the camera may be disposed in a longitudinally extended state when inserted into the body cavity and subsequently moved into a deployed state for performing the surgical procedure, and the processor circuit may be operably configured to generate the composite view based on images captured by the camera in the longitudinally extended state including a first perspective and to transform the composite image for display to include a second perspective that generally corresponds to a perspective of the camera in the deployed state.
- the processor circuit may be operably configured to determine the instrument envelope by determining an instrument envelope identifying a workable volume within which the instrument is capable of manipulating.
- FIG. 1 is a perspective view of a robotic surgery system in accordance with some embodiments
- FIG. 2 is a perspective view of a drive unit and insertion tube of the robotic surgery system shown in
- FIG. 1 is a diagrammatic representation of FIG. 1 ;
- FIG. 3A is a perspective view of a portion of the insertion tube shown in FIG. 2 and a camera disposed on an end of the insertion tube;
- FIG. 3B is a perspective view of the insertion tube and an instrument being inserted through the insertion tube;
- FIG. 3C is a perspective view of the insertion tube and instruments with the camera in a deployed state
- FIG. 4 is a sectional view of a patient's abdomen with the insertion tube inserted into a body cavity;
- FIG. 5 is a block diagram of processor circuit elements of the system shown in FIG. 1 ;
- FIG. 6 is a flowchart depicting blocks of code for directing the processor circuit elements shown in FIG. 5 during insertion of an instrument into the body cavity of the patient's abdomen;
- FIG. 7A is an example of an image represented by body cavity image data captured by the camera shown in FIG. 3A ;
- FIG. 7B is an example of an overlay image generated by the processor circuit elements shown in FIG. 5 ;
- FIG. 7C is an example of a composite view generated by the processor circuit elements shown in
- FIG. 5
- FIG. 8 is a further perspective view of the camera shown in FIG. 3A in relation to anatomy of the patient shown in FIG. 4 ;
- FIG. 9 is a perspective view of a deployed camera and instruments in accordance with some embodiments.
- FIG. 10 is a generated composite view corresponding to the deployed camera and instruments shown in FIG. 9 ;
- FIG. 11 is another generated composite view corresponding to the deployed camera and instruments shown in FIG. 9 ;
- FIG. 12 is a perspective view of a portion of an insertion tube in accordance with some embodiments.
- FIG. 13 is a further example of a composite view generated by the processor circuit elements shown in FIG. 5 ;
- FIG. 14A is a side view of the insertion tube and instruments in an insertion position
- FIG. 14B is a perspective view of the insertion tube and instruments with the camera in a deployed state
- FIG. 15A is an example of a view captured by the camera in the position shown in FIG. 14A ;
- FIG. 15B is an example of a view generated by the processor circuit for the camera position shown in
- FIG. 14A
- FIG. 16 is a block diagram of processor circuit elements involved in generation of a 3D composite view
- FIG. 17 is a perspective view of a camera and insertion tube and an overlay image generated in accordance with some embodiments.
- the workstation 102 further includes a master processor circuit 114 in communication with the input device 112 for receiving the input signals and generating control signals for controlling the robotic surgery system, which are transmitted to the instrument cart 104 via an interface cable 116 .
- the input device 112 includes right and left hand controllers 122 and 124 , which are grasped by the operator's hands and moved to produce input signals at the input device.
- the instrument cart 104 includes a slave processor circuit 118 that receives and the control signals from the master processor circuit 114 and produces slave control signals operable to control the insertion tube 108 and the instrument 110 during a surgical procedure. While the embodiment shown includes both master and slave processor circuits, in other embodiments a single processor circuit may be used to perform both master and slave functions.
- the workstation 102 also includes a display 120 in communication with the master processor circuit 114 for displaying body cavity images and other information to an operator.
- FIG. 3A a portion of the insertion tube 108 is shown in FIG. 3A and includes two adjacently located bores 300 and 302 extending through the insertion tube 108 for receiving a surgical instrument.
- the camera 204 is configured as a stereoscopic camera having a pair of spaced apart imagers 304 and 306 for producing stereoscopic views representing an interior view of the body cavity.
- the camera 204 also includes an illumination source 308 for illuminating the body cavity for capturing images.
- the illumination source 308 may be implemented locally on the camera using a light emitting diode or the illumination source may be remotely located and may deliver the illumination through an optical fiber running through the insertion tube 108 .
- the insertion tube 108 is shown isolated from the drive unit 106 in FIG. 3B .
- the instrument 110 is partially inserted through the bore 300 and includes a manipulator 310 and an end effector 312 at a distal end of the instrument.
- the manipulator 310 may include an articulated tool positioner as described in detail in commonly owned PCT patent publication WO2014/201538 entitled “ARTICULATED TOOL POSITIONER AND SYSTEM EMPLOYING SAME” filed on Dec. 20, 2013 and incorporated herein by reference in its entirety.
- the described manipulator in PCT patent publication WO2014/201538 provides for dexterous movement of the end effector 312 through a plurality of articulated segments.
- the interface of the drive unit 106 may have a track system (not shown) for advancing and retracting the actuator 314 through the bore 300 in response to the input signals received from the input device 112 to place the end effector 312 at a desired longitudinal offset with respect to the insertion tube 108 .
- the camera 204 When the insertion tube 108 is detached as shown in FIG. 3A , the camera 204 remains in a longitudinally extended or in-line state generally oriented along a longitudinal z-axis of the insertion tube 108 .
- the camera 204 is mounted on an articulated arm 322 moveable in response to drive forces delivered by the drive interface 202 of the drive unit 106 to the drive interface 200 of the insertion tube 108 .
- the articulated arm 322 may be covered by a sterile boot that encloses the articulating mechanism.
- the articulated arm 322 may be configured as disclosed in commonly owned PCT patent publication WO 2017/173524 entitled “Camera Positioning Method and Apparatus for Capturing Images during a Medical Procedure”, filed on Apr. 4, 2017 and incorporated herein by reference in its entirety.
- the arm 322 When the arm 322 is actuated to move the camera 204 , the actuation provided may be used to determine the relative positioning of the camera within the surgical workspace.
- the drive forces delivered by the drive unit 106 cause the camera 204 to move from the longitudinally extended insertion state shown in FIGS. 3A and 3B to a deployed state as shown in FIG. 3C .
- the manipulator 310 of the instrument 110 is actuated to perform dexterous movement to position the end effector 312 for performing various surgical tasks.
- a second instrument 316 is also inserted for performing surgical tasks.
- the second instrument 316 may be similarly configured to the instrument 110 , thus having a manipulator 318 and an end effector 320 .
- the camera 204 is able to generate images of the body cavity without obstructing movements of the manipulators 310 and 318 .
- the insertion tube 108 may be initially detached from the drive unit 106 for insertion into a body cavity 404 of the patient 400 , but coupled via an image signal line 502 to the slave processor circuit 118 such that images of the interior of the body cavity 404 can be displayed on the display 120 .
- neither of the instruments 110 or 316 (shown in FIG. 3C ) is initially received through the respective bores 300 and 302 .
- the end effectors 312 and 320 on the instruments 110 and 316 may have sharp cutting edges and are not initially introduced into the body cavity 404 due to the risk of causing damage to tissue prior to the camera 204 being able to capture and relay body cavity image data back to the master processor circuit 114 .
- the cap 408 has sufficient compliance to permit the camera 204 to be moved around within the body cavity 404 and aligned to cause the illumination source 308 (shown in FIG. 3A ) to illuminate a desired region of the body cavity 404 for capturing images of the patient's anatomy.
- the camera 204 thus captures body cavity image data, which is relayed back via the slave processor circuit 118 to the master processor circuit 114 for display on the display 120 of the workstation 102 .
- FIG. 5 A block diagram of the processor circuit elements of the system 100 is shown in FIG. 5 .
- the master processor circuit 114 on the workstation 102 includes a microprocessor 500 , a memory 502 , a USB interface 504 , an input/output 506 and a motion control interface 508 , all of which are in communication with the microprocessor 500 .
- the input device 112 (shown in FIG. 1 ) communicates using a USB protocol and the USB interface 504 receives input signals produced by the input device in response to movements of the hand controllers 122 and 124 .
- the microprocessor 500 processes the input signals and causes the motion control interface 508 to transmit control signals to the instrument processor circuit 118 via the interface cable 116 .
- the memory 502 provides storage for program codes 520 for directing the microprocessor 500 to perform various functions.
- the memory 502 also includes storage for data, such as instrument parameters 522 .
- the slave processor circuit 118 on the instrument cart 104 includes a microprocessor 580 , a memory 582 , a communications interface 584 , and a drive control interface 586 , all of which are in communication with the microprocessor.
- the microprocessor 580 receives the control signals at the communications interface 584 over the interface cable 116 from the master processor circuit 114 of the workstation 102 .
- the microprocessor 580 then processes the control signals, which are output by the drive control interface 586 and cause the drive unit 106 to produce drive forces for moving the instruments 110 and 316 and the camera 204 .
- the memory 582 provides storage for program codes 590 and other control data 592 associated with operation of the instrument cart 104 .
- a flowchart depicting blocks of code for directing the master and slave processor circuits 114 and 118 during insertion of an instrument into the body cavity 404 of the patient 400 is shown generally at 600 .
- the blocks generally represent codes that may be read from the master processor memory 502 and slave processor memory 582 for directing the microprocessors 500 and 580 to perform various functions.
- the actual code to implement each block may be written in any suitable program language, such as C, C++, C#, Java, and/or assembly code, for example.
- the process begins at block 602 , which directs the microprocessor 580 of the slave processor circuit 118 to receive body cavity image data.
- the body cavity image data is captured by the camera 204 and represents an interior view of the body cavity 404 .
- the insertion tube 108 may be moved around within the body cavity 404 to position the camera 204 to view various anatomical features within the body cavity 404 .
- the camera 204 is oriented to provide images of an ovary 414 of the patient 400 .
- the body cavity image data is then relayed back via the interface cable 116 for receipt by the microprocessor 500 of the master processor circuit 114 .
- Block 604 then directs the microprocessor 500 of the master processor circuit 114 to determine instrument parameters associated with physical extents of the instrument to be inserted into the body cavity.
- each of the instruments 110 and 316 shown in FIG. 3 will have associated parameters that identify attributes of the instrument such its physical length, the type of end effector attached, offset in relation to the bore 300 and camera, actuation limits, etc. These parameters will generally have already been loaded into the memory 502 of the master processor circuit 114 in the instrument parameters memory location 522 .
- Block 606 then directs the microprocessor 500 to read the instrument parameters from the memory location 522 for each instrument and to determine an instrument envelope based on the instrument parameters.
- the instrument envelope corresponding to the instrument 110 is depicted in broken lines at 416 in FIGS. 4 and 5 and identifies a region through which the instrument 110 is capable of physically moving when inserted into the body cavity 404 . Note that as shown in FIG. 4 , the instrument 110 is not yet inserted through the insertion tube 108 . Since the instrument 110 will be eventually inserted through the bore 300 of the insertion tube 108 , the orientation of the instrument is determined by the orientation of the bore and the insertion tube.
- the instrument envelope 416 of the instrument 110 may have a generally cylindrical shape extending outwardly with respect to a longitudinal axis of the bore 300 and illustrates the potential reach of the instrument.
- the physical reach of the instrument 110 may be calculated as a length of the instrument when fully inserted in the bore 300 , as defined in the instrument parameters memory location 522 .
- the instrument envelope 416 may represent a 3D volume, for example representing the total volume which may be occupied by the instrument when inserted through the insertion tube 108 .
- the envelope may represent a physical reach of the instrument in at least one degree of freedom (for example along a longitudinal axis of the bore 300 ) associated with the instrument prior to insertion of the instrument into the body cavity.
- Block 608 then directs the microprocessor 500 to generate a view based on the body cavity image data being captured in real-time from the camera 204 .
- the view thus represents current conditions within the body cavity 404 .
- An example of an image represented by the body cavity image data is depicted in FIG. 7A at 700 .
- the camera 204 captures an image within a field of view illuminated by the illumination source 308 , in this case including the left ovary 414 , a portion of the colon 702 , the uterus 704 , the left fallopian tube 706 , as well as the right ovary 708 and right fallopian tube 709 .
- the body cavity image data may be streamed from the camera 204 as a real-time sequence of high-definition (HD) video frames for viewing as either 2D or 3D video on the display 120 .
- the master processor circuit 114 would need to be capable of generating the 3D point cloud, determining the instrument envelope, and generating the necessary projections at a reasonable frame rate (for example at 15-30 frames per second for HD video data).
- Block 610 then directs the microprocessor 500 to generate a view including an envelope overlay image representing the determined instrument envelope 416 .
- the view generated at block 610 represents a prediction of the physical extents of the instrument 110 prior to the instrument actually being inserted into the body cavity.
- An example overlay image is shown in FIG. 7B at 710 .
- the instrument is represented by a cylindrical shape shown as would appear from the perspective of the camera 204 with a distal end 712 representing the end effector.
- the camera 204 is shown longitudinally extended for initial positioning of the insertion tube 108 prior to inserting instruments through the insertion tube.
- the camera captures a view (indicated within the broken lines 1400 ) of the body cavity 404 from the perspective of the camera 204 when longitudinally extended and located in at the position P 0 before being moved into the deployed state (as shown at P 1 in FIG. 14B ).
- the camera 204 is shown oriented in the deployed state, which would be used for viewing of the body cavity 404 during a surgical procedure.
- the camera 204 is moved upwardly and angled down to provide a view 1402 of the surgical workspace within which the instruments are capable of reaching (shown in broken outline at 1404 ) and to facilitate insertion of the instruments through the insertion tube 108 and subsequent manipulation thereof.
- the view 1400 in FIG. 14A thus differs in perspective from the view in FIG. 14B due to the different positioning P 0 and P 1 .
- the master processor circuit 114 is configured to distort the image captured by the camera 204 when in the position P 0 prior to display on the display 120 .
- the degree of distortion of the image is selected to make the displayed image appear as if it were captured by the camera 204 in the deployed state P 1 so that the system 100 provides the surgeon with a displayed perspective that corresponds to the perspective displayed during the remainder of the surgical procedure once the camera 204 is deployed.
- the coordinates of the positions P 0 and P 1 may be determined based on parameters associated with actuation of the articulated arm 322 .
- FIG. 15A An example of the image generated for the position P 0 of the camera 204 is shown in FIG. 15A and the boundaries of the display 120 or other display window are indicated as a rectangle ABCD.
- the image captured by the camera 204 at P 0 may be distorted to generate an acute trapezoid as shown in FIG. 15B at 1500 . Since the distorted image 1500 no longer corresponds with the boundaries of the display 120 (i.e. A, B, C, D), the distorted image would need to be scaled to fit within the display boundaries ABCD. This results in some truncation of the originally captured image as shown at 1502 in FIG. 15B .
- the distorted image may be formed using the image provided by one of the stereoscopic imagers of the camera 204 .
- a 2D representation of the envelope overlay images may then be displayed over the distorted image. Since only the perspective from a fixed and known position P 1 is required, generation of the distorted image may be based on a fixed distortion filter (i.e. a trapezoid distortion with parameters corresponding to the difference between the camera positions P 0 and P 1 ). The generation of the image would thus not be processor intensive since complex image transformation is not required. The estimation of the distorted view would thus produce a 2D image and a 2D representation of the envelope overlay images would accordingly be used when forming a composite image.
- a fixed distortion filter i.e. a trapezoid distortion with parameters corresponding to the difference between the camera positions P 0 and P 1 .
- the camera 204 may be beneficial to articulate the camera 204 from position P 0 to P 1 while the insertion tube 108 is being positioned to provide a “birds-eye” (from the perspective of the camera in a deployed state during surgery) view of the surgical site, it may not be possible as the camera may only be moveable when the insertion tube 108 is coupled to the drive unit 106 , as described above. Estimation of the bird's eye view based on images captured while in the longitudinally extended state shown in FIG. 14A may thus be of use.
- the process 600 then continues at block 612 , which directs the microprocessor 500 to generate display signals operable to display a composite view of the interior of the body cavity on the display 120 associated with the workstation 102 .
- An example composite view is shown in FIG. 7C at 720 and is based on the body cavity image data captured by the camera 204 and includes the envelope overlay image 710 .
- the composite view 720 illustrates the relative proximity between the instrument 110 and anatomy such as the uterus 704 and ovary 414 for the current camera position.
- the body cavity image 700 and envelope overlay image 710 may be generated at video frame rates in near-real time so that the display on the display 120 is continually updated as the insertion tube 108 is moved within the body cavity 404 .
- the envelope overlay image 710 may be displayed as a shaded region or other representation of the instrument envelope. Since the envelope overlay image 710 is based on physical parameters of the actual instrument 110 , the composite view 720 thus provides a prior indication of an extent that the instrument will have, once inserted through the insertion tube 108 into the body cavity 404 .
- the display 120 may be capable of displaying the stereoscopic images for viewing through a pair of 3D viewing spectacles to provide a 3D view of the interior of the body cavity 404 .
- the captured 2D images produced by each of the pair of spaced apart imagers 304 and 306 may be presented in a format suitable for viewing with spectacles having colored filters, polarization filters, or actively shuttered spectacles.
- the display may be configured to display each image only to one of the operator's eyes using separate displays in a head mounted display.
- 3D image capture techniques may be used to generate data having 3D information.
- a time of flight camera may be used to generate image data including three-dimensional spatial information obtained on a pixel-by-pixel basis by determining the time that controlled illumination takes to be reflected back from anatomical features.
- a camera in combination with a structured light source for illuminating the interior of the body cavity may be used to facilitate determination of three-dimensional spatial information.
- the camera 204 includes the pair of spaced apart imagers 304 and 306 (shown in FIG. 3A ) and each image sensor is operably configured to capture an image of the interior of the body cavity 404 from a different perspective within a coordinate frame [x v , y v , z v ].
- the coordinate frame [x v , y v , z v ] has an origin o v located at a center of the camera 204 .
- the origin of the coordinate frame may be located elsewhere on the insertion tube 108 .
- the 2D images produced by the imagers 304 and 306 may be processed to generate a 3D point cloud using various open source or propriety library functions.
- the point cloud represents a set of points P i , such as the points P 0 and P 1 shown in FIG. 8 .
- the point cloud points P i are generated from the 2D images by determining a distance between common points on the surface as represented by the separate images produced by the spaced apart imagers 304 and 306 .
- Generation of the point cloud thus provides 3D coordinates within the coordinate frame [x v , y v , z v ] of points P i representing surfaces of anatomical features such as the ovary 414 , uterus 704 , and fallopian tube 706 .
- the instrument parameters stored in the memory location 522 of the master processor memory 502 are then used to determine the range of possible motion of the instrument 110 with respect to the bore 300 within the coordinate frame [x v , y v , z v ].
- the instrument parameters 522 may include a maximum distance between an opening of the bore 300 and the end effector 312 of the instrument 110 when the instrument is fully inserted and advanced through the bore.
- the instrument parameters 522 may further include an offset distance between the opening of the bore 300 and the origin of the coordinate frame [x v , y v , z v ] at the camera 204 .
- the extent of the cylindrical instrument envelope 802 may then be converted into a point cloud of points P j or may be represented other 3D representation techniques, such as a polygon mesh (not shown) having vertices located on the surface of the cylindrical volume.
- the instrument parameters 522 may also include articulation information defining a manipulation region within which the manipulator 310 (shown in FIG. 3C ) is capable of moving the end effector 312 .
- the cylindrical instrument envelope 802 may be further expanded to include the manipulation region that the instrument 110 will have once inserted.
- the instrument envelope may thus be displayed as an outline of a volume 804 demarcating extents for manipulation of the instrument in response to movements of the input device 112 (shown in FIG. 1 ).
- the shape of the volume 804 is dependent on the type of manipulator 310 that is implemented to provide the dexterous movement of the instrument 110 .
- Generation of the envelope overlay image 710 may involve generating 3D image data in a similar format to the image data produced by the camera 204 to facilitate generation of a 3D composite view 720 by overlaying the instrument envelope image over the body cavity image data.
- the microprocessor 500 may be operably configured to generate projections of a 3D envelope overlay image 710 onto 2D planes corresponding to the body cavity images generated by the pair of spaced apart imagers 304 and 306 .
- the resulting 2D envelope projections may then be combined with the original body cavity images as an overlay image to produce a 3D view of the envelope overlay image 710 within a 3D composite view.
- the overlay image may be displayed as a shaded or a semi-transparent 3D region in the resulting 3D composite view. In other embodiments the overlay image may be displayed showing only an outline of the instrument envelope. Standard open source image overlay functions are generally available for performing a combination of two or more images and may be used to implement the process in near real time.
- FIG. 16 a block diagram illustrating the processing and generation of signals by the microprocessor 500 for producing the 3D composite view is shown at 1600 .
- the microprocessor 500 of the master processor circuit 114 receives left and right images from the spaced apart left and right imagers 304 and 306 of the camera 204 and generates a 3D point cloud.
- An available software tool such as OpenGL may be implemented on the microprocessor 500 to render a 3D scene of the point cloud as well as the 3D envelope overlay.
- the 3D point cloud and 3D envelope are rendered together so that the 3D envelope is clipped where it intersects the point cloud.
- the known position of the camera in relation to the z-axis the insertion tube 108 may then be used to create two views; one view for the left imager and another view for the right imager. Finally, the left and right views are rendered without the point cloud and output to the image processing block, which creates the left and right composite views.
- the insertion tube 108 and camera 204 are initially detached from the drive interface 202 of the drive unit 106 , facilitating positioning of the camera by hand to receive a desired interior view of the body cavity 404 based on the composite view.
- the operator ensures that the insertion tube 108 is located such that the instrument 110 can be safely inserted and once inserted will be disposed to provide an adequate range of manipulation for performing the desired surgical process.
- the drive unit may then be connected to the drive unit 106 and further positioning of the camera 204 , such as for example moving to the deployed state shown in FIG. 3C , may then be performed under control of the master processor circuit 114 and slave processor circuit 118 .
- the microprocessor 500 of the master processor circuit 114 may be directed to discontinue display of the envelope overlay image, since the instrument 110 would then move into the view in place of the instrument envelope.
- the insertion tube 108 once coupled to the drive interface 202 of the drive unit 106 may need to be repositioned for various reasons, including a need to adjust the available surgical workspace that is within reach of the manipulation range of the instruments 110 and second instrument 316 .
- repositioning of the insertion tube may be accomplished by repositioning the drive unit 106 manually (by a bed side nurse or surgeon other than the surgeon operating the workstation 102 ).
- the surgeon operating the workstation 102 may generate actuation signals at the workstation 102 (for example by moving the hand controllers 122 ) to cause the insertion tube 108 to be repositioned.
- the system 100 may be first configured to decouple the instruments 110 and 316 and/or camera 204 from the input device 112 so that further movement within the body cavity 404 is inhibited.
- decoupling of the instruments 110 and 316 may be caused by the surgeon depressing a clutch pedal 126 that prevents further changes to the instrument actuation signals such that the instrument cart 104 inhibits further movements that could cause damage to the patient's tissues or organs.
- instrument envelope overlay images 900 and 902 when in the decoupled state instrument envelope overlay images 900 and 902 (similar to that shown at 804 in FIG. 8 ) may be generated and displayed for each of the instruments 110 and 316 to indicate the respective instrument manipulation regions while repositioning of the insertion tube 108 to a safe and desired position.
- FIG. 10 a resulting composite view of the instrument envelope overlay images 900 and 902 as viewed on the display 120 (from the perspective of the camera 204 ) is shown at 1000 .
- the composite view 1000 depicted in FIG. 10 is a “birds-eye” view from the view point of the camera 204 in its deployed state as shown in FIG. 9 .
- the composite view 1000 thus depicts the possible range of manipulation 900 and 902 for each of the instruments 316 and 110 and provides information to aid in the repositioning.
- the surgeon may wish to perform surgical operations in a specific anatomical region such as the ovary 414 .
- a composite view 1100 may display an enlarged portion of the patient's anatomy along with the instrument envelope overlay images 1100 and 1102 depicting the associated possible range of manipulation of the instruments 110 and 316 .
- the camera 204 is integrally coupled to and part of the insertion tube 108 .
- the camera may be received as a separate unit in an insertion tube.
- an insertion tube 1200 is sized to permit first and second instruments 1202 and 1204 and a camera 1206 to be received.
- the camera 1206 is configured as a separate unit having its own sheath 1208 .
- the insertion tube 1200 may still be inserted through the opening 410 in the cap 408 ( FIG. 4 ) or other body orifice.
- the camera 1206 and first and second instruments 1202 and 1204 are all independently movable through the insertion tube 1200 .
- the insertion tube 1200 may have separated bores for accommodating the camera 1206 , and each of the first and second instruments 1202 and 1204 .
- the insertion tube 1200 may be inserted partway into the body cavity 404 through the opening 410 before the camera 1206 and instruments 1202 and 1204 are inserted. Subsequently, when the camera 1206 is inserted and reaches an end 1210 of the insertion tube 1200 the composite images depicting instrument envelopes similar to those shown in FIGS. 9 to 11 may be generated to allow the insertion tube to be positioned appropriately, as generally disclosed above.
- the instruments 1202 and 1204 can be inserted and would move into view in place of the respective instrument envelopes.
- the camera and instruments are independently movable and may be inserted through separate incisions.
- a camera may be coupled to a first manipulator associated with a robotic surgery system and an instrument may be coupled to a second manipulator, both of which are independently moveable.
- the location of the instrument will not be constrained by an instrument bore (as in the case of the bore 300 of the insertion tube 108 ) but rather by the point at the incision through which the instrument is inserted into the body cavity.
- the instrument parameters 522 may then be used to determine the possible physical extent of the instrument within the body cavity 404 .
- the body cavity image data may be processed by the master processor circuit 114 to identify anatomical features within a composite view as shown generally at 1300 .
- the master processor circuit 114 may be operably configured to identify major organs such as the ovary 414 , uterus 704 , colon 702 , etc.
- the master processor circuit 114 may implement machine-learning techniques to train an image recognition algorithm to recognize anatomical features under various conditions.
- the image recognition algorithm may be neural network based and may be trained using a set of labeled training images in which features such as the fallopian tube 706 , and ovary 414 have been previously manually labeled by an operator.
- an additional overlay image identifying at least one anatomical feature within the body cavity 404 may be generated.
- the composite view 1300 in addition to the body cavity image 700 and envelope overlay image 710 (shown in FIG. 7 ) may also include highlighting or other overlay identifying some anatomical features. For example, relevant anatomical features in the composite view 1300 may be displayed together with shading or semi-transparent overlay colors to indicate different anatomical features.
- the master processor circuit 114 may determine whether there are any regions of potential encroachment of between the instrument envelope 710 and an identified anatomical feature, in which case an alert signal may be generated.
- the alert signal may take the form of a further warning overlay image 1302 for display as part of the composite image.
- the warning overlay image 1302 may also be accompanied by an audible warning or by haptic feedback via the input device 112 .
- the alert signal may further be transmitted to the slave processor circuit 118 to cause the drive unit 106 to generate drive signals that limit further movement of the drive unit 106 and the insertion tube 108 to prevent or otherwise limit full movement of the camera and/or instruments into the region of potential encroachment.
- a similar alert signal may also be generated when repositioning the insertion tube 108 to highlight a potential encroachment of between the instrument envelope and an identified anatomical feature. Repositioning of the insertion tube 108 may involve causing the insertion tube to pivot about the location of the incision or causing a portion of the insertion tube to articulate along its length to reposition a distal end of the insertion tube.
- Generation of the overlay image for highlighting the anatomical features may involve the master processor circuit 114 processing the body cavity image data using the image recognition algorithm to identify surface regions associated with the anatomy in the area of the surgical worksite. Each surface region may then be represented using a polygon mesh representation or other method. A similar polygon mesh representation may also be generated for the envelope overlay image 710 and points of proximity or intersection between the instrument envelope and the anatomy surface region may be determined. Standard 3D surface intersection library functions for this purpose are available in open source libraries of functions.
- the overlay image may be generated as a 3D volume reachable by manipulating the instruments 110 and 316 , as shown at 1700 in broken lines.
- the volume 1700 has a first portion 1702 within which the instruments 110 and 316 are freely able to move within the constraints provided by the manipulators 310 and 318 .
- the volume 1700 has a second portion 1704 proximate the insertion tube 108 that may be encumbered by the camera 204 , even when in the deployed state.
- a non-manipulatable portion of the instruments 110 and 316 proximate the insertion tube 108 may also have a role in determining the second portion 1704 .
- a remaining workable volume is thus enclosed within the volume 1700 , which may be displayed on the display 120 to indicate this workable volume.
- the workable volume is the region where the end-effectors can reach by manipulating the instruments 110 and 316 and moving the instruments in the z-axis direction.
- the workable volume 1700 would thus be dependent on the deployed position of the camera 204 and the articulated arm 322 supporting the camera, and also on the structure of the instruments 110 and 316 .
- the workable volume 1700 is shown from a different perspective in FIG. 18 .
- the first portion 1702 and second portion 1704 may be otherwise shaped depending on the structural features of the insertion tube 108 , camera 204 , and the instruments 110 and 316 .
- the workable volume 1700 may be generated based on the instrument parameters as described above.
- the above disclosed embodiments provide a representation of the instrument prior to actual insertion into the patient and thus prevent inadvertent damage to tissue or organs of the patient.
- the surgeon or other operator is able to safely view the interior of the body cavity and position the camera to capture images of the surgical worksite.
- the generated composite view including the overlay image provides a visual cue to the surgeon of possible engagement between the instrument and sensitive anatomy, thus facilitating positioning of the camera prior to insertion of the instrument.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Physics & Mathematics (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Optics & Photonics (AREA)
- Biophysics (AREA)
- Robotics (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Gynecology & Obstetrics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Endoscopes (AREA)
- Human Computer Interaction (AREA)
Abstract
Description
TABLE 1 | |
qins | z-position of the insertion tube |
θ1 | Angle that represents how much the s-segment is bent |
δ1 | Angle that represents the plane in which the s-segment is bent. |
θ2 | Angle that represents how much the distal segment is bent |
δ1 | Angle that represents the plane in which the distal segment is bent. |
γ | Wrist roll |
Claims (24)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/273,442 US11000339B2 (en) | 2018-04-24 | 2019-02-12 | System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure |
US17/227,617 US11779418B2 (en) | 2018-04-24 | 2021-04-12 | System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure |
US18/481,912 US20240099794A1 (en) | 2018-04-24 | 2023-10-05 | System and apparatus for insertion of an instrument into a body cavity for performing a surgical procedure |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/961,507 US10058396B1 (en) | 2018-04-24 | 2018-04-24 | System and apparatus for insertion of an instrument into a body cavity for performing a surgical procedure |
US16/053,232 US10245113B1 (en) | 2018-04-24 | 2018-08-02 | System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure |
US16/273,442 US11000339B2 (en) | 2018-04-24 | 2019-02-12 | System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/053,232 Continuation US10245113B1 (en) | 2018-04-24 | 2018-08-02 | System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/227,617 Continuation US11779418B2 (en) | 2018-04-24 | 2021-04-12 | System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190321118A1 US20190321118A1 (en) | 2019-10-24 |
US11000339B2 true US11000339B2 (en) | 2021-05-11 |
Family
ID=63208876
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/961,507 Active US10058396B1 (en) | 2018-04-24 | 2018-04-24 | System and apparatus for insertion of an instrument into a body cavity for performing a surgical procedure |
US16/053,232 Active US10245113B1 (en) | 2018-04-24 | 2018-08-02 | System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure |
US16/273,442 Active 2038-11-20 US11000339B2 (en) | 2018-04-24 | 2019-02-12 | System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure |
US17/227,617 Active 2038-04-25 US11779418B2 (en) | 2018-04-24 | 2021-04-12 | System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure |
US18/481,912 Pending US20240099794A1 (en) | 2018-04-24 | 2023-10-05 | System and apparatus for insertion of an instrument into a body cavity for performing a surgical procedure |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/961,507 Active US10058396B1 (en) | 2018-04-24 | 2018-04-24 | System and apparatus for insertion of an instrument into a body cavity for performing a surgical procedure |
US16/053,232 Active US10245113B1 (en) | 2018-04-24 | 2018-08-02 | System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/227,617 Active 2038-04-25 US11779418B2 (en) | 2018-04-24 | 2021-04-12 | System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure |
US18/481,912 Pending US20240099794A1 (en) | 2018-04-24 | 2023-10-05 | System and apparatus for insertion of an instrument into a body cavity for performing a surgical procedure |
Country Status (3)
Country | Link |
---|---|
US (5) | US10058396B1 (en) |
EP (1) | EP3560411A3 (en) |
CA (1) | CA3034919A1 (en) |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105188590B (en) * | 2012-12-10 | 2019-04-26 | 直观外科手术操作公司 | Image collecting device and can the collision during manipulation device lever arm controlled motion avoid |
US10058396B1 (en) | 2018-04-24 | 2018-08-28 | Titan Medical Inc. | System and apparatus for insertion of an instrument into a body cavity for performing a surgical procedure |
US11369366B2 (en) | 2018-07-16 | 2022-06-28 | Cilag Gmbh International | Surgical visualization and monitoring |
US11612438B2 (en) | 2018-09-05 | 2023-03-28 | Point Robotics Medtech Inc. | Navigation system and method for medical operation by a robotic system using a tool |
GB2577719B (en) * | 2018-10-03 | 2023-04-26 | Cmr Surgical Ltd | Navigational aid |
GB2577718B (en) * | 2018-10-03 | 2022-08-24 | Cmr Surgical Ltd | Feature identification |
US11147434B2 (en) | 2018-10-10 | 2021-10-19 | Titan Medical Inc. | Systems, methods, and apparatuses for capturing images during a medical procedure |
US20200154094A1 (en) * | 2018-11-09 | 2020-05-14 | Titan Medical Inc. | Stereoscopic imaging apparatus for use in confined spaces |
US11918294B2 (en) * | 2019-01-31 | 2024-03-05 | Brainlab Ag | Virtual trajectory planning |
JP6867654B2 (en) * | 2019-03-15 | 2021-05-12 | リバーフィールド株式会社 | Force display device and force display method for medical robot systems |
US11166774B2 (en) * | 2019-04-17 | 2021-11-09 | Cilag Gmbh International | Robotic procedure trocar placement visualization |
US10939970B2 (en) | 2019-05-22 | 2021-03-09 | Titan Medical Inc. | Robotic surgery system |
WO2020263630A1 (en) * | 2019-06-26 | 2020-12-30 | Titan Medical Inc. | Sterile barrier systems and methods for robotic surgery systems |
US11744667B2 (en) | 2019-12-30 | 2023-09-05 | Cilag Gmbh International | Adaptive visualization by a surgical system |
US11759283B2 (en) * | 2019-12-30 | 2023-09-19 | Cilag Gmbh International | Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto |
US11896442B2 (en) | 2019-12-30 | 2024-02-13 | Cilag Gmbh International | Surgical systems for proposing and corroborating organ portion removals |
US11648060B2 (en) | 2019-12-30 | 2023-05-16 | Cilag Gmbh International | Surgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ |
US12002571B2 (en) | 2019-12-30 | 2024-06-04 | Cilag Gmbh International | Dynamic surgical visualization systems |
US11219501B2 (en) | 2019-12-30 | 2022-01-11 | Cilag Gmbh International | Visualization systems using structured light |
US11776144B2 (en) * | 2019-12-30 | 2023-10-03 | Cilag Gmbh International | System and method for determining, adjusting, and managing resection margin about a subject tissue |
US11832996B2 (en) | 2019-12-30 | 2023-12-05 | Cilag Gmbh International | Analyzing surgical trends by a surgical system |
US11284963B2 (en) | 2019-12-30 | 2022-03-29 | Cilag Gmbh International | Method of using imaging devices in surgery |
CN112641513B (en) * | 2020-12-15 | 2022-08-12 | 深圳市精锋医疗科技股份有限公司 | Surgical robot and control method and control device thereof |
JP1713324S (en) * | 2021-09-14 | 2022-04-22 | Medical robot | |
US20230156174A1 (en) * | 2021-11-17 | 2023-05-18 | 3Dintegrated Aps | Surgical visualization image enhancement |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040068173A1 (en) | 2002-08-06 | 2004-04-08 | Viswanathan Raju R. | Remote control of medical devices using a virtual device interface |
US20050182295A1 (en) | 2003-12-12 | 2005-08-18 | University Of Washington | Catheterscope 3D guidance and interface system |
US20060064010A1 (en) | 2004-09-17 | 2006-03-23 | Cannon Charles Jr | Probe guide for use with medical imaging systems |
US20090326318A1 (en) | 2008-06-27 | 2009-12-31 | Intuitive Surgical, Inc. | Medical robotic system providing an auxilary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide |
US20110295108A1 (en) | 2007-11-26 | 2011-12-01 | C.R. Bard, Inc. | Apparatus for use with needle insertion guidance system |
US20140081127A1 (en) | 2012-09-19 | 2014-03-20 | The Regents Of The University Of Michigan | Advanced Intraoperative Neural Targeting System and Method |
WO2015149046A1 (en) | 2014-03-28 | 2015-10-01 | Dorin Panescu | Quantitative three-dimensional imaging of surgical scenes from multiport perspectives |
US9486159B2 (en) | 2003-05-22 | 2016-11-08 | Intuitive Surgical Operations, Inc. | Device and method for superimposing patterns on images in real time, particularly for guidance by location |
US9492240B2 (en) | 2009-06-16 | 2016-11-15 | Intuitive Surgical Operations, Inc. | Virtual measurement tool for minimally invasive surgery |
US9516996B2 (en) | 2008-06-27 | 2016-12-13 | Intuitive Surgical Operations, Inc. | Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the position and orienting of its tip |
WO2017124177A1 (en) | 2016-01-19 | 2017-07-27 | Titan Medical Inc. | Graphical user interface for a robotic surgical system |
US20180132944A1 (en) * | 2015-05-18 | 2018-05-17 | Koninklijke Philips N.V. | Intra-procedural accuracy feedback for image-guided biopsy |
US20180161063A1 (en) | 2015-07-02 | 2018-06-14 | Olympus Corporation | Ultrasound observation apparatus, method of operating ultrasound observation apparatus, and computer readable recording medium |
US10058396B1 (en) | 2018-04-24 | 2018-08-28 | Titan Medical Inc. | System and apparatus for insertion of an instrument into a body cavity for performing a surgical procedure |
US20190192232A1 (en) * | 2017-12-26 | 2019-06-27 | Biosense Webster (Israel) Ltd. | Use of augmented reality to assist navigation during medical procedures |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8182422B2 (en) * | 2005-12-13 | 2012-05-22 | Avantis Medical Systems, Inc. | Endoscope having detachable imaging device and method of using |
CA3004277C (en) | 2013-06-19 | 2020-10-20 | Titan Medical Inc. | Articulated tool positioner and system employing same |
WO2016090459A1 (en) | 2014-12-11 | 2016-06-16 | Titan Medical Inc. | Actuator and drive for manipulating a tool |
US11576562B2 (en) | 2016-04-07 | 2023-02-14 | Titan Medical Inc. | Camera positioning method and apparatus for capturing images during a medical procedure |
US11071593B2 (en) * | 2017-07-14 | 2021-07-27 | Synaptive Medical Inc. | Methods and systems for providing visuospatial information |
JP7016681B2 (en) * | 2017-12-01 | 2022-02-07 | ソニー・オリンパスメディカルソリューションズ株式会社 | Endoscope system |
JP7159577B2 (en) * | 2018-03-20 | 2022-10-25 | ソニーグループ株式会社 | Endoscope system, control method, information processing device, and program |
JP2019185002A (en) * | 2018-04-11 | 2019-10-24 | ソニー株式会社 | Microscope system and medical light source device |
-
2018
- 2018-04-24 US US15/961,507 patent/US10058396B1/en active Active
- 2018-08-02 US US16/053,232 patent/US10245113B1/en active Active
-
2019
- 2019-02-12 US US16/273,442 patent/US11000339B2/en active Active
- 2019-02-25 CA CA3034919A patent/CA3034919A1/en active Pending
- 2019-04-24 EP EP19170809.8A patent/EP3560411A3/en active Pending
-
2021
- 2021-04-12 US US17/227,617 patent/US11779418B2/en active Active
-
2023
- 2023-10-05 US US18/481,912 patent/US20240099794A1/en active Pending
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040068173A1 (en) | 2002-08-06 | 2004-04-08 | Viswanathan Raju R. | Remote control of medical devices using a virtual device interface |
US9486159B2 (en) | 2003-05-22 | 2016-11-08 | Intuitive Surgical Operations, Inc. | Device and method for superimposing patterns on images in real time, particularly for guidance by location |
US20050182295A1 (en) | 2003-12-12 | 2005-08-18 | University Of Washington | Catheterscope 3D guidance and interface system |
US20060064010A1 (en) | 2004-09-17 | 2006-03-23 | Cannon Charles Jr | Probe guide for use with medical imaging systems |
US20110295108A1 (en) | 2007-11-26 | 2011-12-01 | C.R. Bard, Inc. | Apparatus for use with needle insertion guidance system |
US20090326318A1 (en) | 2008-06-27 | 2009-12-31 | Intuitive Surgical, Inc. | Medical robotic system providing an auxilary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide |
US9516996B2 (en) | 2008-06-27 | 2016-12-13 | Intuitive Surgical Operations, Inc. | Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the position and orienting of its tip |
US9492240B2 (en) | 2009-06-16 | 2016-11-15 | Intuitive Surgical Operations, Inc. | Virtual measurement tool for minimally invasive surgery |
WO2011150376A1 (en) | 2010-05-28 | 2011-12-01 | C.R. Bard, Inc. | Apparatus for use with needle insertion guidance system |
US20140081127A1 (en) | 2012-09-19 | 2014-03-20 | The Regents Of The University Of Michigan | Advanced Intraoperative Neural Targeting System and Method |
WO2015149046A1 (en) | 2014-03-28 | 2015-10-01 | Dorin Panescu | Quantitative three-dimensional imaging of surgical scenes from multiport perspectives |
US20180132944A1 (en) * | 2015-05-18 | 2018-05-17 | Koninklijke Philips N.V. | Intra-procedural accuracy feedback for image-guided biopsy |
US20180161063A1 (en) | 2015-07-02 | 2018-06-14 | Olympus Corporation | Ultrasound observation apparatus, method of operating ultrasound observation apparatus, and computer readable recording medium |
WO2017124177A1 (en) | 2016-01-19 | 2017-07-27 | Titan Medical Inc. | Graphical user interface for a robotic surgical system |
US20190192232A1 (en) * | 2017-12-26 | 2019-06-27 | Biosense Webster (Israel) Ltd. | Use of augmented reality to assist navigation during medical procedures |
US10058396B1 (en) | 2018-04-24 | 2018-08-28 | Titan Medical Inc. | System and apparatus for insertion of an instrument into a body cavity for performing a surgical procedure |
Non-Patent Citations (1)
Title |
---|
U.S. Appl. No. 16/053,232, filed Aug. 2, 2018, Genova et al. |
Also Published As
Publication number | Publication date |
---|---|
EP3560411A2 (en) | 2019-10-30 |
US10245113B1 (en) | 2019-04-02 |
US20210228295A1 (en) | 2021-07-29 |
US20190321118A1 (en) | 2019-10-24 |
CA3034919A1 (en) | 2019-10-24 |
US20240099794A1 (en) | 2024-03-28 |
EP3560411A3 (en) | 2019-11-20 |
US10058396B1 (en) | 2018-08-28 |
US11779418B2 (en) | 2023-10-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11779418B2 (en) | System and apparatus for positioning an instrument in a body cavity for performing a surgical procedure | |
WO2019116592A1 (en) | Device for adjusting display image of endoscope, and surgery system | |
US5572999A (en) | Robotic system for positioning a surgical instrument relative to a patient's body | |
JP5372225B2 (en) | Tool position and identification indicator displayed in the border area of the computer display screen | |
CN108697304B (en) | Medical information processing device, medical information processing method, and medical information processing system | |
JP7480477B2 (en) | Medical observation system, control device and control method | |
WO2021124716A1 (en) | Method, apparatus and system for controlling an image capture device during surgery | |
WO2017163407A1 (en) | Endoscope device, endoscope system, and surgery system provided with same | |
JP4027876B2 (en) | Body cavity observation system | |
JP3532660B2 (en) | Body cavity observation device | |
US20220280238A1 (en) | Robot-assisted setup for a surgical robotic system | |
US20230172438A1 (en) | Medical arm control system, medical arm control method, medical arm simulator, medical arm learning model, and associated programs | |
US20240046589A1 (en) | Remote surgical mentoring | |
US20220409326A1 (en) | Method, apparatus and system for controlling an image capture device during surgery | |
WO2022219878A1 (en) | Medical observation system, medical image processing method, and information processing device | |
Hayashibe et al. | Real-time 3D deformation imaging of abdominal organs in laparoscopy | |
EP3989858A1 (en) | System and method related to registration for a medical procedure |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: CORPORATION SERVICE COMPANY C/O PROJECT TIME LLC, DELAWARE Free format text: SECURITY INTEREST;ASSIGNOR:TITAN MEDICAL INC.;REEL/FRAME:052643/0922 Effective date: 20200508 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
AS | Assignment |
Owner name: TITAN MEDICAL INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GENOVA, PERRY;MCNALLY, DAVID;SIGNING DATES FROM 20180418 TO 20180423;REEL/FRAME:055705/0109 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED Free format text: AWAITING TC RESP, ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP, ISSUE FEE PAYMENT VERIFIED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: TITAN MEDICAL INC., CANADA Free format text: CHANGE OF ADDRESS;ASSIGNOR:TITAN MEDICAL INC.;REEL/FRAME:058549/0570 Effective date: 20211220 |
|
AS | Assignment |
Owner name: TITAN MEDICAL INC., CANADA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:PROJECT TIME LLC;REEL/FRAME:059174/0081 Effective date: 20220131 |