CN118019507A - Redundant communication channels and processing of imaging feeds - Google Patents

Redundant communication channels and processing of imaging feeds Download PDF

Info

Publication number
CN118019507A
CN118019507A CN202280063249.XA CN202280063249A CN118019507A CN 118019507 A CN118019507 A CN 118019507A CN 202280063249 A CN202280063249 A CN 202280063249A CN 118019507 A CN118019507 A CN 118019507A
Authority
CN
China
Prior art keywords
surgical
video stream
surgical video
computing system
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280063249.XA
Other languages
Chinese (zh)
Inventor
F·E·谢尔顿四世
B·A·富尔斯特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cilag GmbH International
Original Assignee
Cilag GmbH International
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/384,265 external-priority patent/US11601232B2/en
Application filed by Cilag GmbH International filed Critical Cilag GmbH International
Priority claimed from PCT/IB2022/056674 external-priority patent/WO2023002388A1/en
Publication of CN118019507A publication Critical patent/CN118019507A/en
Pending legal-status Critical Current

Links

Landscapes

  • Medical Treatment And Welfare Office Work (AREA)

Abstract

The computing system may use redundant communication paths to communicate surgical imaging feeds. The computing system may obtain a plurality of surgical video streams via a plurality of paths. The plurality of surgical video streams may include copies of the same video. The surgical video stream may be obtained, for example, from the same in-vivo imaging feed, such as an in-vivo visible light feed. For example, a first video stream may be obtained via a communication path and a second video stream may be obtained via another communication path. The computing system may display the surgical video stream or send the surgical video stream for display. The computing system may determine whether the video stream being displayed encounters any problems. Upon detecting that the video stream being displayed has a problem, the computing system may display another obtained surgical video stream or send another obtained surgical video stream for display.

Description

Redundant communication channels and processing of imaging feeds
Cross Reference to Related Applications
The present application claims the benefit of provisional U.S. patent application No. 63/224,813 filed on 7.22 of 2021, the disclosure of which is incorporated herein by reference in its entirety.
The present application relates to the following concurrently filed patent applications, the contents of each of which are incorporated herein by reference:
U.S. patent application entitled "METHOD OF SURGICAL SYSTEM POWER MANAGEMENT, COMMUNICATION, PROCESSING, STORAGE ANDDISPLAY" and attorney docket number END9340USNP 1; and
U.S. patent application entitled "COOPERATIVE COMPOSITE VIDEO STREAMS LAYERED ONTO THE SURGICAL SITE AND INSTRUMENTS" and attorney docket number END9340 USNP.
Background
Surgery is typically performed in a surgical theatre or operating room of a medical facility such as, for example, a hospital. Various surgical devices and systems are utilized in performing surgical procedures. In the digital and information age, medical systems and facilities often implement systems or procedures utilizing newer and improved techniques more slowly due to patient safety and the general desire to maintain traditional practices. It is desirable to improve the delivery and processing of surgical video feeds, such as intra-operative video feeds captured by laparoscopes.
Disclosure of Invention
A computing system may include: a processor configured to enable: obtaining a plurality of surgical video streams via a plurality of communication paths during a surgical procedure; transmitting a first surgical video stream of the plurality of surgical video streams for display, wherein the first surgical video stream is obtained via a first communication path; detecting a problem associated with the first surgical video stream; and transmitting a second surgical video stream of the plurality of surgical video streams for display, wherein the second surgical video stream is obtained via a second communication path, the second communication path being different from the first communication path.
The computing system may provide the following technical effects: if delay, stuck, distorted, etc. conditions occur in the first surgical video stream, fault protection is provided.
The first surgical video stream and the second surgical video stream may be obtained from the same intraoperative imaging feed.
The computing system may provide the following technical effects: if delay, jamming, distortion, etc. occurs in the first surgical video stream, a visual feed of the surgical site is maintained.
The processor may be configured to: the first surgical video stream is processed prior to display using a first processing module, wherein problems associated with the first surgical video stream are detected based on detecting problems with the first processing module.
The computing system may provide the following technical effects: dedicated hardware processing is provided for the first surgical video stream without delay, jamming, distortion, etc. of the first surgical video stream.
The processing module may include at least one of: a multispectral analysis module; a laser Doppler blood flow measurement analysis module; a plurality of field programmable arrays; or a content composition module.
The computing system may provide the following technical effects: a dedicated hardware analysis is provided for the first surgical video stream without delay, jamming, distortion, etc. of the first surgical video stream.
The processor may be configured to: processing the first surgical video stream using a first processing module; detecting a problem with the first processing module, wherein a problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module; and processing the second surgical video stream using the second processing module.
The computing system may provide the following technical effects: dedicated hardware parallel processing is provided for the first surgical video stream and the second surgical video stream without delay, jamming, distortion, etc. of the first surgical video stream.
The first surgical video stream and the second surgical video stream may be associated with imaging data captured via the same light sensing element, and the first surgical video stream and the second surgical video stream are processed via different processing modules.
The computing system may provide the following technical effects: dedicated hardware parallel processing is provided for a first surgical video stream of a surgical site and a second surgical video stream of the surgical site without delay, jamming, distortion, etc. of the first surgical video stream.
The plurality of surgical video streams may include a third surgical video stream, and the processor may be further configured to: the first surgical video stream is enhanced using the third surgical video stream before sending the first surgical video stream for display.
The computing system may provide the following technical effects: the healthcare professional is continuously instructed to complete the surgical procedure in real time.
The processor may be further configured to: processing the first surgical video stream using a first processing module; processing the second surgical video stream using a second processing module; combining the processed first surgical video stream and the processed second surgical system for display, wherein upon detecting a problem associated with the first surgical video stream, the processor is configured to pause the combining.
The computing system may provide the following technical effects: in the absence of delay, jamming, distortion, etc. in the first surgical video stream, dedicated hardware parallel processing is provided for the first surgical video stream and the second surgical video stream, thereby reducing display delay.
The plurality of surgical video streams may include a third surgical video stream, and the processor may be further configured to: extracting surgical annotation data from the third surgical video stream; and inserting the extracted surgical notes into the first surgical video stream before sending the first surgical video stream for display.
The computing system may provide the following technical effects: the healthcare professional is continuously instructed to complete the surgical procedure in real time.
The first surgical video stream and the second surgical video stream may be obtained from the same in-vivo visible light feed.
A method may include: obtaining a plurality of surgical video streams via a plurality of communication paths during a surgical procedure; transmitting a first surgical video stream of the plurality of surgical video streams for display, wherein the first surgical video stream is obtained via a first communication path; detecting a problem associated with the first surgical video stream; and transmitting a second surgical video stream of the plurality of surgical video streams for display, wherein the second surgical video stream is obtained via a second communication path, the second communication path being different from the first communication path.
The first surgical video stream and the second surgical video stream may be obtained from the same intraoperative imaging feed.
The method may further comprise: the first surgical video stream is processed prior to display using a first processing module, wherein problems associated with the first surgical video stream are detected based on detecting problems with the first processing module.
The processing module may include at least one of: a multispectral analysis module; a laser Doppler blood flow measurement analysis module; a plurality of field programmable arrays; or a content composition module.
The method may include: processing the first surgical video stream using a first processing module; detecting a problem with the first processing module, wherein a problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module; and processing the second surgical video stream using the second processing module.
The first surgical video stream and the second surgical video stream may be associated with imaging data captured via the same light sensing element, and the first surgical video stream and the second surgical video stream are processed via different processing modules.
The plurality of surgical video streams may include a third surgical video stream, and the method may further include: the first surgical video stream is enhanced using the third surgical video stream before sending the first surgical video stream for display.
The method may further comprise: processing the first surgical video stream using a first processing module; processing the second surgical video stream using a second processing module; and merging the processed first surgical video stream and the processed second surgical system for display, wherein upon detecting a problem associated with the first surgical video stream, the processor is configured to pause the combining.
The plurality of surgical video streams may include a third surgical video stream, and the method may further include: extracting surgical annotation data from the third surgical video stream; and inserting the extracted surgical notes into the first surgical video stream before sending the first surgical video stream for display.
The first surgical video stream and the second surgical video stream may be obtained from the same in-vivo visible light feed.
The methods described above provide methods corresponding to the operation of the computing system described above. The technical effects and advantages described above in connection with the computing system described above also apply to the methods of the methods described above.
Any and/or all of the methodologies described above may be embodied as computer-implemented methodologies including, but not limited to, methodologies implemented by a processor, integrated circuit, microcontroller, field Programmable Gate Array (FPGA), or the like. The implementation computing system may be a hardware device or may include a plurality of hardware devices configured to be operable as a distributed computing system. An implementation computer system may include a memory containing instructions for performing any and/or all of the methods described above. For example, the memory may contain instructions that, when executed by the computing system and/or its processor, cause the system or processor to perform one or more of the methods described above.
Any and/or all of the methods described above may be embodied in a computer-readable storage medium, such as a non-transitory computer-readable storage medium, which may be embodied as a computer program product, the computer-readable storage medium containing instructions that, when executed by a computer, cause the computer to perform any one or more of the methods described above.
The methods described above may not include methods of treating the human or animal body by surgery or therapy, or diagnostic methods performed on the human or animal body. Each of the methods described above may be a method that is not a surgical, therapeutic or diagnostic method. For example, each of the methods described above have embodiments that do not include performing a surgical procedure or any surgical or therapeutic steps thereof.
The computing system may use redundant communication paths to communicate surgical imaging feeds. The computing system may obtain multiple surgical video streams via multiple paths. The multiple surgical video streams may include different video feeds and/or copies of the same video feed. The surgical video stream may be obtained, for example, from the same in-vivo imaging feed (such as an in-vivo visible light feed). For example, a first video stream may be obtained via a communication path and a second video stream may be obtained via another communication path. The computing system may display the surgical video stream or send the surgical video stream for display. The computing system may determine whether the video stream being displayed encounters any problems. Upon detecting a problem with the video stream being displayed, the computing system may display another obtained surgical video stream or send another obtained surgical video stream for display. For example, the main video stream may be initially displayed. Upon detecting a problem associated with the primary video, the secondary video stream may be displayed.
In an example, the computing system may use redundant processing paths to process the surgical imaging feed. The computing system may obtain a source surgical imaging stream and may process the source imaging stream using a plurality of processing modules. For example, at least some of the processing modules may process the surgical imaging stream in parallel. The computing system may determine whether any problems have been encountered at the processing module. If no problem is found, the processed surgical imaging streams may be combined for display. Upon detecting a problem associated with the processing module, the computing system may select a surgical imaging stream for display that is unaffected by the detected problem. For example, a surgical imaging stream that has not been processed by a processing module associated with a detected problem may be selected for display.
The present invention provides a computing system that can generate a composite video stream from a plurality of input feeds. The computing system may obtain the surgical video stream and superimpose content associated with the surgical procedure onto the surgical video stream. For example, the overlay content may be obtained from the secondary video feed or a portion of the secondary video feed. The computing system may determine a location, size, and/or orientation of an overlay region for overlaying the overlay content by analyzing the content of the surgical video stream. For example, based on the content of the frames of the surgical video stream, the computing system may determine a location of an overlay region in the frames for overlaying the overlay content. Based on the content of the subsequent frame of the surgical video stream, the computing system may determine another overlay region location in the subsequent frame for overlaying the overlay content. The composite video stream may be generated based on the overlay region locations determined for different frames of the surgical video stream.
The computing system may determine a position, size, and/or orientation of an overlay region for overlaying the overlay content based on one or more fiducial markers captured in the surgical video stream. For example, the computing system may identify fiducial markers in video frames of the surgical video stream and determine respective positions, sizes, and/or orientations of the fiducial markers in the respective video frames. For a given video frame or set of video frames, the computing system may determine the size, position, and/or orientation of the overlay region based on the position, size, and/or orientation of the fiducial marker captured therein.
Drawings
FIG. 1A is a block diagram of a computer-implemented surgical system.
FIG. 1B is a block diagram of a computer-implemented multi-layer surgical system.
Fig. 1C is a logic diagram illustrating the control plane and data plane of the surgical system.
Fig. 2 illustrates an exemplary surgical system in a surgical operating room.
Fig. 3 illustrates an exemplary surgical hub paired with various systems.
Fig. 4 illustrates a surgical data network having a set of communication surgical hubs configured to interface with a set of sensing systems, an environmental sensing system, a set of devices, etc.
FIG. 5 illustrates an exemplary computer-implemented interactive surgical system that may be part of a surgical system.
Fig. 6 shows a logic diagram of a control system for a surgical instrument.
Fig. 7 illustrates an exemplary surgical system including a handle having a controller and a motor, an adapter releasably coupled to the handle, and a loading unit releasably coupled to the adapter.
Fig. 8 illustrates an exemplary situational awareness surgical system.
Fig. 9A-9C illustrate an exemplary visualization system.
Fig. 10 shows an exemplary instrument for near infrared spectroscopy (NIRS) spectroscopy.
FIG. 11 shows an example of an instrument for determining NIRS based on Fourier transform infrared imaging.
Fig. 12 shows an exemplary instrument that may be used to detect doppler shift of laser light scattered from a portion of tissue.
Fig. 13 shows an exemplary composite image comprising a surface image and a subsurface vessel image.
FIG. 14 illustrates an exemplary visualization system.
FIG. 15 illustrates an exemplary visualization system.
Fig. 16 illustrates an exemplary process for delivering surgical imaging feeds using redundant tubing.
FIG. 17 illustrates an exemplary process for delivering surgical imaging feeds using redundant processing paths.
FIG. 18A illustrates an exemplary process for generating a composite surgical video stream from multiple input feeds.
FIG. 18B illustrates an exemplary process for generating a composite surgical video stream using fiducial markers.
Fig. 19A-19C illustrate exemplary frames of a composite surgical video stream having superimposed content that moves as surgical instruments in the surgical video stream move.
Detailed Description
The applicant of the present application owns the following U.S. patent applications, each of which is incorporated herein by reference in its entirety:
U.S. patent application Ser. No. 15/940,654 (attorney docket number END8501 USNP) filed on 3/29 of 2018, entitled "SURGICAL HUB SITUATIONAL AWARENESS";
U.S. patent application Ser. No. 15/940,742 (attorney docket number END8504USNP 2) filed on 3/29 of 2018, entitled "DUAL CMOS ARRAY IMAGING";
U.S. patent application Ser. No. 17/062,521, entitled "TIERED-ACCESS SURGICAL VISUALIZATION SYSTEM", filed on month 10 and2 of 2020
(Attorney docket number END9287USNP 2);
U.S. patent application Ser. No. 17/062,530 (attorney docket number END9287USNP 13) filed on month 10 and 2 of 2020, entitled "SURGICAL HUB HAVING VARIABLE INTERCONNECTIVITY CAPABILITIES";
U.S. patent application Ser. No. 17/062,512 (attorney docket number END9287 USNP) filed ON month 10 and 2 of 2020, entitled "TIERED SYSTEM DISPLAY CONTROL base ON CAPACITY AND USER OPERATION";
U.S. patent application Ser. No. 17/062,508 (attorney docket number END9287 USNP) filed on month 10 and 2 of 2020, entitled "COOPERATIVE SURGICAL DISPLAYS";
U.S. patent application Ser. No. 17/062,509 (attorney docket number END9287 USNP) filed on month 10 and 2 of 2020, entitled "INTERACTIVE INFORMATION OVERLAY ON MULTIPLE SURGICAL DISPLAYS";
U.S. patent application Ser. No. 17/062,507 (attorney docket number END9287 USNP) filed on month 10 and 2 of 2020, entitled "COMMUNICATION CONTROL FOR A SURGEON CONTROLLED SECONDARY DISPLAY AND PRIMARY DISPLAY";
U.S. patent application Ser. No. 17/062,513 (attorney docket number END9288 USNP) filed on month 10 and 2 of 2020, entitled "SITUATIONAL AWARENESS OF INSTRUMENTS LOCATION AND INDIVIDUALIZATION OF USERS TO CONTROL DISPLAYS";
U.S. patent application Ser. No. 17/062,517 (attorney docket number END9288 USNP) filed on month 10 and 2 of 2020, entitled "SHARED SITUATIONAL AWARENESS OF THE DEVICE ACTUATOR ACTIVITY TO PRIORITIZE CERTAIN ASPECTS OF DISPLAYED INFORMATION";
U.S. patent application Ser. No. 17/062,520 (attorney docket No. END9288 USNP) filed on month 10 and 2 of 2020, entitled "MONITORING OF USER VISUAL GAZE TO CONTROL WHICH DISPLAY SYSTEM DISPLAYS THE PRIMARY INFORMATION";
U.S. patent application Ser. No. 17/062,519 (attorney docket number END9288 USNP) filed on month 10 and 2 of 2020, entitled "RECONFIGURATION OF DISPLAY SHARING"; and
U.S. patent application Ser. No. 17/062,516 (attorney docket number END9288USNP 5) filed on month 10 and 2 of 2020, entitled "CONTROL OF A DISPLAY OUTSIDE THE STERILE FIELD FROM A DEVICE WITHIN THE STERILE FIELD".
Fig. 1A is a block diagram of a computer-implemented surgical system 20000. Exemplary surgical systems, such as surgical system 20000, can include one or more surgical systems (e.g., surgical subsystems) 20002, 20003, and 20004. For example, surgical system 20002 can comprise a computer-implemented interactive surgical system. For example, the surgical system 20002 may include a surgical hub 20006 and/or a computing device 20016 in communication with a cloud computing system 20008, e.g., as described in fig. 2. Cloud computing system 20008 may comprise at least one remote cloud server 20009 and at least one remote cloud storage unit 20010. Exemplary surgical systems 20002, 20003, or 20004 can include wearable sensing system 20011, environmental sensing system 20015, robotic system 20013, one or more smart instruments 20014, human-machine interface system 20012, and the like. The human interface system is also referred to herein as a human interface device. The wearable sensing system 20011 may include one or more HCP sensing systems and/or one or more patient sensing systems. The environment sensing system 20015 may include, for example, one or more devices for measuring one or more environmental properties, e.g., as further described in fig. 2. The robotic system 20013 may include a plurality of devices for performing a surgical procedure, for example, as further described in fig. 2.
The surgical system 20002 may be in communication with a remote server 20009, which may be part of a cloud computing system 20008. In one example, the surgical system 20002 can communicate with the remote server 20009 via a cable/FIOS networking node of an internet service provider. In one example, the patient sensing system may communicate directly with the remote server 20009. The surgical system 20002 and/or components therein may communicate with the remote server 20009 via cellular transmission/reception points (TRPs) or base stations using one or more of the following cellular protocols: GSM/GPRS/EDGE (2G), UMTS/HSPA (3G), long Term Evolution (LTE) or 4G, LTE-advanced (LTE-a), new air interface (NR) or 5G.
The surgical hub 20006 can cooperatively interact with one of a plurality of devices that display images from the laparoscope and information from one or more other intelligent devices and one or more sensing systems 20011. The surgical hub 20006 can interact with one or more sensing systems 20011, one or more smart devices, and a plurality of displays. The surgical hub 20006 may be configured to collect measurement data from one or more sensing systems 20011 and send notification or control messages to the one or more sensing systems 20011. The surgical hub 20006 can send and/or receive information including notification information to and/or from the human interface system 20012. The human interface system 20012 may include one or more Human Interface Devices (HIDs). The surgical hub 20006 can send and/or receive notification or control information to convert to audio, display, and/or control information to various devices in communication with the surgical hub.
For example, the sensing system 20001 may include a wearable sensing system 20011 (the wearable sensing system may include one or more HCP sensing systems and one or more patient sensing systems) and an environmental sensing system 20015, as described in fig. 1A. The one or more sensing systems 20001 can measure data related to various biomarkers. The one or more sensing systems 20001 can use one or more sensors such as light sensors (e.g., photodiodes, photoresistors), mechanical sensors (e.g., motion sensors), acoustic sensors, electrical sensors, electrochemical sensors, pyroelectric sensors, infrared sensors, etc. to measure biomarkers. The one or more sensors may measure biomarkers as described herein using one or more of the following sensing techniques: photoplethysmography, electrocardiography, electroencephalography, colorimetry, impedance spectroscopy, potentiometry, amperometry, and the like.
Biomarkers measured by the one or more sensing systems 20001 may include, but are not limited to, sleep, core body temperature, maximum oxygen intake, physical activity, alcohol consumption, respiration rate, oxygen saturation, blood pressure, blood glucose, heart rate variability, blood ph, hydration status, heart rate, skin conductance, tip temperature, tissue perfusion pressure, coughing and sneezing, gastrointestinal motility, gastrointestinal imaging, respiratory bacteria, oedema, psychotropic factors, sweat, circulating tumor cells, autonomic nerve tone, circadian rhythm, and/or menstrual cycle.
Biomarkers may relate to physiological systems, which may include, but are not limited to, behavioral and psychological, cardiovascular, renal, skin, nervous, gastrointestinal, respiratory, endocrine, immune, tumor, musculoskeletal, and/or reproductive systems. Information from the biomarkers may be determined and/or used by, for example, a computer-implemented patient and surgical system 20000. Information from the biomarkers may be determined and/or used by computer-implemented patient and surgical system 20000, for example, to improve the system and/or improve patient outcome. One or more sensing systems 20001, biomarkers 20005, and physiological systems are described in more detail in U.S. application 17/156,287 (attorney docket number END9290USNP 1), filed on 1, 22, 2021, the disclosure of which is incorporated herein by reference in its entirety.
FIG. 1B is a block diagram of a computer-implemented multi-layer surgical system. As shown in fig. 1B, the computer-implemented multi-layer surgical system 40050 may include multi-layer systems, such as a surgical private sub-network layer system 40052, an edge layer system 40054 associated with the surgical private sub-network layer system 40052, and a cloud layer system 40056.
The surgical private sub-network layer system 40052 may comprise a plurality of interconnected surgical sub-systems. For example, the surgical subsystems may be grouped according to the type of surgery and/or other departments in a medical facility or hospital. For example, a medical facility or hospital may include a plurality of surgery-specific departments, such as an emergency department (ER) department 40070, a colorectal department 40078, a weight-loss department 40072, a chest department 40066, and a billing department 40068. Each of the surgical specific departments may include one OR more surgical subsystems associated with an Operating Room (OR) and/OR a Health Care Professional (HCP). For example, colorectal department 40078 may include a set of surgical hubs (e.g., surgical hub 20006 as depicted in fig. 1A). The surgical hub may be designated for use with a corresponding HCP, such as HCP a 40082 and HCP B40080. In one example, a colorectal department may include a set of surgical hubs that may be located in respective ORs (such as OR 1, 40074 and OR 2, 40076). The medical facility or hospital may also include a billing department subsystem 40068. Billing department subsystem 40068 may store and/or manage billing data associated with the respective departments (such as ER department 40070, colorectal department 40078, weight loss department 40072, and/or chest department 40066).
For example, the edge layer system 40054 may be associated with a medical facility or hospital, and may include one or more edge computing systems 40064. Edge computing system 40064 may include a storage subsystem and a server subsystem. In one example, an edge computing system including an edge server and/OR storage unit may provide additional processing and/OR storage services to a surgical hub that is part of one of the departments OR (e.g., OR1 and OR2 of a colorectal department).
The surgical private sub-network layer system 40052 and the edge layer system 40054 may be within the health insurance flow and liability act (HIPAA) scope 40062. The surgical private sub-network system 40052 and the edge layer system 40054 may be connected to the same local data network. The local data network may be a local data network of a medical facility or hospital. The local data network may be in the HIPAA range. Because the surgical private sub-network layer system 40052 and the edge layer system 40054 are located within the HIPAA range 40062, patient data between the edge computing system 40064 and devices located within one of the entities of the surgical private sub-network layer system 40052 may flow without editing and/or encryption. For example, patient data between the edge computing system 40064 and a surgical hub located in the OR1 40074 of the colorectal department 40078 may flow without editing and/OR encryption.
Cloud system 40056 may include enterprise cloud system 40060 and public cloud system 40058. For example, enterprise cloud system 40060 may be cloud computing system 20008 including a remote cloud server subsystem and/or a remote cloud storage subsystem, as depicted in fig. 1A. The enterprise cloud system 40060 may be managed by an organization, such as a private company. The enterprise cloud system 40060 can communicate with one OR more entities located within the HIPAA range 40062 (e.g., edge computing system 40064, surgical hubs in the OR (e.g., OR1 40074) of various departments (e.g., colorectal department 40078).
Public cloud system 40058 may be operated by a cloud computing service provider. For example, a cloud computing service provider may provide storage services and/or computing services to a plurality of enterprise cloud systems (e.g., enterprise cloud system 40060).
Fig. 1C is a logical block diagram 40000 illustrating various communication planes in a surgical system.
As shown in fig. 1C, a control plane 40008 and a data plane 40010 may be used for the communication plane between the controller 40002 and the management applications 40014 and 40016 on one side and between the system modules and/or modular devices 40012a to 40012n on the other side. In one example, in addition to the control plane 40008, a data plane may also exist between the system modules and/or modular devices 40012 a-40012 n and the surgical hub. The data plane 40010 can provide a data plane path (e.g., a redundant data plane path) between system modules and/or modular devices 40012 a-40012 n associated with one or more surgical hubs. One of the surgical hubs or surgical hubs (e.g., where there are multiple surgical hubs in the operating room) may act as the controller 40002. In one example, the controller 40002 can be an edge computing system that can be within the health insurance flow and liability act (HIPAA) of the surgical system, for example, as shown in fig. 1B. The controller 40002 may be in communication with an enterprise cloud system 40020. As shown in fig. 1C, the enterprise cloud system 40020 may be located outside of HIPAA range 40018. Accordingly, patient data to and/or from enterprise cloud system 40020 may be compiled and/or encrypted.
Controller 40002 can be configured to provide north interface 40004 and south interface 40006. North interface 40004 may be used to provide control plane 40008. The control plane 40008 can include one or more management applications 40014 and 40016, which can enable a user to configure and/or manage system modules and/or modular devices 40012 a-40012 n associated with a surgical system. The management application 40014 and the management application 40016 may be used to obtain the status of various system modules and/or modular devices 40012a through 40012n.
The management application 40014 and the management application 40016 using the control plane may interact with the controller 40002 using, for example, a set of Application Programming Interface (API) calls. The management application 40014 and the management application 40016 may interact with the controller 40002 via a management protocol or an application layer protocol to configure and/or monitor the status of the system modules and/or modular devices. The management protocol or application layer protocol used to monitor status and/or configure the system modules or modular devices associated with the surgical system may include Simple Network Management Protocol (SNMP), TELNET protocol, secure Shell (SSH) protocol, network configuration protocol (netcon), etc.
SNMP or a similar protocol may be used to collect status information and/or send configuration related data (e.g., configuration related control programs) associated with the system modules and/or modular devices to the controller. SNMP or similar protocols can collect information by selecting devices associated with the surgical system from a central network management console using messages (e.g., SNMP messages). Messages may be sent and/or received at regular or random intervals. These messages may include Get messages and Set messages. Get messages or messages similar to Get messages may be used to obtain information from a system module or a modular device associated with the surgical system. The Set message or a message similar to the Set message may be used to change a configuration associated with a system module or a modular device associated with the surgical system.
For example, get messages or similar messages may include SNMP messages GetRequest, getNextRequest or GetBulkRequest. The Set message may include an SNMP SetRequest message. GetRequest, getNextRequest, getBulkRequest messages or similar messages may be used by a configuration manager (e.g., SNMP manager) running on the controller 40002. The configuration manager may communicate with a communication agent (e.g., SNMP agent) that may be part of a system module and/or modular device in the surgical system. The communication manager on controller 40002 can use SNMP message SetRequest messages or the like to set values of parameters or object instances in the system modules of the surgical system and/or communication agents on the modular device. In one example, for example, an SNMP module can be used to establish a communication path between a system module and/or a modular device associated with a surgical system.
Based on the query or configuration-related message received from the management applications, such as management applications 40014 and 40016, controller 40002 can generate configuration queries and/or configuration data for querying or configuring system modules and/or modular devices associated with the surgical hub or surgical system. A surgical hub (e.g., surgical hub 20006 shown in fig. 1A) or an edge computing system (e.g., edge computing system 40064 shown in fig. 1B) can manage and/or control various system modules and/or modular devices 40012 a-40012 n associated with the surgical system. For example, the northbound interface 40004 of the controller 40002 can be used to alter control interactions between one or more modules and/or devices associated with the surgical system. In one example, the controller 40002 can be used to establish one or more communication data paths between a plurality of modules and/or devices associated with the surgical system. The controller 40002 can use its southbound interface 40006 to send control programs including queries and/or configuration changes to system modules and/or modular devices of the surgical system.
The system module and/or modular device 40012 a-40012 n of the surgical system, or a communication agent that may be part of the system module and/or modular device, may send a notification message or trap to the controller 40002. The controller may forward the notification message or trap to the management application 40014 and the management application 40016 via its northbound interface 40004 for display on a display. In one example, the controller 40002 can send notifications to other system modules and/or modular devices 40012a through 40012n that are part of the surgical system.
The system module and/or modular device 40012 a-40012 n of the surgical system or a communication agent that is part of the system module and/or modular device may send a response to a query received from the controller 40002. For example, a communication agent, which may be part of a system module or modular device, may send a response message in response to a Get or Set message or a message similar to a Get or Set message received from controller 40002. In one example, responsive messages from system modules or modular devices 40012a through 40012n may include requested data in response to Get messages or similar messages received from controller 40002. In one example, in response to a Set message or similar message received from a system module or modular device 40012 a-40012 n, the response message from controller 40002 may include the newly Set value as an acknowledgement that the value has been Set.
The system modules or modular devices 40012a through 40012n may use trap or notification messages or messages similar to trap or notification messages to provide information about events associated with the system modules or modular devices. For example, a trap or notification message may be sent from the system module or modular device 40012 a-40012 n to the controller 40002 to indicate the status of the communication interface (e.g., whether the communication interface is available for communication). The controller 40002 can send the receipt of the trap message back to the system module or modular device 40012a through 40012n (e.g., back to a proxy on the system module or modular device).
In one example, the TELNET protocol can be used to provide a two-way interactive text-oriented communication facility between the system modules and/or modular devices 40012 a-40012 n and the controller 40002. The TELNET protocol may be used to collect status information from the controller 40002 and/or send configuration data (e.g., control programs) to the controller. One of the management applications 40014 or 40016 can use a TELNET to establish a connection with the controller 40002 using a transmission control protocol port number 23.
In one example, SSH (cryptographic protocol) may be used to allow telnet and collect status information from controller 40002 and/or send configuration data to the controller regarding system modules and/or modular devices 40012 a-40012 n. One of the management applications 40014 or 40016 may use the SSH to establish an encrypted connection with the controller 40002 using the transmission control protocol port number 22.
In one example, NETCONF can be used to perform management functions by invoking tele-surgical calls using, for example, < rpc >, < rpc-reply > or < wait-config > operations. < rpc > and < rpc-reply > surgical calls or similar surgical calls may be used to exchange information from system modules and/or modular devices associated with the surgical system. The netcon f < wait-config > operation or similar operations may be used to configure system modules and/or modular devices associated with the surgical system.
The controller 40002 can configure the system modules and/or modular devices 40012a through 40012n to establish the data plane 40010. The data plane 40010 (e.g., also referred to as a user plane or forwarding plane) may enable communication data paths between multiple system modules and/or modular devices 40012 a-40012 n. The data plane 40010 can be used by system modules and/or modular devices 40012a through 40012n for communicating data streams of data between system modules and/or modular devices associated with a surgical system. The data stream may be established using one or more dedicated communication interfaces between system modules and/or modular devices associated with one or more surgical hubs of the surgical system. In one example, the data flow may be established over one or more Local Area Networks (LANs) and one or more Wide Area Networks (WANs), such as the internet.
In one example, the data plane 40010 can provide support for establishing first and second independent, disjoint, concurrent, and redundant communication paths for data flows between system modules and/or modular devices 40012b and 40012 n. As shown in fig. 1C, a redundant communication path may be established between system modules/modular devices 40012b and 40012 n. The redundant communication paths may carry the same/redundant data streams between system modules and/or modular devices. In one example, the system module and/or the modular device may continue to transmit/receive at least one copy of the dropped data packet over the second communication path when or if some of the data packets are dropped over one of the redundant communication paths due to a problem with one of the communication interfaces on the system module/modular device 40012b and 40012 n.
Fig. 2 shows an example of a surgical system 20002 in a surgical room. As shown in fig. 2, the patient is operated on by one or more healthcare professionals (HCPs). The HCP is monitored by one or more HCP sensing systems 20020 worn by the HCP. The HCP and the environment surrounding the HCP may also be monitored by one or more environmental sensing systems including, for example, a set of cameras 20021, a set of microphones 20022, and other sensors that may be deployed in an operating room. The HCP sensing system 20020 and the environmental sensing system can communicate with a surgical hub 20006, which in turn can communicate with one or more cloud servers 20009 of a cloud computing system 20008, as shown in fig. 1A. The environmental sensing system may be used to measure one or more environmental properties, such as the location of an HCP in an operating room, HCP movement, environmental noise in an operating room, temperature/humidity in an operating room, and the like.
As shown in fig. 2, a main display 20023 and one or more audio output devices (e.g., speakers 20019) are positioned in the sterile field to be visible to an operator at the operating table 20024. In addition, the visualization/notification tower 20026 is positioned outside the sterile field. The visualization/notification tower 20026 may include a first non-sterile Human Interface Device (HID) 20027 and a second non-sterile HID 20029 facing away from each other. The HID may be a display or a display with a touch screen that allows a person to interface directly with the HID. The human-machine interface system guided by the surgical hub 20006 may be configured to coordinate the flow of information to operators inside and outside the sterile field using HIDs 20027, 20029, and 20023. In one example, the surgical hub 20006 may cause the HID (e.g., the main HID 20023) to display notifications and/or information about the patient and/or surgical procedure. In one example, the surgical hub 20006 can prompt and/or receive inputs from personnel in the sterile or non-sterile area. In one example, the surgical hub 20006 may cause the HID to display a snapshot of the surgical site recorded by the imaging device 20030 on the non-sterile HID 20027 or 20029, while maintaining a real-time feed of the surgical site on the main HID 20023. For example, a snapshot on non-sterile display 20027 or 20029 may allow a non-sterile operator to perform diagnostic steps related to a surgical procedure.
In one aspect, the surgical hub 20006 can be configured to route diagnostic inputs or feedback entered by a non-sterile operator at the visualization tower 20026 to the main display 20023 within the sterile field, which can be viewed by the sterile operator at the operating table. In one example, the input may be a modification to a snapshot displayed on the non-sterile display 20027 or 20029, which may be routed through the surgical hub 20006 to the main display 20023.
Referring to fig. 2, a surgical instrument 20031 is used in a surgical procedure as part of a surgical system 20002. The hub 20006 may be configured to coordinate the flow of information to the display of the surgical instrument 20031. For example, U.S. patent application publication No. US2019-0200844 A1 (U.S. patent application Ser. No. 16/209,385), entitled "METHOD OF HUB COMMUNICATION, PROCESSING, STORAGE ANDDISPLAY," filed on even 4 at 12.2018, the disclosure OF which is incorporated herein by reference in its entirety. Diagnostic inputs or feedback entered by a non-sterile operator at visualization tower 20026 may be routed by hub 20006 to a surgical instrument display within the sterile field, which may be viewable by an operator of surgical instrument 20031. For example, an exemplary surgical instrument suitable for use with surgical system 20002 is described under the heading "Surgical Instrument Hardware" in U.S. patent application publication No. US2019-0200844 A1 (U.S. patent application No. 16/209,385), entitled "METHOD OF HUB COMMUNICATION, PROCESSING, STORAGE ANDDISPLAY," filed on day 4 OF 12 in 2018, the disclosure OF which is incorporated herein by reference in its entirety.
Fig. 2 shows an example of a surgical system 20002 for performing a surgical operation on a patient lying on an operating table 20024 in a surgical room 20035. The robotic system 20034 may be used in surgery as part of a surgical system 20002. The robotic system 20034 may include a surgeon's console 20036, a patient side cart 20032 (surgical robot), and a surgical robot hub 20033. When the surgeon views the surgical site through the surgeon's console 20036, the patient-side cart 20032 can manipulate the at least one removably coupled surgical tool 20037 through a minimally invasive incision in the patient. An image of the surgical site may be obtained by a medical imaging device 20030 that is steerable by a patient side cart 20032 to orient the imaging device 20030. The robotic hub 20033 may be used to process images of the surgical site for subsequent display to the surgeon via the surgeon's console 20036.
Other types of robotic systems may be readily adapted for use with surgical system 20002. Various examples of robotic systems and surgical tools suitable for use with the present disclosure are described in U.S. patent application No. US2019-0201137 A1 (U.S. patent application No. 16/209,407), entitled "METHOD OF ROBOTIC HUB COMMUNICATION, DETECTION, AND CONTROL," filed on even date 4 at 12 in 2018, the disclosure of which is incorporated herein by reference in its entirety.
Various examples of cloud-based analysis performed by cloud computing system 20008 and suitable for use with the present disclosure are described in U.S. patent application publication No. US2019-0206569 A1 (U.S. patent application No. 16/209,403), entitled "METHOD OF CLOUD BASED DATA ANALYTICS FOR USE WITH THE HUB," filed on day 4, 12 in 2018, the disclosure of which is incorporated herein by reference in its entirety.
In various aspects, the imaging device 20030 can include at least one image sensor and one or more optical components. Suitable image sensors may include, but are not limited to, charge Coupled Device (CCD) sensors and Complementary Metal Oxide Semiconductor (CMOS) sensors.
The optical components of the imaging device 20030 can include one or more illumination sources and/or one or more lenses. One or more illumination sources may be directed to illuminate multiple portions of the surgical field. The one or more image sensors may receive light reflected or refracted from the surgical field, including light reflected or refracted from tissue and/or surgical instruments.
The one or more illumination sources may be configured to radiate electromagnetic energy in the visible spectrum as well as the invisible spectrum. The visible spectrum (sometimes referred to as the optical spectrum or the luminescence spectrum) is that portion of the electromagnetic spectrum that is visible to the human eye (i.e., detectable by the human eye), and may be referred to as visible light or simple light. A typical human eye will respond to wavelengths in the range of about 380nm to about 750nm in air.
The invisible spectrum (e.g., non-emission spectrum) is the portion of the electromagnetic spectrum that lies below and above the visible spectrum (i.e., wavelengths below about 380nm and above about 750 nm). The human eye cannot detect the invisible spectrum. Wavelengths greater than about 750nm are longer than the red visible spectrum, and they become invisible Infrared (IR), microwave, and radio electromagnetic radiation. Wavelengths less than about 380nm are shorter than the violet spectrum and they become invisible ultraviolet, x-ray and gamma-ray electromagnetic radiation.
In various aspects, the imaging device 20030 is configured for use in minimally invasive surgery. Examples of imaging devices for use in the present disclosure include, but are not limited to, arthroscopes, angioscopes, bronchoscopes, choledochoscopes, colonoscopes, cytoscopes, duodenoscopes, enteroscopes, esophageal-duodenal scopes (gastroscopes), endoscopes, laryngoscopes, nasopharyngeal-nephroscopes, sigmoidoscopes, thoracoscopes, and ureteroscopes.
The imaging device may employ multispectral monitoring to distinguish between topography and underlying structures. Multispectral images are images that capture image data in a particular range of wavelengths across the electromagnetic spectrum. Wavelengths may be separated by filters or by using instruments that are sensitive to specific wavelengths, including light from frequencies outside the visible range, such as IR and ultraviolet. Spectral imaging may allow extraction of additional information that the human eye fails to capture with its red, green, and blue receptors. The use OF multispectral imaging is described in more detail under the heading "ADVANCED IMAGING Acquisition Module" OF U.S. patent application publication US2019-0200844 A1 (U.S. patent application No. 16/209,385), entitled "METHOD OF HUB COMMUNICATION, PROCESSING, STORAGE ANDDISPLAY," filed on 4 OF 12 in 2018, the disclosure OF which is incorporated herein by reference in its entirety. After completing a surgical task to perform one or more of the previously described tests on the treated tissue, multispectral monitoring may be a useful tool for repositioning the surgical site. Needless to say, the operating room and surgical equipment need to be strictly sterilized during any surgical procedure. The stringent hygiene and sterilization conditions required in the "surgery room" (i.e., operating or treatment room) require the highest possible sterility of all medical devices and equipment. Part of this sterilization process is the need to sterilize the patient or any substance penetrating the sterile field, including the imaging device 20030 and its attachments and components. It should be understood that the sterile field may be considered a designated area that is considered to be free of microorganisms, such as within a tray or within a sterile towel, or the sterile field may be considered to be an area surrounding a patient that is ready for surgery. The sterile field may include a scrubbing team member properly worn, as well as all equipment and fixtures in the field.
The wearable sensing system 20011 shown in fig. 1A may include one or more sensing systems, such as the HCP sensing system 20020 shown in fig. 2. The HCP sensing system 20020 may include a sensing system for monitoring and detecting a set of physical states and/or a set of physiological states of a health care worker (HCP). The HCP may typically be a surgeon or one or more healthcare workers or other healthcare providers assisting the surgeon. In one example, the sensing system 20020 can measure a set of biomarkers to monitor the heart rate of the HCP. In one example, a sensing system 20020 (e.g., a wristwatch or wristband) worn on the surgeon's wrist may use an accelerometer to detect hand movement and/or tremor and determine the magnitude and frequency of tremors. The sensing system 20020 can send the measurement data associated with the set of biomarkers to the surgical hub 20006 for further processing. One or more environmental sensing devices may send environmental information to the surgical hub 20006. For example, the environmental sensing device may include a camera 20021 for detecting hand/body positions of the HCP. The environmental sensing device may include a microphone 20022 for measuring environmental noise in the operating room. Other environmental sensing devices may include devices such as a thermometer for measuring temperature and a hygrometer for measuring the humidity of the environment in the operating room. The surgical hub 20006, alone or in communication with the cloud computing system, may use the surgeon biomarker measurement data and/or environmental sensing information to modify the control algorithm of the handheld instrument or the average delay of the robotic interface, for example, to minimize tremors. In one example, the HCP sensing system 20020 may measure one or more surgeon biomarkers associated with the HCP and send measurement data associated with the surgeon biomarkers to the surgical hub 20006. The HCP sensing system 20020 may use one or more of the following RF protocols to communicate with the surgical hub 20006: bluetooth, bluetooth Low-Energy (BLE), bluetooth Smart, zigbee, Z-wave, IPv 6Low power wireless personal area network (6 LoWPAN), wi-Fi. The surgeon biomarkers may include one or more of the following: pressure, heart rate, etc. Environmental measurements from the operating room may include environmental noise levels associated with the surgeon or patient, surgeon and/or personnel movements, surgeon and/or personnel attention levels, and the like.
The surgical hub 20006 may use the surgeon biomarker measurement data associated with the HCP to adaptively control one or more surgical instruments 20031. For example, the surgical hub 20006 may send control programs to the surgical instrument 20031 to control its actuators to limit or compensate for fatigue and use of fine motor skills. The surgical hub 20006 may send control programs based on situational awareness and/or context regarding importance or criticality of the task. When control is needed, the control program may instruct the instrument to change operation to provide more control.
Fig. 3 shows an exemplary surgical system 20002 having a surgical hub 20006 paired with a wearable sensing system 20011, an environmental sensing system 20015, a human interface system 20012, a robotic system 20013, and a smart instrument 20014. Hub 20006 includes display 20048, imaging module 20049, generator module 20050, communication module 20056, processor module 20057, storage array 20058, and operating room mapping module 20059. In certain aspects, as shown in fig. 3, the hub 20006 further includes a smoke evacuation module 20054 and/or a suction/irrigation module 20055. During surgery, energy application to tissue for sealing and/or cutting is typically associated with smoke evacuation, aspiration of excess fluid, and/or irrigation of tissue. Fluid lines, power lines, and/or data lines from different sources are often entangled during surgery. Solving this problem during surgery can waste valuable time. Disconnecting the pipeline may require disconnecting the pipeline from its respective module, which may require resetting the module. Hub modular housing 20060 provides a unified environment for managing power, data, and fluid lines, which reduces the frequency of entanglement between such lines. Aspects of the present disclosure provide a surgical hub 20006 for use in a surgical procedure involving the application of energy to tissue at a surgical site. The surgical hub 20006 includes a hub housing 20060 and a combined generator module slidably received in a docking cradle of the hub housing 20060. The docking station includes a data contact and a power contact. The combined generator module includes two or more of an ultrasonic energy generator component, a bipolar RF energy generator component, and a monopolar RF energy generator component that are housed in a single unit. In one aspect, the combination generator module further comprises a smoke evacuation component for connecting the combination generator module to at least one energy delivery cable of the surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluids and/or particulates generated by application of therapeutic energy to tissue, and a fluid line extending from the remote surgical site to the smoke evacuation component. In one aspect, the fluid line may be a first fluid line and the second fluid line may extend from the remote surgical site to an aspiration and irrigation module 20055 slidably housed in a hub housing 20060. In one aspect, the hub housing 20060 can include a fluid interface. Certain surgical procedures may require more than one type of energy to be applied to tissue. One energy type may be more advantageous for cutting tissue, while a different energy type may be more advantageous for sealing tissue. For example, a bipolar generator may be used to seal tissue, while an ultrasonic generator may be used to cut the sealed tissue. Aspects of the present disclosure provide a solution in which hub modular housing 20060 is configured to be able to house different generators and facilitate interactive communication therebetween. One of the advantages of hub modular housing 20060 is that it enables quick removal and/or replacement of various modules. Aspects of the present disclosure provide a modular surgical housing for use in a surgical procedure involving the application of energy to tissue. The modular surgical housing includes a first energy generator module configured to generate a first energy for application to tissue, and a first docking mount including a first docking port including a first data and power contact, wherein the first energy generator module is slidably movable into electrical engagement with the power and data contact, and wherein the first energy generator module is slidably movable out of electrical engagement with the first power and data contact. Further to the above, the modular surgical housing further comprises a second energy generator module configured to generate a second energy different from the first energy for application to the tissue, and a second docking station comprising a second docking port comprising a second data and power contact, wherein the second energy generator module is slidably movable into electrical engagement with the power and data contact, and wherein the second energy generator is slidably movable out of electrical contact with the second power and data contact. In addition, the modular surgical housing further includes a communication bus between the first docking port and the second docking port configured to facilitate communication between the first energy generator module and the second energy generator module. Referring to fig. 3, aspects of the present disclosure are presented as a hub modular housing 20060 that allows for modular integration of generator module 20050, smoke evacuation module 20054, and suction/irrigation module 20055. Hub modular housing 20060 also facilitates interactive communication between modules 20059, 20054, 20055. The generator module 20050 can have integrated monopolar, bipolar and ultrasonic components supported in a single housing unit slidably inserted into the hub modular housing 20060. The generator module 20050 may be configured to be connectable to a monopolar device 20051, a bipolar device 20052, and an ultrasound device 20053. Alternatively, the generator module 20050 can include a series of monopolar generator modules, bipolar generator modules, and/or an ultrasound generator module that interact through the hub modular housing 20060. The hub modular housing 20060 can be configured to facilitate interactive communication between the insertion and docking of multiple generators into the hub modular housing 20060 such that the generators will act as a single generator.
Fig. 4 illustrates a surgical data network having a set of communication hubs configured to enable connection to a cloud of a set of sensing systems, environmental sensing systems, and a set of other modular devices located in one or more operating rooms of a medical facility, a patient recovery room, or a room specially equipped for surgical procedures in a medical facility, in accordance with at least one aspect of the present disclosure.
As shown in fig. 4, the surgical hub system 20060 may include a modular communication hub 20065 configured to enable connection of modular devices located in a medical facility to a cloud-based system (e.g., a cloud computing system 20064, which may include a remote server 20067 coupled to a remote storage device 20068). The modular communication hub 20065 and devices may be connected in a room in a medical facility specifically equipped for surgical procedures. In one aspect, the modular communication hub 20065 may include a network hub 20061 and/or a network switch 20062 in communication with a network router 20066. The modular communication hub 20065 may be coupled to a local computer system 20063 to provide local computer processing and data manipulation.
Computer system 20063 may include a processor and a network interface 20100. The processor may be coupled to a communication module, a storage device, a memory, a non-volatile memory, and an input/output (I/O) interface via a system bus. The system bus may be any of several types of bus structure including a memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any variety of available bus architectures, including, but not limited to, a 9-bit bus, an Industry Standard Architecture (ISA), a micro-Charmel architecture (MSA), an Extended ISA (EISA), an Intelligent Drive Electronics (IDE), a VESA Local Bus (VLB), a Peripheral Component Interconnect (PCI), a USB, an Advanced Graphics Port (AGP), a personal computer memory card international association bus (PCMCIA), a Small Computer System Interface (SCSI), or any other peripheral bus.
The controller may be any single or multi-core processor, such as those provided by Texas Instruments under the trade name ARM Cortex. In one aspect, the processor may be an LM4F230H5QR ARM Cortex-M4F processor core available from, for example Texas Instruments, which includes 256KB of single-cycle flash memory or other non-volatile memory (up to 40 MHz) on-chip memory, a prefetch buffer for improving execution above 40MHz, 32KB single-cycle Sequential Random Access Memory (SRAM), loaded withInternal read-only memory (ROM) of software, 2KB electrically erasable programmable read-only memory (EEPROM), and/or one or more Pulse Width Modulation (PWM) modules, one or more Quadrature Encoder Inputs (QEI) analog, one or more 12-bit analog-to-digital converters (ADC) with 12 analog input channels, the details of which can be seen in the product data sheet.
In one example, the processor may include a secure controller comprising two controller-based families (such as TMS570 and RM4 x), also known as manufactured by Texas Instruments under the trade name Hercules ARM Cortex R. The security controller may be configured to be capable of being dedicated to IEC 61508 and ISO 26262 security critical applications, etc., to provide advanced integrated security features while delivering scalable execution, connectivity, and memory options.
It is to be appreciated that computer system 20063 may include software that acts as an intermediary between users and the basic computer resources described in suitable operating environment. Such software may include an operating system. An operating system, which may be stored on disk storage, may be used to control and allocate resources of the computer system. System applications may utilize an operating system to manage resources through program modules and program data stored either in system memory or on disk storage. It is to be appreciated that the various components described herein can be implemented with various operating systems or combinations of operating systems.
A user may enter commands or information into the computer system 20063 through input devices coupled to the I/O interface. Input devices may include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touch pad, keyboard, microphone, joystick, game pad, satellite dish, scanner, television tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices are connected to the processor 20102 via interface ports through the system bus. Interface ports include, for example, serial ports, parallel ports, game ports, and USB. The output device uses the same type of port as the input device. Thus, for example, a USB port may be used to provide input to computer system 20063 and to output information from computer system 20063 to an output device. Output adapters are provided to illustrate that there may be some output devices such as monitors, displays, speakers, and printers that may require special adapters among other output devices. Output adapters may include, by way of illustration, but are not limited to video and sound cards that provide a means of connection between an output device and a system bus. It should be noted that other devices or systems of devices such as remote computers may provide both input and output capabilities.
The computer system 20063 may operate in a networked environment using logical connections to one or more remote computers, such as a cloud computer, or local computers. The remote cloud computer may be a personal computer, a server, a router, a network PC, a workstation, a microprocessor based appliance, a peer device or other common network node and the like, and typically includes many or all of the elements described relative to computer systems. For simplicity, only memory storage devices with remote computers are shown. The remote computer may be logically connected to the computer system through a network interface and then physically connected via communication connection. The network interface may encompass communication networks such as Local Area Networks (LANs) and Wide Area Networks (WANs). LAN technologies may include Fiber Distributed Data Interface (FDDI), copper Distributed Data Interface (CDDI), ethernet/IEEE 802.3, token ring/IEEE 802.5, and so on. WAN technologies may include, but are not limited to, point-to-point links, circuit switched networks such as Integrated Services Digital Networks (ISDN) and variants thereof, packet switched networks, and Digital Subscriber Lines (DSL).
In various examples, computer system 20063 may include an image processor, an image processing engine, a media processor, or any special purpose Digital Signal Processor (DSP) for processing digital images. The image processor may employ parallel computation with single instruction, multiple data (SIMD) or multiple instruction, multiple data (MIMD) techniques to increase speed and efficiency. The digital image processing engine may perform a series of tasks. The image processor may be a system on a chip having a multi-core processor architecture.
Communication connection may refer to hardware/software for connecting a network interface to a bus. Although a communication connection is shown for illustrative clarity inside computer system 20063, it can also be external to computer system 20063. The hardware/software necessary for connection to the network interface may include, for exemplary purposes only, internal and external technologies such as, modems including regular telephone grade modems, cable modems, fiber optic modems and DSL modems, ISDN adapters, and Ethernet cards. In some examples, the network interface may also be provided using an RF interface.
The surgical data network associated with the surgical hub system 20060 can be configured to be passive, intelligent, or switched. The passive surgical data network acts as a conduit for data, enabling it to be transferred from one device (or segment) to another device (or segment) as well as cloud computing resources. The intelligent surgical data network includes additional features to enable monitoring of traffic through the surgical data network and configuring each port in the hub 20061 or the network switch 20062. The intelligent surgical data network may be referred to as a manageable hub or switch. The switching hub reads the destination address of each packet and then forwards the packet to the correct port.
The modular devices 1a-1n located in the operating room may be coupled to a modular communication hub 20065. The network hub 20061 and/or the network switch 20062 may be coupled to a network router 20066 to connect the devices 1a-1n to the cloud computing system 20064 or the local computer system 20063. The data associated with the devices 1a-1n may be transmitted via routers to cloud-based computers for remote data processing and manipulation. The data associated with the devices 1a-1n may also be transferred to the local computer system 20063 for local data processing and manipulation. Modular devices 2a-2m located in the same operating room may also be coupled to network switch 20062. The network switch 20062 may be coupled to a network hub 20061 and/or a network router 20066 to connect the devices 2a-2m to the cloud 20064. Data associated with the devices 2a-2m may be transmitted to the cloud computing system 20064 via the network router 20066 for data processing and manipulation. The data associated with the devices 2a-2m may also be transferred to the local computer system 20063 for local data processing and manipulation.
The wearable sensing system 20011 can include one or more sensing systems 20069. The sensing system 20069 may include a HCP sensing system and/or a patient sensing system. The one or more sensing systems 20069 can communicate with the computer system 20063 or cloud server 20067 of the surgical hub system 20060 directly via one of the network routers 20066 or via a network hub 20061 or network switch 20062 in communication with the network router 20066.
The sensing system 20069 may be coupled to the network router 20066 to connect the sensing system 20069 to the local computer system 20063 and/or the cloud computing system 20064. Data associated with the sensing system 20069 may be transmitted to the cloud computing system 20064 via the network router 20066 for data processing and manipulation. Data associated with the sensing system 20069 may also be transmitted to the local computer system 20063 for local data processing and manipulation.
As shown in fig. 4, the surgical hub system 20060 may be expanded by interconnecting a plurality of network hubs 20061 and/or a plurality of network switches 20062 with a plurality of network routers 20066. The modular communication hub 20065 may be included in a modular control tower configured to be capable of housing a plurality of devices 1a-1n/2a-2m. Local computer system 20063 may also be contained in a modular control tower. The modular communication hub 20065 may be connected to the display 20068 to display images obtained by some of the devices 1a-1n/2a-2m, for example, during a surgical procedure. In various aspects, the devices 1a-1n/2a-2m may include, for example, various modules such as non-contact sensor modules in an imaging module coupled to an endoscope, a generator module coupled to an energy-based surgical device, a smoke evacuation module, an aspiration/irrigation module, a communication module, a processor module, a memory array, a surgical device connected to a display, and/or other modular devices of the modular communication hub 20065 connectable to a surgical data network.
In one aspect, the surgical hub system 20060 shown in FIG. 4 may include a combination of a network hub, a network switch, and a network router that connects the devices 1a-1n/2a-2m or the sensing system 20069 to the cloud base system 20064. One or more of the devices 1a-1n/2a-2m or sensing systems 20069 coupled to the hub 20061 or the network switch 20062 may collect data in real time and transmit the data to the cloud computer for data processing and operation. It should be appreciated that cloud computing relies on shared computing resources, rather than using local servers or personal devices to process software applications. The term "cloud" may be used as a metaphor for "internet," although the term is not so limited. Thus, the term "cloud computing" may be used herein to refer to "types of internet-based computing" in which different services (such as servers, storage devices, and applications) are delivered to modular communication hubs 20065 and/or computer systems 20063 located in an operating room (e.g., stationary, mobile, temporary, or live operating room or space) and devices connected to modular communication hubs 20065 and/or computer systems 20063 through the internet. The cloud infrastructure may be maintained by a cloud service provider. In this case, the cloud service provider may be an entity that coordinates the use and control of devices 1a-1n/2a-2m located in one or more operating rooms. The cloud computing service may perform a number of computations based on data collected by intelligent surgical instruments, robots, sensing systems, and other computerized devices located in the operating room. Hub hardware enables multiple devices, sensing systems, and/or connections to connect to computers in communication with cloud computing resources and storage devices.
Applying cloud computer data processing techniques to the data collected by devices 1a-1n/2a-2m, the surgical data network may provide improved surgical results, reduced costs, and improved patient satisfaction. At least some of the devices 1a-1n/2a-2m may be employed to observe tissue conditions to assess leakage or perfusion of sealed tissue following tissue sealing and cutting procedures. At least some of the devices 1a-1n/2a-2m may be employed to identify pathologies, such as effects of disease, and data including images of body tissue samples for diagnostic purposes may be examined using cloud-based computing. This may include localization and edge validation of tissues and phenotypes. At least some of the devices 1a-1n/2a-2m may be employed to identify anatomical structures of the body using various sensors integrated with imaging devices and techniques, such as overlapping images captured by multiple imaging devices. The data (including image data) collected by the devices 1a-1n/2a-2m may be transmitted to the cloud computing system 20064 or the local computer system 20063, or both, for data processing and manipulation, including image processing and manipulation. Such data analysis may further employ result analysis processing and may provide beneficial feedback using standardized methods to confirm or suggest modification of surgical treatment and surgeon behavior.
Applying cloud computer data processing techniques to the measurement data collected by sensing system 20069, the surgical data network may provide improved surgical results, improved recovery results, reduced costs, and improved patient satisfaction. At least some of the sensing systems 20069 may be used to assess the physiological condition of a surgeon operating on a patient or a patient being prepared for surgery or a patient recovered after surgery. The cloud-based computing system 20064 may be used to monitor biomarkers associated with a surgeon or patient in real-time and may be used to generate a surgical plan based at least on measurement data collected prior to a surgical procedure, provide control signals to surgical instruments during the surgical procedure, and notify the patient of complications during the post-surgical procedure.
The operating room devices 1a-1n may be connected to the modular communication hub 20065 via a wired channel or a wireless channel, depending on the configuration of the devices 1a-1n to the hub 20061. In one aspect, hub 20061 may be implemented as a local network broadcaster operating on the physical layer of the Open Systems Interconnection (OSI) model. The hub may provide a connection to devices 1a-1n located in the same operating room network. The hub 20061 may collect data in the form of packets and send it to the router in half duplex mode. The hub 20061 may not store any media access control/internet protocol (MAC/IP) for transmitting device data. Only one of the devices 1a-1n may transmit data through the hub 20061 at a time. The hub 20061 may have no routing tables or intelligence about where to send information and broadcast all network data on each connection and to remote servers 20067 of the cloud computing system 20064. Hub 20061 may detect basic network errors such as collisions, but broadcasting all information to multiple ports may pose a security risk and cause bottlenecks.
The operating room devices 2a-2m may be connected to the network switch 20062 via a wired channel or a wireless channel. The network switch 20062 operates in the data link layer of the OSI model. The network switch 20062 may be a multicast device for connecting devices 2a-2m located in the same operating room to a network. The network switch 20062 may send data in frames to the network router 20066 and may operate in full duplex mode. Multiple devices 2a-2m may transmit data simultaneously through network switch 20062. The network switch 20062 stores and uses the MAC addresses of the devices 2a-2m to transfer data.
The network hub 20061 and/or network switch 20062 may be coupled to a network router 20066 to connect to the cloud computing system 20064. The network router 20066 operates in the network layer of the OSI model. The network router 20066 generates routes for transmitting data packets received from the network hub 20061 and/or network switch 20062 to cloud-based computer resources to further process and manipulate data collected by any or all of the devices 1a-1n/2a-2m and the wearable sensing system 20011. Network router 20066 may be employed to connect two or more different networks located at different locations, such as, for example, different operating rooms at the same medical facility or different networks located at different operating rooms at different medical facilities. The network router 20066 may send data in packets to the cloud computing system 20064 and operate in full duplex mode. Multiple devices may transmit data simultaneously. Network router 20066 may use the IP address to transmit data.
In one example, hub 20061 may be implemented as a USB hub that allows multiple USB devices to connect to a host. USB hubs can extend a single USB port to multiple tiers so that more ports are available to connect devices to a host system computer. Hub 20061 may include wired or wireless capabilities for receiving information over wired or wireless channels. In one aspect, a wireless USB short-range, high-bandwidth wireless radio communication protocol may be used for communication between devices 1a-1n and devices 2a-2m located in an operating room.
In an example, the operating room devices 1a-1n/2a-2m and/or the sensing system 20069 may communicate with the modular communication hub 20065 via bluetooth wireless technology standard for exchanging data from fixed devices and mobile devices and constructing Personal Area Networks (PANs) over short distances (using short wavelength UHF radio waves of 2.4GHz to 2.485GHz in the ISM band). The operating room devices 1a-1n/2a-2m and/or sensing systems 20069 may communicate with the modular communication hub 20065 via a variety of wireless or wired communication standards or protocols, including, but not limited to Bluetooth, low-Energy Bluetooth, near Field Communication (NFC), wi-Fi (IEEE 802.11 series), wiMAX (IEEE 802.16 series), IEEE 802.20, new air interface (NR), long Term Evolution (LTE) and Ev-DO, hspa+, hsdpa+, hsupa+, EDGE, GSM, GPRS, CDMA, TDMA, DECT, and ethernet derivatives thereof, as well as any other wireless and wired protocols designated 3G, 4G, 5G, and above. The computing module may include a plurality of communication modules. For example, a first communication module may be dedicated to shorter range wireless communications, such as Wi-Fi and Bluetooth Low-Energy Bluetooth, bluetooth Smart, while a second communication module may be dedicated to longer range wireless communications, such as GPS, EDGE, GPRS, CDMA, wiMAX, LTE, ev-DO, hspa+, hsdpa+, hsupa+, EDGE, GSM, GPRS, CDMA, TDMA, and so on.
The modular communication hub 20065 may serve as a central connection for one or more of the operating room devices 1a-1n/2a-2m and/or the sensing system 20069 and may process a type of data known as a frame. The frames may carry data generated by the devices 1a-1n/2a-2m and/or the sensing system 20069. When a frame is received by modular communication hub 20065, the frame may be amplified and/or sent to network router 20066, which may transmit data to cloud computing system 20064 or local computer system 20063 using a plurality of wireless or wired communication standards or protocols, as described herein.
The modular communication hub 20065 may be used as a stand-alone device or connected to a compatible network hub 20061 and network switch 20062 to form a larger network. The modular communication hub 20065 may generally be easy to install, configure, and maintain, making it a good option to network the operating room devices 1a-1n/2a-2 m.
Fig. 5 shows a computer-implemented interactive surgical system 20070, which may be part of a surgical system 20002. The computer implemented interactive surgical system 20070is similar in many respects to the HCP sensing system 20002. For example, computer-implemented interactive surgical system 20070can include one or more surgical subsystems 20072, similar in many respects to surgical system 20002. Each surgical subsystem 20072 may include at least one surgical hub 20076 in communication with a cloud computing system 20064, which may include a remote server 20077 and a remote storage 20078. In one aspect, the computer-implemented interactive surgical system 20070can include a modular control 20085 that connects to multiple operating room devices, such as sensing systems 20001, intelligent surgical instruments, robots, and other computerized devices located in the operating room.
As shown in the example of fig. 5, the modular control 20085 can be coupled to an imaging module 20088 (which can be coupled to an endoscope 20087), a generator module 20090 that can be coupled to an energy device 20089, a smoke extractor module 20091, a suction/irrigation module 20092, a communication module 20097, a processor module 20093, a storage array 20094, a smart device/appliance 20095 and a contactless sensor module 20096, optionally coupled to a display 20086 and display 20084, respectively. The non-contact sensor module 20096 may use ultrasonic, laser-type, and/or similar non-contact measurement devices to measure the dimensions of the operating room and generate a map of the operating room. Other distance sensors may be employed to determine the boundaries of the operating room. The ultrasound-based non-contact sensor module may scan the Operating Room by emitting a burst of ultrasound and receiving echoes as it bounces off the Operating Room's perimeter wall, as described under the heading "surgicalhub SPATIAL AWARENESS WITHIN AN Operating Room" in U.S. provisional patent application serial No. 62/611,341, filed on day 12, 28, 2017, which provisional patent application is incorporated herein by reference in its entirety. The sensor module may be configured to be able to determine the size of the operating room and adjust the bluetooth pairing distance limit. The laser-based non-contact sensor module may scan the operating room by emitting laser pulses, receiving laser pulses bouncing off the enclosure of the operating room, and comparing the phase of the emitted pulses with the received pulses to determine the operating room size and adjust the bluetooth pairing distance limit.
The modular control 20085 can also be in communication with one or more sensing systems 20069 and environmental sensing systems 20015. The sensing system 20069 can be connected to the modular control 20085 directly via a router or via a communication module 20097. The operating room device may be coupled to the cloud computing resources and the data storage device via modular controls 20085. Robotic surgical hub 20082 can also be connected to modular control 20085 and cloud computing resources. The devices/instruments 20095 or 20084, the human-machine interface system 20080, etc. can be coupled to the modular control 20085 via a wired or wireless communication standard or protocol, as described herein. The human interface system 20080 can include a display subsystem and a notification subsystem. Modular controls 20085 can be coupled to a hub display 20081 (e.g., monitor, screen) to display and overlay images received from imaging modules 20088, device/instrument displays 20086, and/or other human-machine interface systems 20080. The hub display 20081 can also display data received from devices connected to the modular control 20085 in conjunction with the image and the overlay image.
Fig. 6 illustrates a logic diagram of a control system 20220 of a surgical instrument or tool, in accordance with one or more aspects of the present disclosure. The surgical instrument or tool may be configurable. The surgical instrument may include surgical fixation devices, such as imaging devices, surgical staplers, energy devices, endocutter devices, etc., that are specific to the procedure at hand. For example, the surgical instrument may include any of a powered stapler, a powered stapler generator, an energy device, a pre-energy jaw device, an endocutter clamp, an energy device generator, an operating room imaging system, a smoke extractor, an aspiration-irrigation device, an insufflation system, and the like. The system 20220 may include control circuitry. The control circuitry may include a microcontroller 20221 that includes a processor 20222 and a memory 20223. For example, one or more of the sensors 20225, 20226, 20227 provide real-time feedback to the processor 20222. A motor 20230 driven by a motor driver 20229 is operably coupled to the longitudinally movable displacement member to drive the I-beam knife elements. The tracking system 20228 may be configured to determine the position of the longitudinally movable displacement member. The position information may be provided to a processor 20222, which may be programmed or configured to determine the position of the longitudinally movable drive member and the position of the firing member, firing bar, and I-beam knife element. Additional motors may be provided at the tool driver interface to control I-beam firing, closure tube travel, shaft rotation, and articulation. The display 20224 may display various operating conditions of the instrument and may include touch screen functionality for data entry. The information displayed on the display 20224 may be overlaid with images acquired via the endoscopic imaging module.
The microcontroller 20221 may be any single or multi-core processor, such as those provided by Texas Instruments under the trade name ARM Cortex. In one aspect, the master microcontroller 20221 may be an LM4F230H5QR ARM Cortex-M4F processor core available from, for example Texas Instruments, an on-chip memory including 256KB of single-cycle flash memory or other non-volatile memory (up to 40 MHz), a prefetch buffer for improving performance above 40MHz, 32KB single-cycle SRAM, loaded withInternal ROM for software, 2KB EEPROM, one or more PWM modules, one or more QEI analog and/or one or more 12-bit ADC with 12 analog input channels, details of which can be seen in the product data sheet.
Microcontroller 20221 can include a secure controller comprising two controller-based families such as TMS570 and RM4x, which are also known as being manufactured by Texas Instruments under the trade name Hercules ARM Cortex R. The security controller may be configured to be capable of being dedicated to IEC 61508 and ISO 26262 security critical applications, etc., to provide advanced integrated security features while delivering scalable execution, connectivity, and memory options.
The microcontroller 20221 can be programmed to perform various functions such as precise control of the speed and position of the knife and articulation system. In one aspect, the microcontroller 20221 may include a processor 20222 and a memory 20223. The electric motor 20230 may be a brushed Direct Current (DC) motor having a gear box and a mechanical link to an articulation or knife system. In one aspect, the motor driver 20229 may be a3941 available from Allegro Microsystems, inc. Other motor drives may be readily substituted for use in the tracking system 20228, which includes an absolute positioning system. A detailed description of an absolute positioning system is described in U.S. patent application publication No. 2017/0296213, entitled "SYSTEMS AND METHODS FOR CONTROLLING A SURGICAL STAPLING AND CUTTING INSTRUMENT," published on 10, 19, 2017, which is incorporated herein by reference in its entirety.
The microcontroller 20221 can be programmed to provide precise control over the speed and position of the displacement member and articulation system. The microcontroller 20221 may be configured to be able to calculate a response in software of the microcontroller 20221. The calculated response may be compared to the measured response of the actual system to obtain an "observed" response, which is used in the actual feedback decision. The observed response may be an advantageous tuning value that equalizes the smooth continuous nature of the simulated response with the measured response, which may detect external effects on the system.
The motor 20230 may be controlled by a motor driver 20229 and may be employed by a firing system of the surgical instrument or tool. In various forms, the motor 20230 may be a brushed DC drive motor having a maximum rotational speed of about 25,000 rpm. In some examples, the motor 20230 may include a brushless motor, a cordless motor, a synchronous motor, a stepper motor, or any other suitable electric motor. The motor driver 20229 may include, for example, an H-bridge driver including Field Effect Transistors (FETs). The motor 20230 may be powered by a power assembly releasably mounted to the handle assembly or tool housing for supplying control power to the surgical instrument or tool. The power assembly may include a battery that may include a plurality of battery cells connected in series that may be used as a power source to provide power to a surgical instrument or tool. In some cases, the battery cells of the power assembly may be replaceable and/or rechargeable. In at least one example, the battery cell may be a lithium ion battery, which may be coupled to and separable from the power component.
The motor driver 20229 may be a3941 available from Allegro Microsystems, inc. A3941 may be a full bridge controller for use with external N-channel power Metal Oxide Semiconductor Field Effect Transistors (MOSFETs) specifically designed for inductive loads, such as brushed DC motors. The driver 20229 may include a unique charge pump regulator that may provide full (> 10V) gate drive for battery voltages as low as 7V and may allow a3941 to operate with reduced gate drive as low as 5.5V. A bootstrap capacitor may be employed to provide the above-described battery supply voltage required for an N-channel MOSFET. The internal charge pump of the high side drive may allow for direct current (100% duty cycle) operation. Diodes or synchronous rectification may be used to drive the full bridge in either a fast decay mode or a slow decay mode. In slow decay mode, current recirculation may pass through either the high-side FET or the low-side FET. The resistor-tunable dead time protects the power FET from breakdown. The integrated diagnostics provide indications of brown-out, over-temperature, and power bridge faults and may be configured to protect the power MOSFET under most short circuit conditions. Other motor drives may be readily substituted for use in the tracking system 20228, which includes an absolute positioning system.
The tracking system 20228 may include a controlled motor drive circuit arrangement including a position sensor 20225 in accordance with an aspect of the present disclosure. The position sensor 20225 for the absolute positioning system may provide a unique position signal corresponding to the position of the displacement member. In some examples, the displacement member may represent a longitudinally movable drive member comprising a rack of drive teeth for meshing engagement with a corresponding drive gear of the gear reducer assembly. In some examples, the displacement member may represent a firing member that may be adapted and configured as a rack that may include drive teeth. In some examples, the displacement member may represent a firing bar or an I-beam, each of which may be adapted and configured as a rack that can include drive teeth. Thus, as used herein, the term displacement member may be used generally to refer to any movable member of a surgical instrument or tool, such as a drive member, firing bar, I-beam, or any element that may be displaced. In one aspect, a longitudinally movable drive member may be coupled to the firing member, the firing bar, and the I-beam. Thus, the absolute positioning system may actually track the linear displacement of the I-beam by tracking the linear displacement of the longitudinally movable drive member. In various aspects, the displacement member may be coupled to any position sensor 20225 adapted to measure linear displacement. Thus, a longitudinally movable drive member, firing bar, or I-beam, or combination thereof, may be coupled to any suitable linear displacement sensor. The linear displacement sensor may comprise a contact type displacement sensor or a non-contact type displacement sensor. The linear displacement sensor may comprise a Linear Variable Differential Transformer (LVDT), a Differential Variable Reluctance Transducer (DVRT), a sliding potentiometer, a magnetic sensing system comprising a movable magnet and a series of linearly arranged hall effect sensors, a magnetic sensing system comprising a fixed magnet and a series of movable linearly arranged hall effect sensors, an optical sensing system comprising a movable light source and a series of linearly arranged photodiodes or photodetectors, an optical sensing system comprising a fixed light source and a series of movable linearly arranged photodiodes or photodetectors, or any combination thereof.
The electric motor 20230 may include a rotatable shaft operably interfacing with a gear assembly mounted to the displacement member in meshing engagement with a set of drive teeth or racks of drive teeth. The sensor element may be operably coupled to the gear assembly such that a single rotation of the position sensor 20225 element corresponds to certain linear longitudinal translations of the displacement member. The gearing and sensor arrangement may be connected to the linear actuator via a rack and pinion arrangement, or to the rotary actuator via a spur gear or other connection. The power source may supply power to the absolute positioning system and the output indicator may display an output of the absolute positioning system. The displacement member may represent a longitudinally movable drive member including racks of drive teeth formed thereon for meshing engagement with corresponding drive gears of the gear reducer assembly. The displacement member may represent a longitudinally movable firing member, a firing bar, an I-beam, or a combination thereof.
A single rotation of the sensor element associated with the position sensor 20225 may be equivalent to a longitudinal linear displacement d1 of the displacement member, where d1 is: after a single rotation of the sensor element coupled to the displacement member, the displacement member moves a longitudinal linear distance from point "a" to point "b". The sensor arrangement may be connected via gear reduction which allows the position sensor 20225 to complete only one or more rotations for the full stroke of the displacement member. The position sensor 20225 may complete multiple rotations for a full stroke of the displacement member.
A series of switches (where n is an integer greater than one) may be employed alone or in combination with gear reduction to provide unique position signals for more than one revolution of the position sensor 20225. The state of the switch may be fed back to the microcontroller 20221, which applies logic to determine a unique position signal corresponding to the longitudinal linear displacement d1+d2+ … … dn of the displacement member. The output of the position sensor 20225 is provided to the microcontroller 20221. The position sensor 20225 of the sensor arrangement may comprise a magnetic sensor, an analog rotation sensor (e.g., potentiometer), or an array of analog hall effect elements that output a unique combination of position signals or values.
The position sensor 20225 may include any number of magnetic sensing elements, such as magnetic sensors classified according to whether they measure a total magnetic field or vector components of a magnetic field. Techniques for producing the two types of magnetic sensors described above may cover a variety of aspects of physics and electronics. Techniques for magnetic field sensing may include sniffing coils, fluxgates, optical pumps, nuclear spin, SQUIDs, hall effects, anisotropic magnetoresistance, giant magnetoresistance, magnetic tunnel junctions, giant magnetoresistance, magnetostriction/piezoelectric composites, magnetostriction diodes, magnetostriction transistors, optical fibers, magneto-optical, and mems-based magnetic sensors, and the like.
The position sensor 20225 for the tracking system 20228, which includes an absolute positioning system, may include a magnetic rotational absolute positioning system. The position sensor 20225 may be implemented AS an AS5055EQFT single-piece magnetic rotational position sensor, commercially available from Austria Microsystems, AG. The position sensor 20225 is connected to the microcontroller 20221 to provide an absolute positioning system. The position sensor 20225 may be a low voltage and low power component and may include four hall effect elements that may be located in the region of the position sensor 20225 above the magnet. A high resolution ADC and intelligent power management controller may also be provided on the chip. A coordinate rotation digital computer (CORDIC) processor (also known as bitwise and Volder algorithms) may be provided to perform simple and efficient algorithms to calculate hyperbolic functions and trigonometric functions, which require only addition, subtraction, bit shifting and table lookup operations. The angular position, alarm bit, and magnetic field information may be transmitted to the microcontroller 20221 through a standard serial communication interface, such as a Serial Peripheral Interface (SPI) interface. The position sensor 20225 may provide 12 or 14 bit resolution. The site sensor 20225 may be an AS5055 chip provided in a small QFN 16 pin 4 x 0.85mm package.
The tracking system 20228, which includes an absolute positioning system, may include and/or be programmed to implement feedback controllers, such as PID, status feedback, and adaptive controllers. The power source converts the signal from the feedback controller into a physical input to the system: in this case a voltage. Other examples include PWM of voltage, current, and force. In addition to the location measured by the location sensor 20225, other sensors may be provided to measure physical parameters of the physical system. In some aspects, the one or more other sensors may include a sensor arrangement such as those described in U.S. patent 9,345,481, entitled "STAPLE CARTRIDGE TISSUE THICKNESS SENSOR SYSTEM," issued 5/24/2016, the entire disclosure of which is incorporated herein by reference; U.S. patent application publication No. 2014/0263552 entitled "STAPLE CARTRIDGE TISSUE THICKNESS SENSOR SYSTEM" published at 9/18 of 2014, which is incorporated herein by reference in its entirety; and U.S. patent application Ser. No. 15/628,175, entitled "TECHNIQUES FOR ADAPTIVE CONTROL OF MOTOR VELOCITY OF A SURGICAL STAPLING AND CUTTING INSTRUMENT," filed on 6/20/2017, which is incorporated herein by reference in its entirety. In a digital signal processing system, an absolute positioning system is coupled to a digital data acquisition system, wherein the output of the absolute positioning system will have a limited resolution and sampling frequency. The absolute positioning system may include a comparison and combination circuit to combine the calculated response with the measured response using an algorithm (such as a weighted average and a theoretical control loop) that drives the calculated response toward the measured response. The calculated response of the physical system may take into account characteristics such as mass, inertia, viscous friction, inductance and resistance to predict the state and output of the physical system by knowing the inputs.
Thus, the absolute positioning system can provide an absolute position of the displacement member upon power-up of the instrument, and does not retract or advance the displacement member to a reset (clear or home) position as may be required by conventional rotary encoders that merely count the number of forward or backward steps taken by the motor 20230 to infer the position of the device actuator, drive rod, knife, and the like.
The sensor 20226 (such as, for example, a strain gauge or micro-strain gauge) may be configured to measure one or more parameters of the end effector, such as, for example, an amplitude of strain exerted on the anvil during a clamping operation, which may be indicative of a closing force applied to the anvil. The measured strain may be converted to a digital signal and provided to the processor 20222. Alternatively or in addition to the sensor 20226, a sensor 20227 (such as a load sensor) may measure the closing force applied to the anvil by the closure drive system. A sensor 20227, such as a load sensor, may measure the firing force applied to the I-beam during the firing stroke of the surgical instrument or tool. The I-beam is configured to engage a wedge sled configured to cam the staple drivers upward to push staples out into deforming contact with the anvil. The I-beam may also include a sharp cutting edge that may be used to sever tissue when the I-beam is advanced distally through the firing bar. Alternatively, a current sensor 20231 may be employed to measure the current drawn by the motor 20230. For example, the force required to advance the firing member may correspond to the current drawn by the motor 20230. The measured force may be converted to a digital signal and provided to the processor 20222.
For example, the strain gauge sensor 20226 may be used to measure the force applied to tissue by the end effector. A strain gauge may be coupled to the end effector to measure forces on tissue being treated by the end effector. A system for measuring a force applied to tissue grasped by an end effector may include a strain gauge sensor 20226, such as a microstrain gauge, which may be configured to measure one or more parameters of the end effector, for example. In one aspect, the strain gauge sensor 20226 can measure the magnitude or magnitude of the strain applied to the jaw members of the end effector during a clamping operation, which can be indicative of tissue compression. The measured strain may be converted to a digital signal and provided to the processor 20222 of the microcontroller 20221. The load sensor 20227 may measure the force used to operate the knife element, for example, to cut tissue captured between the anvil and the staple cartridge. A magnetic field sensor may be employed to measure the thickness of the captured tissue. The measurement results of the magnetic field sensor may also be converted into digital signals and provided to the processor 20222.
The microcontroller 20221 can use measurements of tissue compression, tissue thickness, and/or force required to close the end effector on tissue measured by the sensors 20226, 20227, respectively, to characterize corresponding values of the selected position of the firing member and/or the speed of the firing member. In one case, the memory 20223 may store techniques, formulas, and/or look-up tables that may be employed by the microcontroller 20221 in the evaluation.
The control system 20220 of the surgical instrument or tool may also include wired or wireless communication circuitry to communicate with the modular communication hub 20065, as shown in fig. 5.
Fig. 7 illustrates an exemplary surgical system 20280 according to the present disclosure, and may include a surgical instrument 20282 that communicates with a console 20294 or portable device 20296 over a local area network 20292 and/or cloud network 20293 via a wired and/or wireless connection. The console 20294 and portable device 20296 may be any suitable computing device. The surgical instrument 20282 may include a handle 20297, an adapter 20285, and a loading unit 20287. Adapter 20285 is releasably coupled to handle 20297 and loading unit 20287 is releasably coupled to adapter 20285 such that adapter 20285 transmits force from the drive shaft to loading unit 20287. The adapter 20285 or the loading unit 20287 may include a load cell (not explicitly shown) disposed therein to measure the force exerted on the loading unit 20287. The loading unit 20287 can include an end effector 20289 having a first jaw 20291 and a second jaw 20290. The loading unit 20287 may be an in situ loading or Multiple Firing Loading Unit (MFLU) that allows the clinician to fire multiple fasteners multiple times without removing the loading unit 20287 from the surgical site to reload the loading unit 20287.
The first and second jaws 20291, 20290 can be configured to clamp tissue therebetween, fire fasteners through the clamped tissue, and sever the clamped tissue. The first jaw 20291 can be configured to fire at least one fastener multiple times or can be configured to include a replaceable multiple fire fastener cartridge that includes a plurality of fasteners (e.g., staples, clips, etc.) that can be fired more than once before being replaced. The second jaw 20290 may comprise an anvil that deforms or otherwise secures the fasteners as they are ejected from the multi-fire fastener cartridge.
The handle 20297 may include a motor coupled to the drive shaft to affect rotation of the drive shaft. The handle 20297 may include a control interface for selectively activating the motor. The control interface may include buttons, switches, levers, sliders, touch screens, and any other suitable input mechanism or user interface that may be engaged by the clinician to activate the motor.
The control interface of the handle 20297 may be in communication with the controller 20298 of the handle 20297 to selectively activate the motor to affect rotation of the drive shaft. The controller 20298 may be disposed within the handle 20297 and configured to receive input from the control interface and adapter data from the adapter 20285 or loading unit data from the loading unit 20287. The controller 20298 may analyze the input from the control interface and the data received from the adapter 20285 and/or the loading unit 20287 to selectively activate the motor. The handle 20297 may also include a display that a clinician may view during use of the handle 20297. The display may be configured to display portions of the adapter or loading unit data before, during, or after firing the instrument 20282.
The adapter 20285 may include an adapter identification device 20284 disposed therein and the load unit 20287 may include a load unit identification device 20288 disposed therein. The adapter identifying means 20284 may be in communication with the controller 20298 and the loading unit identifying means 20288 may be in communication with the controller 20298. It should be appreciated that the load unit identification device 20288 may communicate with the adapter identification device 20284, which relays or communicates the communication from the load unit identification device 20288 to the controller 20298.
Adapter 20285 may also include a plurality of sensors 20286 (one shown) disposed thereabout to detect various conditions of adapter 20285 or the environment (e.g., whether adapter 20285 is connected to a loading unit, whether adapter 20285 is connected to a handle, whether a drive shaft is rotating, torque of a drive shaft, strain of a drive shaft, temperature within adapter 20285, number of firings of adapter 20285, peak force of adapter 20285 during firings, total amount of force applied to adapter 20285, peak retraction force of adapter 20285, number of pauses of adapter 20285 during firings, etc.). The plurality of sensors 20286 may provide input to the adapter identification arrangement 20284 in the form of data signals. The data signals of the plurality of sensors 20286 may be stored within the adapter identification means 20284 or may be used to update the adapter data stored within the adapter identification means. The data signals of the plurality of sensors 20286 may be analog or digital. The plurality of sensors 20286 may include a load cell to measure the force exerted on the loading unit 20287 during firing.
The handle 20297 and adapter 20285 may be configured to interconnect the adapter identification means 20284 and the loading unit identification means 20288 with the controller 20298 via an electrical interface. The electrical interface may be a direct electrical interface (i.e., including electrical contacts that engage one another to transfer energy and signals therebetween). Additionally or alternatively, the electrical interface may be a contactless electrical interface to wirelessly transfer energy and signals therebetween (e.g., inductive transfer). It is also contemplated that the adapter identifying means 20284 and the controller 20298 may communicate wirelessly with each other via a wireless connection separate from the electrical interface.
The handle 20297 may include a transceiver 20283 configured to enable transmission of instrument data from the controller 20298 to other components of the system 20280 (e.g., the LAN 20292, the cloud 20293, the console 20294, or the portable device 20296). The controller 20298 may also transmit instrument data and/or measurement data associated with the one or more sensors 20286 to the surgical hub. The transceiver 20283 may receive data (e.g., cartridge data, loading unit data, adapter data, or other notification) from the surgical hub 20270. The transceiver 20283 may also receive data (e.g., bin data, load unit data, or adapter data) from other components of the system 20280. For example, the controller 20298 can transmit instrument data to the console 20294 that includes a serial number of an attachment adapter (e.g., adapter 20285) attached to the handle 20297, a serial number of a loading unit (e.g., loading unit 20287) attached to the adapter 20285, and a serial number of multiple firing fastener cartridges loaded to the loading unit. Thereafter, the console 20294 may transmit data (e.g., bin data, load unit data, or adapter data) associated with the attached bin, load unit, and adapter, respectively, back to the controller 20298. The controller 20298 may display the message on the local instrument display or transmit the message to the console 20294 or portable device 20296 via the transceiver 20283 to display the message on the display 20295 or portable device screen, respectively.
Fig. 8 illustrates a diagram of a situational awareness surgical system 5100 in accordance with at least one aspect of the present disclosure. The data sources 5126 can include, for example, a modular device 5102 (which can include sensors configured to detect parameters associated with the patient, HCP, and environment, and/or the modular device itself), a database 5122 (e.g., an EMR database containing patient records), and a patient monitoring device 5124 (e.g., a Blood Pressure (BP) monitor and an Electrocardiogram (EKG) monitor), a HCP monitoring device 35510, and/or an environment monitoring device 35512. The surgical hub 5104 may be configured to be able to derive surgical-related context information from the data, e.g., based on a particular combination of received data or a particular sequence of received data from the data source 5126. The context information inferred from the received data may include, for example, the type of surgical procedure being performed, the particular step of the surgical procedure being performed by the surgeon, the type of tissue being operated on, or the body cavity being the subject of the procedure. Some aspects of the surgical hub 5104 may be referred to as "situational awareness" of this ability to derive or infer information about the surgical procedure from the received data. For example, the surgical hub 5104 may incorporate a situation awareness system, which is hardware and/or programming associated with the surgical hub 5104 to derive context information related to the surgical procedure from the received data and/or surgical planning information received from the edge computing system 35514 or enterprise cloud server 35516.
The situational awareness system of the surgical hub 5104 may be configured to derive background information from data received from the data source 5126 in a number of different ways. For example, the situational awareness system may include a pattern recognition system or a machine learning system (e.g., an artificial neural network) that has been trained on training data to correlate various inputs (e.g., data from database 5122, patient monitoring device 5124, modular device 5102, HCP monitoring device 35510, and/or environmental monitoring device 35512) with corresponding background information about the surgical procedure. The machine learning system may be trained to accurately derive context information about the surgical procedure from the provided inputs. In an example, the situational awareness system may include a look-up table storing pre-characterized contextual information about the surgery associated with one or more inputs (or input ranges) corresponding to the contextual information. In response to a query with one or more inputs, the lookup table may return corresponding context information that the situational awareness system uses to control the modular device 5102. In an example, the contextual information received by the situational awareness system of the surgical hub 5104 can be associated with a particular control adjustment or set of control adjustments for one or more modular devices 5102. In an example, the situational awareness system may include an additional machine learning system, look-up table, or other such system that generates or retrieves one or more control adjustments for the one or more modular devices 5102 when providing contextual information as input.
The surgical hub 5104, in combination with the situational awareness system, can provide a number of benefits to the surgical system 5100. One benefit may include improved interpretation of sensed and collected data, which in turn will improve processing accuracy and/or use of data during a surgical procedure. Returning to the previous example, the situational awareness surgical hub 5104 may determine the type of tissue being operated on; thus, upon detection of an unexpectedly high force for closing the end effector of the surgical instrument, the situation aware surgical hub 5104 can properly ramp up or ramp down the motor speed for the tissue type surgical instrument.
The type of tissue being operated on may affect the adjustment of the compression rate and load threshold of the surgical stapling and severing instrument for a particular tissue gap measurement. The situational awareness surgical hub 5104 can infer whether the surgical procedure being performed is a thoracic or abdominal procedure, allowing the surgical hub 5104 to determine whether tissue held by the end effector of the surgical stapling and severing instrument is pulmonary tissue (for thoracic procedures) or gastric tissue (for abdominal procedures). The surgical hub 5104 can then appropriately adjust the compression rate and load threshold of the surgical stapling and severing instrument for the type of tissue.
The type of body cavity that is operated during an insufflation procedure can affect the function of the smoke extractor. The situation-aware surgical hub 5104 can determine whether the surgical site is under pressure (by determining that the surgical procedure is utilizing insufflation) and determine the type of procedure. Since one type of procedure may typically be performed within a particular body cavity, the surgical hub 5104 may then appropriately control the motor rate of the smoke extractor for the body cavity in which it is operated. Thus, the situational awareness surgical hub 5104 can provide consistent smoke evacuation for both thoracic and abdominal procedures.
The type of procedure being performed may affect the optimal energy level for the operation of the ultrasonic surgical instrument or the Radio Frequency (RF) electrosurgical instrument. For example, arthroscopic surgery may require higher energy levels because the end effector of the ultrasonic surgical instrument or the RF electrosurgical instrument is submerged in a fluid. The situational awareness surgical hub 5104 may determine whether the surgical procedure is an arthroscopic procedure. The surgical hub 5104 can then adjust the RF power level or ultrasonic amplitude (e.g., "energy level") of the generator to compensate for the fluid-filled environment. Relatedly, the type of tissue being operated on can affect the optimal energy level at which the ultrasonic surgical instrument or RF electrosurgical instrument is operated. The situation aware surgical hub 5104 can determine the type of surgical procedure being performed and then tailor the energy level of the ultrasonic surgical instrument or the RF electrosurgical instrument, respectively, according to the expected tissue profile of the surgical procedure. Further, the situation aware surgical hub 5104 may be configured to be able to adjust the energy level of the ultrasonic surgical instrument or the RF electrosurgical instrument throughout the surgical procedure rather than on a procedure-by-procedure basis only. The situation aware surgical hub 5104 may determine the step of the surgical procedure being performed or to be performed subsequently and then update the control algorithms of the generator and/or the ultrasonic surgical instrument or the RF electrosurgical instrument to set the energy level at a value appropriate for the desired tissue type in accordance with the surgical step.
In an example, data can be extracted from additional data sources 5126 to improve the conclusion drawn by the surgical hub 5104 from one of the data sources 5126. The situation aware surgical hub 5104 may augment the data it receives from the modular device 5102 with background information about the surgical procedure that has been constructed from other data sources 5126. For example, the situation-aware surgical hub 5104 may be configured to determine from video or image data received from a medical imaging device whether hemostasis has occurred (e.g., whether bleeding at a surgical site has ceased). The surgical hub 5104 may be further configured to be able to compare physiological measurements (e.g., blood pressure sensed by a BP monitor communicatively connected to the surgical hub 5104) with visual or image data of hemostasis (e.g., from a medical imaging device communicatively coupled to the surgical hub 5104) to determine the integrity of a staple line or tissue weld. The situational awareness system of the surgical hub 5104 can consider the physiological measurement data to provide additional context in analyzing the visualization data. The additional context may be useful when the visual data itself may be ambiguous or incomplete.
For example, if the situation awareness surgical hub 5104 determines that the subsequent step of the procedure requires the use of an RF electrosurgical instrument, it may actively activate a generator connected to the instrument. Actively activating the energy source may allow the instrument to be ready for use upon completion of a prior step of the procedure.
The situational awareness surgical hub 5104 may determine whether the current or subsequent steps of the surgical procedure require different views or magnification on the display based on the feature(s) that the surgeon expects to view at the surgical site. The surgical hub 5104 can actively change the displayed view accordingly (e.g., as provided by a medical imaging device for a visualization system) such that the display is automatically adjusted throughout the surgical procedure.
The situation aware surgical hub 5104 may determine which step of the surgical procedure is being performed or will be performed subsequently and whether specific data or comparisons between data are required for that step of the surgical procedure. The surgical hub 5104 can be configured to automatically invoke a data screen based on the steps of the surgical procedure being performed without waiting for the surgeon to request that particular information.
Errors may be checked during setup of the surgery or during the course of the surgery. For example, the situational awareness surgical hub 5104 may determine whether the operating room is properly or optimally set up for the surgical procedure to be performed. The surgical hub 5104 may be configured to determine the type of surgical procedure being performed, retrieve (e.g., from memory) the corresponding manifest, product location, or setup requirements, and then compare the current operating room layout to the standard layout determined by the surgical hub 5104 for the type of surgical procedure being performed. In some examples, the surgical hub 5104 can compare the list of items for the procedure and/or the list of devices paired with the surgical hub 5104 to a suggested or expected list of items and/or devices for a given surgical procedure. If there are any discontinuities between the lists, the surgical hub 5104 may provide an alert indicating that a particular modular device 5102, patient monitoring device 5124, HCP monitoring device 35510, environmental monitoring device 35512, and/or other surgical item is missing. In some examples, the surgical hub 5104 may determine a relative distance or location of the modular device 5102 and the patient monitoring device 5124, e.g., via a proximity sensor. The surgical hub 5104 can compare the relative position of the device to suggested or expected layouts for a particular surgical procedure. If there are any discontinuities between the layouts, the surgical hub 5104 can be configured to provide an alert indicating that the current layout for the surgical procedure deviates from the suggested layout.
The situational awareness surgical hub 5104 may determine whether the surgeon (or other HCP) is making an error or otherwise deviating from the intended course of action during the surgical procedure. For example, the surgical hub 5104 may be configured to be able to determine the type of surgical procedure being performed, retrieve (e.g., from memory) a corresponding list of steps or order of device use, and then compare the steps being performed or the devices being used during the surgical procedure with the expected steps or devices determined by the surgical hub 5104 for that type of surgical procedure being performed. The surgical hub 5104 can provide an alert indicating that a particular step in the surgical procedure is performing an unexpected action or is utilizing an unexpected device.
The surgical instrument (and other modular devices 5102) may be adjusted for each surgical specific context (such as adjustment to different tissue types) as well as verification actions during the surgical procedure. The next steps, data, and display adjustments may be provided to the surgical instrument (and other modular devices 5102) in the surgical room depending on the particular context of the procedure.
The computing system may use redundant communication paths to communicate surgical imaging feeds. For example, surgical video feeds may be sent via multiple video stream paths to improve the flexibility of the feeds.
Fig. 9A-9C illustrate an exemplary visualization system 2108 that may be incorporated into a surgical system. The visualization system 2108 may include an imaging control unit 2002 and a handheld unit 2020. The imaging control unit 2002 may include one or more illumination sources, a power source for the one or more illumination sources, one or more types of data communication interfaces (including USB, ethernet, or wireless interfaces 2004), and one or more video outputs 2006. The imaging control unit 2002 may also include an interface, such as the USB interface 2010, configured to enable transmission of integrated video and image capture data to a USB-enabled device. The imaging control unit 2002 may also include one or more computing components, including but not limited to a processor unit, a transitory memory unit, a non-transitory memory unit, an image processing unit, a bus structure for forming a data link among the computing components, and any interface (e.g., input and/or output) devices necessary to receive information from and transmit information to components not included in the imaging control unit. The non-transitory memory may also contain instructions that, when executed by the processor unit, may perform any number of manipulations of data that may be received from the handheld unit 2020 and/or a computing device not included in the imaging control unit.
The illumination source may include a white light source 2012 and one or more laser sources. The imaging control unit 2002 may include one or more optical and/or electrical interfaces for optically and/or electrically communicating with the handheld unit 2020. As non-limiting examples, the one or more laser sources may include any one or more of a red laser source, a green laser source, a blue laser source, an infrared laser source, and an ultraviolet laser source. In some non-limiting examples, the red laser source may provide illumination in a range having a peak wavelength that may be between 635nm and 660nm, inclusive. Non-limiting examples of peak wavelengths of the red laser light may include about 635nm, about 640nm, about 645nm, about 650nm, about 655nm, about 660nm, or any value or range of values therebetween. In some non-limiting examples, the green laser source may provide illumination having a peak wavelength that may be in a range between 520nm and 532nm, inclusive. Non-limiting examples of peak wavelengths of the green laser light may include about 520nm, about 522nm, about 524nm, about 526nm, about 528nm, about 530nm, about 532nm, or any value or range of values therebetween. In some non-limiting examples, the blue laser source may provide illumination having a peak wavelength that may be in a range between 405nm and 445nm, inclusive. Non-limiting examples of peak wavelengths of the blue laser may include about 405nm, about 410nm, about 415nm, about 420nm, about 425nm, about 430nm, about 435nm, about 440nm, about 445nm, or any value or range of values therebetween. In some non-limiting examples, the infrared laser source may provide illumination with a peak wavelength that may be in a range between 750nm and 3000nm (inclusive). Non-limiting examples of peak wavelengths of the infrared laser light may include about 750nm, about 1000nm, about 1250nm, about 1500nm, about 1750nm, about 2000nm, about 2250nm, about 2500nm, about 2750nm, 3000nm, or any value or range of values therebetween. In some non-limiting examples, the ultraviolet laser source may provide illumination having a peak wavelength that may be in a range between 200nm and 360nm, inclusive. Non-limiting examples of peak wavelengths of the ultraviolet laser light may include about 200nm, about 220nm, about 240nm, about 260nm, about 280nm, about 300nm, about 320nm, about 340nm, about 360nm, or any value or range of values therebetween.
The handheld unit 2020 may include a body 2021, a camera mirror cable 2015 attached to the body 2021, and an elongated camera probe 2024. The body 2021 of the handheld unit 2020 may include handheld unit control buttons 2022 or other controls to allow a health professional to use the handheld unit 2020 to control operation of the handheld unit 2020 or other components of the imaging control unit 2002 (including, for example, light sources). The camera mirror cable 2015 may include one or more electrical conductors and one or more optical fibers. The camera mirror cable 2015 may terminate with a camera head connector 2008 at a proximal end in which the camera head connector 2008 is configured to mate with the one or more optical and/or electrical interfaces of the imaging control unit 2002. The electrical conductors may provide power to the handheld unit 2020 (including the body 2021 and the elongate camera probe 2024) and/or any electronic components inside the handheld unit 2020 (including the body 2021 and/or the elongate camera probe 2024). The electrical conductors may also be used to provide two-way data communication between the handheld unit 2020 and any one or more components of the imaging control unit 2002. The one or more optical fibers may conduct illumination from the one or more illumination sources in the imaging control unit 2002 through the handheld unit body 2021 and to the distal end of the elongate camera probe 2024. In some non-limiting aspects, the one or more optical fibers may also conduct light reflected or refracted from the surgical site to one or more optical sensors disposed in the elongate camera probe 2024, the handheld unit body 2021, and/or the imaging control unit 2002.
Fig. 9B (top plan view) depicts some aspects of the handheld unit 2020 of the visualization system 2108 in more detail. The handheld unit body 2021 may be constructed of a plastic material. The handheld unit control buttons 2022 or other controls may have rubber overmolding to protect the controls while allowing the surgeon to manipulate the controls. The camera mirror cable 2015 may have optical fibers integrated with electrical conductors, and the camera mirror cable 2015 may have a protective and flexible outer coating, such as PVC. In some non-limiting examples, the camera mirror cable 2015 may be about 10 feet long to allow for ease of use during surgery. The length of the camera mirror cable 2015 may be in the range of about 5 feet to about 15 feet. Non-limiting examples of the length of the camera mirror cable 2015 may be about 5 feet, about 6 feet, about 7 feet, about 8 feet, about 9 feet, about 10 feet, about 11 feet, about 12 feet, about 13 feet, about 14 feet, about 15 feet, or any length or range of lengths therebetween. The elongate camera probe 2024 may be made of a rigid material such as stainless steel. The elongate camera probe 2024 may be engaged with the handheld unit body 2021 via a rotatable collar 2026. The rotatable collar 2026 may allow the elongate camera probe 2024 to rotate relative to the handheld unit body 2021. The elongate camera probe 2024 may terminate at a distal end with a plastic window 2028 sealed with epoxy.
The side plan view of the handheld unit depicted in fig. 9C shows that a light or image sensor 2030 may be disposed at the distal end 2032a of the elongate camera probe or within the handheld unit body 2032 b. The light or image sensor 2030 may be provided with additional optical elements in the imaging control unit 2002. Fig. 9C depicts an example of a light sensor 2030 comprising a CMOS image sensor 2034 disposed within a mounting frame 2036 having a radius of about 4 mm. Although the CMOS image sensor in fig. 9C is depicted as being disposed within a mount 2036 having a radius of about 4mm, it is recognized that such sensor and mount combination may have any useful size to be disposed within the elongate camera probe 2024, the handheld unit body 2021, or within the image control unit 2002. Some non-limiting examples of such alternative mounts may include 5.5mm mount 2136, 4mm mount 2136, 2.7mm mount 2136, and 2mm mount 2136. It is recognized that the image sensor may also include a CCD image sensor. CMOS or CCD sensors may include an array of individual light sensing elements (pixels).
During surgery, a surgeon may be required to manipulate tissue to achieve a desired medical result. The surgeon's action is limited by the visual observation in the surgical site. Thus, the surgeon may not be aware of the placement of vascular structures located under the tissue being manipulated, for example, during surgery.
Because the surgeon cannot visualize the vasculature below the surgical site, the surgeon may accidentally sever one or more critical blood vessels during the procedure.
It is therefore desirable to have a surgical visualization system that can acquire imaging data of a surgical site for presentation to a surgeon in which the presentation can include information related to the presence of vascular structures located beneath the surface of the surgical site.
Some aspects of the present disclosure also provide control circuitry configured to control illumination of a surgical site using one or more illumination sources, such as laser sources, and to receive imaging data from one or more image sensors. In some aspects, the present disclosure provides a non-transitory computer-readable medium storing computer-readable instructions that, when executed, cause an apparatus to detect a blood vessel in tissue and determine its depth below a tissue surface.
In some aspects, a surgical image acquisition system may include: a plurality of illumination sources, wherein each illumination source is configured to be capable of emitting light having a specified center wavelength; a light sensor configured to receive a portion of light reflected from the tissue sample when the tissue sample is illuminated by one or more of the plurality of illumination sources; a computing system. The computing system may be configured to be capable of: receiving data from the light sensor when the tissue sample is illuminated by each of the plurality of illumination sources; determining a depth position of a structure within the tissue sample based on data received by the light sensor when the tissue sample is illuminated by each of the plurality of illumination sources, and calculating visual data regarding the structure and the depth position of the structure. In some aspects, the visualization data may have a data format that may be used by a display system, and the structure may include one or more vascular organizations.
In one aspect, a surgical image acquisition system can include an independent color cascade of illumination sources including visible light and light outside of the visible range to image one or more tissues within a surgical site at different times and different depths. The surgical image acquisition system may also detect or calculate characteristics of light reflected and/or refracted from the surgical site. The characteristics of the light can be used to provide a composite image of tissue within the surgical site, as well as to provide an analysis of underlying tissue that is not directly visible at the surface of the surgical site. The surgical image acquisition system can determine tissue depth locations without the need for a separate measurement device.
In one aspect, the characteristic of the light reflected and/or refracted from the surgical site may be an amount of absorbance of the light at one or more wavelengths. The various chemical components of the individual tissues may result in specific light absorption patterns that are wavelength dependent.
In one aspect, the illumination source may include a red laser source and a near infrared laser source, wherein the one or more tissues to be imaged may include vascular tissue, such as veins or arteries. In some aspects, a red laser source (in the visible range) may be used to image some aspects of underlying vascular tissue based on spectroscopy in the visible red range. In some non-limiting examples, the red laser source may provide illumination having a peak wavelength that may be in a range between 635nm and 660nm, inclusive. Non-limiting examples of peak wavelengths of the red laser light may include about 635nm, about 640nm, about 645nm, about 650nm, about 655nm, about 660nm, or any value or range of values therebetween. In some other aspects, a near infrared laser source may be used to image underlying vascular tissue based on near infrared spectroscopy. In some non-limiting examples, the near infrared laser source may emit illumination at a wavelength that may range between 750 and 3000nm, inclusive. Non-limiting examples of peak wavelengths of the infrared laser light may include about 750nm, about 1000nm, about 1250nm, about 1500nm, about 1750nm, about 2000nm, about 2250nm, about 2500nm, about 2750nm, 3000nm, or any value or range of values therebetween. It will be appreciated that a combination of red and infrared spectra may be used to detect underlying vascular tissue. In some examples, vascular tissue may be detected using a red laser source with a peak wavelength at about 660nm and a near-IR laser source with a peak wavelength at about 750nm or at about 850 nm.
Near infrared spectroscopy (NIRS) is a non-invasive technique that allows tissue oxygenation to be determined based on spectrophotometric quantification of oxygenated and deoxygenated hemoglobin within the tissue. In some aspects, NIRS may be used to directly image vascular tissue based on the difference in absorbance of illumination between vascular tissue and non-vascular tissue. Alternatively, vascular tissue may be visualized indirectly based on differences in the absorbance of illumination of blood flow in the tissue before and after a physiological intervention is applied, such as arterial occlusion methods and venous occlusion methods.
The instrument for Near Infrared (NIR) spectroscopy may be similar to the instrument for the UV visible and mid IR range. Such spectroscopy apparatus may include an illumination source, a detector, and a dispersive element to select a particular near-IR wavelength for illuminating the tissue sample. In some aspects, the source may comprise an incandescent light source or a quartz halogen light source. In some aspects, the detector may comprise a semiconductor (e.g., inGaAs) photodiode or photo array. In some aspects, the dispersive element may comprise a prism or more commonly a diffraction grating. Fourier transform NIR instruments using interferometers are also common, especially for wavelengths greater than about 1000 nm. Depending on the sample, the spectrum can be measured in either reflection or transmission mode.
Fig. 10 schematically depicts one example of an instrument 2400 similar to that used for the UV visible and mid IR range of the NIR spectrum. The light source 2402 may emit a broad spectral range of illumination 2404 that may be projected onto a dispersive element 2406, such as a prism or diffraction grating. The dispersive element 2406 is operable to select a narrow wavelength portion 2408 of light emitted by the broad spectrum light source 2402, and the selected light portion 2408 may illuminate the tissue 2410. Light reflected from tissue 2412 may be directed to detector 2416 (e.g., using dichroic mirror 2414), and the intensity of reflected light 2412 may be recorded. The wavelength of light illuminating the tissue 2410 may be selected by the dispersive element 2406. In some aspects, the tissue 2410 may be irradiated with only a single narrow wavelength portion 2408 selected by the dispersive element 2406 from the light source 2402. In other aspects, the tissue 2410 may be scanned with various narrow wavelength portions 2408 selected by the dispersive element 2406. In this way, spectroscopic analysis of the tissue 2410 can be obtained in the NIR wavelength range.
Fig. 11 schematically depicts one example of an instrument 2430 for determining NIRS based on fourier transform infrared imaging. In fig. 11, a laser source that emits 2432 light in the near IR range 2434 illuminates tissue sample 2440. Light reflected 2436 by tissue 2440 is reflected by a mirror (such as dichroic mirror 2444) to beam splitter 2446. The beam splitter 2446 directs a portion of the light 2448 reflected by the tissue 2440 to the stationary mirror 2450 and directs a portion of the light 2452 reflected 2436 by the tissue 2440 to the moving mirror 2454. The moving mirror 2454 can oscillate in place based on an attached piezoelectric transducer activated by a sinusoidal voltage having a voltage frequency. The position of the moving mirror 2454 in space corresponds to the frequency of the sinusoidal activation voltage of the piezoelectric transducer. Light reflected from the moving mirror and the stationary mirror may be recombined 2458 at beam splitter 2446 and directed to detector 2456. The computing component may receive the signal output of detector 2456 and perform a fourier transform (in time) of the received signal. Because the wavelength of light received from moving mirror 2454 varies in time relative to the wavelength of light received from stationary mirror 2450, the time-based fourier transform of the recombined light corresponds to the wavelength-based fourier transform of recombined light 2458. In this way, a wavelength-based spectrum of light reflected from tissue 2440 may be determined, and spectral characteristics of light reflected 2436 from tissue 2440 may be acquired. Thus, a change in absorbance of illumination from a spectral component of light reflected from tissue 2440 may indicate the presence or absence of tissue having a particular light absorption characteristic (such as hemoglobin).
An alternative form of near-infrared light to determine hemoglobin oxygenation would be to use monochromatic red light to determine the red light absorbance characteristics of hemoglobin. The absorbance characteristic of hemoglobin to red light having a center wavelength of about 660nm may indicate whether hemoglobin is oxygenated (arterial blood) or deoxygenated (venous blood).
In some alternative surgical procedures, contrast agents may be used to improve the collected data regarding oxygenation and tissue oxygenation. In one non-limiting example, NIRS techniques may be used in conjunction with bolus injection of a near IR contrast agent, such as indocyanine green (ICG), having a peak absorbance at about 800 nm. ICG has been used in some medical procedures to measure cerebral blood flow.
In one aspect, the characteristic of light reflected and/or refracted from the surgical site may be a Doppler shift of the wavelength of light from its illumination source.
Laser doppler flow can be used to visualize and characterize the particle flow moving against an effectively stationary background. Thus, the laser light scattered by moving particles (such as blood cells) may have a different wavelength than the original illumination laser source. In contrast, laser light scattered by an effectively stationary background (e.g., vascular tissue) may have the same wavelength as the original illuminating laser source. The wavelength change of scattered light from the blood cells may reflect both the flow direction of the blood cells relative to the laser source and the blood cell velocity.
Fig. 12 depicts aspects of an instrument 2530 that can be used to detect doppler shift of laser light scattered from a portion of tissue 2540. Light 2534 from laser 2532 may pass through beam splitter 2544. Portions of the laser 2536 may be transmitted by the beam splitter 2544 and may illuminate the tissue 2540. Another portion of the laser light may be reflected 2546 by beam splitter 2544 to impinge on detector 2550. Light backscattered 2542 by tissue 2540 may be directed by beam splitter 2544 and also projected on detector 2550. The combination of light 2534 from laser 2532 with light 2542 back-scattered by tissue 2540 may result in an interference pattern detected by detector 2550. The interference pattern received by detector 2550 may include interference fringes produced by a combination of light 2534 from laser 2532 and doppler shifted (and thus wavelength shifted) light backscattered 2452 from tissue 2540.
It can be appreciated that the back-scattered light 2542 from the tissue 2540 can also include back-scattered light from boundary layers within the tissue 2540 and/or absorption of wavelength-specific light by materials within the tissue 2540. Thus, the interference pattern observed at detector 2550 may incorporate interference fringe features from these additional optical effects and thus may confound the calculation of doppler shift unless properly analyzed.
Fig. 13 depicts aspects of a composite visual display 2800 that may be presented by a surgeon during a surgical procedure. The composite visual display 2800 may be constructed by overlaying a white light image 2830 of the surgical site with a Doppler analysis image 2850.
The white light image 2830 may depict the surgical site 2832, the one or more surgical incisions 2834, and the tissue 2836 as is optionally visible within the surgical incision 2834. The white light image 2830 may be generated by illuminating 2840 the surgical site 2832 with a white light source 2838 and receiving reflected white light 2842 by an optical detector. While a white light source 2838 may be used to illuminate the surface of the surgical site, in one aspect, an appropriate combination of red, green, and blue lasers 2854, 2856, 2858 may be used to visualize the surface of the surgical site.
The Doppler analysis image 2850 may include blood vessel depth information along with blood flow information 2852 (from speckle analysis). The blood vessel depth and blood flow velocity may be obtained by: the surgical site is irradiated with a laser of a plurality of wavelengths and the blood vessel depth and blood flow are determined based on the known penetration depth of the light of the particular wavelength. In general, the surgical site 2832 may be illuminated by light emitted by one or more lasers (such as red laser 2854, green laser 2856, and blue laser 2858). The CMOS detector 2872 may receive light (2862, 2866, 2870) reflected back from the surgical site 2832 and its surrounding tissue. The 2874 Doppler analysis image 2850 may be constructed based on an analysis of the plurality of pixel data from the CMOS detector 2872.
For example, the red laser 2854 may emit red laser shots 2860 on the surgical site 2832 and the reflected light 2862 may exhibit a surface or minimal subsurface structure. In one aspect, the green laser 2856 may emit green laser shots 2864 on the surgical site 2832 and the reflected light 2866 may exhibit deeper subsurface characteristics. In another aspect, the blue laser 2858 may emit blue laser shots 2868 on the surgical site 2832 and the reflected light 2870 may show blood flow, for example, deeper within the vascular structure. The speckle contrast analysis may present information to the surgeon regarding blood flow and velocity through deeper vascular structures.
Although not shown in fig. 13, it should be appreciated that the imaging system may also illuminate the surgical site with light outside the visible range. Such light may include infrared light and ultraviolet light. In some aspects, the source of infrared light or ultraviolet light may include a broadband wavelength source (such as a tungsten source, a tungsten halogen source, or a deuterium source). In some other aspects, the source of infrared or ultraviolet light may comprise a narrowband wavelength source (IR diode laser, UV gas laser, or dye laser).
The depth of the surface features in the tissue slice may be determined. The image acquisition system may illuminate tissue with a first light beam having a first center frequency and receive a first reflected light from the tissue illuminated by the first light beam. The image acquisition system may then calculate a first doppler shift based on the first light beam and the first reflected light. The image acquisition system may then illuminate the tissue with a second light beam having a second center frequency and receive a second reflected light from the tissue illuminated by the second light beam. The image acquisition system may then calculate a second doppler shift based on the second light beam and the second reflected light. The image acquisition system may then calculate a depth of the tissue feature based at least in part on the first center wavelength, the first doppler shift, the second center wavelength, and the second doppler shift. Tissue characteristics may include the presence of moving particles (such as blood cells moving within a blood vessel), as well as the direction and speed of flow of the moving particles. It should be appreciated that the method may be extended to include illuminating tissue with any one or more additional beams. Further, the system may calculate an image comprising a combination of an image of the tissue surface and an image of a structure disposed within the tissue.
Multiple visual displays may be used. For example, a 3D display may provide a composite image that displays a combined white light (or a suitable combination of red, green, and blue lasers) and laser doppler image. The additional display may provide only a white light display or a display that displays a composite white light display and NIRS display to visualize only the blood oxygenation response of the tissue. However, NIRS display may not require every cycle of the tissue response to be allowed.
Surgical visualization systems using the imaging techniques disclosed herein may benefit from ultra-high sampling and display frequencies. The sampling rate may be associated with the capabilities of the underlying device performing the sampling. A general purpose computing system with software may be associated with a first range of achievable sampling rates. A pure hardware implementation (e.g., an application specific integrated circuit ASIC) may be associated with the second range of achievable sampling rates. The second range associated with a pure hardware implementation will typically be higher than (e.g., much higher than) the first range associated with a general purpose computing software implementation.
Surgical visualization systems using the imaging techniques disclosed herein may benefit from a solution that balances the higher sampling rate associated with hardware-based implementations with both the adaptability and/or updatability of software systems. Such a surgical visualization system may employ a mix of hardware and software solutions. For example, the surgical visualization system may employ various hardware-implemented transformations with software selectors. The surgical visualization system may also employ a Field Programmable Gate Array (FPGA). An FPGA may include hardware devices that may include one or more logic elements. These logic elements may be configured by a bitstream to perform various functions. For example, logic elements may be configured to perform certain individual logic functions and to perform these functions in accordance with certain orders and interconnections. Once configured, the FPGA can perform its functions using hardware logic elements without further configuration. Moreover, once configured, the FPGA can be reconfigured with different bit streams to implement different functions. Similarly, once reconfigured, the FPGA may perform this different function using hardware logic elements.
Fig. 14 illustrates an exemplary surgical visualization system 10000. The surgical visualization system 10000 can be used to analyze at least a portion of a surgical field. For example, the surgical visualization system 10000 can be used to analyze tissue 10002 within at least a portion of a surgical site. The surgical visualization system 10000 can include an FPGA 10004, a processor (e.g., processor 10006 local to the FPGA 10004), a memory 10008, a laser illumination source 10010, a light sensor 10012, a display 10014, and/or a processor 10016 that is remotely disposed from the FGPA. The surgical visualization system 10000 can include components and functions described in connection with, for example, fig. 9A-9C.
System 10000 can use FPGA 10004 to convert reflected laser light by frequency conversion to identify, for example, doppler shift of the light to determine moving particles. The transformed data may be displayed (e.g., in real-time). For example, it may be displayed as a graph and/or metric 10020, representing the number of moving particles per second. The system 10000 can include communication between a processor 10006 local to the FPGA 10004 and a processor 10016 remotely located from the FGPA. For example, the processor 10016, which is remotely located from the FGPA 10004, may aggregate data (e.g., a few seconds of data). And the system may be capable of displaying the data aggregate. For example, it may be displayed as a graph and/or metric representing movement trend 10026. The graph and/or metric 10026 may be superimposed over the real-time data. Such trend information can be used to identify occlusion, instrument vessel sealing/clamping efficiency, vessel tree overview, and even oscillating motion amplitude over time. The FPGA 10004 can be configured to be capable of updating on the fly, e.g., capable of updating with different (e.g., more complex) transforms. These updates may come from local or remote communication servers. These updates may change the analysis of the transformation from refractive index (e.g., analysis of cell irregularities) to blood flow, to multiple simultaneous depth analyses, etc., for example.
FPGA updates may include transformations that implement a variety of imaging options for the user. These imaging options may include standard combinations of visible light, tissue refractive index, doppler shift, motion artifact correction, improved dynamic range, improved local definition, super resolution, NIR fluorescence, multispectral imaging, confocal laser microscopy, optical coherence tomography, raman spectroscopy, photoacoustic imaging, or any combination. The imaging options may include any of the options presented in any of the following: U.S. patent application Ser. No. 15/940,742 entitled "DUAL CMOS ARRAY IMAGING" filed on 2018, 3, 29; U.S. patent application Ser. No. 13/952,564, entitled "WIDE DYNAMIC RANGE USING MONOCHROMATIC SENSOR," filed on 7/26/2013; U.S. patent application Ser. No. 14/214,311, entitled "SUPER RESOLUTION AND COLOR MOTION ARTIFACT CORRECTION IN A PULSED COLOR IMAGING SYSTEM," filed on day 14,3, 2014; U.S. patent application No. 13/952,550, entitled "CAMERA SYSTEM WITH MINIMAL AREA MONOLITIC CMOSIMAGE SENSOR" filed on 7/26/2013, each of which is incorporated herein by reference in its entirety. For example, doppler wavelength shift shifting may be used to identify the number, size, velocity, and/or directionality of moving particles. For example, doppler wavelength shifting can be used with multiple laser wavelengths to correlate tissue depth and moving particles. For example, tissue refractive index may be used to identify irregularities or variability in tissue shallowness and subsurface aspects. In surgical practice, it may be beneficial to identify tumor margins, infected, ruptured surface tissue, adhesions, changes in tissue composition, and the like. NIR fluorescence may include techniques in which the drug injected by the system is preferentially absorbed by the target tissue. When illuminated with light of the appropriate wavelength, the target tissue fluoresces and can be imaged by a viewer/camera with NIR capability. Hyperspectral imaging and/or multispectral imaging can include illuminating and evaluating tissue at a number of wavelengths throughout the electromagnetic spectrum to provide a real-time image. It can be used to distinguish target tissue. For example, it is also possible to achieve imaging depths of 0mm-10 mm. Confocal laser microscopy (CLE) can use light to capture high resolution, cell-level resolution without penetrating into the tissue. It can provide real-time histopathology of tissue. Techniques for capturing micron resolution 3D images from within tissue using light. Optical Coherence Tomography (OCT) can employ NIR light. For example, OCT may enable imaging of tissue having a depth of 1mm-2 mm. Raman spectroscopy may include techniques to measure photon shifts caused by monochromatic laser illumination of tissue. It can be used to recognize certain molecules. Photoacoustic imaging may include subjecting tissue to laser pulses such that a portion of the energy causes thermo-elastic expansion and ultrasound emission. These generated ultrasonic waves can be detected and analyzed to form an image.
The laser illumination source 10010 may comprise any laser illumination source suitable for analyzing human tissue. For example, the laser illumination source 10010 may include a device such as a source laser emitter. The laser illumination source 10010 may use one or more wavelengths of laser light to illuminate the tissue 10002. For example, the laser illumination source 10010 may use a red-blue-green-ultraviolet 1-ultraviolet 2-infrared combination. This combination with, for example, 360Hz-480Hz sampling and actuation rate will allow each light source to have multiple frames at the end user 60Hz combined frame rate. Laser wavelength combinations with independent sources can improve the resolution produced by a single array and can achieve various depth penetrations.
For example, tissue 10002 may be human tissue within a portion of a surgical site. The laser light may reflect from the tissue 10002, thereby producing reflected laser light. The reflected laser light may be received by the light sensor 10012. The light sensor 10012 can be configured to receive reflected laser light from at least a portion of the surgical field. The light sensor 10012 can be configured to receive laser light from the entire surgical field. The light sensor may be configured to receive reflected laser light from a selectable portion of the surgical field. For example, a user, such as a surgeon, may direct a light sensor and a light laser illumination source and/or a laser illumination source to analyze a particular portion of a surgical field.
The light sensor 10012 may be any device suitable for sensing reflected laser light and outputting corresponding information. For example, the light sensor 10012 may detect one or more characteristics of the reflected laser light, such as amplitude, frequency, wavelength, doppler shift, and/or other time or frequency domain characteristics. The laser sensor 10012 source may comprise a device that incorporates a light sensor such as that disclosed in fig. 9A-9C.
The laser sensor 10012 may include one or more sensor modules 10013. The sensor module 10013 may be configured to be able to measure a wide range of wavelengths. For example, the sensor module 10013 may be tuned and/or filtered to measure a particular wavelength. For example, the sensor module 10013 may include a discrete sensor, a set of sensors, a sensor array, a combination of sensor arrays, or the like. For example, the sensor module 10013 may include a semiconductor component such as a photodiode, a CMOS (complementary metal oxide semiconductor) image sensor, a CCD (charge coupled device) image sensor, or the like. The laser sensor 10012 may comprise a dual CMOS array. Details regarding the use of FPGAs in imaging systems can be found in U.S. patent application serial No. 17/062,521 (attorney docket No. END9287USNP 2), filed on even date 10/2020, entitled "TIERED-ACCESS SURGICAL VISUALIZATION SYSTEM," the entire contents of which are incorporated herein by reference.
FIG. 15 illustrates exemplary aspects of the visualization system described herein. The example surgical visualization system 45500 may be used to analyze tissue 10002 within at least a portion of a surgical site. The surgical visualization system 45500 may include a laser illumination source 10010, a light sensor 10012, a display 10014, and/or one or more processing units 45502 and 45504. The surgical visualization system 45500 may include, for example, the components and functions described in connection with fig. 14, such as the laser illumination source 10010, the light sensor 10012 including the sensor module 10013, and/or the display 10014. The surgical visualization system 45500 may include components and functions described in connection with, for example, fig. 9A-9C.
As shown in fig. 15, one or more surgical video streams generated from light sensor 10012 can be transmitted to display 10014 via path 45506, path 45508, path 45510, and/or other paths. The surgical video stream may include various video feeds described herein with reference to fig. 9-14. The surgical video stream may include a primary visual video feed, such as an in-vivo camera feed. The surgical video stream may include one or more ancillary video feeds, such as a video stream associated with multispectral analysis, a video stream associated with a doppler flow meter, video streams of different spectral ranges, video streams captured using visible light and light outside the visible range, video streams captured at different time intervals, and/or video streams for overlaying onto another video stream.
The surgical video stream may be processed via one or more processing modules. For example, video stream 45514 may be processed via processing module 45502 and video stream 45516 may be processed via processing module 45504. The video stream may be multiple feeds of the same source stream that are separately transmitted and processed. As shown, video stream 45512 may be transmitted via a dedicated communication stream pipe (pipeline) line bypassing processing modules such as processing modules 45502 and 45504. For example, video stream 45512 may be an HD video feed with no processing or intermediate steps between the mirror and the display. Video streams 45514 and 45516 may be sent through post-capture processing to extract or convert video or images to provide additional visualization capabilities. Video stream 45512 may be identical to video streams 45514 and/or 45516 prior to processing via processing modules 45502 and/or 45504. The processes 45502 and 45504 may be the same or different.
As shown in fig. 15, the redundant surgical imaging communication tubing and process can provide a fail-safe condition for the primary visible light feed. The primary video stream may be transmitted via multiple paths. For example, a video stream may be split into different portions for transmission via multiple paths. A portion of the video stream (e.g., every other picture frame) may be transmitted via one path and the remaining portion of the video stream or the remaining picture frames may be transmitted via another path. The two parts may be combined or merged prior to display. Accordingly, the transmission speed of the video feed can be increased and the data storage and transmission limitations of the system architecture can be overcome. For example, two video stream portions may be encoded such that they may be independently decoded and displayed without being recombined. Combining the two parts can produce a higher quality video feed; however, displaying the video stream portion may provide the surgeon with a sufficient view of the surgical site. If a problem associated with a path for transmitting a portion of a video stream is detected, the combining or merging of the video stream portions may be paused and the video stream portions that have been successfully transmitted to the display system may be displayed.
The main video stream may be processed via multiple instances of the same processing module over different paths. If an instance of the processing module experiences a delay or failure, the video stream may be processed via another instance of the processing module and may be displayed. If the processing causes undue delay or experiences failure, a video stream that bypasses the processing may be displayed. Utilizing multiple video stream paths may improve processing speed and reliability because individual paths may be associated with separate processing elements. By performing processing tasks in parallel, failure of one video feed may not result in loss of all feeds. Performing different processing tasks in parallel and combining the processed video streams for display may increase processing throughput and reduce latency.
For example, the computing system may control the capturing, processing, and/or transmitting of the visualization feed to prioritize the delay over the reliability, or prioritize the reliability over the delay. The priority focus may be dynamically controlled or updated. For example, the computing system may prioritize within the FPGA so that latency may be reduced. The visualization feeds may be prioritized within the FPGA so that the reliability of the video stream may be improved. For example, the computing system may adjust communication paths such as communication paths 45506, 45508, and 45510 to prioritize the delay over the reliability, or prioritize the reliability over the delay. For example, allocating more communication paths to transmit the same video stream may improve reliability, and allocating communication paths for processing in parallel may reduce latency.
Thus, redundancy allows advanced computational imaging processing, reduced latency, and/or allows communication failure while still providing an ensured visual feed of the camera.
Fig. 16 illustrates an exemplary process for delivering surgical imaging feeds using redundant tubing. At 45520, a plurality of surgical video streams may be obtained via a plurality of paths. The multiple video or imaging feeds may be copies of the same feed with different paths to the user display. For example, a first video stream may be obtained via a communication path and a second video stream may be obtained via another communication path. The multiple surgical video streams may be bifurcated, allowing a portion of the feed to pass down one path and a different portion to pass through another path. For example, the surgical video stream may be an HD video feed, and the surgical video stream may be a higher quality video stream or a video stream with added visualization capabilities. The plurality of surgical video streams may be obtained from the same in-vivo visible light feed, or may be obtained from different in-vivo visible light feeds. At 45522, a video stream may be displayed. At 45524, it may be determined whether the video stream being displayed encounters at least one problem. Upon detecting a problem with the video stream being displayed, another video stream may be displayed at 45526. For example, the main video stream may be initially displayed. The secondary video stream may be displayed upon detection of a problem associated with the primary video. Redundant communication paths may be used in parallel to improve reliability, communication speed, throughput, and/or reduce delay of surgical imaging/video feeds for display.
FIG. 17 illustrates an exemplary process for processing a surgical imaging feed using redundant processing paths. At 45530, a source surgical imaging stream may be obtained. For example, the source surgical imaging stream may be obtained as described herein with reference to fig. 9-15. At 45532, the source stream may be processed using a plurality of processing modules. For example, at least some of the processing modules may be used to process the surgical imaging stream in parallel. At 45534, it may be determined whether a problem has been encountered at the processing module. If no problems are found, the processed video streams may be combined for display at 45536. Upon detecting a problem associated with the processing module, a video stream that is unaffected by the detected problem may be displayed at 45540. For example, a video stream that has not been processed by a processing module associated with the detected problem may be selected for display.
For example, a surgical imaging stream generated from the light sensor 10012 as shown in fig. 15 can be obtained. The source surgical imaging stream may be processed in parallel using processing module 45502 and processing module 45504. Processing module 45502 and processing module 45504 may be different instances of the same type of processing module. The processed video stream 45514 and the processed video stream 45516 may be combined for display.
Upon detecting a problem associated with processing module 45504, processed video stream 45516 may become unavailable and the merging of processed video streams 45514 and 45516 may cease. The processed video stream 45514 may be identified as a video stream that is not affected by the detected problem and may be displayed. The video stream 45512 bypassing the processing module shown in fig. 15 may be identified as a video stream that is not affected by the detected problem and may be displayed. For example, when both processing modules 45502 and 45504 encounter a problem, an unprocessed video stream may be selected for display. For example, the computing system may detect a problem associated with the processing module and provide an indication of a processing interrupt. The computing system may provide options to the HCP to enable the HCP to manually control the visualization system. The computing system may allow the user to select processing aspects that bypass the visualization system. Thus, the computing system may ensure that the user may display video even if the hardware processing fails. Processing module 45502 and processing module 45504 may be used in parallel to improve processing throughput, reduce latency, and/or ensure availability of surgical imaging streams for display.
For example, a source video stream (such as a video stream generated from light sensor 10012 as shown in fig. 15) may be split into different portions to be processed separately. A first processing module, such as processing module 45502, may be used to process a portion of the video stream (e.g., every other picture frame), and a second processing module, such as processing module 45504, may be used to process the remaining portion of the video stream or the remaining picture frames. The two processed video portions (such as processed video streams 45514 and 45516) may be combined or merged prior to display. Thus, the processing speed of the video feed can be increased and the processing limitations of the system architecture can be overcome. Combining the two parts can produce a higher quality video feed; however, displaying the video stream portion may provide the surgeon with a sufficient view of the surgical site. If a problem associated with the processing module is detected, the combining or merging of the video stream portions may be paused and the video stream portions that have been successfully processed may be displayed.
For example, multiple source video streams may be generated from the light sensor 10012 as shown in fig. 15. The source video stream may contain images associated with different temporal aspects. For example, after processing, the video streams may be combined for display. The video streams may be displayed separately without merging. For example, the video stream may be used as a redundant backup offering to ensure that the user has a display of the source regardless of problems with the processing module or communication path.
Exemplary processing modules may include, but are not limited to, a multispectral analysis module as described herein with reference to fig. 14, a laser doppler flow meter analysis module as described herein with reference to fig. 12 and 13, a field programmable array, a content compounding module, and the like.
The exemplary processing module may be configured to enable the surgical video stream to be enhanced using another video stream. For example, the computing system may derive contextual information associated with the surgical procedure by analyzing the video streams described herein. The context information may be used to enhance another surgical video stream. For example, the computing system may extract one or more portions (e.g., portions including a region of interest) from the surgical video stream and superimpose the extracted portions onto another surgical video stream, as described herein.
The exemplary processing module may be configured to be capable of annotating the surgical video. As will be appreciated by those skilled in the art, video may be annotated with metadata. Visual annotations may include indicating the location of objects or people of interest in a video, describing objects in a video, describing the context of a scene, and/or providing other information. For example, the processing module may be configured to be capable of annotating microsurgical results. The processing module may be configured to be able to analyze the video feed and identify a point in time at which the tag is added. The processing module may receive data from the surgical device and may be configured to annotate data captured via the device sensor, such as raw sensor data. The processing module may be configured to be capable of annotating a scale (scaling). The processing module may be configured to be capable of converting the video stream for use with a 3D environment visualization tool. The annotation may be inserted on a timestamp or on the video itself. The processing module may receive input from the HCP such as a resident diary. When annotated into the surgical video, the raw sensor data may be coupled with a resident diary. The video stream used with the 3D environment visualization tool may be annotated with a resident diary. The processing module may be configured to be able to track an object of interest and identify a contour of the object when the object is occluded or partially occluded. The processing module may annotate the contour of the object of interest in the video. The processing module may receive surgical information, such as context information, from a situational awareness surgical hub as described herein with reference to fig. 8, and may insert the context information into the video. The processing module may be configured to be able to annotate the course of the plurality of events over time.
The exemplary processing module may be configured to be able to derive contextual information associated with the surgical procedure by analyzing the video streams described herein. The derived context information may be indicated (e.g., inserted) in another video stream. For example, as described herein, the derived context information may be inserted into the overlay region described herein.
The object may be confirmed and tracked via video processing from video captured by one or more imaging systems. The video may be analyzed to identify objects. For example, the processing module may identify and highlight known objects via annotations such that the identified objects may be capable of being confirmed with a frame-to-frame contour (frame-to-frame outlining). The processing module may locate one or more objects of interest in the video. For example, the processing module may predict objects within the image that lie within the boundary. The video feed may be analyzed using various known image or video processing techniques (e.g., keypoint detection and analysis, bounding box annotation, polygonal mesh processing, image segmentation, face recognition, gesture recognition, point clouds, lines and splines, etc.). Those skilled in the art will appreciate that various object detection and tracking techniques used in autonomous vehicles may be used in surgical video processing.
The video may be processed to identify movement of the HCP and/or the object. Based on the movement information, surgical activity and/or actions of the HCP may be determined. For example, HCPs or objects may be tracked and classified based on motion, function, and/or manipulation. A repeatable event pattern may be identified.
The video may be stored during processing as part of a video display. The computing system may be configured to be able to erase the video after processing and transmitting the video to address privacy concerns and retention aspects of the video in the process.
The present invention provides a computing system that can generate a composite video stream from a plurality of input feeds. The computing system may obtain a surgical video stream and overlay content associated with a surgical procedure. The computing system may determine a location of an overlay region for overlaying the overlay content by analyzing the content of the surgical video stream. For example, based on the content of the frame of the surgical video stream, the computing system may determine a location of an overlay region in the frame for overlaying the overlay content; based on the content of a subsequent frame of the surgical video stream, the computing system may determine another overlay region location in the subsequent frame for overlaying the overlay content. The composite video stream may be generated based on the overlay region locations determined for different frames of the surgical video stream.
For example, the surgical video stream may be a video feed from a laparoscopic surgical site, and the composite video stream may be generated by overlaying the overlay content onto the video of the surgical site at the determined overlay region location. As the laparoscope moves, the position, orientation, and/or size of the superimposed content can be adjusted over the surgical site in the video stream. The surgical video stream may include frames with surgical instruments and the composite video stream may be generated by overlaying the overlay content onto the surgical instruments at the determined overlay region locations. As the surgical instrument moves, the position, orientation, and/or size of the overlay content may be adjusted in the video stream.
For example, the primary imaging of the surgical site may be supplemented by overlaying or inserting a secondary video feed and/or data overlay. The overlay content is adjustable as the primary mirror image moves and is orientable relative to the surgical site and surgical instrument. The overlay content may cover the entire primary imaging or may cover a portion of the primary imaging feed. The overlay region may coincide with a predetermined location on the instrument based on one or more fiducial markers on the instrument.
FIG. 18 illustrates an exemplary process for generating a composite surgical video stream from multiple input feeds. At 45550, a surgical video stream may be obtained. For example, the computing system may obtain the surgical video stream via an imaging device, such as imaging device 20030 described herein with respect to fig. 2. For example, the computing system may obtain a surgical video stream as described herein with reference to fig. 9-15. The surgical video stream may be or may include a video feed from a laparoscopic surgical site. For example, the computing system may obtain the surgical video stream via one OR more cameras in the OR (such as camera 20021 as described herein with reference to fig. 2).
At 45552, overlay content may be obtained. The overlay content may include information associated with a device such as a surgical instrument. For example, overlay content for overlaying onto an energy device in an image or video may include an indication of energy blade temperature, an indication of power-on status (e.g., due to capacitive coupling), and/or other information associated with the energy device. For example, the overlay content may include use step instructions associated with the surgical instrument.
For example, overlay content for overlaying a surgical stapling and severing device in an image or video may include an indication of the loading state of the surgical instrument (e.g., whether a cartridge is loaded). A loading condition on the surgical instrument may be sensed, and overlay content may be generated based on the sensed loading condition. This may prevent the blades from moving forward when the cartridge is not loaded or is improperly loaded.
The overlay content may include labels for the anatomical segment and the peripheral edge of at least a portion of the anatomical segment. The peripheral edge may be configured to guide a surgeon to a cutting position relative to the anatomical segment. The overlay content may include a supplemental image of an organ associated with the surgical procedure. The overlay content may include alternate imaging. The overlay content may include one or more of data obtained via pre-operative tumor MRI, CT imaging, related pre-operative data, ICG data, real-time doppler monitoring, surgical procedures, device status, and/or other overlays customizable by the user. The overlay content may include an indication associated with a previous surgical step of the surgical procedure, such as a previous suture and/or a previous weld.
The overlay content may include the secondary video feed or a portion of the secondary video feed. For example, the secondary video feed may be a video feed that has been processed via one or more processing modules as described herein with respect to fig. 15. The secondary video feed may be a video associated with the current surgical procedure. The secondary video feed may be a instructional video showing how the surgical procedure is performed. The secondary video feed may be surgical analog video. The secondary video feed may be video of a previous surgical step. For example, the secondary video feed may be previously stitched and/or previously welded video. This may enable the HCP to compare previous work with current surgical steps, for example, to more easily identify a transection site. The computing system may extract a portion of the secondary surgical video feed as overlay content. For example, portions of the secondary surgical video including the region of interest may be identified and extracted.
For example, the overlay content may be obtained from another video feed, from a surgical hub described herein, from one or more surgical devices described herein, and/or from one or more sensors described herein. The overlay content may be updated in real-time in response to changes in the position and/or orientation of the surgical instrument within the surgical instrument frame. Various superimposed and obtained superimposed content is further described in U.S. patent application Ser. No. 17/062,509 (attorney docket number: END9287USNP, entitled "INTERACTIVE INFORMATION OVERLAY ON MULTIPLE SURGICAL DISPLAYS," filed on even date 10/2 in 2020), which is incorporated herein by reference in its entirety.
At 45554, a position, size, and/or orientation of the overlay region may be determined. The computing system may determine a location of an overlay region for overlaying the overlay content by analyzing the content of the surgical video stream. In an example, intra-image markers may be used to insert data, images, and alternative imaging streams. By tracking and reading the markers on the instrument, the computing system can overlay information at consistent locations relative to the instrument such that the overlay content can move as the instrument moves in the video feed.
One or more overlay regions may be determined based on one or more fiducial markers. For example, a surgical instrument (such as the surgical instrument described herein with reference to fig. 7) may include one or more fiducial markers. Fiducial markers may be placed at predetermined locations on the surgical instrument. The location for placement of the fiducial markers may include an area on the surgical instrument adapted to overlay the overlay. The location for placement of the fiducial marker may include an area adapted to determine the orientation of the surgical instrument and/or the distance of the instrument from the imaging system (e.g., camera). The fiducial markers may be visible or invisible to the human eye but can be identified via video processing. The fiducial markers may have a particular shape such that the computing system may identify the surgical instrument based on the fiducial markers.
The fiducial markers may be a predetermined pattern such that the computing system may identify the surgical instrument based on the fiducial markers. For example, the fiducial marker may include an electronically readable code, such as a QR code. The computing system may obtain a video feed that captures the fiducial markers and identify the surgical instrument based on an electronically readable code (e.g., captured in the video feed or the imaging feed). Based on the electronically readable code, the computing system may retrieve information associated with the surgical instrument, such as a model of the surgical instrument. For example, using electronically readable code, the computing system may obtain spatial attribute information associated with fiducial markers on the surgical instrument. The computing system may scale the overlay content based on the obtained spatial properties of the markers in the video/imaging feed.
The size of the overlay region may be determined based on the size of the region on the surgical instrument suitable for content overlay, the size of the fiducial markers in real life, and the size of the fiducial markers in the video frames of the surgical video stream (e.g., the ratio between real life size and imaging size). For example, the computing system may identify fiducial markers in video frames of the surgical video stream and determine respective positions, sizes, and/or orientations of the fiducial markers in the respective video frames. For a given video frame or set of video frames, the computing system may determine the size, position, and/or orientation of the overlay region based on the position, size, and/or orientation of the fiducial marker captured therein.
Details regarding fiducial marks and spatial awareness of surgical products and instruments can be found in a concurrently filed application entitled "HUB IDENTIFICATION AND TRACKING OF OBJECTS AND PERSONNEL WITHIN THE OR TO OVERLAY DATA THAT IS CUSTOM TO THE USER'S NEED" (attorney docket number END9340 USNP), the contents of which are incorporated herein by reference.
As described herein, the overlay content may include an indication of a previous stitch or a previous weld. The overlay content may include alternative imaging, such as CT images indicative of a tumor. The overlay region may be determined such that the overlay content may be inserted near the current jaw placement but not interfere with the current jaw placement. This may enable the user to compare a previous work to a current work and/or to view alternative images within the jaws to more easily identify a tumor and/or a transection site.
At 45556, a composite video stream may be generated by overlaying the overlay content onto the surgical video stream. For example, the composite video stream may be generated by inserting the overlay content into a surgical video stream (such as a main surgical video feed). For example, the overlay content may be scaled, oriented, and inserted onto the shaft of the surgical instrument based on the fiducial markers captured in the primary video. For example, the position, size, shape, and/or orientation of the overlay region may be dynamically adjusted as the surgical device of interest moves in the main video. For example, when the overlay content includes a secondary video feed or a portion of a secondary video feed, a "picture-in-picture" technique may be used to generate the composite video. The computing system may determine a first overlay region size for overlaying the overlay content onto the first surgical video frame based on the content of the first frame and determine a second overlay region size for overlaying the overlay content onto the second surgical overlay frame based on the content of the second frame. The computing system may scale the secondary surgical video stream or a portion of the secondary video stream containing the region of interest based on the determined first overlay region size and the determined second overlay region size.
FIG. 18B illustrates an exemplary process for generating a composite surgical video stream using fiducial markers. At 45553, a surgical video stream may be obtained, e.g., as described herein. At 45555, a fiducial marker may be identified in a video frame of the surgical video stream. At 45557, overlay region positions, orientations, and/or sizes may be identified for inserting overlay content into the video frames. At 45559, overlay content may be inserted into the overlay region. This process may be repeated periodically or aperiodically for each frame, every other frame, every n frames. For example, the overlay region position, orientation, and/or size may be updated upon determining that the contents of the mainstream changes (e.g., the position of the surgical instrument changes significantly, the orientation of the surgical instrument changes significantly). For example, fiducial markers may be placed on the surgical instrument. By identifying fiducial markers in the video frames and using the identified fiducial markers to determine the overlay region, the overlay content may be moved in the composite video stream as the surgical instrument is moved in the surgical video stream.
Fig. 19A-19C illustrate exemplary video frames of a composite surgical video stream having superimposed content that moves as surgical instruments in the surgical video stream move. As described herein, a composite surgical video stream may be generated by inserting overlay content into a surgical video feed at one or more determined overlay regions. The composite surgical video stream may include video frames having superimposed content that may be adjusted based on content from corresponding frames of the surgical video feed. Fig. 19A illustrates an exemplary video frame 45560 of an exemplary composite surgical video stream. As shown, the video frame 45560 may include an image of a surgical instrument 45562, such as the surgical instrument 20282 described herein with respect to fig. 7. The overlay 45564 may indicate a loading condition of the surgical instrument 45562, such as "unloaded," as shown in fig. 19A. The overlay region for inserting the overlay content 45564 may be determined based on the fiducial markers 45566 on the surgical instrument 45562.
For example, the computing system may identify the location, size, and/or orientation of the fiducial markers 45566 in the main video frame of the main video feed. Based on the position, size, and/or orientation of fiducial markers 45566 in the main video frame, the computing system may determine the position, size, and/or orientation of the overlay region, as described herein. As shown in fig. 19A, the overlay area may be identified such that the overlay is easily readable and proportional to the surgical instrument.
Fig. 19B illustrates an exemplary video frame 45570 of an exemplary composite surgical video stream. As shown in fig. 19B, the overlay content 45574 may reflect an updated loading status "load" of the surgical instrument 45562. The surgical instrument 45562 and fiducial marker 45566 shown in video frame 45570 have moved compared to the surgical instrument 45562 and fiducial marker 45566 shown in video frame 45560. Based on the location of the fiducial markers 45566, the computing device may determine the location of the overlay content region of the overlay content 45574. Surgical instrument 45562 and fiducial marker 45566 shown in video frame 45570 are smaller than surgical instrument 45562 and fiducial marker 45566 shown in video frame 45560. Based on the size of the fiducial markers 45566, the computing device may determine the size of the overlay content region of the overlay content 45574. As shown in fig. 19B, the superimposed content 45574 is scaled down from the superimposed content 45564 shown in fig. 19A. Surgical instrument 45562 and fiducial marker 45566 shown in video frame 45570 are in an "inverted" orientation as compared to surgical instrument 45562 shown in video frame 45560. Based on the orientation of the fiducial markers 45566, the computing device may determine an orientation of the overlay content region of the overlay content 45574. As shown in fig. 19B, the overlay content 45574 is also inverted. Fig. 19C illustrates an exemplary video frame 45580 of an exemplary composite surgical video stream. As shown, overlay content 45584 may be moved and scaled based on fiducial markers 45566, but may remain in an upright or substantially upright orientation to improve readability.
For example, de-identification may be performed on the video feeds described therein. The de-identification process may be performed at an edge computing system, as described herein with reference to fig. 1B. For example, the computing system may detect and blur face captures in the video feed. Facial, silhouette, gait and/or other characteristics may be obscured. Other person-identifying content, such as an ID badge, may be detected, obscured, and/or removed from the video. In some cases, laparoscopic video, which typically includes in vivo imaging, may accidentally include in vitro images. Such images may be detected by the computing system and may be removed from the video.
For example, video obtained via an in-room camera may be correlated with video obtained via an in-vivo camera to identify surgical progress information. The computing system may determine the surgical task being performed based on video frames obtained via the endoscope in combination with video frames obtained via the OR camera. The determination may be further based on one or more monitoring sensors described herein with reference to fig. 1-8. The computing system may identify HCPs (e.g., roles of HCPs, current and/OR pending tasks, and/OR professions) based on video frames obtained via the endoscope and video frames obtained via the OR camera. Based on the determined surgical progress information and the identification of the HCP, the computing system may generate customized overlay content for the HCP. For example, the customized overlay content may be displayed via Augmented Reality (AR) or mixed reality overlays on the user interface. The AR device may provide AR content to the HCP. For example, a visual AR device, such as safety glasses with AR display, AR goggles, or a Head Mounted Display (HMD), may include a graphics processor for rendering 2D or 3D video and/or imaging for display. Based on the determined surgical progress information and the identification of the HCP, the computing system may generate a customized control signal for the apparatus or surgical device. For example, based on determining that the HCP carrying the energy device is a nurse, the computing system may generate control signals to prevent the energy device from entering an energized state. For example, based on determining that the HCP carrying the energy device is a surgeon, the computing system may generate control signals to allow the energy device to enter an energized state.
For example, 3D modeling may be performed by combining imaging videos, pre-operative imaging, and/or intra-operative imaging. For example, the computing system may analyze video frames of the primary surgical imaging or video feed to identify missing information for the primary surgical imaging or video feed. The computing system may identify areas of poor quality (e.g., artifacts, blurring, obstructing the field of view, etc.) and may provide an indication to request additional imaging to supplement the primary surgical video feed.
The computing system may identify regions associated with incomplete or missing information in the surgical imaging or video feed based on the sharpness of the regions. The computing system may generate a notification indicating the identified region associated with the incomplete or missing information. For example, the notification may include an indication of the surgical imaging or video frame in which the identified region is highlighted in the shape and color of the rendering. The computing system may interpolate the possible locations and shapes of the organ in the imaging or video frames.
The computing system may identify regions associated with potentially misleading data in the surgical imaging or video feed based on the sharpness of the regions. For example, areas with artifacts or defects exceeding a threshold may be identified. The computing system may generate a notification indicating that the surgical imaging or video feed contains potentially misleading information. For example, the notification may include an indication of a surgical imaging or video frame in which the identified region is marked as containing potentially misleading information.
The computing system may generate an indication with instruction information, such as which regions of the patient have uncertain or inadequate imaging usage steps. The computing system may indicate information associated with the access port for the input of the additional scan. The supplemental scan may be a different type of imaging/scanning than the primary image source. For example, a CT scan of the abdomen may have insufficient or uncertain data for the interior of the liver, pancreas, or other solid organ. The computing system may provide an indication or notification of the device for scanning with the ultrasound imaging system. The computing system may obtain imaging data from the supplemental scan and may combine the imaging data obtained from the primary imaging source with the imaging data obtained from the supplemental scan.
The computing system may fuse imaging data or video obtained from different sources based on the common anatomical landmarks. For example, the computing system may use the primary imaging as a source map for the supplemental image. By analyzing the primary imaging, the computing system may identify the location of content boundaries or bounds, where imaging data obtained from different sources may be consolidated. For example, the computing system may use a third imaging source, such as imaging obtained via laparoscopy, to identify linkable aspects for fusing the primary imaging data and the secondary imaging data. For example, the computing system may fuse the primary imaging data and the secondary imaging data based on preset imaging fiducial markers.
After performing the imaging data fusion, the computing system may determine a level of integrity of the organ imaging for one or more portions of the fused imaging. The level of integrity of a portion of the fused image may be indicative of the estimated integrity and/or accuracy of the portion of the fused image. In an example, imaging data (e.g., still and/or video imaging data) may be generated via different energy imaging techniques. The integrity level of the fused organ imaging may be determined based on different energy imaging techniques.
The computing system may generate composite imaging data based on portions of imaging data from the plurality of imaging feeds. Multiple imaging feeds may be received simultaneously. The composite imaging data may be generated in real-time. For example, one or more portions of the composite imaging data may include visible light imaging, while other portions of the composite imaging may include alternative source imaging as described herein. The computing system may generate composite imaging data by replacing a portion of the imaging data from one imaging feed with a corresponding portion of the imaging data from another imaging feed. The computing system may superimpose portions of the imaging data from one imaging feed onto corresponding portions of the imaging data from another imaging feed. The computing system may highlight (e.g., transparently highlight) a portion of the imaging feed from one imaging data based on a corresponding portion of the imaging data from another imaging feed.
For example, one or more portions of the surgical video stream may be defined for extended imaging overlays. The portion for expanding the imaging stack may be predefined or determined based on the surgeon's input during the surgical procedure. The portion of the imaging stack for expansion may include, but is not limited to, a transection site or an anatomical site. The extended imaging may include visualization of blood supply around critical structures, transection sites, or anatomical sites. For example, CT imaging may be superimposed on the surgical video stream (e.g., in a semi-transparent manner). The extended imaging stack may enable the surgeon to choose a transection or anatomic path without damaging critical structures of the patient's organ. The extended imaging stack may enable the surgeon to distinguish between a tree of blood vessels supplying the tumor and blood vessels not supplying the tumor and identify the blood vessels to be severed. Details regarding the overlaying of content on video feeds of surgical sites are further described in U.S. patent application Ser. No. 17/062,509 (attorney docket number: END9287USNP, titled "INTERACTIVE INFORMATION OVERLAY ON MULTIPLE SURGICAL DISPLAYS," filed on even date 10/2 2020), which is incorporated herein by reference in its entirety.
The following is a non-exhaustive list of embodiments that form part of the present disclosure:
Embodiment 1. A computing system, comprising:
a processor configured to enable:
obtaining a plurality of surgical video streams via a plurality of communication paths during a surgical procedure;
Transmitting a first surgical video stream of the plurality of surgical video streams for display, wherein the first surgical video stream is obtained via a first communication path;
detecting a problem associated with the first surgical video stream; and
Transmitting a second surgical video stream of the plurality of surgical video streams for display, wherein the second surgical video stream is obtained via a second communication path, the second communication path being different from the first communication path.
Embodiment 1 can provide the following technical effects: if delay, stuck, distorted, etc. conditions occur in the first surgical video stream, fault protection is provided.
Embodiment 2. The computing system of embodiment 1 wherein the first surgical video stream and the second surgical video stream are obtained from the same intraoperative imaging feed.
Example 2 can provide the following technical effects: if delay, jamming, distortion, etc. occurs in the first surgical video stream, a visual feed of the surgical site is maintained.
Embodiment 3. The computing system of embodiment 1 or embodiment 2, wherein the processor is configured to: the first surgical video stream is processed prior to display using a first processing module, wherein the problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module.
Embodiment 3 can provide the following technical effects: dedicated hardware processing is provided for the first surgical video stream without delay, jamming, distortion, etc. of the first surgical video stream.
Embodiment 4. The computing system of any of embodiments 1 to 3, wherein the processing module comprises at least one of:
A multispectral analysis module;
A laser Doppler blood flow measurement analysis module;
a plurality of field programmable arrays; or (b)
And a content compounding module.
Example 4 may provide the following technical effects: providing dedicated hardware analysis for the first surgical video stream without delay, jamming, distortion, etc. of the first surgical video stream.
Embodiment 5. The computing system of any of embodiments 1 to 4, wherein the processor is configured to:
Processing the first surgical video stream using a first processing module;
Detecting a problem with the first processing module, wherein the problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module; and
The second surgical video stream is processed using a second processing module.
Example 5 can provide the following technical effects: dedicated hardware parallel processing is provided for the first surgical video stream and the second surgical video stream without delay, jamming, distortion, etc. of the first surgical video stream.
Embodiment 6. The computing system of any of embodiments 1 to 4, wherein the first surgical video stream and the second surgical video stream are associated with imaging data captured via a same light sensing element, and the first surgical video stream and the second surgical video stream are processed via different processing modules.
Example 6 can provide the following technical effects: providing dedicated hardware parallel processing for a first surgical video stream of a surgical site and a second surgical video stream of the surgical site without delay, jamming, distortion, etc. of the first surgical video stream.
Embodiment 7. The computing system of any of embodiments 1 to 6, wherein the plurality of surgical video streams includes a third surgical video stream, and the processor is further configured to:
the first surgical video stream is enhanced using the third surgical video stream before the first surgical video stream is sent for display.
Embodiment 7 can provide the following technical effects: the healthcare professional is continuously instructed to complete the surgical procedure in real time.
Embodiment 8 the computing system of any of embodiments 1-4, wherein the processor is further configured to:
Processing the first surgical video stream using a first processing module;
Processing the second surgical video stream using a second processing module;
Combining the processed first surgical video stream and the processed second surgical system for display, wherein upon detecting the problem associated with the first surgical video stream, the processor is configured to pause the combining.
Example 8 can provide the following technical effects: in the absence of delay, jamming, distortion, etc. in the first surgical video stream, dedicated hardware parallel processing is provided for the first surgical video stream and the second surgical video stream, thereby reducing display delay.
Embodiment 9. The computing system of any of embodiments 1 to 6 and 8, wherein the plurality of surgical video streams includes a third surgical video stream, and the processor is further configured to:
extracting surgical annotation data from the third surgical video stream; and
The extracted surgical notes are inserted into the first surgical video stream before the first surgical video stream is sent for display.
Example 9 can provide the following technical effects: the healthcare professional is continuously instructed to complete the surgical procedure in real time.
Embodiment 10. The computing system of any of embodiments 1 to 9, wherein the first surgical video stream and the second surgical video stream are obtained from the same in-vivo visible light feed.
Example 11. A method comprising:
obtaining a plurality of surgical video streams via a plurality of communication paths during a surgical procedure;
Transmitting a first surgical video stream of the plurality of surgical video streams for display, wherein the first surgical video stream is obtained via a first communication path;
detecting a problem associated with the first surgical video stream; and
Transmitting a second surgical video stream of the plurality of surgical video streams for display, wherein the second surgical video stream is obtained via a second communication path, the second communication path being different from the first communication path.
Embodiment 12. The method of embodiment 11 wherein the first surgical video stream and the second surgical video stream are obtained from the same intraoperative imaging feed.
Embodiment 13. The method of embodiment 11 or embodiment 12, further comprising:
The first surgical video stream is processed prior to display using a first processing module, wherein the problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module.
Embodiment 14. The method of any of embodiments 1 to 13, wherein the processing module comprises at least one of:
A multispectral analysis module;
A laser Doppler blood flow measurement analysis module;
a plurality of field programmable arrays; or (b)
And a content compounding module.
Embodiment 15. The method of any of embodiments 11 to 14, further comprising:
Processing the first surgical video stream using a first processing module;
Detecting a problem with the first processing module, wherein the problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module; and
The second surgical video stream is processed using a second processing module.
Embodiment 16. The method of embodiment 11 wherein the first surgical video stream and the second surgical video stream are associated with imaging data captured via a same light sensing element and the first surgical video stream and the second surgical video stream are processed via different processing modules.
Embodiment 17. The method of any of embodiments 11 to 16, wherein the plurality of surgical video streams includes a third surgical video stream, and the method further comprises:
the first surgical video stream is enhanced using the third surgical video stream before the first surgical video stream is sent for display.
Embodiment 18. The method of any of embodiments 11 to 14, further comprising:
Processing the first surgical video stream using a first processing module;
processing the second surgical video stream using a second processing module; and
Combining the processed first surgical video stream and the processed second surgical system for display, wherein upon detecting the problem associated with the first surgical video stream, the processor is configured to pause the combining.
Embodiment 19 the method of any one of embodiments 11-16 and 18, wherein the plurality of surgical video streams includes a third surgical video stream, and the method further comprises:
extracting surgical annotation data from the third surgical video stream; and
The extracted surgical notes are inserted into the first surgical video stream before the first surgical video stream is sent for display.
Embodiment 20. The method of any of embodiments 11-19, wherein the first surgical video stream and the second surgical video stream are obtained from the same in-vivo visible light feed.
Embodiments 11 through 20 provide methods corresponding to the operation of the computing systems described in embodiments 1 through 10, respectively. The technical effects and advantages described above in connection with embodiments 1 to 10 also apply to the methods according to embodiments 11 to 20.
Any and/or all of the above-described embodiments 11-20 may be embodied as a computer-implemented method, including, but not limited to, a method implemented by a processor, an integrated circuit, a microcontroller, a Field Programmable Gate Array (FPGA), or the like. The implementation computing system may be a hardware device or may include a plurality of hardware devices configured to be operable as a distributed computing system. An implementation computer system may include a memory containing instructions for performing any and/or all of the methods described above. For example, the memory may contain instructions that, when executed by the computing system and/or its processor, cause the system or the processor to perform one or more of embodiments 11-20.
Any and/or all of the embodiments 11-20 described above may be embodied in the form of a computer-readable storage medium, such as a non-transitory computer-readable storage medium, containing instructions that, when executed by a computer, cause the computer to perform one or more of the embodiments 11-20. Any and/or all of the above-described embodiments 11-20 may be embodied as a computer program product.
Embodiments 11 through 20 may exclude methods of treating the human or animal body by surgery or therapy, or diagnostic methods performed on the human or animal body. Each of embodiments 11 to 20 may be a method that is not a surgical, therapeutic or diagnostic method. For example, each of examples 11-20 has an embodiment that does not include performing the surgical procedure or any surgical or therapeutic steps thereof.
The following is a non-exhaustive list of the various aspects that form part of this disclosure:
aspect 1. A computing system, comprising:
a processor configured to enable:
obtaining a plurality of surgical video streams via a plurality of communication paths during a surgical procedure;
Transmitting a first surgical video stream of the plurality of surgical video streams for display, wherein the first surgical video stream is obtained via a first communication path;
detecting a problem associated with the first surgical video stream; and
Transmitting a second surgical video stream of the plurality of surgical video streams for display, wherein the second surgical video stream is obtained via a second communication path, the second communication path being different from the first communication path.
Aspect 2 the computing system of aspect 1, wherein the first surgical video stream and the second surgical video stream are obtained from a same intraoperative imaging feed.
Aspect 3 the computing system of aspect 1, wherein the processor is configured to:
The first surgical video stream is processed prior to display using a first processing module, wherein the problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module.
Aspect 4 the computing system of aspect 3, wherein the processing module comprises at least one of:
A multispectral analysis module;
A laser Doppler blood flow measurement analysis module;
a plurality of field programmable arrays; or (b)
And a content compounding module.
Aspect 5 the computing system of aspect 1, wherein the processor is configured to:
Processing the first surgical video stream using a first processing module;
Detecting a problem with the first processing module, wherein the problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module; and
The second surgical video stream is processed using a second processing module.
Aspect 6. The computing system of aspect 1, wherein the first surgical video stream and the second surgical video stream are associated with imaging data captured via a same light sensing element, and the first surgical video stream and the second surgical video stream are processed via different processing modules.
Aspect 7 the computing system of aspect 1, wherein the plurality of surgical video streams includes a third surgical video stream, and the processor is further configured to:
the first surgical video stream is enhanced using the third surgical video stream before the first surgical video stream is sent for display.
Aspect 8 the computing system of aspect 1, wherein the processor is further configured to:
Processing the first surgical video stream using a first processing module;
processing the second surgical video stream using a second processing module; and
Combining the processed first surgical video stream and the processed second surgical system for display, wherein upon detecting the problem associated with the first surgical video stream, the processor is configured to pause the combining.
Aspect 9 the computing system of aspect 1, wherein the plurality of surgical video streams includes a third surgical video stream, and the processor is further configured to:
extracting surgical annotation data from the third surgical video stream; and
The extracted surgical notes are inserted into the first surgical video stream before the first surgical video stream is sent for display.
Aspect 10. The computing system of aspect 1, wherein the first surgical video stream and the second surgical video stream are obtained from the same in-vivo visible light feed.
Aspect 11. A method comprising:
obtaining a plurality of surgical video streams via a plurality of communication paths during a surgical procedure;
Transmitting a first surgical video stream of the plurality of surgical video streams for display, wherein the first surgical video stream is obtained via a first communication path;
detecting a problem associated with the first surgical video stream; and
Transmitting a second surgical video stream of the plurality of surgical video streams for display, wherein the second surgical video stream is obtained via a second communication path, the second communication path being different from the first communication path.
Aspect 12 the method of aspect 11, wherein the first surgical video stream and the second surgical video stream are obtained from the same intraoperative imaging feed.
Aspect 13 the method of aspect 11, further comprising:
The first surgical video stream is processed prior to display using a first processing module, wherein the problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module.
Aspect 14 the method of aspect 13, wherein the processing module comprises at least one of:
A multispectral analysis module;
A laser Doppler blood flow measurement analysis module;
a plurality of field programmable arrays; or (b)
And a content compounding module.
Aspect 15. The method of aspect 11, further comprising:
Processing the first surgical video stream using a first processing module;
Detecting a problem with the first processing module, wherein the problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module; and
The second surgical video stream is processed using a second processing module.
Aspect 16. The method of aspect 11, wherein the first surgical video stream and the second surgical video stream are associated with imaging data captured via a same light sensing element, and the first surgical video stream and the second surgical video stream are processed via different processing modules.
Aspect 17 the method of aspect 11, wherein the plurality of surgical video streams includes a third surgical video stream, and the method further comprises:
the first surgical video stream is enhanced using the third surgical video stream before the first surgical video stream is sent for display.
Aspect 18 the method of aspect 11, further comprising:
Processing the first surgical video stream using a first processing module;
processing the second surgical video stream using a second processing module; and
Combining the processed first surgical video stream and the processed second surgical system for display, wherein upon detecting the problem associated with the first surgical video stream, the processor is configured to pause the combining.
Aspect 19 the method of aspect 11, wherein the plurality of surgical video streams includes a third surgical video stream, and the method further comprises:
extracting surgical annotation data from the third surgical video stream; and
The extracted surgical notes are inserted into the first surgical video stream before the first surgical video stream is sent for display.
Aspect 20. The method of aspect 11, wherein the first surgical video stream and the second surgical video stream are obtained from the same in-vivo visible light feed.

Claims (20)

1. A computing system, comprising:
a processor configured to enable:
obtaining a plurality of surgical video streams via a plurality of communication paths during a surgical procedure;
Transmitting a first surgical video stream of the plurality of surgical video streams for display, wherein the first surgical video stream is obtained via a first communication path;
detecting a problem associated with the first surgical video stream; and
Transmitting a second surgical video stream of the plurality of surgical video streams for display, wherein the second surgical video stream is obtained via a second communication path, the second communication path being different from the first communication path.
2. The computing system of claim 1, wherein the first surgical video stream and the second surgical video stream are obtained from a same intraoperative imaging feed.
3. The computing system of claim 1 or claim 2, wherein the processor is configured to be capable of:
The first surgical video stream is processed prior to display using a first processing module, wherein the problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module.
4. The computing system of any of claims 1 to 3, wherein the processing module comprises at least one of:
A multispectral analysis module;
A laser Doppler blood flow measurement analysis module;
a plurality of field programmable arrays; or (b)
And a content compounding module.
5. The computing system of any of claims 1 to 4, wherein the processor is configured to be capable of:
Processing the first surgical video stream using a first processing module;
Detecting a problem with the first processing module, wherein the problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module; and
The second surgical video stream is processed using a second processing module.
6. The computing system of any of claims 1 to 4, wherein the first and second surgical video streams are associated with imaging data captured via a same light sensing element, and the first and second surgical video streams are processed via different processing modules.
7. The computing system of any of claims 1 to 6, wherein the plurality of surgical video streams includes a third surgical video stream, and the processor is further configured to:
the first surgical video stream is enhanced using the third surgical video stream before the first surgical video stream is sent for display.
8. The computing system of any of claims 1 to 4, wherein the processor is further configured to:
Processing the first surgical video stream using a first processing module;
Processing the second surgical video stream using a second processing module;
Combining the processed first surgical video stream and the processed second surgical system for display, wherein upon detecting the problem associated with the first surgical video stream, the processor is configured to pause the combining.
9. The computing system of any of claims 1 to 6 and 8, wherein the plurality of surgical video streams includes a third surgical video stream, and the processor is further configured to:
extracting surgical annotation data from the third surgical video stream; and
The extracted surgical notes are inserted into the first surgical video stream before the first surgical video stream is sent for display.
10. The computing system of any of claims 1 to 9, wherein the first surgical video stream and the second surgical video stream are obtained from a same in-vivo visible light feed.
11. A method, comprising:
obtaining a plurality of surgical video streams via a plurality of communication paths during a surgical procedure;
Transmitting a first surgical video stream of the plurality of surgical video streams for display, wherein the first surgical video stream is obtained via a first communication path;
detecting a problem associated with the first surgical video stream; and
Transmitting a second surgical video stream of the plurality of surgical video streams for display, wherein the second surgical video stream is obtained via a second communication path, the second communication path being different from the first communication path.
12. The method of claim 11, wherein the first surgical video stream and the second surgical video stream are obtained from a same intraoperative imaging feed.
13. The method of claim 11 or claim 12, further comprising:
The first surgical video stream is processed prior to display using a first processing module, wherein the problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module.
14. The method of any one of claims 1 to 13, wherein the processing module comprises at least one of:
A multispectral analysis module;
A laser Doppler blood flow measurement analysis module;
a plurality of field programmable arrays; or (b)
And a content compounding module.
15. The method of any of claims 11 to 14, further comprising:
Processing the first surgical video stream using a first processing module;
Detecting a problem with the first processing module, wherein the problem associated with the first surgical video stream is detected based on detecting the problem with the first processing module; and
The second surgical video stream is processed using a second processing module.
16. The method of claim 11, wherein the first and second surgical video streams are associated with imaging data captured via a same light sensing element, and the first and second surgical video streams are processed via different processing modules.
17. The method of any of claims 11-16, wherein the plurality of surgical video streams includes a third surgical video stream, and the method further comprises:
the first surgical video stream is enhanced using the third surgical video stream before the first surgical video stream is sent for display.
18. The method of any of claims 11 to 14, further comprising:
Processing the first surgical video stream using a first processing module;
processing the second surgical video stream using a second processing module; and
Combining the processed first surgical video stream and the processed second surgical system for display, wherein upon detecting the problem associated with the first surgical video stream, the processor is configured to pause the combining.
19. The method of any of claims 11-16 and 18, wherein the plurality of surgical video streams includes a third surgical video stream, and the method further comprises:
extracting surgical annotation data from the third surgical video stream; and
The extracted surgical notes are inserted into the first surgical video stream before the first surgical video stream is sent for display.
20. The method of any of claims 11-19, wherein the first surgical video stream and the second surgical video stream are obtained from the same in-vivo visible light feed.
CN202280063249.XA 2021-07-22 2022-07-20 Redundant communication channels and processing of imaging feeds Pending CN118019507A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US63/224,813 2021-07-22
US17/384,265 2021-07-23
US17/384,265 US11601232B2 (en) 2021-07-22 2021-07-23 Redundant communication channels and processing of imaging feeds
PCT/IB2022/056674 WO2023002388A1 (en) 2021-07-22 2022-07-20 Redundant communication channels and processing of imaging feeds

Publications (1)

Publication Number Publication Date
CN118019507A true CN118019507A (en) 2024-05-10

Family

ID=90959833

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280063249.XA Pending CN118019507A (en) 2021-07-22 2022-07-20 Redundant communication channels and processing of imaging feeds

Country Status (1)

Country Link
CN (1) CN118019507A (en)

Similar Documents

Publication Publication Date Title
US11601232B2 (en) Redundant communication channels and processing of imaging feeds
US20240156326A1 (en) Characterization of tissue irregularities through the use of mono-chromatic light refractivity
US20210212602A1 (en) Dual cmos array imaging
US11100631B2 (en) Use of laser light and red-green-blue coloration to determine properties of back scattered light
US11963683B2 (en) Method for operating tiered operation modes in a surgical system
US11748924B2 (en) Tiered system display control based on capacity and user operation
US20220104713A1 (en) Tiered-access surgical visualization system
CN116724358A (en) Interactive information overlay on multiple surgical displays
CN116635946A (en) Cooperative surgical display
US20220104765A1 (en) Surgical visualization and particle trend analysis system
US20220104908A1 (en) Field programmable surgical visualization system
CN118019507A (en) Redundant communication channels and processing of imaging feeds
WO2023002388A1 (en) Redundant communication channels and processing of imaging feeds
WO2023002384A1 (en) Cooperative composite video streams layered onto the surgical site and instruments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication