WO2024094838A1 - Operating room dashboard - Google Patents
Operating room dashboard Download PDFInfo
- Publication number
- WO2024094838A1 WO2024094838A1 PCT/EP2023/080643 EP2023080643W WO2024094838A1 WO 2024094838 A1 WO2024094838 A1 WO 2024094838A1 EP 2023080643 W EP2023080643 W EP 2023080643W WO 2024094838 A1 WO2024094838 A1 WO 2024094838A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- surgical
- insights
- operating room
- selection
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract description 66
- 238000001356 surgical procedure Methods 0.000 claims description 113
- 230000009471 action Effects 0.000 claims description 25
- 238000001914 filtration Methods 0.000 claims description 25
- 238000012545 processing Methods 0.000 claims description 21
- 238000004590 computer program Methods 0.000 claims description 12
- 230000004044 response Effects 0.000 claims description 8
- 238000001514 detection method Methods 0.000 claims description 7
- 238000013459 approach Methods 0.000 claims description 5
- 230000002829 reductive effect Effects 0.000 claims description 2
- 230000002452 interceptive effect Effects 0.000 abstract description 33
- 238000013481 data capture Methods 0.000 abstract description 20
- 230000000007 visual effect Effects 0.000 description 17
- 238000010586 diagram Methods 0.000 description 14
- 238000010801 machine learning Methods 0.000 description 13
- 230000006870 function Effects 0.000 description 10
- 230000003993 interaction Effects 0.000 description 10
- 238000004891 communication Methods 0.000 description 8
- 238000012805 post-processing Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000007726 management method Methods 0.000 description 6
- 230000006872 improvement Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 239000000203 mixture Substances 0.000 description 4
- 210000003484 anatomy Anatomy 0.000 description 3
- 238000003491 array Methods 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 238000013523 data management Methods 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 230000004913 activation Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 239000012636 effector Substances 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000000670 limiting effect Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000007639 printing Methods 0.000 description 2
- 230000001902 propagating effect Effects 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 238000007789 sealing Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 208000002177 Cataract Diseases 0.000 description 1
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 208000007913 Pituitary Neoplasms Diseases 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000002192 cholecystectomy Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 238000012517 data analytics Methods 0.000 description 1
- 238000013075 data extraction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 239000000975 dye Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000023597 hemostasis Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 238000002271 resection Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/20—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
Definitions
- the present disclosure is generally related to computing technology, and particularly to improvements to systems that facilitate provision of guidance about surgical procedures in a hospital setting where video streams of the surgical procedures are being captured.
- Medical centers e.g., hospitals, clinics, outpatient surgery centers, etc.
- demands on services such as performing surgical procedures, based on limited health care resources, such as operating rooms, surgeons, staff, instruments, etc.
- limited health care resources such as operating rooms, surgeons, staff, instruments, etc.
- Inefficiency in the process of patient care involving surgical services could be extremely costly to medical centers, patients, and society in general.
- a system includes a memory device and one or more processors coupled with the memory device, the one or more processors configured to perform a plurality of operations.
- the operations include accessing surgical data captured from a plurality of surgical procedures on an operating room basis, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures.
- One or more insights about the surgical data are autonomously determined.
- One or more filter criteria are received, and the surgical data is filtered based on the one or more filter criteria to adjust an underlying dataset of the user-informative elements.
- the one or more insights about the surgical data are updated based on the adjustment to the underlying dataset. Updates to the user- informative elements and the one or more insights are displayed based on the adjustment to the underlying dataset.
- a method includes accessing surgical data captured from a plurality of surgical procedures, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures. The method also includes analyzing the surgical data to determine one or more insights about use of the operating rooms and actions performed by the one or more surgeons and displaying the one or more insights on an operating room dashboard. The method further includes filtering the surgical data based on the one or more filter criteria, updating the one or more insights about the surgical data based on the filtering, and displaying updates to the one or more insights on the operating room dashboard.
- a computer program product includes a memory device with computer readable instructions stored thereon, where executing the computer readable instructions by one or more processing units causes the one or more processing units to perform a plurality of operations that include accessing surgical data captured from a plurality of surgical procedures, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures.
- the operations also include analyzing the surgical data to determine one or more insights about the surgical procedures and displaying the one or more insights.
- the operations further include filtering the surgical data based on the one or more filter criteria, updating the one or more insights about the surgical data based on the filtering, and displaying updates to the one or more insights.
- FIG. 1 shows a computer assisted surgical system according to one or more aspects
- FIG. 2 depicts example user-interactive report of surgical procedures performed using operating rooms at a medical center according to one or more aspects
- FIGS. 3A-3B depict example view of surgical procedures analyzed according to one or more aspects
- FIG. 4 depicts example user-interactive views of surgical procedures analyzed per surgeon according to one or more aspects
- FIGS. 5A-5C depict example user interactivity of a report according to one or more aspects
- FIG. 6 depicts filtering a report according to one or more aspects
- FIGS. 7A-7B depict user-interactive views to analyze selected surgical procedures according to one or more aspects
- FIG. 8 depicts an example of a display on a mobile device according to one or more aspects
- FIG. 9 depicts a flowchart of a method according to one or more aspects
- FIG. 10 depicts a surgical procedure system according to one or more aspects.
- FIG. 11 depicts a block diagram of a computer system according to one or more aspects.
- Exemplary aspects of technical solutions described herein relate to, among other things, devices, systems, methods, computer-readable media, techniques, and methodologies for using machine learning and computer vision to improve a computerized management system of a medical center. Particularly, aspects of the technical solutions herein improve computerized management system of operating rooms in the medical center.
- An operating room dashboard can provide insights into the utilization of a medical center’s resources, such as operating rooms, and other resources (human and materials) used during surgical procedures.
- the dashboard can provides such insights based on surgical video(s) captured of the surgical procedures being performed.
- the video(s) can include intracorporal video from within a patients’ body and extracorporeal video captured from within the operating room(s).
- the insights provided can be automatically determined to facilitate operators, such as staff, administrators, surgeons, etc. of the medical center to take one or more responsive actions.
- structures are predicted dynamically and substantially in real-time as the surgical data is being captured and analyzed by technical solutions described herein.
- a predicted structure can be an anatomical structure, a surgical instrument, etc.
- Exemplary aspects of technical solutions described herein further facilitate generating augmented views of surgical sites using semantic surgical representations based on the predictions of the one or more structures in the surgical data.
- technical solutions herein can determine phases, actions, and other specific aspects of a surgical procedure being performed. Based on the determination of such information about the ongoing surgical procedures, aspects of the technical solutions herein facilitate providing insights (e.g., graphs, notifications, highlights, etc.) for an administrator to manage one or more resources.
- insights e.g., graphs, notifications, highlights, etc.
- managing resources can include scheduling resources, assigning resources, replacing resources, etc. as noted elsewhere, resources can include human resources (e.g., surgeons, staff, etc.) and/or material resources (e.g., operating rooms, tools, equipment, pharmaceuticals, dyes, etc.).
- technical effects can include management of computer memory resources and network traffic by allowing user selection and configuration of the user-informative elements on an operating room dashboard. For example, allowing a user to add or remove user-informative elements to only show the content of interest reduces the memory resource, processing resource, and network resource usage for content that is not relevant or useful for a user.
- the ability to rearrange the user-informative elements on the operating room dashboard by an order preference of the user can allow a user to group more relevant or related views of interest in close physical proximity on a display to reduce scrolling and searching actions that may otherwise result in more resource consumption. For instance, as user-informative elements and insights are dynamically populated when shifted into a viewable position on a display, excessive scrolling and searching for a desired user-informative element can result in additional resource utilization in populating content that is not of interest if selection of the user-informative elements was not configurable. Further, insights can be dynamically determined based on data set filters. Dynamic determination of insights can reduce data storage requirements as compared to prepopulating a database with possible insights that may not be accessed or deemed useful.
- FIG. 1 depicts an example system according to one or more aspects.
- the system 100 includes at least a surgical data capture system 102, a video recording system 104, and a surgical instrumentation system 106, in each operating room (OR) 101.
- OR operating room
- actor 112 can be medical personnel that uses the system 100 to perform a surgical procedure on a patient 110.
- Medical personnel can be a surgeon, assistant, nurse, administrator, or any other actor that interacts with the system 100 in a surgical environment.
- the surgical procedure can be any type of surgery such as but not limited to cataract surgery, laparoscopic cholecystectomy, endoscopic endonasal transsphenoidal approach (eTSA) to resection of pituitary adenomas, or any other surgical procedure.
- the actor 112 can be a technician, and administrator, an engineer, or any other such personnel that interacts with the system 100.
- a surgical procedure can include multiple phases, and each phase can include one or more surgical actions.
- a “surgical action” can include an incision, a compression, a stapling, a clipping, a suturing, a cauterization, a sealing, or any other such actions performed to complete a phase in the surgical procedure.
- a “phase” represents a surgical event that is composed of a series of steps (e.g., closure).
- a “step” refers to the completion of a named surgical objective (e.g., hemostasis).
- certain surgical instruments 108 e.g., forceps
- the surgical instrumentation system 106 provides electrical energy to operate one or more surgical instruments 108 to perform the surgical actions.
- the electrical energy triggers an activation in the surgical instrument 108.
- the electrical energy can be provided in the form of an electrical current or an electrical voltage.
- the activation can cause a surgical action to be performed.
- the surgical instrumentation system 106 can further include electrical energy sensors, electrical impedance sensors, force sensors, bubble and occlusion sensors, and various other types of sensors.
- the electrical energy sensors can measure and indicate an amount of electrical energy applied to one or more surgical instruments 108 being used for the surgical procedure.
- the impedance sensors can indicate an amount of impedance measured by the surgical instruments 108, for example, from the tissue being operated upon.
- the force sensors can indicate an amount of force being applied by the surgical instruments 108. Measurements from various other sensors, such as position sensors, pressure sensors, flow meters, can also be input.
- the video recording system 104 includes one or more cameras, such as operating room cameras, endoscopic cameras, etc.
- the cameras capture video data of the surgical procedure being performed.
- the video recording system 104 includes one or more video capture devices that can include cameras placed in the surgical room to capture events surrounding (i.e., outside) the patient being operated upon.
- the video recording system 104 further includes cameras that are passed inside (e.g., endoscopic cameras) the patient to capture endoscopic data.
- the endoscopic data provides video, images of the surgical procedure (e.g., FIG. 4).
- the surgical data capture system 102 includes one or more memory devices, one or more processors, a user interface device, among other components.
- the surgical data capture system 102 can execute one or more computer executable instructions. The execution of the instructions facilitates the surgical data capture system 102 to perform one or more methods, including those described herein.
- the surgical data capture system 102 can communicate with other computing systems via a wired and/or a wireless network.
- the surgical data capture system 102 includes one or more trained machine learning models that can detect and/or predict features of/from the surgical procedure that is being performed, or has been performed earlier.
- Features can include structures such as anatomical structures, surgical instruments 108 in the surgical procedure.
- Features can further include events such as phases, actions in the surgical procedure.
- Features that are detected can further include actor 112, patient 110.
- the surgical data capture system 102 in one or more examples, can provide recommendation for subsequent actions to be taken by actor 112. Alternatively, or in addition, the surgical data capture system 102 can provide one or more reports based on the detections.
- the detections by the machine learning models can be performed in an autonomous or semi-autonomous manner.
- the machine learning models can include artificial neural networks, such as deep neural networks, convolutional neural networks, recurrent neural networks, encoders, decoders, or any other type of machine learning models.
- the machine learning models can be trained in a supervised, unsupervised, or hybrid manner.
- the machine learning models can be trained to perform detection and/or prediction using one or more types of data acquired by the system 100.
- the machine learning models can use the video data captured via the video recording system 104.
- the machine learning models use the surgical instrumentation data from the surgical instrumentation system 106.
- the machine learning models use a combination of the video and the surgical instrumentation data.
- the machine learning models can also use audio data captured during the surgical procedure.
- the audio data can include sounds emitted by the surgical instrumentation system 106 while activating one or more surgical instruments 108.
- the audio data can include voice commands, snippets, or dialog from one or more actors 112.
- the audio data can further include sounds made by the surgical instruments 108 during their use.
- the machine learning models can detect surgical actions, surgical phases, anatomical structures, surgical instruments, and various other features from the data associated with a surgical procedure. The detection can be performed in real-time in some examples.
- the surgical data capture system 102 analyzes the surgical data, i.e., the various types of data captured during the surgical procedure, in an offline manner (e.g., post-surgery).
- the surgical data capture system 102 from each operating room 101 can be in communication with an OR dashboard 120.
- the OR dashboard 120 can be (or be a part of) a computing system.
- the OR dashboard 120 can be a part of a hospital management system (HMS), or any other such enterprise level system.
- the OR dashboard 120 can include the machine learning capabilities that are described herein.
- the OR dashboard receives information and commands from the one or more surgical data capture systems 102. Further, in some aspects, the OR dashboard 120 provides data and commands to the surgical data capture system 102.
- FIG. 2 depicts an example report of an analysis performed by the surgical data capture system 102 using the surgical data.
- the report 200 is user-interactive and displayed on the OR dashboard 120.
- the report 200 can be displayed via the user interface of the OR dashboard 120. In some aspects, the report can also be displayed by the surgical data capture system 102.
- the report 200 can include a user-informative element 202 (e.g., a total case and case forecast view as a user-informative element) that indicates a number of surgeries performed during a selected duration, for example, a fiscal year, a month, or any other selected duration.
- the user- informative element 202 includes a forecast of how many surgical procedures are expected to be performed in an upcoming duration, for example, a quarter, a month, etc.
- the user-informative element 202 can include one or more interactive elements that the user can configure. In an example, the number of surgical procedures performed and predicted can be shown in the form of a graph. The user can configure type of the graph, e.g., line graph, bar graph, pie graph, etc.
- the user can also configure one or more visual attributes, e.g., color, shape, etc., associated with the user-informative element 202.
- the user can provide, or the OR dashboard may receive a target number of surgical procedures.
- the user-informative element 202 may show a comparison between the number of surgical procedures performed, predicted, and targeted.
- the report 200 can include user-informative elements 204 that provide further details about the surgical procedures performed so far in the selected duration.
- the user-informative element 204 can list the performed surgical procedures according to a type of the surgical procedure. For example, a predetermined list of types of surgical procedures with volume of procedures for each type is provided. The information can be provided in a textual, graphical, or any other manner or a combination thereof.
- the user-informative elements 204 can categorize the surgical procedures performed in the selected duration (in 202), according to the duration of each surgical procedure. The information can be provided in a textual, graphical, or any other manner or a combination thereof.
- the report 200 includes an insight view 206 that provides automatically detected anomalies or patterns in the surgical data associated with the surgical procedures selected (in 202), where the user insight view 206 displays one or more insights as a user-informative element.
- the anomalies or patterns associated with the surgical data can identify surgical procedures that took longer (or shorter) than a typical time for such types of surgical procedures.
- the insights can be based on anomalies or patterns associated with types of surgical procedures performed within a certain number of days in the past.
- the anomalies or patterns that are detected can be configured by a user, for example, by specifying factors to use to detect such variation from normal. For example, the factors can include duration, type, surgeon, staff, equipment, etc., and a combination thereof.
- each of the user-informative elements 202, 204, 206 that represents a surgical procedure in response to a first interaction, such as a hover, a click, a right-click, a touch, a voice command, etc., provides detailed information about that surgical procedure via another user- informative element (not shown).
- the other user- informative element can identify the procedure being performed, phases of the procedure, a duration of the procedure, the personnel involved, equipment used, patient information, etc.
- FIG. 3A depicts an example view 300.
- the view 300 includes a user-informative element 302 that provides detailed information about surgical procedures per operating room 101.
- the user- informative element 302 indicates a number of procedures performed per operating room 101, average procedures per operating room 101, and amount of video recording performed in each operating room 101. Additional parameters can also be configured to be displayed in some aspects
- the view 300 includes a user-informative element 304 that provides further details about one or more parameters associated with the operating rooms 101.
- the user-informative element 304 is displayed in response to a user-interaction with one or more parameters depicted in the user-informative element 302. For example, upon selecting the average number of procedures per operating room in the user-informative element 302, the user-informative element 304 displays a timeline of the selected duration (in 202) and a graph with the average number of procedures performed per operating room 101 for all of the operating rooms 101.
- the average daily procedures, or any other selected parameter is color encoded for each operating room, with the color encoding displayed as an index. The color encoding may be performed dynamically or based on a preselected list of colors for each operating room. In other aspects, in addition to or alternative to color, other visual attributes may be used to represent the selected parameter.
- FIG. 3B depicts another example of the user-informative element 304 in which the surgical procedures performed are displayed using a color encoded line graph for each operating room 101, each line graph depicting durations of surgical procedures performed over the timeline.
- the user can interact with each line graph, which results in display of another room (not shown) that provides a breakdown of the surgical procedures performed in the operating room 101 corresponding to the line graph selected (or interacted with).
- the user-informative element 304 shows an indication of the operating room that was used as an emergency room only for operating on emergency cases.
- Such an operating room may have specific/special attributes associated with it, which may result in exclusion of data from that operating room when generating particular insights.
- specific insights may be generated only based on the operating room 101 used for the emergency cases.
- FIG. 4 depicts an example of another view 400.
- the view 400 can be displayed in response to a user-interaction with one or more of the user-informative elements in other figures herein.
- the view 400 includes a user-informative element 402, which depicts the surgical procedures performed during the selected duration (in 202) based on one or more of the personnel involved, for example, based on the surgeon.
- the details shown per surgeon can include the types of procedures, total number of procedures, average procedures (per day), average duration per case, average duration per day, etc.
- the information can further include complexity ratings associated with the surgical procedures performed by each surgeon.
- the complexity ratings can be based on automated identification of features of the surgical procedures, where the ratings are assigned based on presence (or absence) of one or more surgical features. Alternatively, or in addition, the complexity ratings can be based on one or more annotations provided by the surgeon during the surgical procedures. The complexity ratings can be depicted using one or more visual attributes, such as color, in some aspects.
- the view 400 can further include a user-informative element 404 that displays the information of the surgical procedures based on the surgical data capture systems 102 in each operating room 101.
- the surgical data capture systems 102 can be moved from one operating room 101 to another, and accordingly, the number of surgical procedures recorded by a surgical data capture system 102 can vary from the operating room 101 in which it is located.
- the depicted information in the user-informative element 404 includes a number of procedures recorded on each of the surgical data capture systems 102 connected to the OR dashboard 120, and details for each of such surgical procedures.
- FIG. 5A depicts example user interactivity of the report 200 according to one or more aspects.
- the report 200 includes a user interactive element 502 that facilitates adding more views (300, 400, etc.) to the report 200.
- Interaction e.g., click, hover, double click, touch, etc.
- the menu bar 504 includes types of several views that a user can add to the report 200.
- the types of views can be represented by labels, buttons, or any other selectable and interactive element in the menu bar 504.
- Interaction with the one or more types of views in the menu bar 504 causes a corresponding view to be added to the report 200.
- FIG. 5B depicts the report 200 with a corresponding view 506 for a case complexity rating view added as per interaction depicted in FIG. 5A.
- FIG. 5C depicts additional example interactivity with the report 200.
- one or more views of the report 200 can be moved from one position to another.
- the positions of the view 506 and the view 404 are swapped.
- each of the views (200, 300, 400, etc.) include a user interactive element 508.
- Interaction with the user interactive element 508 displays a menu bar that is context specific to that view.
- the menu bar includes one or more selectable options.
- the options can be selectable by being represented as buttons, icons, images, labels, etc.
- the options can facilitate removing the view, sending feedback about the view to a developer of the OR dashboard 120, switching between a graphical/list view, etc.
- the options can also include removing the view from the report 200.
- FIG. 6 depicts a view 600 that is depicted in response to an interaction with a user interactive element 208 (see FIG. 2).
- the user interactive element 208 facilitates filtering the surgical data being used to generate the report 200 including the one or more views (300, 400, etc.) in the report.
- interaction with the user interactive element 208 displays the view 600.
- the view 600 includes a menu bar 602, with one or more selectable filters.
- the filters are selectable by representing the filters as menu items, such as labels, icons, buttons, etc.
- Each filter can include one or more parameters that the user can select to filter the surgical data accordingly.
- the filters can include filtering the surgical data according to date, time, day of the week, or any other temporal aspect.
- the filters can include filtering the surgical data according to specifics of each surgical procedures, for example, case duration, personnel (e.g., surgeon), procedure type, specialty, operating room, complexity rating, patient information, etc., or a combination thereof. Once the filters are configured and applied, all the views in the report are updated correspondingly.
- the report 200 includes a menu bar 210.
- the menu bar 210 includes one or more selectable menu items. Each menu item can be a button, a label, an icon, etc., which can be interacted with to cause one or more responsive actions.
- the menu bar includes an item that causes the report 200 to be saved or downloaded.
- the report 200 can be saved in a digital form using one or more formats, such as portable document format (PDF), extendable markup language (XME), or any other file format.
- PDF portable document format
- XME extendable markup language
- the menu bar 210 further includes an item that facilitates printing the report 200. The printing can be performed using any printer driver accessible by the OR dashboard 120.
- the menu bar 210 includes an item that facilitates opening a manual related to the OR dashboard 120.
- a menu item can facilitate opening a settings menu that facilitates configuring one or more settings of the OR dashboard and/or the report 200.
- the settings can include visual attributes, default values, etc.
- the menu bar 210 can include a menu item that facilitates sharing the report 200.
- the sharing can send a link to the report 200 to another user of the OR dashboard 120, with which the other user can access the report 200.
- the sharer can specify a level of accessibility of the other user, for example, view only, edit, etc.
- the report 200 further includes user interactive elements 212, which allow an operator to display additional views to further analyze the surgical data.
- FIG. 7A depicts an example view 700 that is displayed in response to an interaction with the user interactive element 212 of the view 202.
- the view 700 facilitates identifying different durations in the timeline of the performed surgical procedures where particular attributes of the surgical procedures are commonly found.
- the view 700 includes characteristics or categories of surgical procedures listed as user interactive elements 702. Each of the user interactive elements 702 can be displayed using different visual attributes, for example, distinct colors.
- the surgical procedures being analyzed from a particular duration e.g., fiscal year, a quarter, one month, within specific dates, etc.
- a timeline 704 spanning the particular duration.
- the surgical procedures that meet a characteristic are marked with the same visual attributes that are associated with the user interactive element 702 of that characteristic.
- the marking is performed by generating and displaying, in the view 702, a visual element 706 (e.g., a line) that is orthogonal to the timeline 704 and that intersects the position of the surgical procedure meeting the characteristic.
- the visual element 706 has the same visual attributes as the user interactive element 702 of that characteristic.
- the user interactive elements 702 representing the characteristics can be selected/interacted with by the operator.
- FIG. 7B depicts an example view generated upon interaction with a user interactive element 702 representing the characteristics.
- the first user interactive element 702 selected by the operator is highlighted (or other user interactive elements 702 are faded) to depict selection of the first user interactive element 702.
- the first visual element 706 corresponding to the first user interactive element 702 is highlighted (or other visual elements 706 corresponding to the other user interactive elements 702 are faded).
- the operator can select the first visual element 706 (instead of the first user interactive element 702) and reach the same view 700 as shown in FIG. 7B.
- the view 700 is updated to show the updated number of surgical procedures that are now selected (compare top left in FIGS. 7A and 7B).
- the insights view 206 is also updated to show anomalies and patterns that are detected from the newly selected surgical procedures that match the selection of the user interactive element 702 (or visual element 706).
- the operator can add an annotation 710 to the view 700 as a multi-user comment box. For example, s/he can interact with a certain position (e.g., right click, click, etc.) and add an annotation 710.
- the annotation can be input by typing, voice command, etc.
- the input annotation 710 can be shared/directed to a particular user of the OR dashboard 120 in some aspects.
- the annotation is stored as a thread of communication to which other users of the OR dashboard 120 can add/edit their respective comments/annotations .
- the depicted reports and views in the drawings are generally shown to have a landscape view, the reports and views can be configured to be displayed in portrait mode, for example, if the OR dashboard 120 is viewed and interacted upon on a device such as a smartphone, tablet computer, or other such electronic devices.
- a device such as a smartphone, tablet computer, or other such electronic devices.
- the positioning of the views, user interactive elements, visual elements, and other items that are shown in the figures herein are configurable and can be adjusted dynamically based on an orientation, resolution, and other characteristics of the device that is being used to operate the OR dashboard 120.
- Examples described herein facilitate providing a user-interactive system to visualize and analyze substantial amounts of data associated with the system 100. Generating such user-interactive reports of the substantial amounts of data is not practical for a human, and hence, the technical solutions described herein provide a practical application to address technical challenges and provide improvements to computing systems, such as operating room management systems, hospital management systems, etc. For example, the technical solutions described herein facilitate service providers to review surgical procedures performed at a medical center over a certain period of time (e.g., month, quarter, etc.) and provide feedback to the hospital, actors, or any other stake holder.
- a certain period of time e.g., month, quarter, etc.
- the technical solutions described herein facilitate improvement in the performance of a surgical action, such as by identifying the actors, cases where particular surgical actions have been performed in the past.
- Technical solutions herein can also identify to an actor, such as a first surgeon, all instances of a surgical action (e.g., sealing) performed s/he performed in a surgical procedure and a comparison of the number of the same surgical actions performed by other surgeons.
- the first surgeon can interactively see the surgical actions being performed by himself/herself, and the other surgeons and determine improvements. For example, the first surgeon can observe ranges of durations for various procedures and phases by other surgeons, and emulate protocols used in such other procedures.
- FIG. 8 depicts an example of a display 800 on a mobile device according to one or more aspects.
- the display 800 can provide a version of the OR dashboard 120 scaled for a mobile device, such as a cellular phone or tablet computer.
- the display 800 can be configured to receive touch- sensitive input, allowing a user to tap, scroll/swipe, and other such gestures to navigate the mobile version of the OR dashboard 120.
- a user- informative element 802 is depicted similar to the user-informative element 202 of FIG. 2.
- One or more insights 804 can also be output on display 800.
- the OR dashboard 120 can be configured to display other user-informative elements responsive to a first user input 806 scrolling in a first direction and to display the one or more insights 804 responsive to a second user input 808 scrolling in a second direction that is orthogonal to the first direction. For instance, an up/down scrolling action can reveal and populate other user-informative elements, while a left/right scrolling action can reveal additional insights 804.
- a user interactive element 810 can be provided on display 800 to allow filtering of underlying data sets that can result in updates to the user-informative elements and insights 804. Further, selection and ordering of the user-informative elements can also be configurable through the display 800.
- FIG. 9 a flowchart of a method 900 for OR dashboard generation and display is generally shown in accordance with one or more aspects. All or a portion of method 900 can be implemented, for example, by all or a portion of system 100 of FIG. 1, the system 1000 of FIG. 10, and/or computer system 1500 of FIG. 11.
- a system can access surgical data captured from a plurality of surgical procedures on an operating room basis, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures.
- the system can determine, autonomously, one or more insights about the surgical data.
- the system can display a report (e.g., report 200) indicative of the one or more insights on an operating room dashboard (e.g., OR dashboard 120) including a plurality of user-informative elements.
- the system can receive one or more filter criteria, for instance, through user interactive element 208.
- the one or more filter criteria can include but is not limited to one or more of: a date range selection, a case duration selection, a surgeon selection, a procedure selection, a specialty selection, an operating room selection, a complexity score selection, and a patient selection.
- the system can filter the surgical data based on the one or more filter criteria to adjust an underlying dataset of the user-informative elements.
- the system can update the one or more insights about the surgical data based on the adjustment to the underlying dataset.
- the system can display updates to the user- informative elements and the one or more insights based on the adjustment to the underlying dataset.
- the one or more insights can be determined based on video streams of the surgical procedures.
- the one or more insights can be viewed, for example, through insight view 206 and/or as insights 804.
- a plurality of user-informative elements can include, but is not limited to one or more of: a total case and case forecast view (e.g., user-informative element 202), a per-operating-room view (e.g., user-informative element 302), an average daily case and usage view (e.g., user-informative element 304), a procedures-by-duration view (e.g., user-informative element 204), a procedure-by-volume view (e.g., user-informative element 204), a case complexity view (e.g., user-informative element 506), a per-capture-computer view (e.g., user-informative element 404), and a surgeon summary view (e.g., user-informative element 402).
- a total case and case forecast view e.g., user-informative element 202
- a per-operating-room view e.g., user-informative element 302
- the total case and case forecast view (e.g., user-informative element 202) can include user interactive elements 702 that display filtering options to highlight one or more of: long case durations, short case durations, high volume complex cases, and reduced staff cases, for example.
- the one or more insights can be autonomously updated based on detection of a user selection of one of the display filtering options from the user interactive elements 702.
- the OR dashboard 120 can be configured to add a multi-user comment box (e.g., annotation 710) to a user-selected portion (e.g., on visual element 706) of the user-informative elements based on a user-initiated comment command.
- the multi-user comment box can trigger sending of a message to a targeted user to request a response to a comment entered in the multi-user comment box.
- the OR dashboard 120 can be configured to add, delete, or reposition user-informative elements based on a user command, e.g., through user interactive elements such as user interactive element 502. Adding to the OR dashboard 120 can be performed through a menu, such as menu 504. In some aspects, the OR dashboard 120 can be configured to display a menu of items to available to display in the user-informative elements and add a new user-informative element to the OR dashboard 120 based on a user selection from the menu.
- the menu of items can include two or more of: an approach, a complexity score, an instrument score, an engagement level, specialties, trainees, an average case timing, daily usage, computer identifiers, insights, operating rooms, surgeons, total number of cases, and types of procedures.
- the OR dashboard 120 is configured for display on a mobile device by displaying the user-informative elements responsive to a first user input 806 scrolling in a first direction and displaying the one or more insights responsive to a second user input 808 scrolling in a second direction that is orthogonal to the first direction.
- FIG. 9 is not intended to indicate that the operations are to be executed in any particular order or that all of the operations shown in FIG. 9 are to be included in every case. Additionally, the processing shown in FIG. 9 can include any suitable number of additional operations.
- FIG. 10 a surgical procedure system 1000 is generally shown according to one or more aspects.
- the example of FIG. 10 depicts a surgical procedure support system 1002 that can include or may be coupled to the system 100 of FIG. 1.
- the surgical procedure support system 1002 can acquire image or video data using one or more cameras 1004.
- the surgical procedure support system 1002 can also interface with one or more sensors 1006 and/or one or more effectors 1008.
- the sensors 1006 may be associated with surgical support equipment and/or patient monitoring.
- the effectors 1008 can be robotic components or other equipment controllable through the surgical procedure support system 1002.
- the surgical procedure support system 1002 can also interact with one or more user interfaces 1010, such as various input and/or output devices.
- the surgical procedure support system 1002 can store, access, and/or update surgical data 1014 associated with a training dataset and/or live data as a surgical procedure is being performed on patient 110 of FIG. 1.
- the surgical procedure support system 1002 can store, access, and/or update surgical objectives 1016 to assist in training and guidance for one or more surgical procedures.
- User configurations 1018 can track and store user preferences.
- the surgical procedure support system 1002 can also communicate with other systems through a network 1030.
- the surgical procedure support system 1002 can communicate with a surgical data dashboard viewer 1040 and a surgical data post-processing system 1050 through the network 1030.
- Other types of devices such as a computing device 1034 (e.g., a mobile phone, laptop, personal computer, or tablet computer), can communicate directly with the surgical procedure support system 1002 or through the network 1030.
- user interfaces 1010 may be connected to or integrated with the surgical procedure support system 1002 by a wired connection while the computing device 1034 connects to the surgical procedure support system 1002 via a wireless connection.
- the computing device 1034 can execute or link to another computer system that executes a surgical data management system to access various data sources through the network 1030.
- the surgical data post-processing system 1050 can receive surgical data and associated data generated by the surgical procedure support system 1002 and may be separately stored and secured through other data storage. Access to specific data or portions of data through the surgical data post-processing system 1050 may be limited by associated permissions.
- the surgical data post-processing system 1050 may include features such as video viewing, video sharing, data analytics, and selective data extraction.
- the surgical data dashboard viewer 1040 can provide viewing access to various data sources, such as surgical data 1014, data collected in the one or more storage devices, and post-processed data generated by the surgical data post-processing system 1050.
- surgical instrument data, video data, and artificial intelligence generated data can be accessed for viewing through the surgical data dashboard viewer 1040.
- the surgical data post-processing system 1050 may generate surgical performance metrics and comparison data across data sets collected at multiple locations, making analytics data available to the surgical data dashboard viewer 1040.
- the surgical data dashboard viewer 1040 can access one or more surgical dashboards, such as the OR dashboard 120 of FIG. 1.
- the surgical data dashboard viewer 1040 and/or surgical data post-processing system 1050 can be components of a surgical data management system.
- One or more computing device 1064 can execute a surgical data management system to access various data sources through a network 1060.
- the network 1030 may be within a facility or multiple facilities maintained within a private network.
- the network 1060 may be a wider area network, such as the internet. Accordingly, the networks 1030 and 1060 may have access to different files and data sets along with shared access to select files and data sets. In some aspects, networks 1030 and 1060 can be combined.
- the computer system 1500 can be an electronic computer framework comprising and/or employing any number and combination of computing devices and networks utilizing various communication technologies, as described herein.
- the computer system 1500 can be easily scalable, extensible, and modular, with the ability to change to different services or reconfigure some features independently of others.
- the computer system 1500 may be, for example, a server, desktop computer, laptop computer, tablet computer, or smartphone.
- computer system 1500 may be a cloud computing node.
- Computer system 1500 may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer system.
- program modules may include routines, programs, objects, components, logic, data structures, and so on that perform particular tasks or implement particular abstract data types.
- Computer system 1500 may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network.
- program modules may be located in both local and remote computer system storage media, including memory storage devices.
- the computer system 1500 has one or more central processing units (CPU(s)) 1501a, 1501b, 1501c, etc. (collectively or generically referred to as processor(s) 1501).
- the processors 1501 can be a single-core processor, multi-core processor, computing cluster, or any number of other configurations.
- the processors 1501 can be any type of circuitry capable of executing instructions.
- the processors 1501, also referred to as processing circuits are coupled via a system bus 1502 to a system memory 1503 and various other components.
- the system memory 1503 can include one or more memory devices, such as read-only memory (ROM) 1504 and a random-access memory (RAM) 1505.
- ROM read-only memory
- RAM random-access memory
- the ROM 1504 is coupled to the system bus 1502 and may include a basic input/output system (BIOS), which controls certain basic functions of the computer system 1500.
- BIOS basic input/output system
- the RAM is read-write memory coupled to the system bus 1502 for use by the processors 1501.
- the system memory 1503 provides temporary memory space for operations of said instructions during operation.
- the system memory 1503 can include random access memory (RAM), read-only memory, flash memory, or any other suitable memory systems.
- the computer system 1500 comprises an input/output (I/O) adapter 1506 and a communications adapter 1507 coupled to the system bus 1502.
- the VO adapter 1506 may be a small computer system interface (SCSI) adapter that communicates with a hard disk 1508 and/or any other similar component.
- SCSI small computer system interface
- the I/O adapter 1506 and the hard disk 1508 are collectively referred to herein as a mass storage 1510.
- Software 1511 for execution on the computer system 1500 may be stored in the mass storage 1510.
- the mass storage 1510 is an example of a tangible storage medium readable by the processors 1501, where the software 1511 is stored as instructions for execution by the processors 1501 to cause the computer system 1500 to operate, such as is described hereinbelow with respect to the various Figures. Examples of computer program product and the execution of such instruction is discussed herein in more detail.
- the communications adapter 1507 interconnects the system bus 1502 with a network 1512, which may be an outside network, enabling the computer system 1500 to communicate with other such systems.
- a portion of the system memory 1503 and the mass storage 1510 collectively store an operating system, which may be any appropriate operating system to coordinate the functions of the various components shown in FIG. 11.
- Additional input/output devices are shown as connected to the system bus 1502 via a display adapter 1515 and an interface adapter 1516 and.
- the adapters 1506, 1507, 1515, and 1516 may be connected to one or more VO buses that are connected to the system bus 1502 via an intermediate bus bridge (not shown).
- a display 1519 e.g., a screen or a display monitor
- a display adapter 1515 which may include a graphics controller to improve the performance of graphics-intensive applications and a video controller.
- a keyboard, a mouse, a touchscreen, one or more buttons, a speaker, etc. can be interconnected to the system bus 1502 via the interface adapter 1516, which may include, for example, a Super I/O chip integrating multiple device adapters into a single integrated circuit.
- Suitable I/O buses for connecting peripheral devices such as hard disk controllers, network adapters, and graphics adapters typically include common protocols, such as the Peripheral Component Interconnect (PCI).
- PCI Peripheral Component Interconnect
- the computer system 1500 includes processing capability in the form of the processors 1501, and storage capability including the system memory 1503 and the mass storage 1510, input means such as the buttons, touchscreen, and output capability including the speaker 1523 and the display 1519.
- the communications adapter 1507 can transmit data using any suitable interface or protocol, such as the internet small computer system interface, among others.
- the network 1512 may be a cellular network, a radio network, a wide area network (WAN), a local area network (LAN), or the Internet, among others.
- An external computing device may connect to the computer system 1500 through the network 1512.
- an external computing device may be an external web server or a cloud computing node.
- FIG. 11 the block diagram of FIG. 11 is not intended to indicate that the computer system 1500 is to include all of the components shown in FIG. 11. Rather, the computer system 1500 can include any appropriate fewer or additional components not illustrated in FIG. 11 (e.g., additional memory components, embedded controllers, modules, additional network interfaces, etc.). Further, the aspects described herein with respect to computer system 1500 may be implemented with any appropriate logic, wherein the logic, as referred to herein, can include any suitable hardware (e.g., a processor, an embedded controller, or an application- specific integrated circuit, among others), software (e.g., an application, among others), firmware, or any suitable combination of hardware, software, and firmware, in various aspects. Various aspects can be combined to include two or more of the aspects described herein.
- aspects disclosed herein may be a system, a method, and/or a computer program product at any possible technical detail level of integration
- the computer program product may include a computer-readable storage medium (or media) having computer-readable program instructions thereon for causing a processor to carry out aspects of the present invention
- a method includes accessing surgical data captured from a plurality of surgical procedures, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures. The method also includes analyzing the surgical data to determine one or more insights about use of the operating rooms and actions performed by the one or more surgeons and displaying the one or more insights on an operating room dashboard. The method further includes filtering the surgical data based on the one or more filter criteria, updating the one or more insights about the surgical data based on the filtering, and displaying updates to the one or more insights on the operating room dashboard.
- the method can include where the one or more insights are determined based on video streams of the surgical procedures.
- the method can include displaying a plurality of user-informative elements on the operating room dashboard, adjusting an underlying dataset of the user-informative elements based on the filtering, and displaying updates to the user-informative elements based on the adjustment to the underlying dataset.
- the method can include displaying a menu of items to available to display in the user-informative elements, adding a new user-informative element to the operating room dashboard based on a user selection from the menu, and deleting or repositioning the user- informative elements on the operating room dashboard based on a user command.
- the menu of items can include two or more of: an approach, a complexity score, an instrument score, an engagement level, specialties, trainees, an average case timing, daily usage, computer identifiers, insights, operating rooms, surgeons, total number of cases, and types of procedures
- the one or more filter criteria can include one or more of: a date range selection, a case duration selection, a surgeon selection, a procedure selection, a specialty selection, an operating room selection, a complexity score selection, and a patient selection.
- a computer program product includes a memory device with computer readable instructions stored thereon (e.g., a computer-readable storage medium), where executing the computer readable instructions by one or more processing units causes the one or more processing units to perform a plurality of operations that include accessing surgical data captured from a plurality of surgical procedures, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures.
- the operations also include analyzing the surgical data to determine one or more insights about the surgical procedures and displaying the one or more insights.
- the operations further include filtering the surgical data based on the one or more filter criteria, updating the one or more insights about the surgical data based on the filtering, and displaying updates to the one or more insights.
- the one or more insights can be determined based on videos of the surgical procedures.
- the operations can include displaying a plurality of user-informative elements on an operating room dashboard, adjusting an underlying dataset of the user-informative elements based on the filtering, and displaying updates to the user- informative elements based on the adjustment to the underlying dataset, where the user- informative elements can include one or more of: a total case and case forecast view, a per- operating-room view, an average daily case and usage view, a procedures-by-duration view, a procedure-by-volume view, a case complexity view, a per-capture-computer view, and a surgeon summary view.
- the operating room dashboard can be configured for display on a mobile device by displaying the user-informative elements responsive to a first user input scrolling in a first direction and displaying the one or more insights responsive to a second user input scrolling in a second direction that is orthogonal to the first direction.
- the computer-readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
- the computer-readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
- a non-exhau stive list of more specific examples of the computer-readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
- RAM random access memory
- ROM read-only memory
- EPROM or Flash memory erasable programmable read-only memory
- SRAM static random access memory
- CD-ROM compact disc read-only memory
- DVD digital versatile disk
- memory stick a floppy disk
- mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
- a computer-readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
- Computer-readable program instructions described herein can be downloaded to respective computing/processing devices from a computer-readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network, and/or a wireless network.
- the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers.
- a network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer- readable storage medium within the respective computing/processing device.
- Computer-readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction- set- architecture (ISA) instructions, machine instructions, machine-dependent instructions, microcode, firmware instructions, state-setting data, configuration data for integrated circuitry, or either source-code or object code written in any combination of one or more programming languages, including an object-oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the “C” programming language or similar programming languages.
- the computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer-readable program instruction by utilizing state information of the computer-readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
- These computer-readable program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- These computer- readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer-readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer-implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the blocks may occur out of the order noted in the Figures.
- two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- exemplary is used herein to mean “serving as an example, instance or illustration.” Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs.
- the terms “at least one” and “one or more” may be understood to include any integer number greater than or equal to one, i.e., one, two, three, four, etc.
- the terms “a plurality” may be understood to include any integer number greater than or equal to two, i.e., two, three, four, five, etc.
- connection may include both an indirect “connection” and a direct “connection.”
- the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware -based processing unit.
- Computer-readable media may include non- transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
- processors such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
- DSPs digital signal processors
- ASICs application specific integrated circuits
- FPGAs field programmable logic arrays
- processors may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- General Business, Economics & Management (AREA)
- Biomedical Technology (AREA)
- Epidemiology (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Business, Economics & Management (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Techniques are described for providing a user interactive dashboard that provides insights into use of operating rooms at a medical center based on surgical data captured by surgical data capture systems in the operating rooms. User-informative elements and the insights can be displayed and adjusted based on applying one or more filters to underlying datasets of surgical data. An operating room dashboard can display the user-informative elements and the insights and allow for user customizations of displayed content.
Description
OPERATING ROOM DASHBOARD
BACKGROUND
[0001] The present disclosure is generally related to computing technology, and particularly to improvements to systems that facilitate provision of guidance about surgical procedures in a hospital setting where video streams of the surgical procedures are being captured.
[0002] Medical centers (e.g., hospitals, clinics, outpatient surgery centers, etc.) have to manage demands on services, such as performing surgical procedures, based on limited health care resources, such as operating rooms, surgeons, staff, instruments, etc. With increase in overall population, and demand for health care services increasing efficiency of meeting such an increased demand with the limited resources is critical. Inefficiency in the process of patient care involving surgical services could be extremely costly to medical centers, patients, and society in general.
SUMMARY
[0003] According to an aspect, a system includes a memory device and one or more processors coupled with the memory device, the one or more processors configured to perform a plurality of operations. The operations include accessing surgical data captured from a plurality of surgical procedures on an operating room basis, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures. One or more insights about the surgical data are autonomously determined. One or more filter criteria are received, and the surgical data is filtered based on the one or more filter criteria to adjust an underlying dataset of the user-informative elements. The one or more insights about the surgical data are updated based on the adjustment to the underlying dataset. Updates to the user- informative elements and the one or more insights are displayed based on the adjustment to the underlying dataset.
[0004] According to another aspect, a method includes accessing surgical data captured from a plurality of surgical procedures, the surgical data identifying one or more surgeons and an
operating room associated with each of the surgical procedures. The method also includes analyzing the surgical data to determine one or more insights about use of the operating rooms and actions performed by the one or more surgeons and displaying the one or more insights on an operating room dashboard. The method further includes filtering the surgical data based on the one or more filter criteria, updating the one or more insights about the surgical data based on the filtering, and displaying updates to the one or more insights on the operating room dashboard.
[0005] According to a further aspect, a computer program product includes a memory device with computer readable instructions stored thereon, where executing the computer readable instructions by one or more processing units causes the one or more processing units to perform a plurality of operations that include accessing surgical data captured from a plurality of surgical procedures, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures. The operations also include analyzing the surgical data to determine one or more insights about the surgical procedures and displaying the one or more insights. The operations further include filtering the surgical data based on the one or more filter criteria, updating the one or more insights about the surgical data based on the filtering, and displaying updates to the one or more insights.
[0006] The above features and advantages, and other features and advantages, of the disclosure are readily apparent from the following detailed description when taken in connection with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] The specifics of the exclusive rights described herein are particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features and advantages of the aspects of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
[0008] FIG. 1 shows a computer assisted surgical system according to one or more aspects;
[0009] FIG. 2 depicts example user-interactive report of surgical procedures performed using operating rooms at a medical center according to one or more aspects;
[0010] FIGS. 3A-3B depict example view of surgical procedures analyzed according to one or more aspects;
[0011] FIG. 4 depicts example user-interactive views of surgical procedures analyzed per surgeon according to one or more aspects;
[0012] FIGS. 5A-5C depict example user interactivity of a report according to one or more aspects;
[0013] FIG. 6 depicts filtering a report according to one or more aspects;
[0014] FIGS. 7A-7B depict user-interactive views to analyze selected surgical procedures according to one or more aspects;
[0015] FIG. 8 depicts an example of a display on a mobile device according to one or more aspects;
[0016] FIG. 9 depicts a flowchart of a method according to one or more aspects;
[0017] FIG. 10 depicts a surgical procedure system according to one or more aspects; and
[0018] FIG. 11 depicts a block diagram of a computer system according to one or more aspects.
[0019] The diagrams depicted herein are illustrative. There can be many variations to the diagram, or the operations described therein without departing from the spirit of the invention. For instance, the actions can be performed in a differing order, or actions can be added, deleted, or modified. Also, the term “coupled,” and variations thereof describe having a communications path between two elements and do not imply a direct connection between the elements with no intervening elements/connections between them. All of these variations are considered a part of the specification.
DETAILED DESCRIPTION
[0020] Exemplary aspects of technical solutions described herein relate to, among other things, devices, systems, methods, computer-readable media, techniques, and methodologies for using machine learning and computer vision to improve a computerized management system of a medical center. Particularly, aspects of the technical solutions herein improve computerized management system of operating rooms in the medical center.
[0021] An operating room dashboard can provide insights into the utilization of a medical center’s resources, such as operating rooms, and other resources (human and materials) used during surgical procedures. The dashboard can provides such insights based on surgical video(s) captured of the surgical procedures being performed. The video(s) can include intracorporal video from within a patients’ body and extracorporeal video captured from within the operating room(s). The insights provided can be automatically determined to facilitate operators, such as staff, administrators, surgeons, etc. of the medical center to take one or more responsive actions.
[0022] In one or more aspects, structures are predicted dynamically and substantially in real-time as the surgical data is being captured and analyzed by technical solutions described herein. A predicted structure can be an anatomical structure, a surgical instrument, etc. Exemplary aspects of technical solutions described herein further facilitate generating augmented views of surgical sites using semantic surgical representations based on the predictions of the one or more structures in the surgical data. Further, based on the detected structures and features from the captured surgical data, technical solutions herein can determine phases, actions, and other specific aspects of a surgical procedure being performed. Based on the determination of such information about the ongoing surgical procedures, aspects of the technical solutions herein facilitate providing insights (e.g., graphs, notifications, highlights, etc.) for an administrator to manage one or more resources. Here managing resources can include scheduling resources, assigning resources, replacing resources, etc. as noted elsewhere, resources can include human resources (e.g., surgeons, staff, etc.) and/or material resources (e.g., operating rooms, tools, equipment, pharmaceuticals, dyes, etc.).
[0023] Technical effects can include management of computer memory resources and network traffic by allowing user selection and configuration of the user-informative elements on an operating room dashboard. For example, allowing a user to add or remove user-informative elements to only show the content of interest reduces the memory resource, processing resource, and network resource usage for content that is not relevant or useful for a user. Further, the ability to rearrange the user-informative elements on the operating room dashboard by an order preference of the user can allow a user to group more relevant or related views of interest in close physical proximity on a display to reduce scrolling and searching actions that may otherwise result in more resource consumption. For instance, as user-informative elements and insights are dynamically populated when shifted into a viewable position on a display, excessive scrolling and searching for a desired user-informative element can result in additional resource utilization in populating content that is not of interest if selection of the user-informative elements was not configurable. Further, insights can be dynamically determined based on data set filters. Dynamic determination of insights can reduce data storage requirements as compared to prepopulating a database with possible insights that may not be accessed or deemed useful.
[0024] FIG. 1 depicts an example system according to one or more aspects. The system 100 includes at least a surgical data capture system 102, a video recording system 104, and a surgical instrumentation system 106, in each operating room (OR) 101. There may be N operating rooms, OR-1 101 A, OR-2 101B, . . ., OR-N 101N, N being any integer.
[0025] In each operating room 101, actor 112 can be medical personnel that uses the system 100 to perform a surgical procedure on a patient 110. Medical personnel can be a surgeon, assistant, nurse, administrator, or any other actor that interacts with the system 100 in a surgical environment. The surgical procedure can be any type of surgery such as but not limited to cataract surgery, laparoscopic cholecystectomy, endoscopic endonasal transsphenoidal approach (eTSA) to resection of pituitary adenomas, or any other surgical procedure. In other examples, the actor 112 can be a technician, and administrator, an engineer, or any other such personnel that interacts with the system 100. For example, the actor 112 can record data from the system 100, configure/update one or more attributes of the system 100, review past performance of the system 100, repair the system 100, etc.
[0026] A surgical procedure can include multiple phases, and each phase can include one or more surgical actions. A “surgical action” can include an incision, a compression, a stapling, a clipping, a suturing, a cauterization, a sealing, or any other such actions performed to complete a phase in the surgical procedure. A “phase” represents a surgical event that is composed of a series of steps (e.g., closure). A “step” refers to the completion of a named surgical objective (e.g., hemostasis). During each step, certain surgical instruments 108 (e.g., forceps) are used to achieve a specific objective by performing one or more surgical actions.
[0027] The surgical instrumentation system 106 provides electrical energy to operate one or more surgical instruments 108 to perform the surgical actions. The electrical energy triggers an activation in the surgical instrument 108. The electrical energy can be provided in the form of an electrical current or an electrical voltage. The activation can cause a surgical action to be performed. The surgical instrumentation system 106 can further include electrical energy sensors, electrical impedance sensors, force sensors, bubble and occlusion sensors, and various other types of sensors. The electrical energy sensors can measure and indicate an amount of electrical energy applied to one or more surgical instruments 108 being used for the surgical procedure. The impedance sensors can indicate an amount of impedance measured by the surgical instruments 108, for example, from the tissue being operated upon. The force sensors can indicate an amount of force being applied by the surgical instruments 108. Measurements from various other sensors, such as position sensors, pressure sensors, flow meters, can also be input.
[0028] The video recording system 104 includes one or more cameras, such as operating room cameras, endoscopic cameras, etc. The cameras capture video data of the surgical procedure being performed. The video recording system 104 includes one or more video capture devices that can include cameras placed in the surgical room to capture events surrounding (i.e., outside) the patient being operated upon. The video recording system 104 further includes cameras that are passed inside (e.g., endoscopic cameras) the patient to capture endoscopic data. The endoscopic data provides video, images of the surgical procedure (e.g., FIG. 4).
[0029] The surgical data capture system 102 includes one or more memory devices, one or more processors, a user interface device, among other components. The surgical data capture
system 102 can execute one or more computer executable instructions. The execution of the instructions facilitates the surgical data capture system 102 to perform one or more methods, including those described herein. The surgical data capture system 102 can communicate with other computing systems via a wired and/or a wireless network. In one or more examples, the surgical data capture system 102 includes one or more trained machine learning models that can detect and/or predict features of/from the surgical procedure that is being performed, or has been performed earlier. Features can include structures such as anatomical structures, surgical instruments 108 in the surgical procedure. Features can further include events such as phases, actions in the surgical procedure. Features that are detected can further include actor 112, patient 110. Based on the detection, the surgical data capture system 102, in one or more examples, can provide recommendation for subsequent actions to be taken by actor 112. Alternatively, or in addition, the surgical data capture system 102 can provide one or more reports based on the detections. The detections by the machine learning models can be performed in an autonomous or semi-autonomous manner.
[0030] The machine learning models can include artificial neural networks, such as deep neural networks, convolutional neural networks, recurrent neural networks, encoders, decoders, or any other type of machine learning models. The machine learning models can be trained in a supervised, unsupervised, or hybrid manner. The machine learning models can be trained to perform detection and/or prediction using one or more types of data acquired by the system 100. For example, the machine learning models can use the video data captured via the video recording system 104. Alternatively, or in addition, the machine learning models use the surgical instrumentation data from the surgical instrumentation system 106. In yet other examples, the machine learning models use a combination of the video and the surgical instrumentation data.
[0031] Additionally, in some examples, the machine learning models can also use audio data captured during the surgical procedure. The audio data can include sounds emitted by the surgical instrumentation system 106 while activating one or more surgical instruments 108. Alternatively, or in addition, the audio data can include voice commands, snippets, or dialog from one or more actors 112. The audio data can further include sounds made by the surgical instruments 108 during their use.
[0032] In one or more examples, the machine learning models can detect surgical actions, surgical phases, anatomical structures, surgical instruments, and various other features from the data associated with a surgical procedure. The detection can be performed in real-time in some examples. Alternatively, or in addition, the surgical data capture system 102 analyzes the surgical data, i.e., the various types of data captured during the surgical procedure, in an offline manner (e.g., post-surgery).
[0033] The surgical data capture system 102 from each operating room 101 can be in communication with an OR dashboard 120. The OR dashboard 120 can be (or be a part of) a computing system. For example, the OR dashboard 120 can be a part of a hospital management system (HMS), or any other such enterprise level system. In some aspects, the OR dashboard 120 can include the machine learning capabilities that are described herein. Alternatively, or in addition, the OR dashboard receives information and commands from the one or more surgical data capture systems 102. Further, in some aspects, the OR dashboard 120 provides data and commands to the surgical data capture system 102.
[0034] FIG. 2 depicts an example report of an analysis performed by the surgical data capture system 102 using the surgical data. The report 200 is user-interactive and displayed on the OR dashboard 120. The report 200 can be displayed via the user interface of the OR dashboard 120. In some aspects, the report can also be displayed by the surgical data capture system 102.
[0035] The report 200 can include a user-informative element 202 (e.g., a total case and case forecast view as a user-informative element) that indicates a number of surgeries performed during a selected duration, for example, a fiscal year, a month, or any other selected duration. Further, the user- informative element 202 includes a forecast of how many surgical procedures are expected to be performed in an upcoming duration, for example, a quarter, a month, etc. The user-informative element 202 can include one or more interactive elements that the user can configure. In an example, the number of surgical procedures performed and predicted can be shown in the form of a graph. The user can configure type of the graph, e.g., line graph, bar graph, pie graph, etc. the user can also configure one or more visual attributes, e.g., color, shape, etc., associated with the user-informative element 202.
[0036] In some aspects, the user can provide, or the OR dashboard may receive a target number of surgical procedures. The user-informative element 202 may show a comparison between the number of surgical procedures performed, predicted, and targeted.
[0037] Further, the report 200 can include user-informative elements 204 that provide further details about the surgical procedures performed so far in the selected duration. For example, the user-informative element 204 can list the performed surgical procedures according to a type of the surgical procedure. For example, a predetermined list of types of surgical procedures with volume of procedures for each type is provided. The information can be provided in a textual, graphical, or any other manner or a combination thereof. Alternatively, or in addition, the user-informative elements 204 can categorize the surgical procedures performed in the selected duration (in 202), according to the duration of each surgical procedure. The information can be provided in a textual, graphical, or any other manner or a combination thereof.
[0038] Additionally, the report 200 includes an insight view 206 that provides automatically detected anomalies or patterns in the surgical data associated with the surgical procedures selected (in 202), where the user insight view 206 displays one or more insights as a user-informative element. For example, the anomalies or patterns associated with the surgical data can identify surgical procedures that took longer (or shorter) than a typical time for such types of surgical procedures. Alternatively, or in addition, the insights can be based on anomalies or patterns associated with types of surgical procedures performed within a certain number of days in the past. The anomalies or patterns that are detected can be configured by a user, for example, by specifying factors to use to detect such variation from normal. For example, the factors can include duration, type, surgeon, staff, equipment, etc., and a combination thereof.
[0039] It is understood that the number of insights shown, number of surgical procedures selected and/or analyzed, and other such aspects depicted in FIG. 2 are exemplary, and can vary in other aspects.
[0040] Further, each of the user-informative elements 202, 204, 206 that represents a surgical procedure, in response to a first interaction, such as a hover, a click, a right-click, a touch, a voice command, etc., provides detailed information about that surgical procedure via
another user- informative element (not shown). For example, the other user- informative element can identify the procedure being performed, phases of the procedure, a duration of the procedure, the personnel involved, equipment used, patient information, etc.
[0041] Further yet, in response to another interaction with a user-interactive element, e.g., click, double click, right click, etc., of the visual report 200 displays a view 300. FIG. 3A depicts an example view 300.
[0042] The view 300 includes a user-informative element 302 that provides detailed information about surgical procedures per operating room 101. For example, the user- informative element 302 indicates a number of procedures performed per operating room 101, average procedures per operating room 101, and amount of video recording performed in each operating room 101. Additional parameters can also be configured to be displayed in some aspects
[0043] Further, the view 300 includes a user-informative element 304 that provides further details about one or more parameters associated with the operating rooms 101. In some aspects, the user-informative element 304 is displayed in response to a user-interaction with one or more parameters depicted in the user-informative element 302. For example, upon selecting the average number of procedures per operating room in the user-informative element 302, the user-informative element 304 displays a timeline of the selected duration (in 202) and a graph with the average number of procedures performed per operating room 101 for all of the operating rooms 101. For example, the average daily procedures, or any other selected parameter is color encoded for each operating room, with the color encoding displayed as an index. The color encoding may be performed dynamically or based on a preselected list of colors for each operating room. In other aspects, in addition to or alternative to color, other visual attributes may be used to represent the selected parameter.
[0044] FIG. 3B depicts another example of the user-informative element 304 in which the surgical procedures performed are displayed using a color encoded line graph for each operating room 101, each line graph depicting durations of surgical procedures performed over the timeline. In some aspects, the user can interact with each line graph, which results in display
of another room (not shown) that provides a breakdown of the surgical procedures performed in the operating room 101 corresponding to the line graph selected (or interacted with).
[0045] Further, the user-informative element 304, in some aspects, shows an indication of the operating room that was used as an emergency room only for operating on emergency cases. Such an operating room may have specific/special attributes associated with it, which may result in exclusion of data from that operating room when generating particular insights. Alternatively, or in addition, specific insights may be generated only based on the operating room 101 used for the emergency cases.
[0046] FIG. 4 depicts an example of another view 400. In some aspects, the view 400 can be displayed in response to a user-interaction with one or more of the user-informative elements in other figures herein. The view 400 includes a user-informative element 402, which depicts the surgical procedures performed during the selected duration (in 202) based on one or more of the personnel involved, for example, based on the surgeon. The details shown per surgeon can include the types of procedures, total number of procedures, average procedures (per day), average duration per case, average duration per day, etc. In some aspects, the information can further include complexity ratings associated with the surgical procedures performed by each surgeon. The complexity ratings can be based on automated identification of features of the surgical procedures, where the ratings are assigned based on presence (or absence) of one or more surgical features. Alternatively, or in addition, the complexity ratings can be based on one or more annotations provided by the surgeon during the surgical procedures. The complexity ratings can be depicted using one or more visual attributes, such as color, in some aspects.
[0047] The view 400 can further include a user-informative element 404 that displays the information of the surgical procedures based on the surgical data capture systems 102 in each operating room 101. The surgical data capture systems 102 can be moved from one operating room 101 to another, and accordingly, the number of surgical procedures recorded by a surgical data capture system 102 can vary from the operating room 101 in which it is located. The depicted information in the user-informative element 404 includes a number of procedures recorded on each of the surgical data capture systems 102 connected to the OR dashboard 120, and details for each of such surgical procedures.
[0048] FIG. 5A depicts example user interactivity of the report 200 according to one or more aspects. For example, the report 200 includes a user interactive element 502 that facilitates adding more views (300, 400, etc.) to the report 200. Interaction (e.g., click, hover, double click, touch, etc.) with the user interactive element 502 causes a menu bar 504 to be displayed. The menu bar 504 includes types of several views that a user can add to the report 200. The types of views can be represented by labels, buttons, or any other selectable and interactive element in the menu bar 504. Interaction with the one or more types of views in the menu bar 504 causes a corresponding view to be added to the report 200. FIG. 5B depicts the report 200 with a corresponding view 506 for a case complexity rating view added as per interaction depicted in FIG. 5A.
[0049] FIG. 5C depicts additional example interactivity with the report 200. For example, one or more views of the report 200 can be moved from one position to another. In FIG. 5C, the positions of the view 506 and the view 404 are swapped. Further, each of the views (200, 300, 400, etc.) include a user interactive element 508. Interaction with the user interactive element 508 displays a menu bar that is context specific to that view. The menu bar includes one or more selectable options. For example, the options can be selectable by being represented as buttons, icons, images, labels, etc. The options can facilitate removing the view, sending feedback about the view to a developer of the OR dashboard 120, switching between a graphical/list view, etc. The options can also include removing the view from the report 200.
[0050] FIG. 6 depicts a view 600 that is depicted in response to an interaction with a user interactive element 208 (see FIG. 2). The user interactive element 208 facilitates filtering the surgical data being used to generate the report 200 including the one or more views (300, 400, etc.) in the report. In some aspects, interaction with the user interactive element 208 displays the view 600. The view 600 includes a menu bar 602, with one or more selectable filters. The filters are selectable by representing the filters as menu items, such as labels, icons, buttons, etc. Each filter can include one or more parameters that the user can select to filter the surgical data accordingly. For example, the filters can include filtering the surgical data according to date, time, day of the week, or any other temporal aspect. Further the filters can include filtering the surgical data according to specifics of each surgical procedures, for example, case duration, personnel (e.g., surgeon), procedure type, specialty, operating room, complexity rating, patient
information, etc., or a combination thereof. Once the filters are configured and applied, all the views in the report are updated correspondingly.
[0051] Referring to FIG. 2, the report 200 includes a menu bar 210. The menu bar 210 includes one or more selectable menu items. Each menu item can be a button, a label, an icon, etc., which can be interacted with to cause one or more responsive actions. For example, the menu bar includes an item that causes the report 200 to be saved or downloaded. The report 200 can be saved in a digital form using one or more formats, such as portable document format (PDF), extendable markup language (XME), or any other file format. The menu bar 210 further includes an item that facilitates printing the report 200. The printing can be performed using any printer driver accessible by the OR dashboard 120. Further, the menu bar 210 includes an item that facilitates opening a manual related to the OR dashboard 120. Further yet, a menu item can facilitate opening a settings menu that facilitates configuring one or more settings of the OR dashboard and/or the report 200. For example, the settings can include visual attributes, default values, etc.
[0052] Further, the menu bar 210 can include a menu item that facilitates sharing the report 200. The sharing can send a link to the report 200 to another user of the OR dashboard 120, with which the other user can access the report 200. In one or more aspects, the sharer can specify a level of accessibility of the other user, for example, view only, edit, etc.
[0053] The report 200 further includes user interactive elements 212, which allow an operator to display additional views to further analyze the surgical data. FIG. 7A depicts an example view 700 that is displayed in response to an interaction with the user interactive element 212 of the view 202. The view 700 facilitates identifying different durations in the timeline of the performed surgical procedures where particular attributes of the surgical procedures are commonly found. The view 700 includes characteristics or categories of surgical procedures listed as user interactive elements 702. Each of the user interactive elements 702 can be displayed using different visual attributes, for example, distinct colors. Further, in the view 700, the surgical procedures being analyzed from a particular duration (e.g., fiscal year, a quarter, one month, within specific dates, etc.) are represented along a timeline 704 spanning the particular duration.
[0054] The surgical procedures that meet a characteristic are marked with the same visual attributes that are associated with the user interactive element 702 of that characteristic. In some aspects, the marking is performed by generating and displaying, in the view 702, a visual element 706 (e.g., a line) that is orthogonal to the timeline 704 and that intersects the position of the surgical procedure meeting the characteristic. The visual element 706 has the same visual attributes as the user interactive element 702 of that characteristic.
[0055] The user interactive elements 702 representing the characteristics can be selected/interacted with by the operator. FIG. 7B depicts an example view generated upon interaction with a user interactive element 702 representing the characteristics. For example, the first user interactive element 702 selected by the operator is highlighted (or other user interactive elements 702 are faded) to depict selection of the first user interactive element 702. Further, the first visual element 706 corresponding to the first user interactive element 702 is highlighted (or other visual elements 706 corresponding to the other user interactive elements 702 are faded). In one or more aspects, the operator can select the first visual element 706 (instead of the first user interactive element 702) and reach the same view 700 as shown in FIG. 7B. Further, upon such selection of the first user interactive element 702 or the visual element 706, the view 700 is updated to show the updated number of surgical procedures that are now selected (compare top left in FIGS. 7A and 7B). The insights view 206 is also updated to show anomalies and patterns that are detected from the newly selected surgical procedures that match the selection of the user interactive element 702 (or visual element 706).
[0056] In one or more aspects, the operator can add an annotation 710 to the view 700 as a multi-user comment box. For example, s/he can interact with a certain position (e.g., right click, click, etc.) and add an annotation 710. The annotation can be input by typing, voice command, etc. The input annotation 710 can be shared/directed to a particular user of the OR dashboard 120 in some aspects. In some aspects, the annotation is stored as a thread of communication to which other users of the OR dashboard 120 can add/edit their respective comments/annotations .
[0057] It should be noted that while the depicted reports and views in the drawings are generally shown to have a landscape view, the reports and views can be configured to be
displayed in portrait mode, for example, if the OR dashboard 120 is viewed and interacted upon on a device such as a smartphone, tablet computer, or other such electronic devices. The positioning of the views, user interactive elements, visual elements, and other items that are shown in the figures herein are configurable and can be adjusted dynamically based on an orientation, resolution, and other characteristics of the device that is being used to operate the OR dashboard 120.
[0058] Examples described herein facilitate providing a user-interactive system to visualize and analyze substantial amounts of data associated with the system 100. Generating such user-interactive reports of the substantial amounts of data is not practical for a human, and hence, the technical solutions described herein provide a practical application to address technical challenges and provide improvements to computing systems, such as operating room management systems, hospital management systems, etc. For example, the technical solutions described herein facilitate service providers to review surgical procedures performed at a medical center over a certain period of time (e.g., month, quarter, etc.) and provide feedback to the hospital, actors, or any other stake holder.
[0059] The technical solutions described herein facilitate improvement in the performance of a surgical action, such as by identifying the actors, cases where particular surgical actions have been performed in the past. Technical solutions herein can also identify to an actor, such as a first surgeon, all instances of a surgical action (e.g., sealing) performed s/he performed in a surgical procedure and a comparison of the number of the same surgical actions performed by other surgeons. The first surgeon can interactively see the surgical actions being performed by himself/herself, and the other surgeons and determine improvements. For example, the first surgeon can observe ranges of durations for various procedures and phases by other surgeons, and emulate protocols used in such other procedures.
[0060] The technical solutions described herein can further facilitate comparing hospital quality care, surgeons, operating room use, surgical data capture systems, etc.
[0061] The examples described herein can be performed using a computer such as a server computer, a desktop computer, a tablet computer, etc. In one or more examples the technical solutions herein can be implemented using cloud computing technology.
[0062] FIG. 8 depicts an example of a display 800 on a mobile device according to one or more aspects. The display 800 can provide a version of the OR dashboard 120 scaled for a mobile device, such as a cellular phone or tablet computer. The display 800 can be configured to receive touch- sensitive input, allowing a user to tap, scroll/swipe, and other such gestures to navigate the mobile version of the OR dashboard 120. In the example of FIG. 8, a user- informative element 802 is depicted similar to the user-informative element 202 of FIG. 2. One or more insights 804 can also be output on display 800. In some aspects, the OR dashboard 120 can be configured to display other user-informative elements responsive to a first user input 806 scrolling in a first direction and to display the one or more insights 804 responsive to a second user input 808 scrolling in a second direction that is orthogonal to the first direction. For instance, an up/down scrolling action can reveal and populate other user-informative elements, while a left/right scrolling action can reveal additional insights 804. A user interactive element 810 can be provided on display 800 to allow filtering of underlying data sets that can result in updates to the user-informative elements and insights 804. Further, selection and ordering of the user-informative elements can also be configurable through the display 800.
[0063] Turning now to FIG. 9, a flowchart of a method 900 for OR dashboard generation and display is generally shown in accordance with one or more aspects. All or a portion of method 900 can be implemented, for example, by all or a portion of system 100 of FIG. 1, the system 1000 of FIG. 10, and/or computer system 1500 of FIG. 11.
[0064] At block 902, a system can access surgical data captured from a plurality of surgical procedures on an operating room basis, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures.
[0065] At block 904, the system can determine, autonomously, one or more insights about the surgical data.
[0066] At block 906, the system can display a report (e.g., report 200) indicative of the one or more insights on an operating room dashboard (e.g., OR dashboard 120) including a plurality of user-informative elements.
[0067] At block 908, the system can receive one or more filter criteria, for instance, through user interactive element 208. The one or more filter criteria can include but is not limited to one or more of: a date range selection, a case duration selection, a surgeon selection, a procedure selection, a specialty selection, an operating room selection, a complexity score selection, and a patient selection.
[0068] At block 910, the system can filter the surgical data based on the one or more filter criteria to adjust an underlying dataset of the user-informative elements.
[0069] At block 912, the system can update the one or more insights about the surgical data based on the adjustment to the underlying dataset.
[0070] At block 914, the system can display updates to the user- informative elements and the one or more insights based on the adjustment to the underlying dataset.
[0071] According to some aspects, the one or more insights can be determined based on video streams of the surgical procedures. The one or more insights can be viewed, for example, through insight view 206 and/or as insights 804.
[0072] According to some aspects, a plurality of user-informative elements can include, but is not limited to one or more of: a total case and case forecast view (e.g., user-informative element 202), a per-operating-room view (e.g., user-informative element 302), an average daily case and usage view (e.g., user-informative element 304), a procedures-by-duration view (e.g., user-informative element 204), a procedure-by-volume view (e.g., user-informative element 204), a case complexity view (e.g., user-informative element 506), a per-capture-computer view (e.g., user-informative element 404), and a surgeon summary view (e.g., user-informative element 402).
[0073] In some aspects, the total case and case forecast view (e.g., user-informative element 202) can include user interactive elements 702 that display filtering options to highlight one or more of: long case durations, short case durations, high volume complex cases, and reduced staff cases, for example. The one or more insights can be autonomously updated based on detection of a user selection of one of the display filtering options from the user interactive elements 702.
[0074] In some aspects, the OR dashboard 120 can be configured to add a multi-user comment box (e.g., annotation 710) to a user-selected portion (e.g., on visual element 706) of the user-informative elements based on a user-initiated comment command. The multi-user comment box can trigger sending of a message to a targeted user to request a response to a comment entered in the multi-user comment box.
[0075] In some aspects, the OR dashboard 120 can be configured to add, delete, or reposition user-informative elements based on a user command, e.g., through user interactive elements such as user interactive element 502. Adding to the OR dashboard 120 can be performed through a menu, such as menu 504. In some aspects, the OR dashboard 120 can be configured to display a menu of items to available to display in the user-informative elements and add a new user-informative element to the OR dashboard 120 based on a user selection from the menu. As a non-limiting example, the menu of items can include two or more of: an approach, a complexity score, an instrument score, an engagement level, specialties, trainees, an average case timing, daily usage, computer identifiers, insights, operating rooms, surgeons, total number of cases, and types of procedures.
[0076] In some aspects, the OR dashboard 120 is configured for display on a mobile device by displaying the user-informative elements responsive to a first user input 806 scrolling in a first direction and displaying the one or more insights responsive to a second user input 808 scrolling in a second direction that is orthogonal to the first direction.
[0077] The processing shown in FIG. 9 is not intended to indicate that the operations are to be executed in any particular order or that all of the operations shown in FIG. 9 are to be included in every case. Additionally, the processing shown in FIG. 9 can include any suitable number of additional operations.
[0078] Turning now to FIG. 10, a surgical procedure system 1000 is generally shown according to one or more aspects. The example of FIG. 10 depicts a surgical procedure support system 1002 that can include or may be coupled to the system 100 of FIG. 1. The surgical procedure support system 1002 can acquire image or video data using one or more cameras 1004. The surgical procedure support system 1002 can also interface with one or more sensors 1006 and/or one or more effectors 1008. The sensors 1006 may be associated with surgical
support equipment and/or patient monitoring. The effectors 1008 can be robotic components or other equipment controllable through the surgical procedure support system 1002. The surgical procedure support system 1002 can also interact with one or more user interfaces 1010, such as various input and/or output devices. The surgical procedure support system 1002 can store, access, and/or update surgical data 1014 associated with a training dataset and/or live data as a surgical procedure is being performed on patient 110 of FIG. 1. The surgical procedure support system 1002 can store, access, and/or update surgical objectives 1016 to assist in training and guidance for one or more surgical procedures. User configurations 1018 can track and store user preferences.
[0079] The surgical procedure support system 1002 can also communicate with other systems through a network 1030. For example, the surgical procedure support system 1002 can communicate with a surgical data dashboard viewer 1040 and a surgical data post-processing system 1050 through the network 1030. Other types of devices, such as a computing device 1034 (e.g., a mobile phone, laptop, personal computer, or tablet computer), can communicate directly with the surgical procedure support system 1002 or through the network 1030. As one example, user interfaces 1010 may be connected to or integrated with the surgical procedure support system 1002 by a wired connection while the computing device 1034 connects to the surgical procedure support system 1002 via a wireless connection. In some aspects, the computing device 1034 can execute or link to another computer system that executes a surgical data management system to access various data sources through the network 1030.
[0080] The surgical data post-processing system 1050 can receive surgical data and associated data generated by the surgical procedure support system 1002 and may be separately stored and secured through other data storage. Access to specific data or portions of data through the surgical data post-processing system 1050 may be limited by associated permissions. The surgical data post-processing system 1050 may include features such as video viewing, video sharing, data analytics, and selective data extraction.
[0081] The surgical data dashboard viewer 1040 can provide viewing access to various data sources, such as surgical data 1014, data collected in the one or more storage devices, and post-processed data generated by the surgical data post-processing system 1050. For example,
surgical instrument data, video data, and artificial intelligence generated data can be accessed for viewing through the surgical data dashboard viewer 1040. The surgical data post-processing system 1050 may generate surgical performance metrics and comparison data across data sets collected at multiple locations, making analytics data available to the surgical data dashboard viewer 1040. The surgical data dashboard viewer 1040 can access one or more surgical dashboards, such as the OR dashboard 120 of FIG. 1. In some aspects, the surgical data dashboard viewer 1040 and/or surgical data post-processing system 1050 can be components of a surgical data management system.
[0082] One or more computing device 1064 (e.g., a mobile phone, laptop, personal computer, or tablet computer), can execute a surgical data management system to access various data sources through a network 1060. The network 1030 may be within a facility or multiple facilities maintained within a private network. The network 1060 may be a wider area network, such as the internet. Accordingly, the networks 1030 and 1060 may have access to different files and data sets along with shared access to select files and data sets. In some aspects, networks 1030 and 1060 can be combined.
[0083] Turning now to FIG. 11, a computer system 1500 is generally shown in accordance with an aspect. The computer system 1500 can be an electronic computer framework comprising and/or employing any number and combination of computing devices and networks utilizing various communication technologies, as described herein. The computer system 1500 can be easily scalable, extensible, and modular, with the ability to change to different services or reconfigure some features independently of others. The computer system 1500 may be, for example, a server, desktop computer, laptop computer, tablet computer, or smartphone. In some examples, computer system 1500 may be a cloud computing node. Computer system 1500 may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, and so on that perform particular tasks or implement particular abstract data types. Computer system 1500 may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing
environment, program modules may be located in both local and remote computer system storage media, including memory storage devices.
[0084] As shown in FIG. 11, the computer system 1500 has one or more central processing units (CPU(s)) 1501a, 1501b, 1501c, etc. (collectively or generically referred to as processor(s) 1501). The processors 1501 can be a single-core processor, multi-core processor, computing cluster, or any number of other configurations. The processors 1501 can be any type of circuitry capable of executing instructions. The processors 1501, also referred to as processing circuits, are coupled via a system bus 1502 to a system memory 1503 and various other components. The system memory 1503 can include one or more memory devices, such as read-only memory (ROM) 1504 and a random-access memory (RAM) 1505. The ROM 1504 is coupled to the system bus 1502 and may include a basic input/output system (BIOS), which controls certain basic functions of the computer system 1500. The RAM is read-write memory coupled to the system bus 1502 for use by the processors 1501. The system memory 1503 provides temporary memory space for operations of said instructions during operation. The system memory 1503 can include random access memory (RAM), read-only memory, flash memory, or any other suitable memory systems.
[0085] The computer system 1500 comprises an input/output (I/O) adapter 1506 and a communications adapter 1507 coupled to the system bus 1502. The VO adapter 1506 may be a small computer system interface (SCSI) adapter that communicates with a hard disk 1508 and/or any other similar component. The I/O adapter 1506 and the hard disk 1508 are collectively referred to herein as a mass storage 1510.
[0086] Software 1511 for execution on the computer system 1500 may be stored in the mass storage 1510. The mass storage 1510 is an example of a tangible storage medium readable by the processors 1501, where the software 1511 is stored as instructions for execution by the processors 1501 to cause the computer system 1500 to operate, such as is described hereinbelow with respect to the various Figures. Examples of computer program product and the execution of such instruction is discussed herein in more detail. The communications adapter 1507 interconnects the system bus 1502 with a network 1512, which may be an outside network, enabling the computer system 1500 to communicate with other such systems. In one aspect, a
portion of the system memory 1503 and the mass storage 1510 collectively store an operating system, which may be any appropriate operating system to coordinate the functions of the various components shown in FIG. 11.
[0087] Additional input/output devices are shown as connected to the system bus 1502 via a display adapter 1515 and an interface adapter 1516 and. In one aspect, the adapters 1506, 1507, 1515, and 1516 may be connected to one or more VO buses that are connected to the system bus 1502 via an intermediate bus bridge (not shown). A display 1519 (e.g., a screen or a display monitor) is connected to the system bus 1502 by a display adapter 1515, which may include a graphics controller to improve the performance of graphics-intensive applications and a video controller. A keyboard, a mouse, a touchscreen, one or more buttons, a speaker, etc., can be interconnected to the system bus 1502 via the interface adapter 1516, which may include, for example, a Super I/O chip integrating multiple device adapters into a single integrated circuit. Suitable I/O buses for connecting peripheral devices such as hard disk controllers, network adapters, and graphics adapters typically include common protocols, such as the Peripheral Component Interconnect (PCI). Thus, as configured in FIG. 11, the computer system 1500 includes processing capability in the form of the processors 1501, and storage capability including the system memory 1503 and the mass storage 1510, input means such as the buttons, touchscreen, and output capability including the speaker 1523 and the display 1519.
[0088] In some aspects, the communications adapter 1507 can transmit data using any suitable interface or protocol, such as the internet small computer system interface, among others. The network 1512 may be a cellular network, a radio network, a wide area network (WAN), a local area network (LAN), or the Internet, among others. An external computing device may connect to the computer system 1500 through the network 1512. In some examples, an external computing device may be an external web server or a cloud computing node.
[0089] It is to be understood that the block diagram of FIG. 11 is not intended to indicate that the computer system 1500 is to include all of the components shown in FIG. 11. Rather, the computer system 1500 can include any appropriate fewer or additional components not illustrated in FIG. 11 (e.g., additional memory components, embedded controllers, modules, additional network interfaces, etc.). Further, the aspects described herein with respect to
computer system 1500 may be implemented with any appropriate logic, wherein the logic, as referred to herein, can include any suitable hardware (e.g., a processor, an embedded controller, or an application- specific integrated circuit, among others), software (e.g., an application, among others), firmware, or any suitable combination of hardware, software, and firmware, in various aspects. Various aspects can be combined to include two or more of the aspects described herein.
[0090] Aspects disclosed herein may be a system, a method, and/or a computer program product at any possible technical detail level of integration. The computer program product may include a computer-readable storage medium (or media) having computer-readable program instructions thereon for causing a processor to carry out aspects of the present invention.
[0091] In some aspects, a method includes accessing surgical data captured from a plurality of surgical procedures, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures. The method also includes analyzing the surgical data to determine one or more insights about use of the operating rooms and actions performed by the one or more surgeons and displaying the one or more insights on an operating room dashboard. The method further includes filtering the surgical data based on the one or more filter criteria, updating the one or more insights about the surgical data based on the filtering, and displaying updates to the one or more insights on the operating room dashboard.
[0092] In some aspects, the method can include where the one or more insights are determined based on video streams of the surgical procedures. In some aspects, the method can include displaying a plurality of user-informative elements on the operating room dashboard, adjusting an underlying dataset of the user-informative elements based on the filtering, and displaying updates to the user-informative elements based on the adjustment to the underlying dataset. In some aspects, the method can include displaying a menu of items to available to display in the user-informative elements, adding a new user-informative element to the operating room dashboard based on a user selection from the menu, and deleting or repositioning the user- informative elements on the operating room dashboard based on a user command.
[0093] In some aspects, the menu of items can include two or more of: an approach, a complexity score, an instrument score, an engagement level, specialties, trainees, an average case timing, daily usage, computer identifiers, insights, operating rooms, surgeons, total number of cases, and types of procedures, and the one or more filter criteria can include one or more of: a date range selection, a case duration selection, a surgeon selection, a procedure selection, a specialty selection, an operating room selection, a complexity score selection, and a patient selection.
[0094] In some aspects, a computer program product includes a memory device with computer readable instructions stored thereon (e.g., a computer-readable storage medium), where executing the computer readable instructions by one or more processing units causes the one or more processing units to perform a plurality of operations that include accessing surgical data captured from a plurality of surgical procedures, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures. The operations also include analyzing the surgical data to determine one or more insights about the surgical procedures and displaying the one or more insights. The operations further include filtering the surgical data based on the one or more filter criteria, updating the one or more insights about the surgical data based on the filtering, and displaying updates to the one or more insights.
[0095] In some aspects, the one or more insights can be determined based on videos of the surgical procedures. In some aspects, the operations can include displaying a plurality of user-informative elements on an operating room dashboard, adjusting an underlying dataset of the user-informative elements based on the filtering, and displaying updates to the user- informative elements based on the adjustment to the underlying dataset, where the user- informative elements can include one or more of: a total case and case forecast view, a per- operating-room view, an average daily case and usage view, a procedures-by-duration view, a procedure-by-volume view, a case complexity view, a per-capture-computer view, and a surgeon summary view. In some aspects, the operating room dashboard can be configured for display on a mobile device by displaying the user-informative elements responsive to a first user input scrolling in a first direction and displaying the one or more insights responsive to a second user input scrolling in a second direction that is orthogonal to the first direction.
[0096] The computer-readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer-readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhau stive list of more specific examples of the computer-readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer-readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
[0097] Computer-readable program instructions described herein can be downloaded to respective computing/processing devices from a computer-readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network, and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer- readable storage medium within the respective computing/processing device.
[0098] Computer-readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction- set- architecture (ISA) instructions, machine instructions, machine-dependent instructions, microcode, firmware instructions, state-setting data, configuration data for integrated circuitry, or either source-code or object code written in any combination of one or more programming languages, including an object-oriented
programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the “C” programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some aspects, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer-readable program instruction by utilizing state information of the computer-readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
[0099] Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to aspects of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
[0100] These computer-readable program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer- readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
[0101] The computer-readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer-implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0102] The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various aspects of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the blocks may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
[0103] The descriptions of the various aspects of the present invention have been presented for purposes of illustration but are not intended to be exhaustive or limited to the aspects disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described aspects. The terminology used herein was chosen to best explain the principles of the aspects, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the aspects described herein.
[0104] Various aspects of the invention are described herein with reference to the related drawings. Alternative aspects of the invention can be devised without departing from the scope of this invention. Various connections and positional relationships (e.g., over, below, adjacent,
etc.) are set forth between elements in the following description and in the drawings. These connections and/or positional relationships, unless specified otherwise, can be direct or indirect, and the present invention is not intended to be limiting in this respect. Accordingly, a coupling of entities can refer to either a direct or an indirect coupling, and a positional relationship between entities can be a direct or indirect positional relationship. Moreover, the various tasks and process steps described herein can be incorporated into a more comprehensive procedure or process having additional steps or functionality not described in detail herein.
[0105] The following definitions and abbreviations are to be used for the interpretation of the claims and the specification. As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having,” “contains," or “containing,” or any other variation thereof are intended to cover a non-exclusive inclusion. For example, a composition, a mixture, process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but can include other elements not expressly listed or inherent to such composition, mixture, process, method, article, or apparatus.
[0106] Additionally, the term “exemplary” is used herein to mean “serving as an example, instance or illustration.” Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs. The terms “at least one” and “one or more” may be understood to include any integer number greater than or equal to one, i.e., one, two, three, four, etc. The terms “a plurality” may be understood to include any integer number greater than or equal to two, i.e., two, three, four, five, etc. The term “connection” may include both an indirect “connection” and a direct “connection.”
[0107] The terms “about,” “substantially,” “approximately,” and variations thereof are intended to include the degree of error associated with measurement of the particular quantity based upon the equipment available at the time of filing the application. For example, “about” can include a range of ± 8% or 5%, or 2% of a given value.
[0108] For the sake of brevity, conventional techniques related to making and using aspects of the invention may or may not be described in detail herein. In particular, various aspects of computing systems and specific computer programs to implement the various technical features described herein are well known. Accordingly, in the interest of brevity, many
conventional implementation details are only mentioned briefly herein or are omitted entirely without providing the well-known system and/or process details.
[0109] It should be understood that various aspects disclosed herein may be combined in different combinations than the combinations specifically presented in the description and accompanying drawings. It should also be understood that, depending on the example, certain acts or events of any of the processes or methods described herein may be performed in a different sequence, may be added, merged, or left out altogether (e.g., all described acts or events may not be necessary to carry out the techniques). In addition, while certain aspects of this disclosure are described as being performed by a single module or unit for purposes of clarity, it should be understood that the techniques of this disclosure may be performed by a combination of units or modules associated with, for example, a medical device.
[0110] In one or more examples, the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware -based processing unit. Computer-readable media may include non- transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
[0111] Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
Claims
1. A system comprising: a memory device; and one or more processors coupled with the memory device, the one or more processors configured to: access surgical data captured from a plurality of surgical procedures on an operating room basis, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures; determine, autonomously, one or more insights about the surgical data; display a report indicative of the one or more insights on an operating room dashboard comprising a plurality of user-informative elements; receive one or more filter criteria; filter the surgical data based on the one or more filter criteria to adjust an underlying dataset of the user-informative elements; update the one or more insights about the surgical data based on the adjustment to the underlying dataset; and display updates to the user-informative elements and the one or more insights based on the adjustment to the underlying dataset.
2. The system of claim 1, wherein the one or more insights are determined based on video streams of the surgical procedures.
3. The system of claim 1, wherein the plurality of user-informative elements comprises one or more of: a total case and case forecast view, a per-operating-room view, an average daily case
and usage view, a procedures-by-duration view, a procedure-by-volume view, a case complexity view, a per-capture-computer view, and a surgeon summary view.
4. The system of claim 3, wherein the total case and case forecast view comprises display filtering options to highlight one or more of: long case durations, short case durations, high volume complex cases, and reduced staff cases.
5. The system of claim 4, wherein the one or more insights are autonomously updated based on detection of a user selection of one of the display filtering options.
6. The system of claim 1, wherein the one or more processors are configured to add a multiuser comment box to a user-selected portion of the user-informative elements based on a user- initiated comment command, and wherein the multi-user comment box triggers sending of a message to a targeted user to request a response to a comment entered in the multi-user comment box.
7. The system of claim 1, wherein the operating room dashboard is configured to add, delete, or reposition the user-informative elements based on a user command.
8. The system of claim 1, wherein the one or more processors are configured to: display a menu of items to available to display in the user-informative elements; and add a new user-informative element to the operating room dashboard based on a user selection from the menu.
9. The system of claim 8, wherein the menu of items comprises two or more of: an approach, a complexity score, an instrument score, an engagement level, specialties, trainees, an average case timing, daily usage, computer identifiers, insights, operating rooms, surgeons, total number of cases, and types of procedures.
10. The system of claim 1, wherein the one or more filter criteria comprise one or more of: a date range selection, a case duration selection, a surgeon selection, a procedure selection, a specialty selection, an operating room selection, a complexity score selection, and a patient selection.
11. The system of claim 1, wherein the operating room dashboard is configured for display on a mobile device by displaying the user-informative elements responsive to a first user input scrolling in a first direction and displaying the one or more insights responsive to a second user input scrolling in a second direction that is orthogonal to the first direction.
12. A method comprising: accessing surgical data captured from a plurality of surgical procedures, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures; analyzing the surgical data to determine one or more insights about use of the operating rooms and actions performed by the one or more surgeons; displaying the one or more insights on an operating room dashboard; filtering the surgical data based on the one or more filter criteria; updating the one or more insights about the surgical data based on the filtering; and displaying updates to the one or more insights on the operating room dashboard.
13. The method of claim 12, wherein the one or more insights are determined based on video streams of the surgical procedures.
14. The method of claim 12, further comprising: displaying a plurality of user-informative elements on the operating room dashboard; adjusting an underlying dataset of the user-informative elements based on the filtering; and displaying updates to the user-informative elements based on the adjustment to the underlying dataset.
15. The method of claim 14, further comprising:
displaying a menu of items to available to display in the user-informative elements; adding a new user-informative element to the operating room dashboard based on a user selection from the menu; and deleting or repositioning the user-informative elements on the operating room dashboard based on a user command.
16. The method of claim 15, wherein the menu of items comprises two or more of: an approach, a complexity score, an instrument score, an engagement level, specialties, trainees, an average case timing, daily usage, computer identifiers, insights, operating rooms, surgeons, total number of cases, and types of procedures, and wherein the one or more filter criteria comprise one or more of: a date range selection, a case duration selection, a surgeon selection, a procedure selection, a specialty selection, an operating room selection, a complexity score selection, and a patient selection.
17. A computer program product comprising a memory device with computer readable instructions stored thereon, wherein executing the computer readable instructions by one or more processing units causes the one or more processing units to perform a plurality of operations comprising: accessing surgical data captured from a plurality of surgical procedures, the surgical data identifying one or more surgeons and an operating room associated with each of the surgical procedures; analyzing the surgical data to determine one or more insights about the surgical procedures; displaying the one or more insights; filtering the surgical data based on the one or more filter criteria; updating the one or more insights about the surgical data based on the filtering; and displaying updates to the one or more insights.
18. The computer program product of claim 17, wherein the one or more insights are determined based on videos of the surgical procedures.
19. The computer program product of claim 17, wherein the operations further comprise: displaying a plurality of user-informative elements on an operating room dashboard; adjusting an underlying dataset of the user-informative elements based on the filtering; and displaying updates to the user-informative elements based on the adjustment to the underlying dataset, wherein the user-informative elements comprise one or more of: a total case and case forecast view, a per-operating-room view, an average daily case and usage view, a procedures-by-duration view, a procedure-by- volume view, a case complexity view, a per- capture-computer view, and a surgeon summary view.
20. The computer program product of claim 19, wherein the operating room dashboard is configured for display on a mobile device by displaying the user-informative elements responsive to a first user input scrolling in a first direction and displaying the one or more insights responsive to a second user input scrolling in a second direction that is orthogonal to the first direction.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202263422393P | 2022-11-03 | 2022-11-03 | |
US63/422,393 | 2022-11-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024094838A1 true WO2024094838A1 (en) | 2024-05-10 |
Family
ID=88697681
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2023/080643 WO2024094838A1 (en) | 2022-11-03 | 2023-11-03 | Operating room dashboard |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024094838A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210192914A1 (en) * | 2017-12-28 | 2021-06-24 | Ethicon Llc | Surgical hub and modular device response adjustment based on situational awareness |
US20220285017A1 (en) * | 2021-03-08 | 2022-09-08 | QuiviQ, Inc. | Optimized operating room block management and related systems and methods |
-
2023
- 2023-11-03 WO PCT/EP2023/080643 patent/WO2024094838A1/en unknown
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210192914A1 (en) * | 2017-12-28 | 2021-06-24 | Ethicon Llc | Surgical hub and modular device response adjustment based on situational awareness |
US20220285017A1 (en) * | 2021-03-08 | 2022-09-08 | QuiviQ, Inc. | Optimized operating room block management and related systems and methods |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7335938B2 (en) | An informatics platform for integrated clinical care | |
US20200275896A1 (en) | Medical device interface system | |
US11636928B1 (en) | Facilitating computerized interactions with EMRS | |
US20230023083A1 (en) | Method of surgical system power management, communication, processing, storage and display | |
CA2870560C (en) | Systems and methods for displaying patient data | |
US20190238791A1 (en) | Surgical Visualization And Recording System | |
US8601385B2 (en) | Zero pixel travel systems and methods of use | |
US20210065888A1 (en) | Systems and methods for medical device monitoring | |
US20160147938A1 (en) | Radiology desktop interaction and behavior framework | |
US20210065889A1 (en) | Systems and methods for graphical user interfaces for a supervisory application | |
EP3745416A1 (en) | Clinical decision support | |
US20200234809A1 (en) | Method and system for optimizing healthcare delivery | |
Eryigit et al. | Association of video completed by audio in laparoscopic cholecystectomy with improvements in operative reporting | |
WO2024094838A1 (en) | Operating room dashboard | |
US20240206989A1 (en) | Detection of surgical phases and instruments | |
EP4420344A1 (en) | Low-latency video capture and overlay | |
EP4323972A1 (en) | Identifying variation in surgical approaches | |
WO2024223462A1 (en) | User interface for participant selection during surgical streaming | |
WO2024213771A1 (en) | Surgical data dashboard | |
US20220013223A1 (en) | Virtual pointer for real-time endoscopic video using gesture and voice commands and video architecture and framework for collecting surgical video at scale | |
EP4258274A1 (en) | De-identifying data obtained from microphones | |
WO2023144356A1 (en) | Provision of surgical guidance based on audiovisual data and instrument data | |
WO2024213571A1 (en) | Surgeon swap control | |
US20240252263A1 (en) | Pose estimation for surgical instruments | |
US20220249195A1 (en) | Video architecture and framework for collecting surgical video at scale |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23801369 Country of ref document: EP Kind code of ref document: A1 |