CN115363752A - Intelligent operation path guiding system - Google Patents

Intelligent operation path guiding system Download PDF

Info

Publication number
CN115363752A
CN115363752A CN202211005301.4A CN202211005301A CN115363752A CN 115363752 A CN115363752 A CN 115363752A CN 202211005301 A CN202211005301 A CN 202211005301A CN 115363752 A CN115363752 A CN 115363752A
Authority
CN
China
Prior art keywords
surgical
surgeon
module
path
database
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211005301.4A
Other languages
Chinese (zh)
Other versions
CN115363752B (en
Inventor
袁本祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huaping Xiangsheng Shanghai Medical Technology Co ltd
Original Assignee
Huaping Xiangsheng Shanghai Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huaping Xiangsheng Shanghai Medical Technology Co ltd filed Critical Huaping Xiangsheng Shanghai Medical Technology Co ltd
Priority to CN202211005301.4A priority Critical patent/CN115363752B/en
Publication of CN115363752A publication Critical patent/CN115363752A/en
Application granted granted Critical
Publication of CN115363752B publication Critical patent/CN115363752B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Instructional Devices (AREA)

Abstract

The invention provides an intelligent surgical path guidance system 100, which is used for providing visual feedback or tactile feedback and real-time guidance for a surgeon in a process so as to reduce surgical risks caused by surgical malwinds, and is characterized by specifically comprising the following steps: a surgical control module 110, a data storage module 120, a graphical user interface GUI 130; the surgical control module 110 is connected to the data retention module 120 to facilitate data transfer between the surgical control module 110 and the data retention module 120; the data preservation module 120 may include a real-time health record unit 121 for storing patient-related data, a 3D model database 122, a surgical path database 123, a surgical annotation database 124, a vibration database 125, and a procedure database 126; the surgical control module 110 includes a surgical planning module 111, a surgical analysis module 112, a surgical simulation module 113, a surgical assistance module 114, and an AI path module 115; a graphical user interface GUI 130 is used for interaction between the surgeon and the surgical planning module 111, the GUI including different icons or radio buttons, each programmed to activate execution of a respective function.

Description

Intelligent operation path guiding system
Technical Field
The invention relates to the technical field of intelligent surgical path guidance, in particular to a surgical path guidance system based on artificial intelligence.
Background
Each surgery has different degrees of associated risks. Based on the pre-operative assessment, the surgeon may determine which surgical procedure to perform on the patient. The surgeon may also practice the surgery beforehand. The surgeon may exercise using a Virtual Reality (VR) system.
VR simulation training, which increases the efficiency of the surgical procedure in terms of time and cost, requires powerful surgical tools for the surgeon to perform the surgical procedure. It achieves virtual and reality fusion, optimizes preoperative planning, improved surgical approach, and has even been expanded to the most advanced surgical robot field. The technique provides a large number of simulated surgery training opportunities for training personnel, and does not increase the risk of iatrogenic injuries of patients. In a plurality of neurosurgical sub-professional operations such as craniotomy, endovascular intervention operation, spinal surgery and the like, the application value is continuously improved, the application field is expanded, and the diagnosis and treatment level of neurosurgical diseases is expected to be greatly improved.
However, VR technology has gradually encountered problems such as unclear clinical utility, high use cost, and preliminary emerging medical ethics since more than half a century since its advent. How to solve these problems, further promoting the application and progress of the technology, requires further deep cooperation of medical training institutions and technology development companies. By reducing the use cost, establishing a VR technical equipment sharing mechanism, developing multi-center large-sample-size random contrast clinical research and other methods, the practicability of the VR technology is determined, the use threshold is reduced, the technology can be widely and effectively applied to the field of neurosurgery, and doctors and patients can benefit.
Disclosure of Invention
The invention aims to realize an intelligent surgical path guidance system 100, which is used for providing visual feedback or tactile feedback and real-time guidance for a surgeon in a process so as to reduce surgical risk caused by surgical malwinds, and is characterized by specifically comprising the following steps: a surgical control module 110, a data storage module 120, a graphical user interface GUI 130;
surgical control module 110 is connected to data retention module 120 to facilitate data transfer between surgical control module 110 and data retention module 120;
the data preservation module 120 may include a real-time health record unit 121 for storing patient-related data, a 3D model database 122, a surgical path database 123, a surgical annotation database 124, a vibration database 125, and a procedure database 126;
wherein the real-time health record unit 121 is configured to store data of the patient in real-time, which may correspond to medical imaging data and/or diagnostic data; the 3D model database 122 stores 3D models of the affected regions of the patient, the 3D models including all regions or tissue types classified by the surgeon as tactile barriers and hard barriers; the surgical path database 123 stores details of the methods of performing various surgical procedures, the different methods and paths of performing various surgical procedures, and the surgical paths that a particular type of surgical procedure may follow; the surgical annotation database 124 is configured to accept annotations provided by the surgeon during surgical practice or training sessions as well as during pre-planning phases or even during actual surgical procedures; the surgeon may add annotations at any time during the virtual reality simulation, and the annotations may be stored in the surgical annotation database 124; the vibration database 125 is configured to store information about margins, ranges or windows of deviation along the surgical path that may be selected by the surgeon, providing tactile feedback to the surgeon through, for example, a tactile feedback hand controller; the procedure database 126 stores video and sensor data related to previously performed surgical procedures;
the surgical control module 110 includes a surgical planning module 111, a surgical analysis module 112, a surgical simulation module 113, a surgical assistance module 114, and an AI path module 115;
a graphical user interface GUI 130 is used for interaction between the surgeon and the surgical planning module 111, the GUI including different icons or radio buttons, each programmed to activate execution of a respective function.
Preferably, the surgical planning module 111 retrieves identification details of the target patient from the real-time health record unit 121, retrieves a diagnosis of the patient and may identify a recommended procedure for the subject patient, creates a 3D model based on data of the three-dimensional model database 122, retrieves possible surgical paths for the recommended procedure for the subject patient.
Preferably, the surgical planning module 111 selects a region of the 3D model to highlight, the highlighted selected region may help enable the surgeon to define tactile and hard barriers for different types of tissue;
the surgical planning module 111 retrieves surgical paths from the surgical path database 1231 to meet the surgical needs of the subject patient, each surgical path may represent a line drawn from one point to another in an image of a body part, which has been previously defined by a surgeon, other expert, or stored in the surgical path database 1231.
Preferably, the AI pathway module 115 is configured to analyze images of a particular surgical area from a plurality of patients who have previously undergone a recommended procedure and store the results of these procedures for each of the plurality of patients, obtaining all possible surgical pathways; the AI routing module 115 calculates a cost function for each surgical path and evaluates whether the predicted outcome of the surgical paths provides satisfactory results.
Preferably, the AI path module 115 evaluates the results based at least on: 1) the ability to provide adequate access to the surgical site, 2) acceptable anatomical access time to the site, and 3) the ability of the patient to tolerate the surgical requirements; AI pathway module 115 selects a surgical pathway that balances the minimum cost function of the surgical pathway and other risk factors, including the risk of approaching and possibly penetrating organs, or at least one factor of the expected postoperative outcome of a given pathway to the patient's recovery time or long-term health.
Preferably, the surgical analysis module 112 scans the area along the surgical path to identify that the surgical tool has moved into either of the tactile barrier or the hard barrier, providing tactile feedback to the surgeon through the tactile controller;
the surgical analysis module 112 determines the presence of a surgical tool within or outside of the hard barrier, stops the advancement or movement of the surgical tool, and prevents the surgical tool from invading critical tissues.
Preferably, the surgical analysis module 112 identifies anomalies or deviations of the surgical instrument from the surgical path, and when the surgical instrument deviates from the surgical path, an alarm is issued based on predefined conditions, which alarm may take the form of highlighting the surgical path or changing the color of the surgical path once the surgical analysis module 112 generates the alarm; a second input may be accepted from the surgeon to continue working along the same path, selecting a different surgical path, or selecting a new surgical procedure.
Preferably, the surgical simulation module 113 implements a surgical simulation procedure, highlights a surgical path on the 3D model, simulates the surgical procedure, and may present one or more virtual organs that the surgeon will manipulate; the movement, cutting, suturing and coagulation of the surgical process of different organs are simulated, and the real practice of the surgical process is simulated.
Preferably, the surgical simulation module 113 creates a VR simulation to train the surgeon using the 3D model, during which training a virtual representation of the surgical tool on the 3D model is displayed and the surgical path displayed on the 3D model may be highlighted; setting a margin of deviation based on the surgeon's input to provide visual and tactile feedback to the surgeon during the actual surgical procedure; tracking the surgeon's movements on the virtual reality simulation as the surgeon operates; the surgeon may set one or more annotations at the deviation points, store the one or more annotations in the surgical annotation database 1112 along with visual and tactile feedback, present the annotations to the surgeon at specified times along with the visual and tactile feedback, and may assist the surgeon by providing reminders and/or warnings at specified times during the procedure.
Preferably, the surgical assistance module 114 provides support to the surgeon during the actual surgical procedure on the subject patient, using camera-integrated AR glasses to identify the position and orientation of the surgeon, subject patient and surgical tool, each step of the surgical procedure may be monitored by the camera-integrated AR glasses, and annotations are presented to the surgeon at predetermined surgical steps.
Drawings
FIG. 1 is a system block diagram of the intelligent surgical path guidance system of the present invention;
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be further described below.
Hereinafter, an intelligent surgical path guidance system according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings. As shown in fig. 1, in an embodiment of the present invention, an intelligent surgical path guidance system 100 for reducing the risk of surgical errors by providing visual or tactile feedback and real-time guidance to the surgeon during the surgical procedure;
the intelligent surgical path guidance system 100 comprises a surgical control module 110, a data storage module 120, and a graphical user interface GUI 130;
surgical control module 110 is connected to data retention module 120 to facilitate data transfer between surgical control module 110 and data retention module 120;
the data retention module 120 may include a real-time health record unit 121 for storing patient-related data, a three-dimensional model database 122, a surgical path database 123, a surgical annotation database 124, a vibration database 125, and a procedure database 126.
The real-time health record unit 121 is configured to store patient data in real-time, which may correspond to medical imaging data and/or diagnostic data, such as medical records of patients, such as medical history test results of individual patients, and records of surgeons/doctors or other healthcare providers.
The 3D model database 122 may store 3D models of the affected regions of the patient. The 3D model may be created using images captured from different sources and may include, but is not limited to, camera images, magnetic Resonance Imaging (MRI) images, ultrasound images, and X-ray images. The 3D model may include all regions or tissue types classified by the surgeon as tactile and hard barriers.
The surgical path database 123 may store details of the methods of performing various surgical procedures, the different methods and paths of performing various surgical procedures, and the surgical paths that a particular type of surgical procedure may follow. Data relating to all surgical paths may be stored in a surgical path database, including details and specifications for each surgical path, assignments of surgical participants, tools and resources required for the surgical path, feasible response paths to adverse conditions of the respective surgery, and the like. The surgeon may access the surgical path database using a user device connected to the system 102.
The surgical annotation database 124 may be configured to accept annotations provided by the surgeon during surgical practice or training sessions as well as during pre-planning stages or even during actual surgery. The surgeon may add annotations at any time during the virtual reality simulation, and the annotations may be stored in the surgical annotation database 124. The surgeon may add annotations using the system 102 or the user device 1118.
The vibration database 125 may be configured to store information about margins, ranges, or windows of deviation along the surgical path that may be selected by the surgeon. Information related to the margin of deviation can be used to provide tactile feedback to the surgeon through, for example, a tactile feedback hand controller.
The procedure database 126 may store video and sensor data related to previously performed surgical procedures. The video and sensor data are recorded in real time during the surgical procedure and stored in procedure database 126. The surgeon may use the stored video and surgical data to select a surgical procedure for the patient, and more particularly, to select a particular surgical path based on the medical needs of the patient.
The surgical control module 110 includes a surgical planning module 111, a surgical analysis module 112, a surgical simulation module 113, a surgical assistance module 114, and an AI path module 115.
The surgical planning module 111 may assist a surgeon or other user in identifying a subject patient, and may then receive the identity of the subject patient. The surgical planning module 111 may store and facilitate retrieval of identification details of a target patient for whom the patient is expected to perform a procedure from the real-time health record unit 121. A diagnosis of the subject patient is retrieved and a recommended procedure can be identified for the subject patient, the recommended procedure being identified based on an analysis of the diagnosis of the subject patient. A three-dimensional (3D) model of the entire body of the subject patient or only the affected area where surgery is recommended is prepared. The 3D model may be created using images captured from different sources, as described above, using known image reconstruction techniques. In at least one embodiment, a 3D model of a subject patient may be stored in the 3D model database 122. Possible surgical paths for the recommended surgery for the subject patient are retrieved. In at least one embodiment, the surgical path may be stored in and retrieved from the surgical path database 123. Each surgical path may represent a potential action step taken during a particular iteration of the recommended surgery. The surgical path may have been previously defined by a surgeon, subject matter expert, or Artificial Intelligence (AI) path module 216. The AI pathway module 115 may be configured to analyze images of a particular surgical field from a plurality of patients who have previously undergone a recommended procedure and also store the results of these procedures for each of the plurality of patients. After analysis, the AI pathway module 212 can present the surgeon with data relating to any number of previously performed surgical pathways, unique surgical pathways, and frequently used surgical pathways along their respective outcomes. The surgical path is displayed as an overlay on the 3D model, allowing the surgeon or other surgical participant to select a surgical path from all of the retrieved surgical paths. The selected surgical path may then be displayed overlaid on the 3D model. The surgeon may select a particular surgical path based on a comparison and/or review of medical results of other patients undergoing the same or similar surgical procedure using the same particular surgical path.
The surgical planning module 111 may facilitate selection of regions of the 3D model. The selected area that can be highlighted can help identify one type of tissue in the affected area, thereby enabling the surgeon to define tactile and hard barriers for different types of tissue. The surgeon may associate feedback with the tactile barrier and the hard barrier. As a further example, the tactile and hard barriers may be defined by the surgeon based on the degree of caution associated with each type of tissue. For example, skin cells, muscles and bones may be associated with lower levels of caution, and thus, the surgeon may set tactile obstacles and may define tactile feedback for these cells. In another example, blood vessels and cells or the brain and Central Nervous System (CNS) may be associated with the highest level of caution.
The surgical planning module 111 may facilitate subject patient identification by the surgeon. The surgical planning module 111 may store and facilitate retrieval of identification details of the subject patient from the real-time health record unit 121. A record of the subject patient may be retrieved and a surgical procedure may be recommended for the subject patient based on an analysis of the record of the subject patient. A 3D model of the subject patient may be prepared from images captured using different sources, as described above. Furthermore, known image reconstruction techniques can be used to create the 3D model. The surgical path may be retrieved from the surgical path database 1231 to meet the surgical needs of the subject patient. Each surgical path may represent a line drawn from one point to another in an image of a body part, which has been previously defined by a surgeon, other expert, or stored in the surgical path database 1231. The surgical path database 123 may facilitate analysis and storage of images of a particular surgical area for a plurality of patients who have previously undergone a particular procedure and its results. The surgical path database 123 may facilitate or be retrieved by the surgeon for all previously used surgical paths, unique surgical paths, and frequently used surgical paths, and their respective outcomes.
Once all possible surgical paths have been considered, a cost function is calculated for each possible convergence path based on each possible selected entry point. The cost function compares the cost functions of the various paths based on the established rules and the traversability values of the organ according to a predetermined set of rules and quantitative parameters. The system searches for the path with the least resistance, i.e. the lowest cost function. The potential path of a given procedure includes various entry points to a common surgical target; thus, each path may pass through or around some of the same organs, albeit from a different side or angle of approach. The potential path with the least cost is identified and selected as the selected path for a given patient and surgical procedure, thereby specifying a particular surgical method and deciding the path to take to achieve the surgical goal.
The system evaluates whether the predicted outcome of the path will provide satisfactory results. In such a case, satisfactory results would include at least the following: 1) the ability to provide adequate access to the surgical site, 2) acceptable anatomical access time to the site, and 3) the ability of the patient to tolerate the surgical procedure requirements. The ability of the system to make accurate predictions is expected to increase over time as data is collected and stored in searchable databases. Thus, in the final path selection, the system can incorporate not only the minimum cost function of the potential path, but also other factors. These factors include, for example, the risk of approaching and possibly penetrating organs, which can be life threatening, or the expected postoperative consequences of a given route to the time of recovery or long term health of a patient. In the event that the path with the lowest cost function has a high intra-operative risk or undesirable post-operative consequences for the patient, the system may select another path that balances the lowest cost function with other identified factors.
The surgical analysis module 112 scans the region defined by the surgical path using the ultrasound imaging system after the surgical path is selected by the surgeon. An area along the surgical path may be scanned to identify movement of a surgical tool of the robotic surgical system 118 into either of the tactile barrier or the hard barrier. When a surgical tool enters one of the barriers (tactile and hard), the surgical analysis module 112 may determine whether the surgical tool has entered the tactile barrier. If the surgical tool has entered the tactile barrier, tactile feedback will be provided to the surgeon. The haptic feedback may be provided to the surgeon through a haptic controller, which may be present on the surgical tool or may be present on a glove worn by the surgeon. Other tactile feedback sensors and devices may also be used in different embodiments, and the surgeon may customize the level and type of tactile feedback based on the classification of different types of tissue.
In another example, the presence of a surgical tool within or outside of a hard barrier may be determined, although it is not found that the surgical tool has entered the tactile barrier. Thus, the surgical analysis module 112 may stop the advancement or movement of the surgical tool, preventing the surgical tool from invading critical tissue. For example, a surgical tool operated by a surgeon may enter a critical area such as the Central Nervous System (CNS), and upon detecting such activity, the surgical analysis module 112 may immediately stop the robotic surgical system completely to stop any damage. Although the surgical procedure has not yet ended, the ultrasound imaging system may again begin scanning the region along the surgical path. Further, the robotic surgical system may continue to operate after the surgeon removes the surgical tool from the hard barrier.
In at least one example embodiment, the ultrasound imaging system may detect that the surgical tool is approaching a significant artery, reaching a tactile barrier preset to, for example, "3 mm". The haptic controls on the surgical tool may activate and begin vibrating to alert the surgeon. Alternatively, an audible alarm may be sounded and/or a pre-recorded message may be played. For example, the message might read "the border of the artery is now 4 mm".
In another example embodiment, the ultrasound imaging system may detect that the surgical tool is approaching the movable muscle, reaching the tactile barrier, e.g. preset to "2mm". The haptic controls on the surgical tool may activate and begin vibrating to indicate the surgeon's alarm. Alternatively, an audible alarm may be sounded and/or a pre-recorded message may be played. The ultrasound imaging system 120 may detect that a surgical tool contacts the bone on which the procedure is to be performed. An audible alarm may be sounded and/or a pre-recorded message may be played, and it may also be detected that the surgical tool has drilled a hole in the bone to or beyond a predetermined threshold length. An audible alarm may be sounded and/or a pre-recorded message may be played.
The surgical analysis module 112 may identify anomalies or deviations of the surgical instrument from the surgical path. When the surgical instrument deviates from the surgical path, an alert may be issued based on predefined conditions stored in memory, which may include, for example, the surgeon or surgical participant holding the surgical instrument off of a fixed step, the surgeon or surgical participant using the wrong surgical instrument, losing the surgical instrument, picking up a surgical instrument not included therein. In a particular step of the use plan, defective or contaminated surgical instruments are picked up, surgical instruments are used at the wrong place during the procedure, surgical instruments are used at inappropriate times, misalignment of the surgical instruments with respect to the surgical instruments may result in or create undesirable events in the surgical path or other conditions.
Once the surgical analysis module 112 generates the alarm, the surgical path may be highlighted in the AR display 1161. For example, the color of the surgical path may change from green to red to highlight deviations. A number of options for proceeding may be presented to the surgeon. In at least one example embodiment, the multiple options presented before the surgeon may include a) continuing to work on the same surgical path, b) selecting a different surgical path, and c) selecting a new surgical procedure. A second input provided by the surgeon may be accepted to continue. In at least one example embodiment, the deviation from the surgical path may be due to an error in the surgical procedure, a physical abnormality, an emergency, or some unexpected development. For example, a surgeon may be operating on a ruptured achilles tendon and a deviation may occur due to the presence of infected tissue. After the deviation occurs, the surgeon may be prompted to choose between working on the surgical path, selecting a different surgical path, or selecting a new surgical procedure. In another exemplary embodiment, the surgeon may choose to use a different surgical path. The surgical path database 123 can facilitate retrieving surgical paths for surgical procedures and filtering out surgical paths that do not overlap with waypoints in the surgical path. The waypoint may be a point in the surgical path at which the surgical instrument deviation occurs. After filtering, the surgical path may be displayed on an AR display overlaid on the 3D model of the subject patient.
Upon determining a deviation from the surgical path, an alarm may be generated to notify the surgeon. The alert may take the form of highlighting the surgical path or changing the color of the surgical path. For example, the color may change from green to red. A second input may be accepted from the surgeon to continue working along the same path, selecting a different surgical path, or selecting a new surgical procedure. The user may be allowed to proceed based on the second input. For example, after a deviation in the surgical path, the surgeon may proceed by selecting a different surgical path.
A Graphical User Interface (GUI) is used for interaction between the surgeon and the surgical planning module 111. The GUI may include different icons or radio buttons, each programmed to activate execution of a corresponding function. Or for example, a "view example" is an icon used by the surgeon to scroll through data of previous patients, enabling the surgeon to apply a filter to data of at least one of the previous patient data or images. Filters may include, but are not limited to, "show surgical procedure with highest success rate" and "show previous patients with similar patient history as the subject patient.
As another example, a "select path" icon allows a surgeon to select a surgical path using a tracking device such as a mouse and pointer. The surgical path may include lines or trajectories displayed as a graphical overlay on the 3D model selected by the surgeon. The surgeon may select the surgical path and then view all metadata associated with the 3D model displayed below the surgical path. The surgeon may also rotate the 3D model around the surgical path so that the 3D model may be viewed from multiple angles.
As another example, a "highlight path" icon may be used to enhance each image present in the stored data relating to the previous patient and the subject patient to display the surgical path. The surgeon may use the "highlight path" icon to select and highlight one surgical path from the stored surgical paths using the tracking device. The "highlight path" icon may also allow the surgeon to turn on, turn off, and highlight elements present in the 3D model, such as arteries, muscle groups, bones, and skin.
As another example, the surgeon may use a "view patient" icon to view and scroll through details of the subject patient, including the numerous 3D models present in the 3D model database 122. The surgeon may use the "duplicate path" icon to select and duplicate a surgical path using the tracking device. In addition, the surgeon may modify the replicated surgical path using a resize or edit function, the surgeon may select the surgical path using a "highlight boundary" icon and continuously highlight the boundary of the surgical path using a tracking device. The boundary may be created virtually by drawing on the image. After creation, the boundaries may be marked. For example, the first bone in the surgical path may be labeled as "first bone" and the region of the surgical path for which a warning is issued may be labeled as "attention region".
In at least one example, the surgeon may click on a displayed element or feature, such as an artery, and may illuminate the entire artery by using color filtering or image recognition techniques. The surgeon may use the "define boundary" icon to select a surgical path, highlight a boundary, and define a boundary. For example, the boundary may be labeled "first bone" and may be defined as "2mm". This definition of the boundary may refer to the first bone in the surgical path and may require a 2mm virtual boundary to be drawn around the highlighted boundary. The surgeon may use the "determine boundary" icon to select a surgical path, highlight a boundary, define a boundary, and further define a boundary. For example, a boundary labeled "first bone" and defined as "2mm" may be further defined as critical or non-critical, indicating whether movement across the boundary is recommended. Such information can be used to determine the criticality of the haptic interface and haptic type.
The surgeon may use the "determine haptic type and number" icon to select the haptic type to be employed and set the strength of the haptic feedback. In at least one example, the haptic sensation may be selected from a buzzer, a vibration, a sound, a visual, and an AI. Further, the intensity of the haptic feedback may indicate the amount of time, vibration speed, and volume of the buzzer. The "determine haptic type and number" icon may allow the surgeon to select the type of haptic interface (e.g., buzzer and vibration), the positioning of the haptic interface (e.g., on the drill bit, on the scalpel), or the surgical robot), and the strength of the haptic feedback from a set of options.
The surgical simulation module 113 may facilitate retrieving a record of the subject patient from an Electronic Health Record (EHR) stored in the real-time health record unit 121, which may include at least one of image data and/or diagnostic data. A record of the subject patient may be retrieved and a recommended procedure may be identified. A recommended procedure for the subject patient may be identified based at least on an analysis of the diagnosis of the subject patient.
The surgical simulation module 113 may facilitate the creation of a 3D model of the affected region/body part of the subject patient. A 3D model may be created based at least on records retrieved from an Electronic Health Record (EHR), e.g., using at least one image captured during a diagnosis of a subject patient. The surgical path may be retrieved from the surgical path database 1110. The surgical path may be used to perform a surgical procedure on a subject patient. All possible surgical paths for repairing the knee joint may be retrieved from the surgical path database 1110. The surgical simulation module 113 can facilitate displaying all surgical paths overlaid on the 3D model. The surgeon may select a surgical path from the displayed surgical paths. As an example, the surgical path may be selected based on input from a surgeon. The surgical simulation module 113 may facilitate display of a virtual representation of a surgical tool on the 3D model. For example, the surgical tool may be a robotic surgical arm with a drill attachment, a scalpel, or any other surgical tool that the surgeon desires. The surgical simulation module 113 may cause the surgical path on the 3D model to be highlighted, e.g., the surgical path may be displayed in green on the 3D model. The user may begin to perform VR simulation of the procedure using the VR surgical exercise system 1120. The VR surgical practice system may allow for simulation of a surgical procedure and may present one or more virtual organs that the surgeon will operate on. An important organ may include a plurality of elements, and each element may have adjacent elements. The plurality of tensioned connections may connect adjacent elements to the living being organ such that a force exerted on one element propagates through the respective adjacent element, thereby providing a distributed reaction on the living being organ.
The VR surgical practice system may also include a physical manipulation device to be manipulated by a user, and a tracking device for tracking the physical manipulation device and converting motion of the physical manipulation device into a force to be applied to the virtual organ. The VR surgical practice system can simulate the movement, cutting, suturing, coagulation, and other aspects of a surgical procedure on different organs. Thus, the VR surgical practice system may facilitate real practice of the surgical procedure.
While performing the surgery, the surgery simulation module 113 may allow the surgeon to set the deviation range to provide visual and tactile feedback during the surgery. In at least one example, the surgeon may define the intensity of the vibration to be provided on the haptic feedback hand controller or actual surgical drill, e.g., the surgeon may define the intensity of the vibration from 1 to 10, from the lowest intensity to the highest intensity. In at least one embodiment, the intensity may be increased or decreased based on changes in the tactile feedback hand controller or the actual surgical drill from the surgical path. In another example embodiment, the vibration setting may be accompanied by a color change to provide visual feedback, for example,
the surgical simulation module 113 may help track the surgeon's movements. The surgeon's motion can be tracked as the surgeon performs the surgical procedure on the VR simulation using the VR surgical practice system. In at least one example embodiment, the surgeon's motion may be tracked using data received from camera-integrated AR glasses worn by the surgeon. In addition, the movement of the surgical tool relative to the surgical path may also be tracked.
In at least one example implementation, if the surgical tool may deviate from the surgical path while the surgeon is performing the surgical procedure using the VR surgical practice system, visual and tactile feedback may be provided to the surgeon immediately upon the occurrence of the deviation point. The deviation point may be indicated by a deviation of the surgical tool from the surgical path within the deviation margin. In one example, visual feedback may be provided by modifying the color of the surgical path highlighted on the 3D model. For example, as described above, the highlighted surgical path may be displayed in green. For non-critical areas, the color may change from green to yellow. As another non-limiting example, for a key region, the color may change from yellow to red, indicating a warning.
The procedure simulation module 113 may cause haptic feedback to be provided to the surgeon through a haptic feedback hand controller. For example, for non-critical paths, the vibration frequency may be set to a lower frequency, while for more critical paths, the vibration frequency may be more intense, and based on the deviation of the surgical tool from the surgical path, the intensity may be higher.
The surgical simulation module 113 may allow the surgeon to set one or more annotations at the deviated points. The annotations may be provided by accessing the system 102 or the user device 1118. Annotations may include, but are not limited to, text, audio notes, instructions to extract specific medical data from a patient's Electronic Health Record (EHR), and audio-video files related to the procedure.
The surgical simulation module 113 can facilitate storing the annotations in a surgical annotations database 1112 along with the visual and tactile feedback. Any changes to the vibration settings, along with the time stamp, may be stored in vibration database 1114. When performing a real surgery, it may be desirable to present the surgeon with annotations when the deviation point occurs. The deviation point can be referenced using a timestamp. The annotations may be presented based on timestamps associated with the annotations.
Subsequently, the surgical simulation module 113 can display a list of deviation points, annotations, and vibration settings defined during the virtual reality simulation. Thereafter, the surgical simulation module 113 can facilitate surgeon approval of the surgical path, procedure, and annotation.
A recommended procedure for the subject patient may be identified based on the medical condition of the patient. A 3D model of an affected body part of a subject patient may be created using at least one image captured during a diagnosis of the subject patient. VR simulations may be created to train a surgeon using a 3D model. During training, a virtual representation of the surgical tool on the 3D model may be displayed, and the surgical path displayed on the 3D model may be highlighted. In addition, the margin of deviation may be set based on the surgeon's input to provide visual and tactile feedback to the surgeon during the actual surgical procedure. In addition, the surgical simulation module 113 can facilitate tracking of the surgeon's motion over the virtual reality simulation as the surgeon operates. Based on the deviation of the surgical tool from the surgical path, the surgical simulation module 113 can facilitate providing at least one of visual and tactile feedback to the surgeon at the point of deviation. In addition, the surgeon may set one or more annotations at the deviation points. Thereafter, the one or more annotations may be stored in the surgical annotation database 1112 along with the visual feedback and the tactile feedback. The annotations may be presented to the surgeon at specified times along with visual and tactile feedback. The annotation may assist the surgeon by providing reminders and/or warnings at specified times during the procedure.
The surgical assistance module 114 may provide support to the surgeon during the actual surgery of the subject patient. The surgical assistance module 114 may facilitate subject patient identification or other users of the surgeon. The surgical assistance module 114 may store and facilitate retrieval of details of a particular surgical procedure, e.g., a surgical path along with an annotation, from the surgical annotation database 112. The surgical path selected by the surgeon may be overlaid on the subject patient. Camera-integrated AR glasses may be used to identify the location and orientation of the surgeon, subject patient, and surgical tool. In at least one alternative embodiment, an operating room camera may be used to identify the location and orientation of the surgeon, subject patient, and surgical tool.
In at least one example embodiment, the surgeon may begin the procedure and, while performing the procedure, the annotation may be presented to the user. In at least one example, the appropriate time to present the annotation can be determined based on a timestamp set by the surgeon during the training phase. For example, the annotation may be set to be presented five minutes after the start of the surgical procedure. In another case, the appropriate time to present the annotation can be determined based on the initiation of the surgical procedure. For example, the annotation may be set to be presented during the beginning of the third step of the surgical procedure. Each step of the surgical procedure may be monitored by camera-integrated AR glasses. Thus, the system can therefore present the annotation to the surgeon at a predetermined surgical step.
The annotation may help the surgeon store important details related to any step of the surgical procedure. These details may be presented to the surgeon at a specified time as a reminder and/or warning. Thus, the surgeon may be aided by their own input recorded during a training session prior to the actual surgery. Thus, the accuracy and efficiency of the surgeon is improved by allowing them to pay attention to every minute but important details, thereby reducing the incidence of errors.
In at least one example embodiment, the color of the highlighted surgical path may change when the surgical tool deviates from the surgical path. For example, in a low priority region, the color of the surgical path may change from green to yellow while the surgical tool begins to deviate from the surgical path. Further, the vibration setting may gradually change from a low intensity to a medium intensity over time. In another example, in a high priority region, the color of the surgical path may change from yellow to red, and tactile feedback is provided to the surgeon during deviation of the surgical tool from the path.
In the foregoing, an intelligent surgical path guidance system according to the present invention has been described. It should be understood that the technical configuration of the present invention can be implemented in other specific forms by those skilled in the art without changing the technical spirit or essential features of the present invention.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention in any way. It will be understood by those skilled in the art that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An intelligent surgical path guidance system 100 for providing visual feedback or tactile feedback and real-time guidance to a surgeon during a procedure to reduce surgical risk of surgical malwinds, comprising: a surgical control module 110, a data storage module 120, a graphical user interface GUI 130;
the surgical control module 110 is connected to the data retention module 120 to facilitate data transfer between the surgical control module 110 and the data retention module 120;
the data preservation module 120 may include a real-time health record unit 121 for storing patient-related data, a 3D model database 122, a surgical path database 123, a surgical annotation database 124, a vibration database 125, and a procedure database 126;
wherein the real-time health record unit 121 is configured to store data of the patient in real-time, which may correspond to medical imaging data and/or diagnostic data; the 3D model database 122 stores 3D models of the affected regions of the patient, the 3D models including all regions or tissue types classified by the surgeon as tactile barriers and hard barriers; the surgical path database 123 stores details of the methods of performing various surgical procedures, the different methods and paths of performing various surgical procedures, and the surgical paths that a particular type of surgical procedure may follow; the surgical annotation database 124 is configured to accept annotations provided by the surgeon during surgical practice or training sessions as well as during pre-planning phases or even during actual surgical procedures; the surgeon may add annotations at any time during the virtual reality simulation, and the annotations may be stored in the surgical annotation database 124; the vibration database 125 is configured to store information about deviation margins, ranges or windows along the surgical path that may be selected by the surgeon, providing tactile feedback to the surgeon through, for example, a tactile feedback hand controller; the procedure database 126 stores video and sensor data related to previously performed surgical procedures;
the surgical control module 110 includes a surgical planning module 111, a surgical analysis module 112, a surgical simulation module 113, a surgical assistance module 114, and an AI path module 115;
a graphical user interface GUI 130 is used for interaction between the surgeon and the surgical planning module 111, the GUI including different icons or radio buttons, each programmed to activate execution of a respective function.
2. The intelligent surgical path guidance system according to claim 1, wherein the surgical planning module 111 retrieves identification details of the target patient from the real-time health record unit 121, retrieves a diagnosis of the patient and may identify a recommended surgery for the subject patient, creates a 3D model based on data of the three-dimensional model database 122, retrieves possible surgical paths of the recommended surgery for the subject patient.
3. The intelligent surgical path directing system of claim 2, wherein the surgical planning module 111 selects a region highlighting of the 3D model, the highlighted selected region may help enable the surgeon to define tactile and hard barriers for different types of tissue;
the surgical planning module 111 retrieves surgical paths from the surgical path database 1231 to meet the surgical needs of the subject patient, each surgical path may represent a line drawn from one point to another in an image of a body part that has been previously defined by a surgeon, other expert, or stored in the surgical path database 1231.
4. The intelligent surgical path directing system of claim 1,
a management system, wherein the AI pathway module 115 is configured to analyze images of a particular surgical field from a plurality of patients who have previously undergone a recommended procedure and to store the results of these procedures for each of the plurality of patients, obtaining all possible surgical pathways; the AI routing module 115 calculates a cost function for each surgical path and evaluates whether the predicted outcome of the surgical paths provides satisfactory results.
5. The intelligent surgical path directing system of claim 4, wherein the AI routing module 115 evaluates based at least on: 1) the ability to provide adequate access to the surgical site, 2) acceptable anatomical access time to the site, and 3) the ability of the patient to tolerate the surgical procedure requirements; AI pathway module 115 selects a surgical pathway that balances the minimum cost function of the surgical pathway and other risk factors, including the risk of approaching and possibly penetrating organs, or at least one factor of the expected postoperative outcome of a given pathway to the patient's recovery time or long-term health.
6. The intelligent surgical path directing system of claim 1, wherein the surgical analysis module 112 scans an area along the surgical path to identify that a surgical tool has moved into any one of a tactile barrier or a hard barrier, providing tactile feedback to the surgeon through a tactile controller;
the surgical analysis module 112 determines the presence of a surgical tool within or outside of the hard barrier, stops the advancement or movement of the surgical tool, and prevents the surgical tool from invading critical tissue.
7. The intelligent surgical path guidance system of claim 6, wherein the surgical analysis module 112 identifies anomalies or deviations of surgical instruments from the surgical path, and when a surgical instrument deviates from the surgical path, alerts are issued based on predefined conditions, and once the surgical analysis module 112 generates an alert, the alert may be in the form of highlighting the surgical path or changing the color of the surgical path; a second input may be accepted from the surgeon to continue working along the same path, selecting a different surgical path, or selecting a new surgical procedure.
8. The intelligent surgical path guidance system of claim 7, wherein the surgical simulation module 113 implements a simulation process of a surgery, highlights a surgical path on the 3D model, simulates a surgical process, and may present one or more virtual organs to be operated on by the surgeon; the movement, cutting, suturing and coagulation of the surgical process of different organs are simulated, and the real practice of the surgical process is simulated.
9. The intelligent surgical path directing system of claim 7, wherein the surgical simulation module 113 creates a VR simulation to train a surgeon using the 3D model, during which training a virtual representation of a surgical tool on the 3D model is displayed and a surgical path displayed on the 3D model can be highlighted; setting a margin of deviation based on the surgeon's input to provide visual and tactile feedback to the surgeon during the actual surgical procedure; tracking the surgeon's motion on a virtual reality simulation as the surgeon operates; the surgeon may set one or more annotations at the deviation points, store the one or more annotations in the surgical annotation database 1112 along with visual and tactile feedback, present the annotations to the surgeon at specified times along with the visual and tactile feedback, and may assist the surgeon by providing reminders and/or warnings at specified times during the procedure.
10. The intelligent surgical path guidance system of claim 1, wherein the surgical assistance module 114 provides support to the surgeon during the actual surgical procedure on the subject patient, identifies the position and orientation of the surgeon, subject patient and surgical tool using camera-integrated AR glasses, each step of the surgical procedure can be monitored by the camera-integrated AR glasses, and presents annotations to the surgeon at predetermined surgical steps.
CN202211005301.4A 2022-08-22 2022-08-22 Intelligent operation path guiding system Active CN115363752B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211005301.4A CN115363752B (en) 2022-08-22 2022-08-22 Intelligent operation path guiding system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211005301.4A CN115363752B (en) 2022-08-22 2022-08-22 Intelligent operation path guiding system

Publications (2)

Publication Number Publication Date
CN115363752A true CN115363752A (en) 2022-11-22
CN115363752B CN115363752B (en) 2023-03-28

Family

ID=84067896

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211005301.4A Active CN115363752B (en) 2022-08-22 2022-08-22 Intelligent operation path guiding system

Country Status (1)

Country Link
CN (1) CN115363752B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102117378A (en) * 2009-12-31 2011-07-06 苏州瑞派宁科技有限公司 Hepatic tumor comprehensive surgical planning analogy method and system thereof based on three-dimensional multimode images
WO2011108994A1 (en) * 2010-03-05 2011-09-09 Agency For Science, Technology And Research Robot assisted surgical training
WO2015154069A1 (en) * 2014-04-04 2015-10-08 Surgical Theater LLC Dynamic and interactive navigation in a surgical environment
CN106251751A (en) * 2016-10-12 2016-12-21 大连文森特软件科技有限公司 A kind of simulated medical surgery analogue system based on VR technology
US20180368930A1 (en) * 2017-06-22 2018-12-27 NavLab, Inc. Systems and methods of providing assistance to a surgeon for minimizing errors during a surgical procedure
DE102018111180A1 (en) * 2018-05-09 2019-11-14 Olympus Winter & Ibe Gmbh Operating method for a medical system and medical system for performing a surgical procedure
CN111417353A (en) * 2017-10-10 2020-07-14 威布鲁尼克斯公司 Surgical shape sensing fiber optic apparatus and method
CN112155729A (en) * 2020-10-15 2021-01-01 中国科学院合肥物质科学研究院 Intelligent automatic planning method and system for surgical puncture path and medical system
CN112885436A (en) * 2021-02-25 2021-06-01 刘春煦 Dental surgery real-time auxiliary system based on augmented reality three-dimensional imaging
CN113197665A (en) * 2021-04-30 2021-08-03 曹立华 Minimally invasive surgery simulation method and system based on virtual reality
CN113672478A (en) * 2020-05-14 2021-11-19 中兴通讯股份有限公司 Log obtaining method, device, terminal, server and storage medium
CN113749769A (en) * 2020-06-03 2021-12-07 格罗伯斯医疗有限公司 Surgical guiding system
US20220000565A1 (en) * 2018-11-15 2022-01-06 Comofi Medtech Private Limited System for renal puncturing assistance
WO2022103254A1 (en) * 2020-11-11 2022-05-19 Elitac B.V. Device, method and system for aiding a surgeon while operating

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102117378A (en) * 2009-12-31 2011-07-06 苏州瑞派宁科技有限公司 Hepatic tumor comprehensive surgical planning analogy method and system thereof based on three-dimensional multimode images
WO2011108994A1 (en) * 2010-03-05 2011-09-09 Agency For Science, Technology And Research Robot assisted surgical training
WO2015154069A1 (en) * 2014-04-04 2015-10-08 Surgical Theater LLC Dynamic and interactive navigation in a surgical environment
CN106251751A (en) * 2016-10-12 2016-12-21 大连文森特软件科技有限公司 A kind of simulated medical surgery analogue system based on VR technology
US20180368930A1 (en) * 2017-06-22 2018-12-27 NavLab, Inc. Systems and methods of providing assistance to a surgeon for minimizing errors during a surgical procedure
CN111417353A (en) * 2017-10-10 2020-07-14 威布鲁尼克斯公司 Surgical shape sensing fiber optic apparatus and method
DE102018111180A1 (en) * 2018-05-09 2019-11-14 Olympus Winter & Ibe Gmbh Operating method for a medical system and medical system for performing a surgical procedure
US20220000565A1 (en) * 2018-11-15 2022-01-06 Comofi Medtech Private Limited System for renal puncturing assistance
CN113672478A (en) * 2020-05-14 2021-11-19 中兴通讯股份有限公司 Log obtaining method, device, terminal, server and storage medium
CN113749769A (en) * 2020-06-03 2021-12-07 格罗伯斯医疗有限公司 Surgical guiding system
CN112155729A (en) * 2020-10-15 2021-01-01 中国科学院合肥物质科学研究院 Intelligent automatic planning method and system for surgical puncture path and medical system
WO2022103254A1 (en) * 2020-11-11 2022-05-19 Elitac B.V. Device, method and system for aiding a surgeon while operating
CN112885436A (en) * 2021-02-25 2021-06-01 刘春煦 Dental surgery real-time auxiliary system based on augmented reality three-dimensional imaging
CN113197665A (en) * 2021-04-30 2021-08-03 曹立华 Minimally invasive surgery simulation method and system based on virtual reality

Also Published As

Publication number Publication date
CN115363752B (en) 2023-03-28

Similar Documents

Publication Publication Date Title
US20220168059A1 (en) Systems and methods of providing assistance to a surgeon for minimizing errors during a surgical procedure
US20230011507A1 (en) Surgical system with ar/vr training simulator and intra-operative physician image-guided assistance
US11304761B2 (en) Artificial intelligence guidance system for robotic surgery
CN109996508B (en) Teleoperated surgical system with patient health record based instrument control
US11737841B2 (en) Configuring surgical system with surgical procedures atlas
US20200243199A1 (en) Methods and systems for providing an episode of care
EP1919390B1 (en) Computer assisted surgery system
CN113015480A (en) Patient specific surgical methods and systems
KR20180058656A (en) Reality - Enhanced morphological method
KR101862360B1 (en) Program and method for providing feedback about result of surgery
US11810360B2 (en) Methods for arthroscopic surgery video segmentation and devices therefor
EP3413774A1 (en) Database management for laparoscopic surgery
KR102146672B1 (en) Program and method for providing feedback about result of surgery
CN116313028A (en) Medical assistance device, method, and computer-readable storage medium
CN115363752B (en) Intelligent operation path guiding system
KR20190133424A (en) Program and method for providing feedback about result of surgery
CN115363773A (en) Orthopedic surgery robot system based on optical positioning navigation and control method thereof
Lemke et al. IT Architecture and Standards for a Therapy Imaging and Model Management System (TIMMS)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant