US20230238109A1 - Method and program for providing feedback on surgical outcome - Google Patents

Method and program for providing feedback on surgical outcome Download PDF

Info

Publication number
US20230238109A1
US20230238109A1 US18/194,067 US202318194067A US2023238109A1 US 20230238109 A1 US20230238109 A1 US 20230238109A1 US 202318194067 A US202318194067 A US 202318194067A US 2023238109 A1 US2023238109 A1 US 2023238109A1
Authority
US
United States
Prior art keywords
surgical
cue sheet
sheet data
actual
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/194,067
Inventor
Jong Hyuck Lee
Woo Jin HYUNG
Hoon Mo Yang
Ho Seung Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hutom Inc
University Industry Foundation UIF of Yonsei University
Original Assignee
Hutom Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hutom Co Ltd filed Critical Hutom Co Ltd
Priority to US18/194,067 priority Critical patent/US20230238109A1/en
Assigned to HUTOM CO., LTD. reassignment HUTOM CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HYUNG, WOO JIN, KIM, HO SEUNG, LEE, JONG HYUCK, YANG, HOON MO
Publication of US20230238109A1 publication Critical patent/US20230238109A1/en
Assigned to UIF (UNIVERSITY INDUSTRY FOUNDATION), YONSEI UNIVERSITY, HUTOM INC. reassignment UIF (UNIVERSITY INDUSTRY FOUNDATION), YONSEI UNIVERSITY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HUTOM CO., LTD.
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/256User interfaces for surgical systems having a database of accessory information, e.g. including context sensitive help or scientific articles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots

Definitions

  • Embodiments of the inventive concept described herein relate to a method and a program for providing a feedback on a surgical outcome.
  • the medical image or the advice of the skilled surgeon may not be utilized as auxiliary means for optimizing the surgical process for a surgery target patient as a surgery target.
  • Deep learning is defined as a set of machine-learning algorithms that attempt high-level abstractions (abstracting of key content or functions in a large amount of data or complex data) via a combination of several nonlinear transformation schemes. Deep learning may be largely considered as a field of machine learning that teaches a human mindset to a computer.
  • Embodiments of the inventive concept provide a method and a program for providing a feedback on a surgical outcome.
  • a method for providing a feedback on a surgical outcome by a computer includes dividing, by the computer, actual surgical data obtained in an actual surgical process into a plurality of detailed surgical operations to obtain actual surgical cue sheet data composed of the plurality of detailed surgical operations, obtaining, by the computer, reference cue sheet data about the actual surgery, and comparing, by the computer, the actual surgical cue sheet data with the reference cue sheet data, and providing, by the computer, the feedback based on the comparison result.
  • the actual surgical data may be divided into the plurality of detailed surgical operations, based on at least one of a surgery target portion, a type of surgical tool, a number of surgical tools, a position of the surgical tool, an orientation of the surgical tool, and movement of the surgical tool included in the actual surgical data.
  • At least one of a standardized name or standardized code data may be assigned to each of the plurality of detailed surgical operations.
  • the providing of the feedback may includes obtaining search information used for searching for at least one of the plurality of detailed surgical operations, extracting at least one detailed surgical operation corresponding to the search information based on the standardized name or the standardized code data, and providing a feedback on the extracted at least one detailed surgical operation.
  • reference cue sheet data may include optimized cue sheet data about the actual surgery, or referenced virtual surgical cue sheet data.
  • the providing of the feedback may include comparing the plurality of detailed surgical operations included in the actual surgical cue sheet data with a plurality of detailed surgical operations included in the reference cue sheet data, and determining, based on the comparison result, whether an unnecessary detailed surgical operation, a missing detailed surgical operation, or an incorrect detailed surgical operation is present in the actual surgical cue sheet data.
  • the determining of whether the incorrect detailed surgical operation is present in the actual surgical cue sheet data may include comparing movement of surgical tool corresponding to a detailed surgical operation included in the reference cue sheet data with movement of surgical tool corresponding to a detailed surgical operation included in the actual surgical cue sheet data, and determining, based on the comparison result, whether the detailed surgical operation included in the actual surgical cue sheet data is incorrect.
  • the method may further include adding the actual surgical cue sheet data to to-be-learned cue sheet data, obtaining a model for obtaining optimized cue sheet data using the to-be-learned cue sheet data having the actual surgical cue sheet data added thereto, and performing reinforcement learning on the obtained model.
  • the method may further include detecting at least one surgical error situation from the obtained surgical information, and providing a feedback on the detected surgical error situation.
  • the method may further include obtaining information about prognosis corresponding to each of one or more actual surgical cue sheet data including the actual surgical cue sheet data, performing reinforcement learning based on the one or more actual surgical cue sheet data and the prognosis information thereof, and determining a correlation between at least one detailed surgical operation included in the one or more actual surgical cue sheet data and the prognosis, based on the reinforcement learning result.
  • a computer program is stored in a computer-readable storage medium, wherein the computer program is configured to perform the method as defined above in combination with a computer as hardware.
  • FIG. 1 is a view showing a robot-based surgery system according to the disclosed embodiment
  • FIG. 2 is a flowchart showing a method for providing a feedback on a surgical outcome according to one embodiment
  • FIG. 3 is a flowchart showing a method for calculating an optimized cue sheet data according to one embodiment.
  • FIG. 4 is a flowchart showing a method for obtaining a feedback according to one embodiment.
  • inventive concept is not limited to the embodiments disclosed below, but may be implemented in various forms.
  • present embodiments are provided to merely complete the disclosure of the inventive concept, and to inform merely fully those skilled in the art of the inventive concept of the scope of the inventive concept.
  • inventive concept is only defined by the scope of the claims.
  • first”, “second”, etc. are used to describe various components, it goes without saying that the components are not limited by these terms. These terms are only used to distinguish one component from another component. Therefore, it goes without saying that a first component as mentioned below may be a second component within a technical idea of the inventive concept.
  • a term “unit” or “module” used herein means software or a hardware component such as FPGA, or ASIC.
  • the “unit” or “module” performs a predetermined role.
  • the “unit” or “module” is not meant to be limited to the software or the hardware.
  • the “unit” or “module” may be configured to reside in an addressable storage medium and may be configured to execute one or more processors.
  • the “unit” or “module” includes components such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of a program code, drivers, firmware, a microcode, circuitry, data, database, data structures, tables, arrays and variables. Functionality provided within the components and the “units” or “modules” may be combined into a smaller number of components and “units” or “modules” or may be further divided into additional components and “units” or “modules”.
  • image refers to multi-dimensional data composed of discrete image elements (e.g., pixels in a 2D image and voxels in a 3D image).
  • image may include a medical image of an object obtained by a CT imaging apparatus.
  • the term “object” may be a person or an animal, or a portion or an entirety of a person or animal.
  • the object may include at least one of organs such as a liver, a heart, a uterus, a brain, a breast, and an abdomen, and a blood vessel.
  • a term “user” may be a surgeon, a nurse, a clinical pathologist, a medical image expert, etc. as a medical expert, may be a technician repairing a medical apparatus.
  • a term “user” may be a surgeon, a nurse, a clinical pathologist, a medical image expert, etc. as a medical expert, may be a technician repairing a medical apparatus.
  • the present disclosure is not limited thereto.
  • a term “medical image data” refers to a medical image that is captured by a medical imaging apparatus, and includes all medical images that may represent a body of an object in a 3D modelling manner
  • the “medical image data” may include a computed tomography (CT) image, a magnetic resonance imaging (MRI) image, a positron emission tomography (PET) image, etc.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • PET positron emission tomography
  • a term “virtual body model” refers to a model created in a conforming manner to an actual patient's body based on the medical image data.
  • the “virtual body model” may be created by modeling the medical image data in a three dimensions manner or by correcting the modeled data to be adapted to an actual surgery situation.
  • a term “virtual surgical data” means data including rehearsal or simulation detailed surgical operation performed on the virtual body model.
  • the “virtual surgical data” may be image data of rehearsal or simulation performed on the virtual body model in a virtual space or may be data in which a surgical operation performed on the virtual body model is recorded.
  • actual surgical data refers to data obtained as a medical staff performs actual surgery.
  • the “actual surgical data” may be image data obtained by imaging a surgical target portion in an actual surgical process, or may be data in which a surgical operation performed in the actual surgical process is recorded.
  • a term “detailed surgical operation” means a minimum unit of a surgical operation as divided according to specific criteria.
  • cue sheet data refers to data in which detailed surgical operations into which a specific surgical process is divided are recorded in order.
  • to-be-executed cue sheet data refers to cue sheet data obtained based on the virtual surgical data obtained via simulation by the user.
  • a term “training virtual surgical cue sheet data” is included in the to-be-executed cue sheet data, and means cue sheet data created based on the virtual surgical data obtained via surgery simulation by the user.
  • a term “referenced virtual surgical cue sheet data” refers to cue sheet data about virtual surgery performed by a specific medical person for construction of to-be-learned big data or surgery process guidance.
  • a term “optimized cue sheet data” means cue sheet data about an optimized surgical process in terms of a surgery time or prognosis.
  • to-be-learned cue sheet data means cue sheet data used for learning for optimized cue sheet data calculation.
  • a term “surgery guide data” means data used as guide information during actual surgery.
  • a term “computer” includes all of various devices capable of performing computation and providing the computation result to the user.
  • the computer may include not only a desktop (PC) and a notebook, but also a smart phone, a tablet PC, a cellular phone, a PCS (Personal Communication Service) phone, a mobile terminal of synchronous/asynchronous IMT-2000 (International Mobile Telecommunication-2000), a palm personal computer (PC), and a personal digital assistant (PDA).
  • a head mounted display (HMD) apparatus includes a computing function
  • the HMD apparatus may be a computer.
  • the computer may be a server that receives a request from a client and performs information processing.
  • FIG. 1 is a view showing a robot-based surgery system according to a disclosed embodiment.
  • FIG. 1 a schematic diagram of the system capable of performing robot-based surgery according to the disclosed embodiment is illustrated.
  • the robot-based surgery system includes a medical imaging apparatus 10 , a server 20 , and a controller 30 , an imaging unit 36 , a display 32 , and a surgery robot 34 provided in an operating room.
  • the medical imaging apparatus 10 may be omitted from the robot-based surgery system according to the disclosed embodiment.
  • the robot-based surgery may be performed by the user controlling the surgery robot 34 using the controller 30 . In one embodiment, the robot-based surgery may be performed automatically by the controller 30 without the user control.
  • the server 20 is a computing device including at least one processor and a communication unit.
  • the controller 30 includes a computing device including at least one processor and a communication unit. In one embodiment, the controller 30 includes hardware and software interfaces for controlling the surgery robot 34 .
  • the imaging unit 36 includes at least one image sensor. That is, the imaging unit 36 includes at least one camera to image a surgery target portion. In one embodiment, the imaging unit 36 is used in conjunction with the surgery robot 34 .
  • the imaging unit 36 may include at least one camera coupled with a surgery arm of the surgery robot 34 .
  • the image captured by the imaging unit 36 is displayed on the display 32 .
  • the controller 30 receives information necessary for surgery from the server 20 or creates information necessary for surgery and provides the information to the user. For example, the controller 30 displays the created or received information necessary for the surgery on the display 32 .
  • the user controls movement of the surgery robot 34 by manipulating the controller 30 while looking at the display 32 to perform robot-based surgery.
  • the server 20 creates information necessary for robot-based surgery using medical image data of the object (patient) previously imaged by the medical imaging apparatus 10 and provides the created information to the controller 30 .
  • the controller 30 may display the information received from the server 20 on the display 32 to present the information to the user, or may use the information received from the server 20 to control the surgery robot 34 .
  • imaging means that may be used in the medical imaging apparatus 10 is not limited particularly.
  • various medical imaging means such as CT, X-Ray, PET, and MRI may be used.
  • FIG. 2 is a flowchart showing a method for providing a feedback on a surgical outcome according to one embodiment.
  • Steps shown in FIG. 2 may be performed in time series by the server 20 or the controller 30 shown in FIG. 1 .
  • each step is described as being performed by a computer, but a subject to perform each step is not limited to a specific device. All or some of the steps may be performed by the server 20 or the controller 30 .
  • a method for providing a feedback on a surgical outcome includes dividing, by the computer, actual surgical data obtained from an actual surgical process into a plurality of detailed surgical operations to obtain actual surgical cue sheet data composed of the plurality of detailed surgical operations (S 200 ); obtaining, by the computer, reference cue sheet data about the actual surgery (S 400 ); and comparing, by the computer, the actual surgical cue sheet data and the reference cue sheet data to provide a feedback based the comparison result (S 600 ).
  • S 200 actual surgical cue sheet data composed of the plurality of detailed surgical operations
  • S 400 reference cue sheet data about the actual surgery
  • S 600 comparing, by the computer, the actual surgical cue sheet data and the reference cue sheet data to provide a feedback based the comparison result
  • the computer may divide the actual surgical data obtained from the actual surgical process into the plurality of detailed surgical operations to obtain the actual surgical cue sheet data composed of the plurality of detailed surgical operations (S 200 ).
  • the computer creates the actual surgical cue sheet data based on the surgical image imaged in the surgical process by the surgery robot or based on data obtained in a control process of the surgery robot.
  • the detailed surgical operation constituting the cue sheet data refers to a minimum operation unit constituting the surgical process.
  • the actual surgical data may be divided into the detailed surgical operations based on several criteria.
  • the actual surgical data may be divided to the detailed surgical operations based on surgery types (e.g., laparoscopic surgery, robot-based surgery), anatomical body parts where surgery is performed, surgical tools as used, a number of surgical tools, an orientation or a position of the surgical tool displayed on a screen, movement of the surgical tool (for example, forward/reward movement), etc.
  • the division criteria and detailed categories included within the division criteria may be directly set by the medical staff learning the actual surgical data.
  • the computer may perform supervised learning based on the division criteria and the detailed categories set by the medical staff to divide the actual surgical data into the detailed surgical operations as a minimum operation unit.
  • the division criteria and detailed categories included within the division criteria may be extracted via learning of a surgical image by the computer.
  • the computer may calculate the division criteria and detailed categories included within the division criteria via deep learning (i.e., unsupervised learning) of actual surgical data accumulated as big data. Subsequently, the computer may divide the actual surgical data based on the division criteria created via the learning of the actual surgical data to create the cue sheet data.
  • the actual surgical data may be divided based on a result of determining whether the actual surgical data satisfies the division criterion via image recognition. That is, the computer may recognize the anatomical organ position on the screen, the number of surgical tools appearing on the screen, and the number of the surgical tools on the screen within the image of the actual surgical data as the division criterion and may divide the actual surgical data into the detailed surgical operation units based on the recognized division criterion.
  • the computer may perform the division process for cue sheet data creation based on surgical tool movement data included in the actual surgical data.
  • the actual surgical data may include various information input in a process of controlling the surgery robot, such as the type and the number of surgical tools selected by the user, information about the movement of each surgical tool when the user performs robot-based surgery. Accordingly, the computer may perform division based on information included in the actual surgical data at each time point thereof.
  • the actual surgical data includes various types of detailed surgical operation such as ablation and suture. Division is performed based on the division criteria. Specifically, a process of dividing the actual surgical data (for example, actual surgery image) about the actual gastric cancer surgery into the detailed surgical operations to create the cue sheet data is as follows.
  • gastric cancer surgery includes a detailed surgical operation to ablate a portion or an entirety of a stomach containing a tumor, and a detailed surgical operation to ablate a lymph node.
  • various resections and connections are used depending on a state of the gastric cancer.
  • each detailed surgical operation may be divided into a plurality of detailed surgical operations based on a specific location where the detailed surgical operation is taken and a direction of movement of the surgical tool.
  • the detailed operation of the gastric cancer surgery may be divided into an opening step, an ablation step, a connection step, and a suture step.
  • a method of changing a disconnected state of an organ to a connected state includes an in vitro anastomosis method of incising and connecting at least 4 to 5 cm of an end of an anticardium, and an in vivo anastomosis method in which about 3 cm of umbilicus is incised and incision and anastomosis occur in an abdominal cavity.
  • the above-described connection stage may be divided in detailed sub-steps according to the specific connection method as described above.
  • each surgery operation may be divided into a plurality of detailed surgical operations according to the position and the movement of the surgical tool.
  • Each of the divided detailed surgical operations has a standardized name allocated thereto based on a location where the detailed surgical operation is performed and a pathway of the surgical tool.
  • the standardized name may be variously defined.
  • the name of the portion may be a name commonly used in the medical field. More comprehensive or detailed names defined in the system according to the disclosed embodiment may be used.
  • the surgery image about the actual surgery by the user may be organized into information in a form of a cue sheet in which a plurality of detailed surgical operations are sequentially arranged based on the standardized names.
  • the cue sheet data may be created as code data of specific digits based on the division criteria for dividing the surgical data into the detailed surgical operations. That is, the computer divides the actual surgical data into standardized detailed surgical operations by applying standardized division criteria and designating detailed categories within the division criteria. The computer may allocate a standardized code value to each detailed category and allocate standardized code data to distinguish each detailed surgical operation.
  • the computer allocates, to each detailed surgical operation, digitalized code data obtained by allocating numbers or letters to categories in order from a higher category to a lower category to which the specific detailed surgical operations belong, according to the order of application of the division criteria.
  • the computer may create cue sheet data in a form in which not images of the divided detailed surgical operations but the standardized code data of the detailed surgical operations are listed.
  • the user may share or deliver the actual surgical process by providing only the cue sheet data composed of the standardized code data.
  • the computer when the computer is a client terminal disposed in the operating room or corresponds to the controller 30 , the computer may acquire the standardized code data from the surgery image, and transmit the obtained code data to the server 20 such that the actual surgical process is shared or delivered.
  • the surgical image is transmitted to the server 20 .
  • the server 20 may create the cue sheet data and the code data.
  • the computer may allocate a standardized name to a standardized code of each detailed surgical operation.
  • the user may select and identify only a desired detailed surgical operation within the entire cue sheet. Further, in this case, the user may easily grasp a progress of the surgery or the rehearsal by simply viewing the cue sheet in which the detailed surgical operations are sequentially arranged based on the standardized names thereof, without viewing an entirety of the surgery image.
  • the cue sheet data may be converted into a surgical image using an image database for each detailed surgical operation.
  • an image matching each code data may be stored in the image data base.
  • a plurality of images matching each code data may be stored therein depending on a situation.
  • specific detailed code data may include different detailed surgical operation images in the image database according to previously performed operations.
  • the computer may reproduce the cue sheet data as a surgical simulation image by sequentially applying the detailed surgical operations included in the cue sheet data to the virtual body model.
  • the image corresponding to the cue sheet data may be reproduced in the same point of view as that of the surgery image.
  • the image corresponding to the cue sheet data may be reconstructed in a point of view different from that of the surgery image and the reconstructed image may be reproduced.
  • the image may be modeled in a 3D manner, and thus the viewpoint and a position thereof may be adjusted according to the user's manipulation.
  • the computer acquires the reference cue sheet data about the actual surgery (S 400 ).
  • the reference cue sheet data means optimized cue sheet data created by the computer.
  • the reference cue sheet data refers to referenced virtual surgical cue sheet data.
  • FIG. 3 is a flowchart showing a method for calculating the optimized cue sheet data according to one embodiment.
  • the computer acquires one or more to-be-learned cue sheet data (S 420 ).
  • the to-be-learned cue sheet data refers to learning target data that is to be learned for calculation of the optimized cue sheet data.
  • the to-be-learned cue sheet data may include cue sheet data created based on the actual surgical data (i.e. actual surgical cue sheet data) or cue sheet data created based on the simulated actual surgical data for reference (i.e. referenced virtual surgical cue sheet data).
  • the actual surgical cue sheet data is created by the computer dividing the actual surgical data according to the division criteria.
  • the referenced virtual surgical cue sheet data is not obtained in the user's surgery simulation process, but is created by performing simulation for purpose of constructing the learning target data or providing the same to practitioners for reference.
  • the reinforcement learning refers to one area of the machine learning, and refers to a method in which an agent defined in a certain environment recognizes a current state and selects a detailed surgical operation or a sequence of detailed surgical operations that maximizes reward among selectable detailed surgical operations or sequences thereof.
  • the reinforcement learning may be summarized as learning of a scheme of maximizing compensation based on state transition and compensation according to the state transition.
  • the computer calculates the optimized cue sheet data using the reinforcement learning result (S 460 ).
  • the optimized cue sheet data is calculated based on the shortest surgical time that may reduce the patient's anesthesia time, a minimum bleeding amount, an essential operation group, and an essential performance sequence, etc. based on the reinforcement learning result.
  • the essential operation group refers to a group of detailed surgical operations that must be performed together to perform a specific detailed surgical operation.
  • the essential performance sequence refers to a surgical operation sequence at which that operations must be sequentially performed in a course of performing specific surgery. For example, surgical operations that must be performed sequentially and an order thereof may be determined according to the type of surgery or the type of the surgical operation.
  • the computer may calculate situation-based optimized cue sheet data based on the patient's physical condition, the surgery target portion (e.g., tumor tissue) condition (e.g., a size, a location, etc. of the tumor) thereof the reinforcement learning.
  • the computer utilizes the patient condition, the surgery target portion condition, and the like along with the to-be-learned cue sheet data for learning.
  • the computer may perform virtual simulation surgery on its own.
  • the computer may create a surgical process according to the type of surgery and the type of the patient based on the disclosed surgical process optimizing method, and may perform the virtual surgery simulation based on the created surgical process.
  • the computer evaluates results of the virtual surgery simulation.
  • the computer may perform the reinforcement learning based on virtual surgical simulation information and evaluation information on the result thereof, thereby to obtain an optimized surgical process.
  • a model trained to create the optimal surgical process may not create an optimized surgical process according to an induvial patient and a type of surgery thereof because in actual surgery, body structures and types of surgery of patients are different from each other.
  • the computer may create a surgical process based on the patient's body structure and the type of surgery thereof using the trained model, and may perform virtual surgery simulation, based on the created surgical process. In this way, the computer may create an optimized surgical process for an individual patient and a type of surgery thereof via the reinforcement learning.
  • the computer compares the actual surgical cue sheet data and the reference cue sheet data with each other and provides a feedback based on a comparison result (S 600 ).
  • the operation of comparing the actual surgical cue sheet data and the reference cue sheet data with each other may be performed by the computer or the controller 30 disposed in the operating room, or may be performed on the server 20 .
  • the server 20 acquires the reference cue sheet data, and compares the reference cue sheet data with the surgical image or the cue sheet data (code data) obtained from the controller 30 .
  • the computer acquires the cue sheet data from the surgical image, and compares the cue sheet data with the reference cue sheet data received from the server 20 .
  • the feedback may be provided through a website or an application.
  • the feedback may be provided through an application installed in a mobile terminal of the surgeon.
  • a notification related to the feedback may be provided to the mobile terminal of the surgeon.
  • FIG. 4 is a flowchart showing a method for obtaining a feedback according to one embodiment.
  • the computer may compare a type and an order of detailed surgical operations included in the actual surgical cue sheet data with a type and an order of detailed surgical operations included in the reference cue sheet data, and may provide a feedback on the surgical outcome based on the comparison result (S 620 ).
  • the computer may determine whether a missing detailed surgical operation, an unnecessary detailed surgical operation, or an incorrect detailed surgical operation among detailed surgical operations included in actual surgical cue sheet data, compared to the detailed surgical operations included in the reference cue sheet data is present, and may provide a feedback on the surgical outcome, based on determination result.
  • a required detailed surgical operation included in the reference cue sheet data may be omitted from the actual surgical cue sheet data.
  • a detailed surgical operation included in the reference cue sheet data may be included in the actual surgical cue sheet data, but details thereof may be modified or incorrect.
  • the type of the detailed surgical operation in the reference cue sheet data and the type of the detailed surgical operation in the actual surgical cue sheet data are identical with each other, but details thereof may be different from each other.
  • the computer determines whether each detailed surgical operation has been performed correctly.
  • a specific detailed surgical operation is to be included in an entire surgical process, the specific detailed surgical operation may not be performed normally.
  • the specific detailed surgical operation is an operation of catching a tissue at a specific location using tongs-shaped surgical tool, the tool may catch the tissue at a deeper position than an optimal depth or the tissue may be caught in a small area of the tongs such that the tissue is caught and then is removed from the tongs. This situation may be determined as the incorrect detailed surgical operation.
  • the computer may recognize the tissue and the surgical tool in the actual surgery image via image recognition to accurately recognize the detailed surgical operation, and may compare the recognized detailed surgical operation with a correct detailed surgical operation in the optimized cue sheet data for evaluation and feedback.
  • the computer may analyze the actual surgical cue sheet data itself, and provide information on a record of an abnormal situation or an unexpected situation based on the analysis result. For example, when a bleeding occurs in a specific site, the computer may provide at least one of the fact that the bleeding has occurred, a location of the bleeding, or an amount of the bleeding.
  • the computer may analyze records of the detailed surgical operation included in the actual surgical cue sheet data, may determine a cause of the bleeding based on the analysis result and may provide the cause.
  • the computer analyzes a completed actual surgical cue sheet to search for a surgical error (S 640 ).
  • the computer may provide the feedback indicating the surgical error.
  • the computer may detect an event when a foreign object remains in the patient's body, the computer may provide a feedback indicating the event.
  • the computer recognizes all objects included in the surgical image, as well as a location of the surgical tool, and the bleeding site, based on the obtained surgical image, and analyzes each of the objects.
  • the computer determines a location, numbers, and an invasion time of the objects included in the surgery image. Therefore, the computer may generate a warning when it is determined that the foreign substance introduced into the surgery target portion has not been removed when the surgery is completed, and may provide a feedback that asks the user to check the surgery target portion.
  • the computer may ask the user to check the surgery target portion even when the object introduced into the surgery target portion is not identified in the image. For example, when it is not confirmed that the object that has invaded into the surgery target portion is removed from the surgery target portion, the object may not be included in the surgery image, but may remain at an invisible site. Thus, the computer may provide the feedback to ask the user to check the surgery target portion even when the object introduced into the surgery target portion is not identified in the image.
  • the computer may detect a surgical operation event of a wrong patient, and may provide a feedback indicating the event.
  • the computer analyzes the surgery image in real time, and performs registration between organs in the 3D modeled image and actual organs.
  • the computer tracks locations of the camera and surgical tool in real time, and determines a surgery situation and obtains information that a simulator follows in the actual surgical process.
  • the computer may ask the user to check this situation.
  • the computer may detect a surgical operation event of a wrong surgical target portion, and may provide a feedback indicating the event.
  • the computer determines a surgical situation, and provides a surgical guide image according to a rehearsal result or an optimal surgical scheme. In this process, when a process of the actual surgery is different from the rehearsal result or the optimal surgery scheme, the surgery may be performed on a wrong site or a different type of surgery may have been performed. Thus, the computer may ask the user to check this situation.
  • the computer may recognize a situation of nerve damage and may provide a feedback indicating this situation.
  • the computer may provide a feedback indicating this situation.
  • a warning may be provided to the user.
  • the computer visually presents a blood vessel, a nerve, and a ganglion that are invisible in an overlapping manner with the surgical image using image registration and AR (augmented reality) or MR (mixed reality) technology.
  • AR augmented reality
  • MR mixed reality
  • the method may further include adding the actual surgical cue sheet data to the to-be-learned cue sheet data used for calculation of the optimized cue sheet data, and performing reinforcement learning (S 660 ).
  • the actual surgical cue sheet data may be improved compared to the optimized cue sheet data. Therefore, the actual surgical cue sheet data may be added to the to-be-learned cue sheet data and then the reinforcement learning thereof may be performed, thereby to obtain a model capable of creating improved optimized cue sheet data.
  • the computer tracks prognosis of the patient corresponding to each surgical record (S 680 ).
  • the computer may perform machine learning using each surgery record and the prognosis corresponding to each surgery record as to-be-learned data, thereby determining a combination of surgical operations resulting in each prognosis.
  • the computer may analyze the surgical operations of patients with specific side effects, and thus the computer may derive detailed surgical operations or combinations of detailed surgical operations that may cause even minor surgical side effects.
  • the computer may acquire information about detailed surgical operations that bring about each prognosis via the reinforcement learning. For example, the computer may perform the reinforcement learning based on operations performed in the surgical process, and learned data about prognosis occurring when the operations are included or are performed in a specific order. Based on the reinforcement learning result, the computer may determine what prognosis (i.e., what side effect) may occur due to a specific detailed surgical operation, continuous detailed surgical operations, or a combination of detailed surgical operations.
  • the computer may output and provide the feedback to the user in various ways.
  • the computer may extract detailed surgical operations having defects and provide the extracted detailed surgical operations as the feedback.
  • the computer may extract and reproduce only images of the detailed surgical operations having the defects, thereby to help the user to grasp the defects.
  • the computer may search for and provide detailed surgical operations included in the surgical process. For example, the surgery is usually performed for several hours or more, such that it is difficult for the user to check an entire surgery image after the surgery for feedback. Therefore, the computer may provide the cue sheet data.
  • the computer may extract only the selected detailed surgical operation and may provide the feedback related thereto. For example, the computer may extract and reproduce only images of the selected detailed surgical operations.
  • each detailed surgical operation has a standardized name and standardized code data. Therefore, the user may search for each detailed surgical operation based on the standardized name or the standardized code data. The user may look at the cue sheet data to easily check the progress of the surgery.
  • the method for providing the feedback on the surgical outcome may be implemented using a program (or application) to be executed in combination with a computer as hardware and stored in a medium.
  • the program may include codes coded in computer languages such as C, C++, JAVA, and machine language that a processor (CPU) of the computer may read through a device interface thereof, in order for the computer to read the program and execute methods implemented using the program.
  • the code may include a functional code related to a function defining functions required to execute the methods, and an execution procedure-related control code necessary for the processor of the computer to execute the functions in a predetermined procedure.
  • the code may further include a memory reference-related code indicating a location (address) of an internal memory of the computer or an external memory thereto in which additional information or media necessary for the processor to execute the functions is stored.
  • the code may further include a communication-related code indicating how to communicate with any other remote computer or server using a communication module of the computer, and indicating information or media to be transmitted and received during the communication.
  • the storage medium means a medium that stores data semi-permanently, rather than a medium for storing data for a short moment, such as a register, a cache, or a memory, and that may be readable by a machine.
  • examples of the storage medium may include, but may not be limited to, ROM, RAM, CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device. That is, the program may be stored in various recording media on various servers to which the computer may access or on various recording media on the user's computer.
  • the medium may be distributed over a networked computer system so that a computer readable code may be stored in a distributed scheme.
  • the steps of the method or the algorithm described in connection with the embodiments of the inventive concept may be implemented directly in hardware, a software module executed by hardware, or a combination thereof.
  • the software modules may reside in random access memory (RAM), read only memory (ROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), a flash memory, a hard disk, a removable disk, CD-ROM, or any form of a computer readable recording medium well known in the art.
  • the feedback on the course and the outcome of the surgery may be provided to the user, based on the comparing result between the actual surgical process and the reference.
  • the feedback may be provided by extracting the necessary portion from the entire surgery image.
  • the user may check the necessary portion thereof.

Abstract

A method for providing a feedback on a surgical outcome by a computer includes dividing, by the computer, actual surgical data obtained in an actual surgical process into a plurality of detailed surgical operations to obtain actual surgical cue sheet data composed of the plurality of detailed surgical operations, obtaining, by the computer, reference cue sheet data about the actual surgery, and comparing, by the computer, the actual surgical cue sheet data with the reference cue sheet data, and providing, by the computer, the feedback based on the comparison result.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is a Continuation of U.S. patent application Ser. No. 16/914,141 filed Jun. 26, 2020, which is a continuation of International Patent Application No. PCT/KR2018/010329, filed on Sep. 5, 2018, which is based upon and claims the benefit of priority to Korean Patent Application No. 10-2017-0182889, filed on Dec. 28, 2017. The disclosures of the above-listed applications are hereby incorporated by reference herein in their entirety.
  • BACKGROUND
  • Embodiments of the inventive concept described herein relate to a method and a program for providing a feedback on a surgical outcome.
  • There is a need to develop a scheme capable of providing information to assist a surgeon in a surgical process. In order to provide the information to assist with the surgery, detailed surgical operations of the surgery must be recognized.
  • Conventionally, in order to design a scenario for optimizing the surgical process, a medical image that is previously captured is referred to or an advice from a highly skilled surgeon is referred. However, it is difficult to determine unnecessary processes only based on the medical image, and the advice of the experienced surgeon is not customized to a specific patient.
  • Therefore, the medical image or the advice of the skilled surgeon may not be utilized as auxiliary means for optimizing the surgical process for a surgery target patient as a surgery target.
  • Accordingly, development of a method for minimizing unnecessary processes in performing the surgery using a 3D medical image (for example, virtual images of 3D surgical tool movements and internal organ changes caused by the movement of the tool) to optimize the surgical process, and providing surgery assisting information based on the optimized surgical process is required.
  • Further, recently, deep learning has been widely used for analysis of medical images. Deep learning is defined as a set of machine-learning algorithms that attempt high-level abstractions (abstracting of key content or functions in a large amount of data or complex data) via a combination of several nonlinear transformation schemes. Deep learning may be largely considered as a field of machine learning that teaches a human mindset to a computer.
  • SUMMARY
  • Embodiments of the inventive concept provide a method and a program for providing a feedback on a surgical outcome.
  • Purposes that the inventive concept intends to achieve are not limited to those mentioned above. Other purposes as not mentioned will be clearly understood by those skilled in the art from following descriptions.
  • According to an exemplary embodiment, a method for providing a feedback on a surgical outcome by a computer includes dividing, by the computer, actual surgical data obtained in an actual surgical process into a plurality of detailed surgical operations to obtain actual surgical cue sheet data composed of the plurality of detailed surgical operations, obtaining, by the computer, reference cue sheet data about the actual surgery, and comparing, by the computer, the actual surgical cue sheet data with the reference cue sheet data, and providing, by the computer, the feedback based on the comparison result.
  • Further, the actual surgical data may be divided into the plurality of detailed surgical operations, based on at least one of a surgery target portion, a type of surgical tool, a number of surgical tools, a position of the surgical tool, an orientation of the surgical tool, and movement of the surgical tool included in the actual surgical data.
  • Further, at least one of a standardized name or standardized code data may be assigned to each of the plurality of detailed surgical operations.
  • Further, the providing of the feedback may includes obtaining search information used for searching for at least one of the plurality of detailed surgical operations, extracting at least one detailed surgical operation corresponding to the search information based on the standardized name or the standardized code data, and providing a feedback on the extracted at least one detailed surgical operation.
  • Further, the reference cue sheet data may include optimized cue sheet data about the actual surgery, or referenced virtual surgical cue sheet data.
  • Further, the providing of the feedback may include comparing the plurality of detailed surgical operations included in the actual surgical cue sheet data with a plurality of detailed surgical operations included in the reference cue sheet data, and determining, based on the comparison result, whether an unnecessary detailed surgical operation, a missing detailed surgical operation, or an incorrect detailed surgical operation is present in the actual surgical cue sheet data.
  • Further, the determining of whether the incorrect detailed surgical operation is present in the actual surgical cue sheet data may include comparing movement of surgical tool corresponding to a detailed surgical operation included in the reference cue sheet data with movement of surgical tool corresponding to a detailed surgical operation included in the actual surgical cue sheet data, and determining, based on the comparison result, whether the detailed surgical operation included in the actual surgical cue sheet data is incorrect.
  • Further, the method may further include adding the actual surgical cue sheet data to to-be-learned cue sheet data, obtaining a model for obtaining optimized cue sheet data using the to-be-learned cue sheet data having the actual surgical cue sheet data added thereto, and performing reinforcement learning on the obtained model.
  • Further, the method may further include detecting at least one surgical error situation from the obtained surgical information, and providing a feedback on the detected surgical error situation.
  • Further, the method may further include obtaining information about prognosis corresponding to each of one or more actual surgical cue sheet data including the actual surgical cue sheet data, performing reinforcement learning based on the one or more actual surgical cue sheet data and the prognosis information thereof, and determining a correlation between at least one detailed surgical operation included in the one or more actual surgical cue sheet data and the prognosis, based on the reinforcement learning result.
  • According to an exemplary embodiment, a computer program is stored in a computer-readable storage medium, wherein the computer program is configured to perform the method as defined above in combination with a computer as hardware.
  • BRIEF DESCRIPTION OF THE FIGURES
  • The above and other objects and features will become apparent from the following description with reference to the following figures, wherein like reference numerals refer to like parts throughout the various figures unless otherwise specified, and wherein:
  • FIG. 1 is a view showing a robot-based surgery system according to the disclosed embodiment;
  • FIG. 2 is a flowchart showing a method for providing a feedback on a surgical outcome according to one embodiment;
  • FIG. 3 is a flowchart showing a method for calculating an optimized cue sheet data according to one embodiment; and
  • FIG. 4 is a flowchart showing a method for obtaining a feedback according to one embodiment.
  • DETAILED DESCRIPTION
  • Advantages and features of the inventive concept, and methods of achieving them will become apparent with reference to embodiments described below in detail in conjunction with the accompanying drawings. However, the inventive concept is not limited to the embodiments disclosed below, but may be implemented in various forms. The present embodiments are provided to merely complete the disclosure of the inventive concept, and to inform merely fully those skilled in the art of the inventive concept of the scope of the inventive concept. The inventive concept is only defined by the scope of the claims.
  • The terminology used herein is for the purpose of describing the embodiments only and is not intended to limit the inventive concept. As used herein, the singular forms “a” and “an” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises”, “comprising”, “includes”, and “including” when used in this specification, specify the presence of the stated features, integers, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, operations, elements, components, and/or portions thereof. Like reference numerals refer to like elements throughout the disclosure. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Although terms “first”, “second”, etc. are used to describe various components, it goes without saying that the components are not limited by these terms. These terms are only used to distinguish one component from another component. Therefore, it goes without saying that a first component as mentioned below may be a second component within a technical idea of the inventive concept.
  • Unless otherwise defined, all terms including technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this inventive concept belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
  • A term “unit” or “module” used herein means software or a hardware component such as FPGA, or ASIC. The “unit” or “module” performs a predetermined role. However, the “unit” or “module” is not meant to be limited to the software or the hardware. The “unit” or “module” may be configured to reside in an addressable storage medium and may be configured to execute one or more processors. Thus, in an example, the “unit” or “module” includes components such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of a program code, drivers, firmware, a microcode, circuitry, data, database, data structures, tables, arrays and variables. Functionality provided within the components and the “units” or “modules” may be combined into a smaller number of components and “units” or “modules” or may be further divided into additional components and “units” or “modules”.
  • As used herein, a term “image” refers to multi-dimensional data composed of discrete image elements (e.g., pixels in a 2D image and voxels in a 3D image). For example, the image may include a medical image of an object obtained by a CT imaging apparatus.
  • As used herein, the term “object” may be a person or an animal, or a portion or an entirety of a person or animal. For example, the object may include at least one of organs such as a liver, a heart, a uterus, a brain, a breast, and an abdomen, and a blood vessel.
  • As used herein, a term “user” may be a surgeon, a nurse, a clinical pathologist, a medical image expert, etc. as a medical expert, may be a technician repairing a medical apparatus. However, the present disclosure is not limited thereto.
  • As used herein, a term “medical image data” refers to a medical image that is captured by a medical imaging apparatus, and includes all medical images that may represent a body of an object in a 3D modelling manner The “medical image data” may include a computed tomography (CT) image, a magnetic resonance imaging (MRI) image, a positron emission tomography (PET) image, etc.
  • As used herein, a term “virtual body model” refers to a model created in a conforming manner to an actual patient's body based on the medical image data. The “virtual body model” may be created by modeling the medical image data in a three dimensions manner or by correcting the modeled data to be adapted to an actual surgery situation.
  • As used herein, a term “virtual surgical data” means data including rehearsal or simulation detailed surgical operation performed on the virtual body model. The “virtual surgical data” may be image data of rehearsal or simulation performed on the virtual body model in a virtual space or may be data in which a surgical operation performed on the virtual body model is recorded.
  • As used herein, a term “actual surgical data” refers to data obtained as a medical staff performs actual surgery. The “actual surgical data” may be image data obtained by imaging a surgical target portion in an actual surgical process, or may be data in which a surgical operation performed in the actual surgical process is recorded.
  • As used herein, a term “detailed surgical operation” means a minimum unit of a surgical operation as divided according to specific criteria.
  • As used herein, a term “cue sheet data” refers to data in which detailed surgical operations into which a specific surgical process is divided are recorded in order.
  • As used herein, a term “to-be-executed cue sheet data” refers to cue sheet data obtained based on the virtual surgical data obtained via simulation by the user.
  • As used herein, a term “training virtual surgical cue sheet data” is included in the to-be-executed cue sheet data, and means cue sheet data created based on the virtual surgical data obtained via surgery simulation by the user.
  • As used herein, a term “referenced virtual surgical cue sheet data” refers to cue sheet data about virtual surgery performed by a specific medical person for construction of to-be-learned big data or surgery process guidance.
  • As used herein, a term “optimized cue sheet data” means cue sheet data about an optimized surgical process in terms of a surgery time or prognosis.
  • As used herein, a term “to-be-learned cue sheet data” means cue sheet data used for learning for optimized cue sheet data calculation.
  • As used herein, a term “surgery guide data” means data used as guide information during actual surgery.
  • As used herein, a term “computer” includes all of various devices capable of performing computation and providing the computation result to the user. For example, the computer may include not only a desktop (PC) and a notebook, but also a smart phone, a tablet PC, a cellular phone, a PCS (Personal Communication Service) phone, a mobile terminal of synchronous/asynchronous IMT-2000 (International Mobile Telecommunication-2000), a palm personal computer (PC), and a personal digital assistant (PDA). Further, when a head mounted display (HMD) apparatus includes a computing function, the HMD apparatus may be a computer. Further, the computer may be a server that receives a request from a client and performs information processing.
  • Hereinafter, embodiments of the inventive concept will be described in detail with reference to the accompanying drawings.
  • FIG. 1 is a view showing a robot-based surgery system according to a disclosed embodiment.
  • Referring to FIG. 1 , a schematic diagram of the system capable of performing robot-based surgery according to the disclosed embodiment is illustrated.
  • According to FIG. 1 , the robot-based surgery system includes a medical imaging apparatus 10, a server 20, and a controller 30, an imaging unit 36, a display 32, and a surgery robot 34 provided in an operating room. According to an embodiment, the medical imaging apparatus 10 may be omitted from the robot-based surgery system according to the disclosed embodiment.
  • In one embodiment, the robot-based surgery may be performed by the user controlling the surgery robot 34 using the controller 30. In one embodiment, the robot-based surgery may be performed automatically by the controller 30 without the user control.
  • The server 20 is a computing device including at least one processor and a communication unit.
  • The controller 30 includes a computing device including at least one processor and a communication unit. In one embodiment, the controller 30 includes hardware and software interfaces for controlling the surgery robot 34.
  • The imaging unit 36 includes at least one image sensor. That is, the imaging unit 36 includes at least one camera to image a surgery target portion. In one embodiment, the imaging unit 36 is used in conjunction with the surgery robot 34. For example, the imaging unit 36 may include at least one camera coupled with a surgery arm of the surgery robot 34.
  • In one embodiment, the image captured by the imaging unit 36 is displayed on the display 32.
  • The controller 30 receives information necessary for surgery from the server 20 or creates information necessary for surgery and provides the information to the user. For example, the controller 30 displays the created or received information necessary for the surgery on the display 32.
  • For example, the user controls movement of the surgery robot 34 by manipulating the controller 30 while looking at the display 32 to perform robot-based surgery.
  • The server 20 creates information necessary for robot-based surgery using medical image data of the object (patient) previously imaged by the medical imaging apparatus 10 and provides the created information to the controller 30.
  • The controller 30 may display the information received from the server 20 on the display 32 to present the information to the user, or may use the information received from the server 20 to control the surgery robot 34.
  • In one embodiment, imaging means that may be used in the medical imaging apparatus 10 is not limited particularly. For example, various medical imaging means such as CT, X-Ray, PET, and MRI may be used.
  • Hereinafter, a method for providing a feedback on a surgical outcome will be described in detail with reference to the drawings.
  • FIG. 2 is a flowchart showing a method for providing a feedback on a surgical outcome according to one embodiment.
  • Steps shown in FIG. 2 may be performed in time series by the server 20 or the controller 30 shown in FIG. 1 . Hereinafter, for convenience of description, each step is described as being performed by a computer, but a subject to perform each step is not limited to a specific device. All or some of the steps may be performed by the server 20 or the controller 30.
  • Referring to FIG. 2 , a method for providing a feedback on a surgical outcome according to an embodiment of the present disclosure includes dividing, by the computer, actual surgical data obtained from an actual surgical process into a plurality of detailed surgical operations to obtain actual surgical cue sheet data composed of the plurality of detailed surgical operations (S200); obtaining, by the computer, reference cue sheet data about the actual surgery (S400); and comparing, by the computer, the actual surgical cue sheet data and the reference cue sheet data to provide a feedback based the comparison result (S600). Hereinafter, detailed description of each step will be described.
  • The computer may divide the actual surgical data obtained from the actual surgical process into the plurality of detailed surgical operations to obtain the actual surgical cue sheet data composed of the plurality of detailed surgical operations (S200). The computer creates the actual surgical cue sheet data based on the surgical image imaged in the surgical process by the surgery robot or based on data obtained in a control process of the surgery robot.
  • The detailed surgical operation constituting the cue sheet data refers to a minimum operation unit constituting the surgical process. The actual surgical data may be divided into the detailed surgical operations based on several criteria. For example, the actual surgical data may be divided to the detailed surgical operations based on surgery types (e.g., laparoscopic surgery, robot-based surgery), anatomical body parts where surgery is performed, surgical tools as used, a number of surgical tools, an orientation or a position of the surgical tool displayed on a screen, movement of the surgical tool (for example, forward/reward movement), etc.
  • The division criteria and detailed categories included within the division criteria may be directly set by the medical staff learning the actual surgical data. The computer may perform supervised learning based on the division criteria and the detailed categories set by the medical staff to divide the actual surgical data into the detailed surgical operations as a minimum operation unit.
  • Further, the division criteria and detailed categories included within the division criteria may be extracted via learning of a surgical image by the computer. For example, the computer may calculate the division criteria and detailed categories included within the division criteria via deep learning (i.e., unsupervised learning) of actual surgical data accumulated as big data. Subsequently, the computer may divide the actual surgical data based on the division criteria created via the learning of the actual surgical data to create the cue sheet data.
  • Further, in another embodiment, the actual surgical data may be divided based on a result of determining whether the actual surgical data satisfies the division criterion via image recognition. That is, the computer may recognize the anatomical organ position on the screen, the number of surgical tools appearing on the screen, and the number of the surgical tools on the screen within the image of the actual surgical data as the division criterion and may divide the actual surgical data into the detailed surgical operation units based on the recognized division criterion.
  • Further, in another embodiment, the computer may perform the division process for cue sheet data creation based on surgical tool movement data included in the actual surgical data. The actual surgical data may include various information input in a process of controlling the surgery robot, such as the type and the number of surgical tools selected by the user, information about the movement of each surgical tool when the user performs robot-based surgery. Accordingly, the computer may perform division based on information included in the actual surgical data at each time point thereof.
  • Further, in one embodiment, the actual surgical data includes various types of detailed surgical operation such as ablation and suture. Division is performed based on the division criteria. Specifically, a process of dividing the actual surgical data (for example, actual surgery image) about the actual gastric cancer surgery into the detailed surgical operations to create the cue sheet data is as follows.
  • For example, gastric cancer surgery includes a detailed surgical operation to ablate a portion or an entirety of a stomach containing a tumor, and a detailed surgical operation to ablate a lymph node. In addition, various resections and connections are used depending on a state of the gastric cancer. In addition, each detailed surgical operation may be divided into a plurality of detailed surgical operations based on a specific location where the detailed surgical operation is taken and a direction of movement of the surgical tool.
  • For example, the detailed operation of the gastric cancer surgery may be divided into an opening step, an ablation step, a connection step, and a suture step.
  • Further, a method of changing a disconnected state of an organ to a connected state includes an in vitro anastomosis method of incising and connecting at least 4 to 5 cm of an end of an anticardium, and an in vivo anastomosis method in which about 3 cm of umbilicus is incised and incision and anastomosis occur in an abdominal cavity. The above-described connection stage may be divided in detailed sub-steps according to the specific connection method as described above.
  • Furthermore, each surgery operation may be divided into a plurality of detailed surgical operations according to the position and the movement of the surgical tool.
  • Each of the divided detailed surgical operations has a standardized name allocated thereto based on a location where the detailed surgical operation is performed and a pathway of the surgical tool.
  • In one embodiment, the standardized name may be variously defined. For example, when dealing with a specific portion as a lower right portion of the stomach, the name of the portion may be a name commonly used in the medical field. More comprehensive or detailed names defined in the system according to the disclosed embodiment may be used.
  • Therefore, the surgery image about the actual surgery by the user may be organized into information in a form of a cue sheet in which a plurality of detailed surgical operations are sequentially arranged based on the standardized names.
  • Further, in one embodiment, the cue sheet data may be created as code data of specific digits based on the division criteria for dividing the surgical data into the detailed surgical operations. That is, the computer divides the actual surgical data into standardized detailed surgical operations by applying standardized division criteria and designating detailed categories within the division criteria. The computer may allocate a standardized code value to each detailed category and allocate standardized code data to distinguish each detailed surgical operation.
  • The computer allocates, to each detailed surgical operation, digitalized code data obtained by allocating numbers or letters to categories in order from a higher category to a lower category to which the specific detailed surgical operations belong, according to the order of application of the division criteria. Thus, the computer may create cue sheet data in a form in which not images of the divided detailed surgical operations but the standardized code data of the detailed surgical operations are listed. Further, the user may share or deliver the actual surgical process by providing only the cue sheet data composed of the standardized code data.
  • In one embodiment, when the computer is a client terminal disposed in the operating room or corresponds to the controller 30, the computer may acquire the standardized code data from the surgery image, and transmit the obtained code data to the server 20 such that the actual surgical process is shared or delivered.
  • In one embodiment, when the computer corresponds to the server 20, the surgical image is transmitted to the server 20. Then, the server 20 may create the cue sheet data and the code data.
  • Further, in one embodiment, the computer may allocate a standardized name to a standardized code of each detailed surgical operation. Thus, the user may select and identify only a desired detailed surgical operation within the entire cue sheet. Further, in this case, the user may easily grasp a progress of the surgery or the rehearsal by simply viewing the cue sheet in which the detailed surgical operations are sequentially arranged based on the standardized names thereof, without viewing an entirety of the surgery image.
  • The cue sheet data may be converted into a surgical image using an image database for each detailed surgical operation. In the image data base, an image matching each code data may be stored. A plurality of images matching each code data may be stored therein depending on a situation. For example, specific detailed code data may include different detailed surgical operation images in the image database according to previously performed operations.
  • Further, as each cue sheet data is matched with a specific virtual body model and the matching result is stored, the computer may reproduce the cue sheet data as a surgical simulation image by sequentially applying the detailed surgical operations included in the cue sheet data to the virtual body model.
  • Therefore, the image corresponding to the cue sheet data may be reproduced in the same point of view as that of the surgery image. Alternatively, the image corresponding to the cue sheet data may be reconstructed in a point of view different from that of the surgery image and the reconstructed image may be reproduced. Alternatively, the image may be modeled in a 3D manner, and thus the viewpoint and a position thereof may be adjusted according to the user's manipulation.
  • The computer acquires the reference cue sheet data about the actual surgery (S400).
  • In one embodiment, the reference cue sheet data means optimized cue sheet data created by the computer.
  • In another embodiment, the reference cue sheet data refers to referenced virtual surgical cue sheet data.
  • FIG. 3 is a flowchart showing a method for calculating the optimized cue sheet data according to one embodiment.
  • The computer acquires one or more to-be-learned cue sheet data (S420). The to-be-learned cue sheet data refers to learning target data that is to be learned for calculation of the optimized cue sheet data. The to-be-learned cue sheet data may include cue sheet data created based on the actual surgical data (i.e. actual surgical cue sheet data) or cue sheet data created based on the simulated actual surgical data for reference (i.e. referenced virtual surgical cue sheet data). The actual surgical cue sheet data is created by the computer dividing the actual surgical data according to the division criteria. The referenced virtual surgical cue sheet data is not obtained in the user's surgery simulation process, but is created by performing simulation for purpose of constructing the learning target data or providing the same to practitioners for reference.
  • Thereafter, the computer performs reinforcement learning using the to-be-learned cue sheet data (S440). The reinforcement learning refers to one area of the machine learning, and refers to a method in which an agent defined in a certain environment recognizes a current state and selects a detailed surgical operation or a sequence of detailed surgical operations that maximizes reward among selectable detailed surgical operations or sequences thereof. The reinforcement learning may be summarized as learning of a scheme of maximizing compensation based on state transition and compensation according to the state transition.
  • Then, the computer calculates the optimized cue sheet data using the reinforcement learning result (S460). The optimized cue sheet data is calculated based on the shortest surgical time that may reduce the patient's anesthesia time, a minimum bleeding amount, an essential operation group, and an essential performance sequence, etc. based on the reinforcement learning result.
  • The essential operation group refers to a group of detailed surgical operations that must be performed together to perform a specific detailed surgical operation. The essential performance sequence refers to a surgical operation sequence at which that operations must be sequentially performed in a course of performing specific surgery. For example, surgical operations that must be performed sequentially and an order thereof may be determined according to the type of surgery or the type of the surgical operation.
  • Further, the computer may calculate situation-based optimized cue sheet data based on the patient's physical condition, the surgery target portion (e.g., tumor tissue) condition (e.g., a size, a location, etc. of the tumor) thereof the reinforcement learning. To this end, the computer utilizes the patient condition, the surgery target portion condition, and the like along with the to-be-learned cue sheet data for learning.
  • In one embodiment, the computer may perform virtual simulation surgery on its own. For example, the computer may create a surgical process according to the type of surgery and the type of the patient based on the disclosed surgical process optimizing method, and may perform the virtual surgery simulation based on the created surgical process.
  • The computer evaluates results of the virtual surgery simulation. The computer may perform the reinforcement learning based on virtual surgical simulation information and evaluation information on the result thereof, thereby to obtain an optimized surgical process.
  • A model trained to create the optimal surgical process may not create an optimized surgical process according to an induvial patient and a type of surgery thereof because in actual surgery, body structures and types of surgery of patients are different from each other.
  • Therefore, the computer may create a surgical process based on the patient's body structure and the type of surgery thereof using the trained model, and may perform virtual surgery simulation, based on the created surgical process. In this way, the computer may create an optimized surgical process for an individual patient and a type of surgery thereof via the reinforcement learning.
  • The computer compares the actual surgical cue sheet data and the reference cue sheet data with each other and provides a feedback based on a comparison result (S600).
  • In one embodiment, the operation of comparing the actual surgical cue sheet data and the reference cue sheet data with each other may be performed by the computer or the controller 30 disposed in the operating room, or may be performed on the server 20.
  • When the comparison is performed by the server 20, the server 20 acquires the reference cue sheet data, and compares the reference cue sheet data with the surgical image or the cue sheet data (code data) obtained from the controller 30.
  • When the comparison is performed by the computer or the controller 30 placed in the operating room, the computer acquires the cue sheet data from the surgical image, and compares the cue sheet data with the reference cue sheet data received from the server 20.
  • In one embodiment, the feedback may be provided through a website or an application. For example, the feedback may be provided through an application installed in a mobile terminal of the surgeon. When the surgery is finished, a notification related to the feedback may be provided to the mobile terminal of the surgeon.
  • FIG. 4 is a flowchart showing a method for obtaining a feedback according to one embodiment.
  • In one embodiment, the computer may compare a type and an order of detailed surgical operations included in the actual surgical cue sheet data with a type and an order of detailed surgical operations included in the reference cue sheet data, and may provide a feedback on the surgical outcome based on the comparison result (S620).
  • For example, the computer may determine whether a missing detailed surgical operation, an unnecessary detailed surgical operation, or an incorrect detailed surgical operation among detailed surgical operations included in actual surgical cue sheet data, compared to the detailed surgical operations included in the reference cue sheet data is present, and may provide a feedback on the surgical outcome, based on determination result.
  • For example, a required detailed surgical operation included in the reference cue sheet data may be omitted from the actual surgical cue sheet data.
  • Further, an unnecessary detailed surgical operation not included in the reference cue sheet data may have been included in the actual surgical cue sheet data.
  • Further, a detailed surgical operation included in the reference cue sheet data may be included in the actual surgical cue sheet data, but details thereof may be modified or incorrect.
  • In this case, the type of the detailed surgical operation in the reference cue sheet data and the type of the detailed surgical operation in the actual surgical cue sheet data are identical with each other, but details thereof may be different from each other.
  • Thus, the computer determines whether each detailed surgical operation has been performed correctly. Although a specific detailed surgical operation is to be included in an entire surgical process, the specific detailed surgical operation may not be performed normally. For example, when the specific detailed surgical operation is an operation of catching a tissue at a specific location using tongs-shaped surgical tool, the tool may catch the tissue at a deeper position than an optimal depth or the tissue may be caught in a small area of the tongs such that the tissue is caught and then is removed from the tongs. This situation may be determined as the incorrect detailed surgical operation. The computer may recognize the tissue and the surgical tool in the actual surgery image via image recognition to accurately recognize the detailed surgical operation, and may compare the recognized detailed surgical operation with a correct detailed surgical operation in the optimized cue sheet data for evaluation and feedback.
  • Further, the computer may analyze the actual surgical cue sheet data itself, and provide information on a record of an abnormal situation or an unexpected situation based on the analysis result. For example, when a bleeding occurs in a specific site, the computer may provide at least one of the fact that the bleeding has occurred, a location of the bleeding, or an amount of the bleeding.
  • Further, the computer may analyze records of the detailed surgical operation included in the actual surgical cue sheet data, may determine a cause of the bleeding based on the analysis result and may provide the cause.
  • Specifically, the computer analyzes a completed actual surgical cue sheet to search for a surgical error (S640). When a surgical error situation is detected according to a preset rule, the computer may provide the feedback indicating the surgical error.
  • For example, a general surgical error situation and a method of providing the feedback accordingly are as described below.
  • In an example, the computer may detect an event when a foreign object remains in the patient's body, the computer may provide a feedback indicating the event. According to a disclosed embodiment, the computer recognizes all objects included in the surgical image, as well as a location of the surgical tool, and the bleeding site, based on the obtained surgical image, and analyzes each of the objects. The computer determines a location, numbers, and an invasion time of the objects included in the surgery image. Therefore, the computer may generate a warning when it is determined that the foreign substance introduced into the surgery target portion has not been removed when the surgery is completed, and may provide a feedback that asks the user to check the surgery target portion. In one embodiment, the computer may ask the user to check the surgery target portion even when the object introduced into the surgery target portion is not identified in the image. For example, when it is not confirmed that the object that has invaded into the surgery target portion is removed from the surgery target portion, the object may not be included in the surgery image, but may remain at an invisible site. Thus, the computer may provide the feedback to ask the user to check the surgery target portion even when the object introduced into the surgery target portion is not identified in the image.
  • In one example, the computer may detect a surgical operation event of a wrong patient, and may provide a feedback indicating the event. According to a disclosed embodiment, the computer analyzes the surgery image in real time, and performs registration between organs in the 3D modeled image and actual organs. The computer tracks locations of the camera and surgical tool in real time, and determines a surgery situation and obtains information that a simulator follows in the actual surgical process. In one embodiment, when the actual organ and the organ in the 3D modeled image do not match with each other in a process of performing the registration between the patient's actual organ and the organ in the 3D modeled image, the surgery may be performed on the wrong patient. Thus, the computer may ask the user to check this situation.
  • In an example, the computer may detect a surgical operation event of a wrong surgical target portion, and may provide a feedback indicating the event. According to a disclosed embodiment, the computer determines a surgical situation, and provides a surgical guide image according to a rehearsal result or an optimal surgical scheme. In this process, when a process of the actual surgery is different from the rehearsal result or the optimal surgery scheme, the surgery may be performed on a wrong site or a different type of surgery may have been performed. Thus, the computer may ask the user to check this situation.
  • In an example, the computer may recognize a situation of nerve damage and may provide a feedback indicating this situation. In a disclosed embodiment, when the actual surgery is different from the rehearsal process, as described above, the computer may provide a feedback indicating this situation. When an important nerve or ganglion is cut depending on a positional relationship between the patient's organ and the surgical tool or when a risk that the surgical tool approaches the nerve is predicted, a warning may be provided to the user. Further, the computer visually presents a blood vessel, a nerve, and a ganglion that are invisible in an overlapping manner with the surgical image using image registration and AR (augmented reality) or MR (mixed reality) technology. Thus, even after the surgery, the user may see an important body portion well, thereby helping to review the surgical process.
  • Further, in another embodiment, the method may further include adding the actual surgical cue sheet data to the to-be-learned cue sheet data used for calculation of the optimized cue sheet data, and performing reinforcement learning (S660).
  • In one embodiment, the actual surgical cue sheet data may be improved compared to the optimized cue sheet data. Therefore, the actual surgical cue sheet data may be added to the to-be-learned cue sheet data and then the reinforcement learning thereof may be performed, thereby to obtain a model capable of creating improved optimized cue sheet data.
  • Further, in another embodiment, the computer tracks prognosis of the patient corresponding to each surgical record (S680). The computer may perform machine learning using each surgery record and the prognosis corresponding to each surgery record as to-be-learned data, thereby determining a combination of surgical operations resulting in each prognosis.
  • For example, the computer may analyze the surgical operations of patients with specific side effects, and thus the computer may derive detailed surgical operations or combinations of detailed surgical operations that may cause even minor surgical side effects.
  • In one embodiment, the computer may acquire information about detailed surgical operations that bring about each prognosis via the reinforcement learning. For example, the computer may perform the reinforcement learning based on operations performed in the surgical process, and learned data about prognosis occurring when the operations are included or are performed in a specific order. Based on the reinforcement learning result, the computer may determine what prognosis (i.e., what side effect) may occur due to a specific detailed surgical operation, continuous detailed surgical operations, or a combination of detailed surgical operations.
  • Further, the computer may output and provide the feedback to the user in various ways. In one embodiment, the computer may extract detailed surgical operations having defects and provide the extracted detailed surgical operations as the feedback. For example, the computer may extract and reproduce only images of the detailed surgical operations having the defects, thereby to help the user to grasp the defects.
  • Further, the computer may search for and provide detailed surgical operations included in the surgical process. For example, the surgery is usually performed for several hours or more, such that it is difficult for the user to check an entire surgery image after the surgery for feedback. Therefore, the computer may provide the cue sheet data. When the user selects one or more detailed surgical operations included in the cue sheet data, the computer may extract only the selected detailed surgical operation and may provide the feedback related thereto. For example, the computer may extract and reproduce only images of the selected detailed surgical operations.
  • According to a disclosed embodiment, each detailed surgical operation has a standardized name and standardized code data. Therefore, the user may search for each detailed surgical operation based on the standardized name or the standardized code data. The user may look at the cue sheet data to easily check the progress of the surgery.
  • The method for providing the feedback on the surgical outcome according to one embodiment of the inventive concept as described above may be implemented using a program (or application) to be executed in combination with a computer as hardware and stored in a medium.
  • The program may include codes coded in computer languages such as C, C++, JAVA, and machine language that a processor (CPU) of the computer may read through a device interface thereof, in order for the computer to read the program and execute methods implemented using the program. The code may include a functional code related to a function defining functions required to execute the methods, and an execution procedure-related control code necessary for the processor of the computer to execute the functions in a predetermined procedure. Moreover, the code may further include a memory reference-related code indicating a location (address) of an internal memory of the computer or an external memory thereto in which additional information or media necessary for the processor to execute the functions is stored. Moreover, when the processor of the computer needs to communicate with any other remote computer or server to execute the functions, the code may further include a communication-related code indicating how to communicate with any other remote computer or server using a communication module of the computer, and indicating information or media to be transmitted and received during the communication.
  • The storage medium means a medium that stores data semi-permanently, rather than a medium for storing data for a short moment, such as a register, a cache, or a memory, and that may be readable by a machine. Specifically, examples of the storage medium may include, but may not be limited to, ROM, RAM, CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device. That is, the program may be stored in various recording media on various servers to which the computer may access or on various recording media on the user's computer. Moreover, the medium may be distributed over a networked computer system so that a computer readable code may be stored in a distributed scheme.
  • The steps of the method or the algorithm described in connection with the embodiments of the inventive concept may be implemented directly in hardware, a software module executed by hardware, or a combination thereof. The software modules may reside in random access memory (RAM), read only memory (ROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), a flash memory, a hard disk, a removable disk, CD-ROM, or any form of a computer readable recording medium well known in the art.
  • According to the disclosed embodiment, the feedback on the course and the outcome of the surgery may be provided to the user, based on the comparing result between the actual surgical process and the reference.
  • According to the disclosed embodiment, the feedback may be provided by extracting the necessary portion from the entire surgery image. Thus, even when the user does not look through the entire surgery image, the user may check the necessary portion thereof.
  • The effects of the inventive concept are not limited to the effects mentioned above. Other effects not mentioned will be clearly understood by those skilled in the art from the above description.
  • While the inventive concept has been described with reference to exemplary embodiments, it will be apparent to those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the inventive concept. Therefore, it should be understood that the above embodiments are not limiting, but illustrative.
  • REFERENCE NUMERALS
    • 10: Medical imaging apparatus
    • 20: Server
    • 30: Controller
    • 32: Display
    • 34: Surgery robot
    • 36: Imaging unit

Claims (11)

What is claimed is:
1. A device for providing a feedback on a surgical outcome based on artificial intelligence, comprising:
an image sensor configured to capture a surgical image of an actual surgical process; and
a processor configured to:
generate actual surgical data based on the surgical image,
divide the actual surgical data into a plurality of detailed surgical operations to obtain actual surgical cue sheet data composed of the plurality of detailed surgical operations,
obtain reference cue sheet data about the actual surgery, and p1 provide feedback by comparing the actual surgical cue sheet data with the reference cue sheet data,
wherein the reference cue sheet data includes at least one of optimized cue sheet data about the actual surgery and referenced virtual surgical cue sheet data,
wherein the optimized cue sheet data includes cue sheet data calculated by an optimized surgical process by learning one or more cue sheet data, and
wherein the reference cue sheet data includes cue sheet data for virtual surgery or actual surgery performed for constructing big data for learning or guiding a surgical process.
2. The device of claim 1, wherein the actual surgical data is divided into the plurality of detailed surgical operations, based on at least one of a surgery target portion, a type of surgical tool, a number of surgical tools, a position of the surgical tool, an orientation of the surgical tool, and movement of the surgical tool included in the actual surgical data.
3. The device of claim 1, wherein at least one of a standardized name and standardized code data is assigned to each of the plurality of detailed surgical operations.
4. The device of claim 1, wherein the processor is further configure to determine whether at least one surgical error is included in the actual surgical cue sheet data by comparing the plurality of detailed surgical operations included in the actual surgical cue sheet data with a plurality of detailed surgical operations included in the reference cue sheet data, and
wherein the at least one surgical error includes at least one of an unnecessary detailed surgical operation, a missing detailed surgical operation, and an incorrect detailed surgical operation.
5. The device of claim 4, wherein the processor is further configured to determine whether a detailed surgical operation included in the actual surgical cue sheet data is incorrect by comparing motion of surgical tool corresponding to a detailed surgical operation included in the reference cue sheet data with motion of surgical tool corresponding to the detailed surgical operation included in the actual surgical cue sheet data.
6. The device of claim 1, wherein the processor is further configured to:
detect an occurrence of an event from the actual surgical data,
determine a cause of the event by analyzing a detailed surgical operation before occurring the event included in the actual surgical cue sheet data, and
wherein the event includes at least one of bleeding information, foreign object information and nerve damage information.
7. The device of claim 1, wherein the processor is further configured to obtain optimized cue sheet data for each situation classified according to a physical condition and a surgical target portion condition when obtaining the reference cue sheet data.
8. The device of claim 1, wherein the processor is further configured to:
add the actual surgical cue sheet data to to-be-learned cue sheet data, and
perform reinforcement learning on a model for the obtaining optimized cue sheet data using the to-be-learned cue sheet data.
9. The device claim 1, wherein the processor is further configured to:
obtain information about prognosis corresponding to each of one or more actual surgical cue sheet data including the actual surgical cue sheet data,
perform reinforcement learning based on information about the one or more actual surgical cue sheet data and the prognosis, and
determine a correlation between at least one detailed surgical operation included in the one or more actual surgical cue sheet data and the prognosis based on a result of the reinforcement learning.
10. The device claim 1, further comprising:
a display; and
wherein the processor is further configured to:
extract a first image of one or more detailed surgical operation corresponding to a situation of surgical error or a second image of one or more detailed surgical operation including a situation of event, and
play the first image or the second image through the display.
11. A method for providing a feedback on a surgical outcome based on artificial intelligence, performed by a device, the method comprising:
capturing, by an image sensor of the device, a surgical image of an actual surgical process;
generating, by a processor of the device, actual surgical data based on the surgical image;
dividing, by the processor, the actual surgical data into a plurality of detailed surgical operations to obtain actual surgical cue sheet data composed of the plurality of detailed surgical operations;
obtaining, by the processor, reference cue sheet data about the actual surgery; and
providing, by the processor, feedback by comparing the actual surgical cue sheet data with the reference cue sheet data,
wherein the reference cue sheet data includes at least one of optimized cue sheet data about the actual surgery and referenced virtual surgical cue sheet data,
wherein the optimized cue sheet data includes cue sheet data calculated by an optimized surgical process by learning one or more cue sheet data, and
wherein the reference cue sheet data includes cue sheet data for virtual surgery or actual surgery performed for constructing big data for learning or guiding a surgical process.
US18/194,067 2017-12-28 2023-03-31 Method and program for providing feedback on surgical outcome Pending US20230238109A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/194,067 US20230238109A1 (en) 2017-12-28 2023-03-31 Method and program for providing feedback on surgical outcome

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
KR1020170182889A KR101862360B1 (en) 2017-12-28 2017-12-28 Program and method for providing feedback about result of surgery
KR10-2017-0182889 2017-12-28
PCT/KR2018/010329 WO2019132165A1 (en) 2017-12-28 2018-09-05 Method and program for providing feedback on surgical outcome
US16/914,141 US11636940B2 (en) 2017-12-28 2020-06-26 Method and program for providing feedback on surgical outcome
US18/194,067 US20230238109A1 (en) 2017-12-28 2023-03-31 Method and program for providing feedback on surgical outcome

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/914,141 Continuation US11636940B2 (en) 2017-12-28 2020-06-26 Method and program for providing feedback on surgical outcome

Publications (1)

Publication Number Publication Date
US20230238109A1 true US20230238109A1 (en) 2023-07-27

Family

ID=62780721

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/914,141 Active 2039-09-04 US11636940B2 (en) 2017-12-28 2020-06-26 Method and program for providing feedback on surgical outcome
US18/194,067 Pending US20230238109A1 (en) 2017-12-28 2023-03-31 Method and program for providing feedback on surgical outcome

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/914,141 Active 2039-09-04 US11636940B2 (en) 2017-12-28 2020-06-26 Method and program for providing feedback on surgical outcome

Country Status (5)

Country Link
US (2) US11636940B2 (en)
EP (1) EP3734608A4 (en)
KR (1) KR101862360B1 (en)
CN (1) CN111771244A (en)
WO (1) WO2019132165A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102149167B1 (en) * 2018-11-07 2020-08-31 주식회사 삼육오엠씨네트웍스 Artificial intelligence based cannula surgery diagnostic apparatus
WO2020159276A1 (en) * 2019-02-01 2020-08-06 주식회사 아이버티 Surgical analysis apparatus, and system, method, and program for analyzing and recognizing surgical image
KR102321157B1 (en) * 2020-04-10 2021-11-04 (주)휴톰 Method and system for analysing phases of surgical procedure after surgery
KR102640314B1 (en) * 2021-07-12 2024-02-23 (주)휴톰 Artificial intelligence surgery system amd method for controlling the same
CN113591757A (en) * 2021-08-07 2021-11-02 王新 Automatic operation device and equipment for eye reshaping
CN115775621B (en) * 2023-02-13 2023-04-21 深圳市汇健智慧医疗有限公司 Information management method and system based on digital operating room

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE9804147D0 (en) * 1998-12-01 1998-12-01 Siemens Elema Ab System for three-dimensional imaging of an internal organ or body structure
KR100961661B1 (en) * 2009-02-12 2010-06-09 주식회사 래보 Apparatus and method of operating a medical navigation system
WO2010108128A2 (en) * 2009-03-20 2010-09-23 The Johns Hopkins University Method and system for quantifying technical skill
US8392342B2 (en) * 2009-11-18 2013-03-05 Empire Technology Development Llc Method and apparatus for predicting movement of a tool in each of four dimensions and generating feedback during surgical events using a 4D virtual real-time space
KR101235044B1 (en) * 2010-11-02 2013-02-21 서울대학교병원 (분사무소) Method of operation simulation and automatic operation device using 3d modelling
KR101795720B1 (en) * 2011-05-12 2017-11-09 주식회사 미래컴퍼니 Control method of surgical robot system, recording medium thereof, and surgical robot system
KR101302595B1 (en) 2012-07-03 2013-08-30 한국과학기술연구원 System and method for predict to surgery progress step
US10350008B2 (en) * 2014-12-02 2019-07-16 X-Nav Technologies, LLC Visual guidance display for surgical procedure
KR101655940B1 (en) * 2015-02-06 2016-09-08 경희대학교 산학협력단 Apparatus for generating guide for surgery design information and method of the same
CN111329554B (en) * 2016-03-12 2021-01-05 P·K·朗 Devices and methods for surgery
CN106901834A (en) * 2016-12-29 2017-06-30 陕西联邦义齿有限公司 The preoperative planning of minimally invasive cardiac surgery and operation virtual reality simulation method
US11158415B2 (en) * 2017-02-16 2021-10-26 Mako Surgical Corporation Surgical procedure planning system with multiple feedback loops
US11112770B2 (en) * 2017-11-09 2021-09-07 Carlsmed, Inc. Systems and methods for assisting a surgeon and producing patient-specific medical devices

Also Published As

Publication number Publication date
EP3734608A1 (en) 2020-11-04
US20200357502A1 (en) 2020-11-12
CN111771244A (en) 2020-10-13
WO2019132165A1 (en) 2019-07-04
EP3734608A4 (en) 2021-09-22
US11636940B2 (en) 2023-04-25
KR101862360B1 (en) 2018-06-29

Similar Documents

Publication Publication Date Title
US20230238109A1 (en) Method and program for providing feedback on surgical outcome
KR102013866B1 (en) Method and apparatus for calculating camera location using surgical video
US11660142B2 (en) Method for generating surgical simulation information and program
KR101926123B1 (en) Device and method for segmenting surgical image
KR102146672B1 (en) Program and method for providing feedback about result of surgery
US20230044399A1 (en) Data analysis based methods and systems for optimizing insertion of a medical instrument
JP2022548237A (en) Interactive Endoscopy for Intraoperative Virtual Annotation in VATS and Minimally Invasive Surgery
KR102628324B1 (en) Device and method for analysing results of surgical through user interface based on artificial interlligence
KR102008891B1 (en) Apparatus, program and method for displaying surgical assist image
KR102298417B1 (en) Program and method for generating surgical simulation information
KR101864411B1 (en) Program and method for displaying surgical assist image
KR20200096155A (en) Method for analysis and recognition of medical image
KR20190133424A (en) Program and method for providing feedback about result of surgery
US11957415B2 (en) Method and device for optimizing surgery
KR20190133425A (en) Program and method for displaying surgical assist image
KR20190133423A (en) Program and method for generating surgical simulation information
KR101940706B1 (en) Program and method for generating surgical simulation information
US20230363821A1 (en) Virtual simulator for planning and executing robotic steering of a medical instrument
US20230031396A1 (en) Apparatus and method for matching the real surgical image with the 3d-based virtual simulated surgical image based on poi definition and phase recognition
KR20230023876A (en) Apparatus and Method for Generating a Virtual Pneumoperitoneum Model

Legal Events

Date Code Title Description
AS Assignment

Owner name: HUTOM CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, JONG HYUCK;HYUNG, WOO JIN;YANG, HOON MO;AND OTHERS;REEL/FRAME:063192/0262

Effective date: 20200623

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

AS Assignment

Owner name: UIF (UNIVERSITY INDUSTRY FOUNDATION), YONSEI UNIVERSITY, KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HUTOM CO., LTD.;REEL/FRAME:066943/0497

Effective date: 20240314

Owner name: HUTOM INC., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HUTOM CO., LTD.;REEL/FRAME:066943/0497

Effective date: 20240314