WO2019132165A1 - Procédé et programme de fourniture de rétroaction sur un résultat chirurgical - Google Patents

Procédé et programme de fourniture de rétroaction sur un résultat chirurgical Download PDF

Info

Publication number
WO2019132165A1
WO2019132165A1 PCT/KR2018/010329 KR2018010329W WO2019132165A1 WO 2019132165 A1 WO2019132165 A1 WO 2019132165A1 KR 2018010329 W KR2018010329 W KR 2018010329W WO 2019132165 A1 WO2019132165 A1 WO 2019132165A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical
data
actual
detailed
surgery
Prior art date
Application number
PCT/KR2018/010329
Other languages
English (en)
Korean (ko)
Inventor
이종혁
형우진
양훈모
김호승
Original Assignee
(주)휴톰
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)휴톰 filed Critical (주)휴톰
Priority to CN201880088961.9A priority Critical patent/CN111771244B/zh
Priority to EP18896154.4A priority patent/EP3734608A4/fr
Publication of WO2019132165A1 publication Critical patent/WO2019132165A1/fr
Priority to US16/914,141 priority patent/US11636940B2/en
Priority to US18/194,067 priority patent/US20230238109A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/256User interfaces for surgical systems having a database of accessory information, e.g. including context sensitive help or scientific articles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots

Definitions

  • the present invention relates to methods and programs for providing feedback on surgical results.
  • a 3D medical image for example, a virtual image of a change in internal organs generated due to movement of a three-dimensional surgical tool and a tool
  • unnecessary processes are minimized to optimize the surgical process
  • Deep learning is defined as a set of machine learning algorithms that try to achieve high levels of abstraction (a task that summarizes key content or functions in large amounts of data or complex data) through a combination of several nonlinear transformation techniques. Deep learning can be viewed as a field of machine learning that teaches computers how people think in a big way.
  • a problem to be solved by the present invention relates to a method and a program for providing feedback on a surgical result.
  • a method for providing feedback on a surgical procedure comprising: dividing a computer based on actual surgical data obtained in a real surgical procedure, Obtaining reference cue chart data for the actual operation, and comparing the actual surgery cue chart data with the reference cue chart data to provide feedback.
  • the plurality of detailed surgical operations may include at least one of a surgical site included in the actual operation data, a type of surgical tool, a number of the surgical tools, a position of the surgical tool, a direction of the surgical tool, And the actual operation data may be divided based on one.
  • the plurality of detailed surgical operations may be one in which at least one of the standardized name and the standardized code data is given, respectively.
  • the providing of the feedback may also include obtaining search information for searching at least one of the plurality of detailed surgical operations, searching for at least one of the at least one of the at least one of the plurality of detailed operation operations corresponding to the search information based on the standardized name or the standardized code data Extracting one detailed surgical operation, and providing feedback for the extracted at least one detailed surgical operation.
  • reference queue chart data may be queue schedule data optimized for the actual operation or reference virtual surgery queue data.
  • the step of providing feedback may further include comparing a plurality of detailed surgical operations included in each of the actual surgical queue roll data and the reference queue sheet data to determine whether the actual surgical queue chart data includes an unnecessary detail operation, And judging whether or not there is a wrong detailed surgery operation.
  • the step of determining whether or not the erroneous detailed surgery operation is performed may include determining whether or not the operation of the surgical tool corresponding to the detailed surgery operation included in the reference cue chart data and the operation corresponding to the detailed operation operation included in the actual surgical cue chart data Comparing the movement of the tool, and comparing the movement of the surgical tool to determine whether the detailed operation included in the actual surgical caption data is erroneous.
  • the method may further include adding the actual surgery queue chart data to the learning queue data and performing reinforcement learning on the model for acquiring optimized queue data using the learning queue data.
  • the method may further include searching at least one surgical error condition in the obtained surgical information, and providing feedback on the searched surgical error condition.
  • a computer-readable recording medium storing a computer-readable program for performing a method of providing feedback on a surgical result according to an embodiment of the present invention, / RTI >
  • the actual operation procedure is compared with the reference, thereby providing feedback on the procedure and the result of the operation.
  • FIG. 1 is a diagram illustrating a robot surgery system in accordance with the disclosed embodiment.
  • FIG. 2 is a flow chart illustrating a method for providing feedback on surgical results in accordance with one embodiment.
  • FIG. 3 is a flow chart illustrating a method for computing optimized queue data in accordance with one embodiment.
  • FIG. 4 is a flow diagram illustrating a method for obtaining feedback in accordance with one embodiment.
  • the term “part” or “module” refers to a hardware component, such as a software, FPGA, or ASIC, and a “component” or “module” performs certain roles. However, “part” or “ module “ is not meant to be limited to software or hardware. A “module “ or “ module “ may be configured to reside on an addressable storage medium and configured to play back one or more processors. Thus, by way of example, “a” or " module " is intended to encompass all types of elements, such as software components, object oriented software components, class components and task components, Microcode, circuitry, data, databases, data structures, tables, arrays, and variables, as used herein. Or " modules " may be combined with a smaller number of components and "parts " or " modules " Can be further separated.
  • image may refer to multi-dimensional data composed of discrete image elements (e.g., pixels in a two-dimensional image and voxels in a 3D image).
  • image may include a medical image or the like of the object obtained by the CT photographing apparatus.
  • an " object" may be a person or an animal, or part or all of a person or an animal.
  • the subject may comprise at least one of the following: liver, heart, uterus, brain, breast, organs such as the abdomen, and blood vessels.
  • the term "user” may be a doctor, a nurse, a clinical pathologist, a medical imaging specialist, or the like, and may be a technician repairing a medical device.
  • medical image data is a medical image captured by a medical image capturing apparatus, and includes all medical images capable of realizing a body of a subject as a three-dimensional model.
  • Medical image data may include a computed tomography (CT) image, a magnetic resonance imaging (MRI), a positron emission tomography (PET) image, and the like.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • PET positron emission tomography
  • the term "virtual body model” refers to a model generated based on medical image data in accordance with an actual patient's body.
  • the “virtual body model” may be generated by modeling the medical image data in three dimensions as it is, or may be corrected after modeling as in actual surgery.
  • virtual surgery data means data including a rehearsal or simulation action performed on a virtual body model.
  • the "virtual surgery data” may be image data that is rehearsed or simulated for a virtual body model in virtual space, or data recorded for a surgical operation performed on a virtual body model.
  • actual operation data means data obtained as an actual medical staff performs surgery.
  • the "actual operation data” may be image data obtained by photographing a surgical site in an actual operation procedure, or may be data recorded on a surgical operation performed in an actual operation procedure.
  • cue sheet data means data in which a specific operation procedure is divided into detailed operation operations and recorded in order.
  • tissue queue data means the queue sheet data obtained based on the virtual surgery data on which the user has performed the simulation.
  • training virtual surgery < / RTI > queue chart data is included in the enactment queue data and refers to the queue chart data generated based on the virtual surgery data obtained by performing the operation simulation.
  • reference virtual surgery queue data refers to cue chart data for a virtual surgery performed by a specific medical person for guidance of construction of a big data for learning or surgical procedure guidance.
  • optimized queue data means queue sheet data for a surgical procedure optimized in terms of an operation time or an operation prognosis.
  • learning cue chart data means the cue chart data used for learning for calculating optimization cue sheet data.
  • surgical guide data means data used as guide information in actual surgery.
  • the term "computer” includes all of the various devices that can perform computational processing to provide results to a user.
  • the computer may be a smart phone, a tablet PC, a cellular phone, a personal communication service phone (PCS phone), a synchronous / asynchronous A mobile terminal of IMT-2000 (International Mobile Telecommunication-2000), a Palm Personal Computer (PC), a personal digital assistant (PDA), and the like.
  • the HMD device when the head mounted display (HMD) device includes a computing function, the HMD device can be a computer.
  • the computer may correspond to a server that receives a request from a client and performs information processing.
  • FIG. 1 is a diagram illustrating a robot surgery system in accordance with the disclosed embodiment.
  • FIG. 1 there is shown a simplified schematic representation of a system capable of performing robotic surgery in accordance with the disclosed embodiments.
  • the robot surgery system includes a medical imaging apparatus 10, a server 20, a control unit 30 provided in an operating room, an image capturing unit 36, a display 32, and a surgical robot 34 do.
  • the medical imaging equipment 10 may be omitted from the robotic surgery system according to the disclosed embodiment.
  • robotic surgery is performed by the user controlling the surgical robot 34 using the control unit 30.
  • robot surgery may be performed automatically by the control unit 30 without user control.
  • the server 20 is a computing device including at least one processor and a communication unit.
  • the control unit 30 includes a computing device including at least one processor and a communication unit. In one embodiment, the control unit 30 includes hardware and software interfaces for controlling the surgical robot 34.
  • the image capturing unit 36 includes at least one image sensor. That is, the image capturing unit 36 includes at least one camera device, and is used to photograph a surgical site. In one embodiment, the imaging section 36 is used in combination with the surgical robot 34. For example, the image capturing unit 36 may include at least one camera coupled with a surgical arm of the surgical robot 34.
  • the image photographed by the image photographing section 36 is displayed on the display 340.
  • the control unit 30 receives information necessary for surgery from the server 20, or generates information necessary for surgery and provides the information to the user. For example, the control unit 30 displays on the display 32 information necessary for surgery, which is generated or received.
  • the user operates the control unit 30 while viewing the display 32 to perform the robot surgery by controlling the movement of the surgical robot 34.
  • the server 20 generates information necessary for robot surgery using the medical image data of the object (patient) photographed beforehand from the medical imaging apparatus 10, and provides the generated information to the control unit 30.
  • the control unit 30 provides the information received from the server 20 to the user by displaying the information on the display 32 or controls the surgical robot 34 using the information received from the server 20.
  • the means that can be used in the medical imaging equipment 10 is not limited, and various other medical imaging acquiring means such as CT, X-Ray, PET, MRI and the like may be used.
  • FIG. 2 is a flow chart illustrating a method for providing feedback on surgical results in accordance with one embodiment.
  • each step shown in FIG. 2 is performed in a time-series manner in the server 20 or the control unit 30 shown in FIG.
  • each step is described as being performed by a computer, but the subject of each step is not limited to a specific device, and all or some of the steps may be performed in the server 20 or the control unit 30 .
  • a method of providing feedback on a surgical result includes dividing the actual operation data obtained in the actual operation procedure, (S200); Acquiring (S400) reference cuesheet data for the actual operation; And providing feedback by comparing the actual surgery queue chart data with the reference queue chart data (S600).
  • S200 actual operation data obtained in the actual operation procedure
  • S400 Acquiring
  • S600 reference queue chart data
  • the computer is divided based on the actual operation data obtained in the actual operation procedure to obtain the actual operation queue chart data composed of the plurality of detailed operation operations (S200).
  • the computer generates actual surgical caption data on the basis of the surgical image photographed by the surgical robot or data obtained in the control process of the surgical robot.
  • the detailed surgery operation constituting the cue chart data is a minimum operation unit constituting the surgical process.
  • the detailed surgical operation can be divided by several criteria.
  • the detailed surgical operation may be performed in a variety of ways, including the type of surgery (e.g., laparoscopic surgery, robotic surgery), the anatomical body part undergoing surgery, the surgical tool used, the number of surgical tools, Position, movement of the surgical tool (e.g., forward / retract), and the like.
  • the subcategories included in the segmentation criterion and the segmentation criterion can be set directly by learning the actual operation data of the medical staff.
  • the computer can perform the supervised learning according to the division criteria and the detailed category set by the medical staff to divide the actual operation data by the detailed operation of the minimum unit.
  • the subcategories included in the segmentation criterion and the segmentation criterion can be extracted through the surgical image learning of the computer.
  • the computer can calculate the division criterion and the category within each division criterion by deep learning learning (i.e., non-coaching learning) of the actual operation data accumulated in the big data. Then, the computer divides the actual surgical data according to the division criterion generated through the actual operation data learning to generate the cue sheet data.
  • the actual operation data may be segmented by recognizing whether or not the actual operation data corresponds to each division standard through image recognition. That is, the computer recognizes the anatomical long-term position corresponding to the segmentation criterion, the surgical tool appearing, the number of each surgical tool, and the like in the in-vivo image of the actual operation data, and can perform the segmentation into the detailed operation unit.
  • the computer may perform a segmentation process for generating the cue sheet data based on the surgical tool motion data included in the actual operation data.
  • the actual operation data may include various information input in the process of controlling the surgical robot such as the type and number of the surgical tools selected by the user and information about the motion of each surgical tool when the user performs the robot surgery .
  • the computer can perform segmentation based on the information contained at each point in time of the actual surgical data.
  • the actual operation data includes various kinds of detailed operation such as ablation and suturing, and segmentation is performed according to the division criterion.
  • a procedure for generating actual surgical data for example, an actual surgical image
  • the stomach cancer surgery is actually performed by dividing the actual operation data into the detailed surgery operation and generating it as the cue chart data will be described below.
  • stomach cancer surgery involves a detailed surgical operation to ablate some or all of the stomach, including the tumor, and a detailed surgical operation to ablate the lymph nodes.
  • various resection and connection techniques are used depending on the condition of the stomach.
  • each detailed operation may be divided into a specific position at which the detailed operation is performed and a more detailed detailed operation according to the direction of movement of the surgical tool.
  • stomach cancer surgery can be divided into a laparotomy, a resectional step, a connection step, and a suturing step.
  • the method of connecting the resected organs includes an ex vivo anastomosis method of incising and connecting 4-5 cm or more of the gut tip, or a method of incision in the abdominal cavity by abdominal incision and anastomosis in the body cavity. May be further divided according to such a specific connection method.
  • each surgical method can be divided into a plurality of more detailed surgical operations according to the position and movement of the surgical tool.
  • Each of the detailed surgical operations divided may be given a standardized name based on the position at which the detailed operation is performed and the pathway of the surgical tool.
  • the terms used in the standardized nomenclature may be variously defined.
  • the name of the site may be a name that is commonly used in the medical field, and a more comprehensive or subdivided name, as defined in the system according to the disclosed embodiment, Can be used.
  • the surgical image in which the user actually performed the surgery can be organized into the information of the cue sheet type, in which the plurality of detailed operation operations are sequentially listed based on the standardized name.
  • the cue chart data may be generated as code data of a certain number of digits according to a criterion for dividing into detailed surgery operations. That is, the computer divides the actual operation data into standardized detailed operation operations by applying the standardized division criteria and specifying the subcategories within the division criterion, assigns the standardized code values to the respective subcategories, Standardized code data that can be used.
  • the computer assigns numbers or letters to the detailed operation data in order from the upper category to which the specific detailed operation belongs in accordance with the dividing reference application sequence. Accordingly, the computer can generate the cue sheet data in a form in which the standardized code data of each detailed operation operation is listed instead of the divided detailed operation operation image. Also, the user can share or transmit the actual operation procedure by providing only the cue chart data composed of the standardized code data.
  • the computer when the computer is a client terminal located in the operating room or corresponds to the control unit 30, the computer obtains standardized code data from the surgical image and transmits the obtained code data to the server 20, The process can be shared or communicated.
  • a surgical image may be transmitted to the server 20, and the server 20 may generate the cue chart data and the code data.
  • the computer may assign a standardized name to a standardized code of each detailed surgery operation. Through this, the user can select and check only the desired detailed operation part within the entire cue sheet. In this case, even if the user does not see all of the surgical images, the user can easily grasp the progress of the operation by simply viewing the sequentially arranged cue sheets on the basis of the standardized names of the detailed surgery operations.
  • the cue chart data can be converted into a surgical image using an image database for each detailed operation.
  • the image database may store an image corresponding to each code data, and a plurality of images corresponding to each code data may be stored according to a situation. For example, specific code data may be loaded with different detailed operation motion images in the image database according to previously performed operations.
  • the computer can reproduce the surgical simulation image by sequentially applying each detailed operation included in the cue sheet data to the virtual body model.
  • the image corresponding to the cue chart data may be reproduced at the same point as the surgical image, or may be reconstructed and reproduced at another point in time.
  • the image may be modeled in 3D, and the viewpoint and position may be adjusted according to the user's operation.
  • the computer obtains reference cue chart data for the actual operation (S400).
  • the reference queue chart data refers to optimized queue queue data generated by a computer.
  • the reference queue seats data refers to virtual surgery queue data for reference.
  • FIG. 3 is a flow chart illustrating a method for computing optimized queue data in accordance with one embodiment.
  • the computer acquires one or more learning queue data (S420).
  • the learning queue data is learning target data learned for calculating optimized queue data.
  • the learning cue chart data may include cue chart data generated based on actual surgery data (i.e., actual surgery cue chart data) or cue chart data generated based on simulated virtual surgery data (i.e., reference virtual cue scan data) .
  • the actual surgery queue chart data is generated by dividing the actual operation data by a computer according to a division criterion.
  • the reference virtual surgery queue data is generated not by the user in the surgical training process but by simulation for the purpose of constructing the learning data or providing the reference data to the practitioners.
  • Reinforcement learning is an area of machine learning, in which an agent defined in an environment recognizes the current state and selects a behavior or sequence of actions that maximizes compensation among selectable behaviors. Reinforcement learning can be summarized as learning how to maximize compensation based on state transitions and compensation based on state transitions.
  • the computer calculates optimized queue data using the enhanced learning result (S460).
  • the optimized queue data is calculated on the basis of the shortest operation time, the minimum amount of blood loss, the required action group, and the mandatory execution order that can reduce the anesthetic time of the patient based on the reinforcement learning result.
  • the mandatory group of operations is a group of detailed surgical operations that must be performed together essentially to perform a specific detailed surgical operation.
  • the essential procedure is a sequence of operations that must be performed sequentially in the course of performing a specific operation. For example, the surgical operations to be sequentially appeared depending on the type of operation or the type of operation, and the order may be determined.
  • the computer also calculates optimized context queue data according to the patient's physical condition, surgical site (e.g., tumor tissue) conditions (e.g., tumor size, location, etc.) through reinforcement learning. To do this, the computer uses the patient condition and the surgical site condition together with the learning queue data at the time of learning.
  • surgical site e.g., tumor tissue
  • the computer uses the patient condition and the surgical site condition together with the learning queue data at the time of learning.
  • the computer can perform a simulated virtual surgery on its own.
  • the computer may generate a surgical procedure according to the type of surgery and the type of patient based on the disclosed surgical process optimization method, and may perform a virtual surgery simulation based on the generated surgical procedure.
  • the computer can acquire the optimized surgical process by evaluating the virtual surgical simulation result and performing reinforcement learning based on the virtual surgical simulation information and the evaluation information on the result.
  • the computer uses the learned model to create a surgical process based on the patient's body structure and the type of surgery, performs reinforcement learning by performing virtual surgery simulation, and optimizes the surgical process according to each patient and type of surgery Can be generated.
  • the computer compares the actual surgery queue chart data with the reference queue chart data and provides feedback (S600).
  • the operation of comparing the actual surgical queue track data with the reference queue track data may be performed in the computer or control unit 30 placed in the operating room, or may be performed in the server 20.
  • the server 20 acquires the reference queue chart data and performs comparison with the surgical image or the cue chart data (code data) obtained from the control unit 30.
  • the computer acquires the cue chart data from the surgical image and performs comparison with the reference cue chart data received from the server 20.
  • the feedback may be provided through a website or application.
  • the feedback may be provided through an application installed on the doctor's mobile terminal, and a notification relating to the feedback may be provided to the doctor's mobile terminal when the operation is terminated.
  • FIG. 4 is a flow diagram illustrating a method for obtaining feedback in accordance with one embodiment.
  • the computer may compare the type and order of the detailed surgical operations contained in the actual surgical caption data with the type and order of the detailed surgical operations included in the reference queue chart data to provide feedback on the surgical result ( S620).
  • the computer may determine whether the detailed surgical operations contained in the actual surgical cue chart data include a detailed surgical operation, an unnecessary detailed surgical operation, or an incorrect detailed surgical operation, as compared to the detailed surgical operations included in the reference queue chart data And provide feedback on the result.
  • the necessary detailed surgery operations included in the reference queue chart data may be missing from the actual surgery queue chart data.
  • the detailed operation included in the reference queue chart data is included in the actual operation queue chart data, but the specific operation may be different or erroneous.
  • the computer determines whether or not each detailed operation is performed correctly. Certain detailed surgical operations should be included in the entire surgical procedure, but may not be performed normally. For example, in the case of the operation of catching a tissue at a specific position with a suture type surgical tool (i.e., a detailed operation operation), if the tissue is caught at a deeper level than the optimal depth, It can be judged that the operation is not a detailed operation.
  • the computer recognizes the tissue and the surgical tool in the actual operation image through the image recognition, accurately recognizes the detailed operation operation, and performs evaluation and feedback in comparison with the normal detailed operation operation in the optimized queue data.
  • the computer can analyze the actual surgical queue chart data itself to provide information about anomalies or incidents where incidents occurred. For example, if a bleeding occurs at a particular site, the computer can provide at least one of the fact that the bleeding has occurred, and the location of the bleeding and the amount of bleeding.
  • the computer may analyze the record of the detailed operation included in the actual surgical queue chart data to determine and provide the cause of the bleeding.
  • the computer analyzes the completed actual operation queue to search for a surgical error (S640).
  • the computer may provide feedback if a surgical error condition according to predetermined rules is detected.
  • the computer may detect foreign objects left in the patient's body and provide feedback.
  • the computer recognizes not only the position of the surgical tool, the bleeding site, but also all objects included in the surgical image based on the obtained surgical image, and analyzes each object.
  • the computer determines the position, the number, and the inflow time of the objects included in the surgical image. Accordingly, the computer generates an alarm when it is determined that the foreign substance introduced when the operation is terminated has not been removed from the surgical site, and can provide feedback requesting confirmation to the user.
  • the computer may ask the user for confirmation, even if the object entering the surgical site is not identified in the image. For example, if an object introduced into a surgical site is not confirmed to be removed from the surgical site, it may remain invisible even though it is not included in the surgical image, so that feedback can be provided to the user.
  • the computer analyzes the surgical image in real time, and performs registration between the organs of the 3D modeling image and actual organs.
  • the computer tracks the position of the camera and surgical tool in real time, determines the surgical situation, and obtains information that allows the simulator to follow the actual surgical procedure.
  • the operation may be performed for the wrong patient. Can be requested.
  • the computer determines the surgical situation, and provides a rehearsal result or a surgical guide image according to the optimal surgical procedure.
  • the computer may ask the user to confirm if the actual procedure is different from the rehearsal or the optimal surgical procedure, because the wrong operation may be performed or other types of surgery may have been performed.
  • the computer can provide feedback if the actual operation differs from the rehearsal as in the method described above, and can be used to cut important nerves or ganglia depending on the location of the patient's organs and surgical instruments, Approach and provide a warning if the risk is predicted.
  • the computer displays images that are invisible, such as blood vessels, nerves, and ganglia, superimposed on the surgical images using image matching and further AR or Mixed Reality (MR) techniques, , which can help to restore the surgical procedure.
  • MR Mixed Reality
  • the method further includes a step (S660) of adding the actual surgery queue data to the learning queue data used for calculating the optimized queue data and performing reinforcement learning.
  • the actual surgery queue data may have a better portion than the optimized queue data. Therefore, it is possible to obtain a model capable of generating better optimization queue data by adding actual surgery queue chart data to the learning queue data and performing reinforcement learning.
  • the computer tracks the prognosis of the patient corresponding to each surgical record (S680).
  • the computer can perform machine learning by using each of the surgical records and the prognosis corresponding to each surgical record as learning data to determine which prognosis is caused by combination of surgical operations.
  • the computer can analyze the surgical actions of patients with certain adverse side effects, resulting in a combination of detailed surgical operations, or detailed surgical operations, which can cause side effects, even minor surgical operations.
  • the computer may obtain information about the detailed surgery operations that invokes each prognosis through reinforcement learning. For example, the computer may perform reinforcement learning based on the operations performed during the surgical procedure and the learning data on the prognosis that occurs when the respective operations are included or performed in a particular order. Based on the reinforcement learning results, the computer can determine which prognosis (i. E., What side effects) can occur due to a particular sub-surgical operation, a sequential sub-surgical operation, or a combination of detailed surgical operations.
  • the computer may obtain information about the detailed surgery operations that invokes each prognosis through reinforcement learning. For example, the computer may perform reinforcement learning based on the operations performed during the surgical procedure and the learning data on the prognosis that occurs when the respective operations are included or performed in a particular order. Based on the reinforcement learning results, the computer can determine which prognosis (i. E., What side effects) can occur due to a particular sub-surgical operation, a sequential sub-surgical operation, or a combination of detailed surgical operations.
  • the computer can output the feedback to the user in various ways.
  • the computer may extract the detailed surgical operations for which a problem exists and provide feedback.
  • the computer can extract and reproduce only the images of the detailed surgery operations in which the problem exists, thereby helping the user to identify the problem.
  • the computer may search for and provide detailed surgical operations included in the surgical procedure. For example, it is common for surgery to last several hours or more, so it is difficult for the user to receive feedback while checking the entire image after surgery.
  • the computer provides the cue chart data, and if the user selects one or more detailed surgical operations included in the cue sheet data, it can extract only the selected detailed operation and provide feedback. For example, the computer can extract and reproduce only images of selected detailed surgery operations.
  • each detailed surgical operation has a standardized name and standardized code data. Accordingly, the user can search for each detailed operation operation based on the standardized name or code data, and can easily judge the operation progress even if only the cue chart data is seen.
  • the method of providing feedback on a surgical result according to an embodiment of the present invention described above may be implemented as a program (or an application) to be executed in combination with a computer as a hardware and stored in a medium.
  • the above-described program may be stored in a computer-readable medium such as C, C ++, JAVA, machine language, or the like that can be read by the processor (CPU) of the computer through the device interface of the computer, And may include a code encoded in a computer language of the computer.
  • code may include a functional code related to a function or the like that defines necessary functions for executing the above methods, and includes a control code related to an execution procedure necessary for the processor of the computer to execute the functions in a predetermined procedure can do.
  • code may further include memory reference related code as to whether the additional information or media needed to cause the processor of the computer to execute the functions should be referred to at any location (address) of the internal or external memory of the computer have.
  • the code may be communicated to any other computer or server remotely using the communication module of the computer
  • a communication-related code for determining whether to communicate, what information or media should be transmitted or received during communication, and the like.
  • the medium to be stored is not a medium for storing data for a short time such as a register, a cache, a memory, etc., but means a medium that semi-permanently stores data and is capable of being read by a device.
  • examples of the medium to be stored include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, but are not limited thereto.
  • the program may be stored in various recording media on various servers to which the computer can access, or on various recording media on the user's computer.
  • the medium may be distributed to a network-connected computer system so that computer-readable codes may be stored in a distributed manner.
  • the steps of a method or algorithm described in connection with the embodiments of the present invention may be embodied directly in hardware, in software modules executed in hardware, or in a combination of both.
  • the software module may be a random access memory (RAM), a read only memory (ROM), an erasable programmable ROM (EPROM), an electrically erasable programmable ROM (EEPROM), a flash memory, a hard disk, a removable disk, a CD- May reside in any form of computer readable recording medium known in the art to which the invention pertains.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Robotics (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Data Mining & Analysis (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Urology & Nephrology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Image Analysis (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)

Abstract

L'invention concerne un procédé pour fournir une rétroaction sur un résultat chirurgical, comprenant les étapes consistant à : obtenir, par un ordinateur, des données de feuilles de repères chirurgicaux réelles composées d'une pluralité d'opérations chirurgicales détaillées par segmentation de données chirurgicales réelles obtenues lors d'une procédure d'opération réelle ; obtenir des données de graphiques de repères de référence pour l'opération réelle ; et comparer les données de feuilles de repères chirurgicaux réelles aux données de feuilles de repères de référence pour fournir une rétroaction.
PCT/KR2018/010329 2017-12-28 2018-09-05 Procédé et programme de fourniture de rétroaction sur un résultat chirurgical WO2019132165A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201880088961.9A CN111771244B (zh) 2017-12-28 2018-09-05 针对手术结果的反馈提供方法
EP18896154.4A EP3734608A4 (fr) 2017-12-28 2018-09-05 Procédé et programme de fourniture de rétroaction sur un résultat chirurgical
US16/914,141 US11636940B2 (en) 2017-12-28 2020-06-26 Method and program for providing feedback on surgical outcome
US18/194,067 US20230238109A1 (en) 2017-12-28 2023-03-31 Method and program for providing feedback on surgical outcome

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020170182889A KR101862360B1 (ko) 2017-12-28 2017-12-28 수술결과에 대한 피드백 제공방법 및 프로그램
KR10-2017-0182889 2017-12-28

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/914,141 Continuation US11636940B2 (en) 2017-12-28 2020-06-26 Method and program for providing feedback on surgical outcome

Publications (1)

Publication Number Publication Date
WO2019132165A1 true WO2019132165A1 (fr) 2019-07-04

Family

ID=62780721

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/010329 WO2019132165A1 (fr) 2017-12-28 2018-09-05 Procédé et programme de fourniture de rétroaction sur un résultat chirurgical

Country Status (5)

Country Link
US (2) US11636940B2 (fr)
EP (1) EP3734608A4 (fr)
KR (1) KR101862360B1 (fr)
CN (1) CN111771244B (fr)
WO (1) WO2019132165A1 (fr)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102149167B1 (ko) * 2018-11-07 2020-08-31 주식회사 삼육오엠씨네트웍스 인공지능 기반의 캐뉼라 수술 진단 장치
WO2020159276A1 (fr) * 2019-02-01 2020-08-06 주식회사 아이버티 Appareil d'analyse chirurgicale et système, procédé et programme pour analyser et reconnaître une image chirurgicale
KR102321157B1 (ko) * 2020-04-10 2021-11-04 (주)휴톰 수술 후 수술과정 분석 방법 및 시스템
KR102640314B1 (ko) * 2021-07-12 2024-02-23 (주)휴톰 인공지능 수술 시스템 및 그것의 제어방법
CN113591757A (zh) * 2021-08-07 2021-11-02 王新 一种眼部整形用自动化手术装置及设备
CN115775621B (zh) * 2023-02-13 2023-04-21 深圳市汇健智慧医疗有限公司 基于数字化手术室的信息管理方法及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000166927A (ja) * 1998-12-01 2000-06-20 Siemens Elema Ab 3次元イメ―ジング装置
KR20120046439A (ko) * 2010-11-02 2012-05-10 서울대학교병원 (분사무소) 3d 모델링을 이용한 수술 시뮬레이션 방법 및 자동 수술장치
KR20120126679A (ko) * 2011-05-12 2012-11-21 주식회사 이턴 수술 상황 판단 및 대응을 위한 수술 로봇 시스템의 제어 방법과 이를 기록한 기록매체 및 수술 로봇 시스템
KR20160066522A (ko) * 2014-12-02 2016-06-10 엑스-네브 테크놀로지스, 엘엘씨 수술 절차를 위한 시각적 안내 디스플레이
KR20160096868A (ko) * 2015-02-06 2016-08-17 경희대학교 산학협력단 수술용 가이드 설계정보 생성장치 및 생성방법

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100961661B1 (ko) * 2009-02-12 2010-06-09 주식회사 래보 수술용 항법 장치 및 그 방법
US9196176B2 (en) * 2009-03-20 2015-11-24 The Johns Hopkins University Systems and methods for training one or more training users
US8392342B2 (en) * 2009-11-18 2013-03-05 Empire Technology Development Llc Method and apparatus for predicting movement of a tool in each of four dimensions and generating feedback during surgical events using a 4D virtual real-time space
KR101302595B1 (ko) 2012-07-03 2013-08-30 한국과학기술연구원 수술 진행 단계를 추정하는 시스템 및 방법
CN111329552B (zh) * 2016-03-12 2021-06-22 P·K·朗 包括机器人的用于引导骨切除的增强现实可视化
CN106901834A (zh) * 2016-12-29 2017-06-30 陕西联邦义齿有限公司 微创心脏外科手术的术前规划及手术虚拟现实模拟方法
US11158415B2 (en) * 2017-02-16 2021-10-26 Mako Surgical Corporation Surgical procedure planning system with multiple feedback loops
US11112770B2 (en) * 2017-11-09 2021-09-07 Carlsmed, Inc. Systems and methods for assisting a surgeon and producing patient-specific medical devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000166927A (ja) * 1998-12-01 2000-06-20 Siemens Elema Ab 3次元イメ―ジング装置
KR20120046439A (ko) * 2010-11-02 2012-05-10 서울대학교병원 (분사무소) 3d 모델링을 이용한 수술 시뮬레이션 방법 및 자동 수술장치
KR20120126679A (ko) * 2011-05-12 2012-11-21 주식회사 이턴 수술 상황 판단 및 대응을 위한 수술 로봇 시스템의 제어 방법과 이를 기록한 기록매체 및 수술 로봇 시스템
KR20160066522A (ko) * 2014-12-02 2016-06-10 엑스-네브 테크놀로지스, 엘엘씨 수술 절차를 위한 시각적 안내 디스플레이
KR20160096868A (ko) * 2015-02-06 2016-08-17 경희대학교 산학협력단 수술용 가이드 설계정보 생성장치 및 생성방법

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3734608A4 *

Also Published As

Publication number Publication date
EP3734608A1 (fr) 2020-11-04
US11636940B2 (en) 2023-04-25
CN111771244B (zh) 2024-05-07
EP3734608A4 (fr) 2021-09-22
KR101862360B1 (ko) 2018-06-29
US20200357502A1 (en) 2020-11-12
CN111771244A (zh) 2020-10-13
US20230238109A1 (en) 2023-07-27

Similar Documents

Publication Publication Date Title
WO2019132165A1 (fr) Procédé et programme de fourniture de rétroaction sur un résultat chirurgical
KR102014359B1 (ko) 수술영상 기반 카메라 위치 제공 방법 및 장치
KR102654065B1 (ko) 스캔 기반 배치를 갖는 원격조작 수술 시스템
CN109996508B (zh) 带有基于患者健康记录的器械控制的远程操作手术系统
WO2019132169A1 (fr) Procédé, appareil, et programme de commande de lecture d'image chirurgicale
KR102523779B1 (ko) 수술 절차 아틀라스를 갖는 수술 시스템의 구성
WO2019132168A1 (fr) Système d'apprentissage de données d'images chirurgicales
KR102458587B1 (ko) 진단 검사를 실시간 치료에 통합하기 위한 범용 장치 및 방법
KR102146672B1 (ko) 수술결과에 대한 피드백 제공방법 및 프로그램
WO2019132244A1 (fr) Procédé de génération d'informations de simulation chirurgicale et programme
KR20190080736A (ko) 수술영상 분할방법 및 장치
KR102008891B1 (ko) 수술보조 영상 표시방법, 프로그램 및 수술보조 영상 표시장치
KR102628324B1 (ko) 인공지능 기반의 사용자 인터페이스를 통한 수술 결과 분석 장치 및 그 방법
WO2019132166A1 (fr) Procédé et programme d'affichage d'image d'assistant chirurgical
CN116075901A (zh) 用于处理医疗数据的系统和方法
WO2022108387A1 (fr) Procédé et dispositif permettant de générer des données de dossier clinique
WO2020159276A1 (fr) Appareil d'analyse chirurgicale et système, procédé et programme pour analyser et reconnaître une image chirurgicale
WO2021206517A1 (fr) Procédé et système de navigation vasculaire peropératoire
KR20190088419A (ko) 수술 시뮬레이션 정보 생성방법 및 프로그램
KR20190133424A (ko) 수술결과에 대한 피드백 제공방법 및 프로그램
KR20190133425A (ko) 수술보조 영상 표시방법 및 프로그램
KR101940706B1 (ko) 수술 시뮬레이션 정보 생성방법 및 프로그램
WO2019164278A1 (fr) Procédé et dispositif permettant d'obtenir des informations chirurgicales à l'aide d'une image chirurgicale
WO2019164272A1 (fr) Procédé et dispositif pour la fourniture d'image chirurgicale
JP2003150714A (ja) 画像診断支援システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18896154

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018896154

Country of ref document: EP

Effective date: 20200728