WO2019132166A1 - Procédé et programme d'affichage d'image d'assistant chirurgical - Google Patents

Procédé et programme d'affichage d'image d'assistant chirurgical Download PDF

Info

Publication number
WO2019132166A1
WO2019132166A1 PCT/KR2018/010330 KR2018010330W WO2019132166A1 WO 2019132166 A1 WO2019132166 A1 WO 2019132166A1 KR 2018010330 W KR2018010330 W KR 2018010330W WO 2019132166 A1 WO2019132166 A1 WO 2019132166A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical
image
displaying
computer
data
Prior art date
Application number
PCT/KR2018/010330
Other languages
English (en)
Korean (ko)
Inventor
이종혁
형우진
양훈모
김호승
Original Assignee
(주)휴톰
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)휴톰 filed Critical (주)휴톰
Publication of WO2019132166A1 publication Critical patent/WO2019132166A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/14Transformations for image registration, e.g. adjusting or mapping for alignment of images
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body

Definitions

  • the present invention relates to a surgical assistant image display method and a program.
  • a 3D medical image for example, a virtual image of a change in internal organs generated due to movement of a three-dimensional surgical tool and a tool
  • unnecessary processes are minimized to optimize the surgical process
  • Deep learning is defined as a set of machine learning algorithms that try to achieve high levels of abstraction (a task that summarizes key content or functions in large amounts of data or complex data) through a combination of several nonlinear transformation techniques. Deep learning can be viewed as a field of machine learning that teaches computers how people think in a big way.
  • a method for displaying a surgical assistant image comprising: obtaining a surgical image including at least a part of a body part of a target body and a surgical tool; Determining a surgical condition corresponding to the surgical image, and displaying a surgical assistant image corresponding to the surgical condition.
  • the step of determining the surgical condition may include recognizing a body part and a surgical tool of the object in the surgical image and determining a position of the recognized body part, a position of the surgical tool, a direction of the surgical tool, And determining the surgical condition based on at least one of the movements.
  • the method may further include obtaining reference cue chart data for the operation, wherein the step of determining the surgical condition includes determining a detailed operation operation included in the cue chart data corresponding to the surgical image And displaying the surgical assistant image may include displaying a 3D modeling image corresponding to the detailed operation.
  • the step of obtaining the reference queue chart data may further include acquiring the updated reference queue chart data according to the surgical situation.
  • the step of determining the surgical condition may further include transmitting standardized code data corresponding to the surgical situation to a server, wherein the step of displaying the surgical assistant image includes receiving, from the server, Receiving the standardized code data of the surgical assistant image, obtaining a surgical assistant image corresponding to the code data, and displaying the obtained surgical assistant image.
  • the step of displaying the surgical assistant image may include matching the surgical assistant image with the surgical image, and displaying the matched image.
  • the surgery assistant image may be a surgical reference image generated using a 3D modeling image generated based on a medical image of a surgical region of the target object.
  • the step of displaying the surgical assistant image may further include displaying information for guiding the moving direction of the surgical tool included in the surgical image based on the surgical reference image.
  • the step of displaying the surgical assistant image may further include displaying information on at least a part of the body part which is not displayed on the surgical image or is difficult to identify in the surgical image.
  • the step of providing the feedback may include providing haptic feedback corresponding to the determined positional relationship to the controller that controls the robot operation, . ≪ / RTI >
  • the step of displaying the surgical assistant image may include the step of displaying feedback on the detected surgical error condition when at least one surgical error condition is detected in the surgical image.
  • a computer program stored in a computer-readable recording medium for performing a method of displaying a surgical assistant image according to an embodiment of the present invention, the computer program being stored in a computer readable by a computer.
  • the actual image data is transmitted between the server and the control unit, but only the code representing the image data is transmitted, and each acquires an image corresponding to the code. Therefore, there is an effect that the time delay due to data transmission can be minimized.
  • FIG. 1 is a diagram illustrating a robot surgery system in accordance with the disclosed embodiment.
  • FIG. 2 is a flowchart illustrating a method of displaying a surgical assistant image according to an embodiment.
  • FIG. 3 is a flowchart illustrating a method of displaying a surgical assistant image using reference queue chart data according to an embodiment.
  • FIG. 4 is a flow chart illustrating a method for computing optimized queue data in accordance with one embodiment.
  • FIG. 5 is a flow chart illustrating a method of displaying a surgical assistant image based on code data received from a server, according to one embodiment.
  • FIG. 6 is a flowchart illustrating a method of displaying various types of surgical assistant images by a computer according to an exemplary embodiment of the present invention.
  • each step shown in FIG. 2 is performed in a time-series manner in the server 20 or the control unit 30 shown in FIG.
  • each step is described as being performed by a computer, but the subject of each step is not limited to a specific device, and all or some of the steps may be performed in the server 20 or the control unit 30 .
  • the computer searches the reference cue chart data for a detailed operation corresponding to the actual operation image, and displays a 3D modeling image corresponding to the searched detailed operation.
  • the computer may search for the detailed surgery operation corresponding to the actual surgical image in the reference queue chart data, display the 3D modeling image corresponding to the next detailed operation operation of the searched detailed operation, and provide a guide for the surgical direction It is possible.
  • the computer can receive information about the surgical assistant image corresponding to the surgical situation in real time from the server, and display the surgical assistant image based on the received information.
  • FIG. 5 is a flow chart illustrating a method of displaying a surgical assistant image based on code data received from a server, according to one embodiment.
  • the server can determine the surgical situation in real time based on the received code data.
  • the server may determine a surgical assistant image corresponding to the determined surgical condition and transmit code data corresponding to the determined surgical assistant image to the computer.
  • the computer determines a detailed surgical operation corresponding to the current surgical situation based on the received code data, and obtains information on the next detailed operation based on the reference cue chart data.
  • the server transmits code data corresponding to the next detailed operation to the computer.
  • the computer renders and generates a surgical assistant image corresponding to the code data, and displays the generated image.
  • FIG. 6 is a flowchart illustrating a method of displaying various types of surgical assistant images by a computer according to an exemplary embodiment of the present invention.
  • the surgical assistant image may be a surgical reference image generated using a 3D modeling image generated based on a medical image of a surgical region of a subject in advance.
  • the surgical reference image may be an image corresponding to the reference queue chart data.
  • the computer can display information for guiding the moving direction of the surgical tool included in the actual operation image based on the operation reference image (S720).
  • the computer displays information about other organs or blood vessels that are hidden from other organs, and blood vessels or nerves that are difficult to see visually.
  • organs or blood vessels that are obscured by other organs, and blood vessels or nerves that are difficult to identify with the naked eye may be displayed in the form of an augmented reality with actual surgical images.
  • the computer may provide haptic feedback corresponding to the determined positional relationship to a controller that controls robot surgery.
  • haptic feedback such as vibration, may be provided to the controller to provide feedback on the positional relationship of the surgical tool and the body part.
  • a situation that does not conform to established rules may be caused by a malfunction of the surgical tool (for example, an operation that may be interfered with or dangerous to the operation, such as a bump in the surgical arm)
  • a malfunction of the surgical tool for example, an operation that may be interfered with or dangerous to the operation, such as a bump in the surgical arm
  • the surgical tool may approach the artery, causing the artery to be cut or injured, or that when performing a cutting operation, , Etc.), and when the surgical supplies are not collected (e.g., when gauze, thread clips, etc. are not collected), and the like.
  • the types of situations that are not in accordance with the prescribed rules are not limited thereto.
  • the unexpected situation may include, for example, a case where bleeding occurs in organs or blood vessels, but the present invention is not limited thereto.
  • the computer can determine a surgical situation, provide a notification when a situation as described above is detected, and forcibly stop the operation of the surgical tool or move the surgical tool to a safe position in an urgent situation.
  • the computer may provide feedback if a surgical error condition according to predetermined rules is detected (S780).
  • the computer recognizes not only the position of the surgical tool, the bleeding site, but also all objects included in the surgical image based on the obtained surgical image, and analyzes each object.
  • the computer determines the position, the number, and the inflow time of the objects included in the surgical image. Accordingly, the computer generates an alarm when it is determined that the foreign substance introduced when the operation is terminated has not been removed from the surgical site, and can provide feedback requesting confirmation to the user.
  • the computer may ask the user for confirmation, even if the object entering the surgical site is not identified in the image. For example, if an object introduced into a surgical site is not confirmed to be removed from the surgical site, it may remain invisible even though it is not included in the actual surgical image, so that feedback can be provided to the user.
  • the computer analyzes real surgical images in real time, and performs registration between the organs of the 3D modeling image and actual organs.
  • the computer tracks the position of the camera and surgical tool in real time, determines the surgical situation, and obtains information that allows the simulator to follow the actual surgical procedure.
  • the operation may be performed for the wrong patient. Can be requested.
  • the computer determines the surgical situation, and provides a rehearsal result or a surgical guide image according to the optimal surgical procedure.
  • the computer may ask the user for confirmation, as the actual progress of the surgery may be different from rehearsal or optimal surgical methods, to take the surgery to the wrong site, or to perform other types of surgeries.
  • the computer can provide warnings or notifications when the actual operation differs from the rehearsal as in the methods described above, and can be used to cut important nerves or ganglia depending on the location of the patient's organs and surgical instruments, The nerve can be accessed to provide a warning if the risk is predicted.
  • the computer helps to see important parts of the user by superimposing the invisible blood vessels, nerves, and ganglia on the surgical images by using image matching and further AR or Mixed Reality You can give.
  • the surgical assistant image display method may be implemented as a program (or application) to be executed in combination with a hardware computer and stored in the medium.
  • the code may be communicated to any other computer or server remotely using the communication module of the computer
  • a communication-related code for determining whether to communicate, what information or media should be transmitted or received during communication, and the like.
  • the medium to be stored is not a medium for storing data for a short time such as a register, a cache, a memory, etc., but means a medium that semi-permanently stores data and is capable of being read by a device.
  • examples of the medium to be stored include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, but are not limited thereto.
  • the program may be stored in various recording media on various servers to which the computer can access, or on various recording media on the user's computer.
  • the medium may be distributed to a network-connected computer system so that computer-readable codes may be stored in a distributed manner.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Robotics (AREA)
  • Primary Health Care (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Epidemiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

L'invention concerne un procédé d'affichage d'une image d'assistant chirurgical, comprenant les étapes consistant à : obtenir, par un ordinateur, une image chirurgicale comprenant au moins une partie d'une partie corporelle d'un sujet et un instrument chirurgical ; afficher l'image chirurgicale obtenue ; déterminer une situation chirurgicale correspondant à l'image chirurgicale ; et afficher une image d'assistant chirurgical correspondant à la situation chirurgicale.
PCT/KR2018/010330 2017-12-28 2018-09-05 Procédé et programme d'affichage d'image d'assistant chirurgical WO2019132166A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2017-0182890 2017-12-28
KR1020170182890A KR101864411B1 (ko) 2017-12-28 2017-12-28 수술보조 영상 표시방법 및 프로그램

Publications (1)

Publication Number Publication Date
WO2019132166A1 true WO2019132166A1 (fr) 2019-07-04

Family

ID=62628297

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/010330 WO2019132166A1 (fr) 2017-12-28 2018-09-05 Procédé et programme d'affichage d'image d'assistant chirurgical

Country Status (2)

Country Link
KR (1) KR101864411B1 (fr)
WO (1) WO2019132166A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230028818A (ko) * 2021-08-19 2023-03-03 한국로봇융합연구원 영상정보기반 복강경 로봇 인공지능 수술 가이드 시스템

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102213412B1 (ko) * 2018-11-15 2021-02-05 서울여자대학교 산학협력단 기복모델 생성방법, 장치 및 프로그램

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100098055A (ko) * 2009-02-27 2010-09-06 한국과학기술원 영상유도수술시스템 및 그 제어방법
KR20110036453A (ko) * 2009-10-01 2011-04-07 주식회사 이턴 수술용 영상 처리 장치 및 그 방법
KR20120046439A (ko) * 2010-11-02 2012-05-10 서울대학교병원 (분사무소) 3d 모델링을 이용한 수술 시뮬레이션 방법 및 자동 수술장치
KR20120126679A (ko) * 2011-05-12 2012-11-21 주식회사 이턴 수술 상황 판단 및 대응을 위한 수술 로봇 시스템의 제어 방법과 이를 기록한 기록매체 및 수술 로봇 시스템
KR101302595B1 (ko) * 2012-07-03 2013-08-30 한국과학기술연구원 수술 진행 단계를 추정하는 시스템 및 방법

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100428234B1 (ko) * 2001-11-23 2004-04-28 주식회사 인피니트테크놀로지 의료영상분할장치 및 그 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100098055A (ko) * 2009-02-27 2010-09-06 한국과학기술원 영상유도수술시스템 및 그 제어방법
KR20110036453A (ko) * 2009-10-01 2011-04-07 주식회사 이턴 수술용 영상 처리 장치 및 그 방법
KR20120046439A (ko) * 2010-11-02 2012-05-10 서울대학교병원 (분사무소) 3d 모델링을 이용한 수술 시뮬레이션 방법 및 자동 수술장치
KR20120126679A (ko) * 2011-05-12 2012-11-21 주식회사 이턴 수술 상황 판단 및 대응을 위한 수술 로봇 시스템의 제어 방법과 이를 기록한 기록매체 및 수술 로봇 시스템
KR101302595B1 (ko) * 2012-07-03 2013-08-30 한국과학기술연구원 수술 진행 단계를 추정하는 시스템 및 방법

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230028818A (ko) * 2021-08-19 2023-03-03 한국로봇융합연구원 영상정보기반 복강경 로봇 인공지능 수술 가이드 시스템
KR102627401B1 (ko) 2021-08-19 2024-01-23 한국로봇융합연구원 영상정보기반 복강경 로봇 인공지능 수술 가이드 시스템

Also Published As

Publication number Publication date
KR101864411B1 (ko) 2018-06-04

Similar Documents

Publication Publication Date Title
KR102654065B1 (ko) 스캔 기반 배치를 갖는 원격조작 수술 시스템
CN109996508B (zh) 带有基于患者健康记录的器械控制的远程操作手术系统
US20240024051A1 (en) Configuring surgical system with surgical procedures atlas
EP3212109B1 (fr) Détermination d'une configuration d'un bras robotisé médical
WO2019132165A1 (fr) Procédé et programme de fourniture de rétroaction sur un résultat chirurgical
WO2019132614A1 (fr) Procédé et appareil de segmentation d'image chirurgicale
US20210015343A1 (en) Surgical assistance apparatus, surgical method, non-transitory computer readable medium and surgical assistance system
US20210290317A1 (en) Systems and methods for tracking a position of a robotically-manipulated surgical instrument
WO2019132244A1 (fr) Procédé de génération d'informations de simulation chirurgicale et programme
KR102008891B1 (ko) 수술보조 영상 표시방법, 프로그램 및 수술보조 영상 표시장치
WO2019132166A1 (fr) Procédé et programme d'affichage d'image d'assistant chirurgical
EP3200719B1 (fr) Détermination d'une configuration d'un bras robotisé médical
WO2020159276A1 (fr) Appareil d'analyse chirurgicale et système, procédé et programme pour analyser et reconnaître une image chirurgicale
KR102084598B1 (ko) 뼈 병변 수술을 위한 ai 기반의 수술 보조 시스템
WO2023008818A1 (fr) Dispositif et procédé permettant de mettre en correspondance une image de chirurgie réelle et une image de chirurgie de simulation virtuelle fondée sur la 3d sur la base de la définition de points d'intérêt (poi) et d'une reconnaissance de phase
KR20190133425A (ko) 수술보조 영상 표시방법 및 프로그램
WO2024063539A1 (fr) Appareil de planification de trajet de coupe de robot chirurgical, et procédé associé
WO2023018138A1 (fr) Dispositif et procédé de génération d'un modèle de pneumopéritoine virtuel d'un patient
WO2019164272A1 (fr) Procédé et dispositif pour la fourniture d'image chirurgicale

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18897168

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18897168

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 06/04/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18897168

Country of ref document: EP

Kind code of ref document: A1