WO2016060308A1 - Needle insertion type robot apparatus for interventional surgery - Google Patents

Needle insertion type robot apparatus for interventional surgery Download PDF

Info

Publication number
WO2016060308A1
WO2016060308A1 PCT/KR2014/009839 KR2014009839W WO2016060308A1 WO 2016060308 A1 WO2016060308 A1 WO 2016060308A1 KR 2014009839 W KR2014009839 W KR 2014009839W WO 2016060308 A1 WO2016060308 A1 WO 2016060308A1
Authority
WO
WIPO (PCT)
Prior art keywords
needle
computer
robot
tumor
edge
Prior art date
Application number
PCT/KR2014/009839
Other languages
French (fr)
Korean (ko)
Inventor
박창민
김남국
Original Assignee
서울대학교병원
울산대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 서울대학교병원, 울산대학교산학협력단 filed Critical 서울대학교병원
Publication of WO2016060308A1 publication Critical patent/WO2016060308A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • A61B10/02Instruments for taking cell samples or for biopsy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges

Definitions

  • the present disclosure relates generally to a needle implantable interventional robotic device, and more particularly to a needle implantable interventional robotic device that effectively performs biopsy of a target point on a border of a heterogeneous lesion. will be.
  • Medical imaging-based biopsy is an interventional procedure that minimizes damage to the surrounding normal tissue and extracts the samples necessary for the pathological diagnosis of abnormal lesions, including post-peritoneal, adrenal, pancreatic and lymph nodes. It is widely applied to various organs in the abdominal cavity, lungs, mediastinum, spine and extremities. Medical image-based biopsy enables the detection of small lesions by using high resolution images to delicately localize the lesion area in three dimensions and view biopsy needles that have entered the tissue. .
  • the CT or C-arm fluoroscopy device and its images can be used to guide the insertion path of the biopsy needle, for example in the planning of the insertion path.
  • the path of insertion of the biopsy needle can be accurately planned by determining the entry angle and insertion point of the biopsy needle at.
  • the image acquisition device eg, Fluoroscopy device, CBCT device placed in the procedure room is aligned with the planned path, i.e., the orientation in which the biopsy needle will be inserted.
  • a navigation view is used to accurately guide the biopsy needle during the biopsy process.
  • a navigation view such as Surgeon's Eye View shown in FIG. 1
  • the center point of the target is shown, and the biopsy needle is based on the insertion point.
  • the target is displayed as a point and a circle is drawn around the point.
  • the tumor is not homogeneous but that the biological properties of the tumor (eg DNA mutation, malignancy) are different for each site within the tumor. Whether it is collected is an important issue in the diagnosis of the tumor, the prediction of the therapeutic effect of the tumor and the estimation of the patient's prognosis.
  • an active tumor cell is located at the edge of the tumor, and the inside of the tumor is necrotic, and there are no tumor cells (necrosis).
  • a biopsy needle is inserted at the center of the tumor, false negative Incorrectly diagnosed errors may occur. Therefore, the operator may intentionally stab the periphery of the tumor sensibly by experience while looking at the floroscopy image to biopsy this heterogeneous tumor.
  • heterogeneous tumors are biopsied with multi-spots that perform biopsy at a plurality of target points, and maps representing the properties of tissues according to positions in the tumor are matched by matching the biopsy location with the characteristics of the sample. It is very important medically, and it is more difficult to biopsy multiple spots based on the operator's experience.
  • FIG. 2 is a diagram illustrating an example of a navigation screen for an ablation procedure disclosed in U.S. Patent Application Publication No. 2013/0317363.
  • FIG. 2 illustrates a target treatment area 138b of an ablation procedure and an expected treatment range when the procedure is performed. 138a).
  • a target treatment area 138b of an ablation procedure and an expected treatment range when the procedure is performed.
  • a needle implantable interventional robot apparatus comprising: a computer integrating surgical planning information into an operating room image; a heterogeneous surgical target A computer incorporating surgical planning information, including a target point on the border of the device and insertion path of the needle-shaped medical instrument, into the operating room image; A robot having a needle-shaped medical tool, comprising: a robot operating according to a computer instruction such that the needle-shaped medical tool follows an insertion path; And a user interface (UI) showing the edge of the affected area using an operating room image in which the operation plan information is integrated with a computer. The operation of the end of the needle-shaped medical tool for the target point when the robot operates according to the operation plan.
  • a needle-inserted interventional robot device comprising a; user interface showing the expected arrival position.
  • 1 is a view showing an example of Surgeon's Eye View
  • FIG. 2 is a view showing an example of a navigation screen for the ablation procedure disclosed in US Patent Publication No. 2013/0317363;
  • FIG. 3 is a view for explaining an example of the needle insertion interventional robot apparatus according to the present disclosure
  • FIG. 4 is a view illustrating an example of a method of dividing a tumor and generating a surgical plan in a preoperative image
  • FIG. 5 is a view illustrating an example of a surgical plan including a plurality of target points, an insertion path, and an insertion point of a biopsy at the edge of a tumor;
  • FIG. 6 is a diagram illustrating an example of a preoperative image in which a tumor and an insertion path are visualized
  • FIG. 7 is a diagram illustrating an example of a method in which an operation plan is integrated into an operating room image
  • FIG. 8 is a view for explaining an example of the positioning means for grasping the relative position information of the patient and the biopsy needle;
  • FIG. 10 is a view for explaining an example of a robot equipped with a revolver type biopsy needle.
  • the needle insertion intervention robot device (hereinafter referred to as intervention intervention robot device) is a biopsy for reducing the radiation exposure, improving the accuracy of the procedure And therapeutic needle implantable imaging interventional robotic systems.
  • the interventional robotic device can be used for biopsy and treatment of 1 cm-class lesions in the abdomen, chest, and the like.
  • An example of a needle-type medical tool is a biopsy needle.
  • the interventional robot device may include a computer 600 that processes or generates a medical image, a robot 100 that works in conjunction with the computer, and an estimated arrival position of the tip of the biopsy needle 111 at the edge of the tumor. It includes a user interface 500 for showing and guiding.
  • the interventional robot device includes a master device 200 for controlling the robot 100 in real time in conjunction with the user interface 500, an image capturing device 300 for capturing the position of the biopsy needle 111 in the human body, and , The apparatus 100 for monitoring the position and posture of the robot 100, the patient 50, and peripheral devices, and the like.
  • the interventional robot device may be applied to biopsies of organs such as lungs, kidneys, liver, etc., and application to other parts of the organs is not excluded. In this example, the lungs are described.
  • the preoperative image is thresholded on the lung of the patient to segment the lesion 10 (eg, a tumor) and generate a surgical plan.
  • the lung images are segmented to prepare a divided lung image.
  • anatomical structures eg, blood vessels, ribs, airways, lung boundaries, etc.
  • Anatomical structures, such as airways and the like may be stored as a lung mask, a vessel mask, a rib mask, an airway mask, or the like.
  • the tumor 10 is divided by a segmentation technique (eg, adaptive threshold) using a HU value appropriate for the tumor 10 as a threshold value.
  • 4 shows an example of an axial cross section of a lung image in which the tumor 10 is divided.
  • the computer 600 is loaded with a preoperative image of the patient, and the operating room image and the preoperative image of the patient acquired at the procedure room are registered by the computer. As a result of the registration, a surgical plan including the insertion paths 82 and 84, the insertion point 41, the target point on the tumor, and the like made using the preoperative image is transferred to the operating room image. This is further described below.
  • FIG. 5 is a diagram illustrating an example of a surgical plan including a plurality of target points, an insertion path, and an insertion point of a biopsy at the edge of a tumor.
  • the tumor 10 may have active cancer cells at the edge 11 or the outer wall, and may have rotten water inside the tumor 10. Therefore, when the edge 11 and the inside of the tumor have different intensities in the image, and the thresholds are distinguished from the edges and the inside of the tumor 10, as shown in FIG. 5, the edge 11 and the inside of the tumor ( 15) are divided to separate.
  • the FDG-PET / CT image is used to distinguish between active and low metabolic sites with high intake of FDG and the like. Thresholding the standardized uptake value (SUV) allows the tumor 10 to be segmented as shown in FIG. 5.
  • the entire tumor 10 may be divided and defined-divided around the tumor 10 using Morphological operators such as distance map or Eroding from the tumor 10 boundary.
  • the divided tumor 10 may be generated as a 3D image. Therefore, the cross section of the tumor 10 can be seen in the direction required by the image processing software, the tumor 10 is visualized to be distinguished from the surroundings, and the edge of the tumor 10 is distinguished from the interior of the tumor 10. Can be.
  • the tumor 10 can be viewed in representative directions, such as axial view, coronal view, and sagittal view, and a surgical plan can be created based on this. .
  • a plurality of biopsy target points are captured at the edge 11 of the tumor.
  • the edge 11 of the tumor may be divided from the inside and the periphery by the division.
  • the interior of the tumor 10 is a cell (necrosis) dead, the edge 11 may be active cancer cells are distributed.
  • the number of target points may be determined at various positions in the tumor 10 as well as at the edge 11 as well as inside the tumor 10.
  • the tumor 10 may be heterogeneous, and the DNA mutation may be different according to the location, and thus the effect may be different when a particular drug or treatment is performed according to the location in the tumor 10. Therefore, if you do only one biopsy, there is a problem that other places live and relapse.
  • the thickness of the edge 11 can be estimated approximately statistically, for example, if the tumor 10 is as large as 2 centimeters, then the center of the tumor 10 is not biopsied or In addition to the biopsy of the center, a surgical plan is made to set a plurality of biopsy target points on the edge 11. If the biopsy needle 111 has a submillimeter accuracy, the surgical plan may be made to stab the edge 11 with an accuracy of 1 millimeter * 1 millimeter for a tumor 10 having a size of 20 millimeters wide. Can be.
  • a biopsy is preferable at a plurality of target points in the tumor 10, and then a map of the tumor 10 is made medically meaningful by matching the properties and positions of the samples.
  • Each insertion path can be made to reach each target point, for example, an insertion path is created such that blood vessels or other structures that intersect the insertion path are minimal. Accordingly, the insertion point of the biopsy needle 111 is determined on the skin of the patient. The insertion point may be smaller than the number of target points. For example, after the biopsy needle 111 is stabbed, other target points may be biopsied by changing the direction without completely removing it from the lung.
  • FIG. 6 illustrates an example of a preoperative image in which a tumor and an insertion path are visualized, and an insertion path (eg, 82) is visualized in 3D between a real rib and a rib.
  • the plurality of target points, insertion points, and insertion paths determined as described above are added to the preoperative image to generate a surgical plan.
  • the preoperative image is a 3D image, and as shown in FIG. 6 through volume rendering, the surgical plan may be generated in 3D.
  • the tumor 10 is divided from the periphery and is marked so that the edge 11 is distinguished.
  • the insertion path is visualized in three dimensions, and a target point (eg, 21) is displayed at the edge 11 of the tumor.
  • Tumor 10 has little contrast and is hardly visible in fluoroscopy city, and tumor 10 is generally shown in a circular shape, but in this example, the tumor 10 is divided so that the edge of tumor 10 is distinguished in the preoperative image. , So that it appears on the operating room image.
  • the biopsy may be performed assuming a certain thickness as the edge 11 from the boundary between the tumor 10 and the periphery. have.
  • FIG. 7 is a view illustrating an example of a method of integrating a surgical plan into an operating room image.
  • An operating room image is acquired at an operating room, and the preoperative image and the operating room image are matched to insert an operation path into the operating room image.
  • the plan is transferred.
  • rigid registration and deformable registration methods and the like may be used.
  • the insertion path 82 may be modified through the user interface 500, and an inappropriate insertion path may be removed in consideration of breathing or movement.
  • 7 (a) is an example of the pre-operative image
  • Figure 7 (b) is an image of the operation plan transfer image as the image of the operating room image and the pre-operative image is matched.
  • the insertion path on a multiplanar reconstruction (MPR, axial view, coronal view, sagittal view) 82, the insertion point and the target point may be overlaid and displayed (the axial view is illustrated in FIG. 7).
  • MPR multiplanar reconstruction
  • the biopsy needle 111 may be guided along the insertion path identified on the MPR to perform the procedure.
  • the final confirmed insertion path is transmitted to a robot or user interface (e.g. navigation device) or the like using TCP / IP or a dedicated communication protocol.
  • the biopsy needle 111 may of course be a single needle type, but a plurality of revolver types (see FIG. 10) may be mounted on the robot in order to biopsy a multi-spot, and thus it may be more effective to biopsy each target point sequentially.
  • FIG. 8 is a view for explaining an example of the positioning means for grasping the relative position information of the patient and the biopsy needle.
  • the positioning means for determining the relative positional relationship between the patient 960 and the biopsy needle 912.
  • the patient 960, the robot 911 with the biopsy needle 912, the infrared camera 991, the infrared reflector assemblies 911, 913, 914, the monitor 920 and the computer ( 940 is provided.
  • the infrared camera 991 detects the plurality of infrared reflectors 911 and 914 indicating the position of the patient 960 and the plurality of infrared reflectors or infrared emitters 913 provided at the ends of the biopsy needle 912, thereby allowing the needle 912 and The location of the patient 960 can be identified.
  • a computer 940 is provided for overall operation of the master console, and a monitor 920 is also provided.
  • the computer 940 and the monitor 920 may correspond to the computer 600 and the user interface 500 described with reference to FIG. 3.
  • the computer 940 also functions as a surgical navigation device.
  • the biopsy needle 912 of the robot 911 is actuated by the computer 940.
  • the infrared reflector assembly 911 is fixed to the patient 960 to indicate the position of the patient 960
  • the infrared reflector assembly 913 is fixed to the biopsy needle 912 to indicate the position of the biopsy needle 912
  • the infrared reflector Assembly 914 is positioned on the chest of patient 960 to indicate patient movement, such as breathing, sneezing of the patient.
  • an infrared camera and an infrared reflector are used, but a magnetic field can be used, and any means can be used as long as the position can be sensed. For example, it is possible to attach a magnetic sensor to the biopsy needle and track with the camera how far it moves.
  • the infrared reflector assembly 911 may be used to indicate the location information of the patient 960, may function as a reference position of the entire system, may be fixed to the patient 960, but may be fixed to the operating table, or the operating table. An additional infrared reflector assembly (not shown) may serve as a reference position. The location of the biopsy needle 912 relative to the patient 960 can be determined.
  • the robot 100 itself knows a location.
  • the robot 100 is holding the biopsy needle 111
  • the robot 100 itself can know its coordinates in the procedure.
  • the robot 100 itself may detect how many millimeters the biopsy needle 111 moves. Therefore, the computer can calculate the orientation and position of the biopsy needle 111 in the space of the procedure image.
  • the computer may calculate the current position of the biopsy needle 111 by matching with the floroscopy city for acquiring the procedure image in a matched operating image space.
  • the positioning means uses a plurality of methods to determine the positional relationship rather than using only one.
  • the distance between the biopsy needle 111 and the target point of the edge 11 of the tumor can be calculated by the computer calculating the relative positional relationship between the patient and the biopsy needle 111 identified by the one or more locating means. Therefore, when poking with the insertion path, the insertion angle, the insertion point, and the insertion distance determined in the surgical plan, the expected arrival position of the tip of the biopsy needle 111 is calculated by the computer.
  • the expected arrival location is displayed on the matched operating room image to inform the operator. This is further described below.
  • FIG. 9 illustrates an example of a user interface, and a plurality of screens 510, 520, 530, and 540 are displayed on the user interface 500.
  • a CT image CT volume; for example, an image transmitted from a floroscopy
  • a mask showing various structures or lesions of the lung is displayed.
  • a button for performing an operation plan and an information window for displaying procedure information or type.
  • the main screen 510 displays a matched operating room image displayed in 3D such that the edge 11 of the affected part is visually distinguished.
  • MPR images eg, 520, 530, 540
  • the tumor 10 having the edge 11 separated in the same direction as the axial view, the coronal view, and the sagittal view 530;
  • the expected arrival positions of the biopsy needle tips eg, 21, 22, 23, 24, 26 are indicated.
  • the relative position of the biopsy needle 111 and the patient 50 can be known in the same manner as described in FIG. 8, and the operating room image (eg, 510) visually displays the matched tumor 10 and simulates the tumor. Seat 11 appears. Therefore, the distance between the target point on the edge 11 of the tumor and the biopsy needle 111 and the expected arrival position (eg, 21, 22, 23, 24, 26) where it will be reached if stabbed to the projected depth at the currently aligned angle.
  • the computer can calculate it.
  • the computer can display the calculated estimated arrival position on the matched operating room image. Therefore, it is possible to visually know whether the target point and the expected arrival position (21, 22, 23, 24, 26) match. If there is a match, the operator instructs the computer 600 (see FIG.
  • the biopsy needle 111 is inserted into the human body by the operation of the robot 100 linked to the computer 600.
  • the position of the end of the biopsy needle 111 can be calculated in the operating room image space, the tumor 10 matched with this You can recheck if your goal is met by comparing the targets. If the expected arrival position and target point do not match, the surgical plan can be modified. For example, the screen displays the difference between the target point and the expected arrival position, and the computer can create an instruction to correct the position of the robot. Alternatively, the operator can instruct the computer by modifying the insertion path or depth.
  • a needle implantable interventional robot device comprising: a computer integrating surgical planning information into an operating room image, comprising: a target point and a needle point on the edge of a heterogeneous surgical target; A computer incorporating surgical planning information including an insertion path of a medical instrument into the operating room image; A robot having a needle-shaped medical tool, comprising: a robot operating according to a computer instruction such that the needle-shaped medical tool follows an insertion path; And a user interface (UI) showing the edge of the affected area using an operating room image in which the operation plan information is integrated with a computer. The operation of the end of the needle-shaped medical tool for the target point when the robot operates according to the operation plan is performed. And a user interface showing an expected arrival position.
  • UI user interface
  • the surgical plan includes a plurality of target points on the edge of the lesion, wherein the computer matches the location of each target point in the lesion and each sample of the lesion obtained at each target point by a needle-shaped medical instrument. Needle insertion interventional robot device, characterized in that for storing information.
  • the user interface includes: at least one screen showing a cross section of the affected part, the screen showing the edge of the affected part and the expected arrival position of the end of the needle-shaped medical tool; Device.
  • the needle inserted interventional robot device characterized in that the computer compares the estimated arrival position of the target point with the tip of the needle-type medical instrument and displays the position change information of the robot for matching on the user interface when there is a mismatch.
  • the at least one screen includes at least one of an axial view, a coronal view, and a sagittal view of the affected area, wherein the at least one screen includes an axial view, a coronal view, and a digital view.
  • At least one of the views is a needle insertion interventional robot device, characterized in that the edge of the affected area, the expected arrival position of the end of the needle-shaped medical tool is displayed.
  • the user interface includes: an additional screen showing a three-dimensional operating room image incorporating a surgical plan; an additional screen showing an edge of the affected part and an expected arrival position of the end of the needle-shaped medical tool; Needle insertion type interventional robot device, characterized in that.
  • the computer calculates the changed estimated arrival position, and the user interface displays the changed estimated arrival position on the screen and additional screens. Device.
  • the locating means includes: a marker for marking the needle-shaped medical tool and the patient; And
  • Needle insertion interventional robot device comprising a; sensing device for detecting a marker.
  • the needle-type medical instrument is a biopsy needle
  • the needle of the biopsy needle is a revolver type
  • a plurality of dogs mounted on the robot the needle insertion interventional robot device, characterized in that to sequentially biopsy each target point.
  • the needle-inserted interventional robot device According to one of the needle-inserted interventional robot device according to the present disclosure, it is possible to more accurately biopsy at the edge of the heterogeneous tumor, and to reduce the error or risk issued by the human biopsy.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Oral & Maxillofacial Surgery (AREA)

Abstract

The present disclosure relates to a needle insertion type robot apparatus for interventional surgery, comprising: as a computer for incorporating operation plan information into an operating room image, a computer which incorporates operation plan information into an operating room image, the operation plan information comprising a target point on the border of a heterogeneous surgical target and an insertion route of a needle-type medical instrument; as a robot having a needle-type medical instrument, a robot which operates according to the direction of the computer so that the needle-type medical instrument follows the insertion route; and as a user interface (UI) which interworks with the computer and displays the border of the surgical target using the operating room image into which the operation plan information has been incorporated, a UI which displays an estimated position that the tip of the needle-type medical instrument will reach with respect to the target point when the robot operates according to an operation plan.

Description

바늘 삽입형 중재시술 로봇 장치Needle Insertion Intervention Robot Device
본 개시(Disclosure)는 전체적으로 바늘 삽입형 중재시술 로봇 장치에 관한 것으로, 특히 이질적인(heterogeneous) 환부의 가장 자리(border) 상의 목표점에 대한 생체검사(biopsy)를 효과적으로 수행하는 바늘 삽입형 중재시술 로봇 장치에 관한 것이다.The present disclosure relates generally to a needle implantable interventional robotic device, and more particularly to a needle implantable interventional robotic device that effectively performs biopsy of a target point on a border of a heterogeneous lesion. will be.
여기서는, 본 개시에 관한 배경기술이 제공되며, 이들이 반드시 공지기술을 의미하는 것은 아니다(This section provides background information related to the present disclosure which is not necessarily prior art).This section provides background information related to the present disclosure which is not necessarily prior art.
의료 영상 기반 생체검사(Biopsy)는 주위의 정상조직에 대한 피해를 최소화하고, 비정상 병변의 병리적 진단에 필요한 샘플을 뽑아내는 중재시술(interventional procedure)로서, 부신, 췌장, 림프절 등의 후 복막, 복강내 여러 장기, 폐, 종격동, 척추, 사지골 등의 부위에 광범위하게 적용된다. 의료 영상 기반 생체검사는, 해상도가 높은 영상을 이용하여 병변 부위를 섬세하게 3차원적으로 지역화(localization) 하고 조직 내에 진입한 생검 바늘(Biopsy Needle)을 볼 수 있어서 작은 크기의 병변 감지가 용이하다. Medical imaging-based biopsy is an interventional procedure that minimizes damage to the surrounding normal tissue and extracts the samples necessary for the pathological diagnosis of abnormal lesions, including post-peritoneal, adrenal, pancreatic and lymph nodes. It is widely applied to various organs in the abdominal cavity, lungs, mediastinum, spine and extremities. Medical image-based biopsy enables the detection of small lesions by using high resolution images to delicately localize the lesion area in three dimensions and view biopsy needles that have entered the tissue. .
의료 영상 기반 생체검사를 시행하는 시술장에서는 CT 또는 C-arm 플로로스코피(fluoroscopy) 장치 및 그 영상을 이용하여, 생검 바늘의 삽입 경로가 가이드될 수 있는데, 예컨대, 삽입 경로의 계획에서 환자 몸에 생검 바늘의 진입 각도 및 삽입점을 정함으로써 생검 바늘의 삽입 경로를 정확히 계획할 수 있다. 환자가 시술장에 들어와서, 수술이 시작되면, 시술장에 놓인 이미지 획득 장치(예: Fluoroscopy 장치, CBCT 장치)를 계획된 경로, 즉 생검 바늘이 삽입될 방위와 동일한 방위로 맞춘다. In a procedure performing a medical imaging based biopsy, the CT or C-arm fluoroscopy device and its images can be used to guide the insertion path of the biopsy needle, for example in the planning of the insertion path. The path of insertion of the biopsy needle can be accurately planned by determining the entry angle and insertion point of the biopsy needle at. When the patient enters the procedure room and the operation begins, the image acquisition device (eg, Fluoroscopy device, CBCT device) placed in the procedure room is aligned with the planned path, i.e., the orientation in which the biopsy needle will be inserted.
상기 생체검사 과정에서 생검 바늘을 정확히 가이드하기 위해 항법뷰가 사용된다. 예를 들어, 도 1에 도시된 Surgeon's Eye View와 같은 항법뷰(navigation view)에서는 생검 바늘로 삽입점(entry point)을 찌르면, 타겟(target)의 센터포인트가 보이고, 삽입점을 기준으로 생검 바늘이 점처럼 보인다. 이러한 항법뷰에서는 타겟이 한 점으로 표시되고, 한 점을 중심으로 원이 그려져 있다. 여기서 삽입 경로의 계획에 따라 어떠한 각도로 몇 밀리미터를 찌르는 것을 계획할 수 있다.A navigation view is used to accurately guide the biopsy needle during the biopsy process. For example, in a navigation view such as Surgeon's Eye View shown in FIG. 1, when the insertion point is inserted into the biopsy needle, the center point of the target is shown, and the biopsy needle is based on the insertion point. Looks like this point In this navigation view, the target is displayed as a point and a circle is drawn around the point. Here you can plan to stick a few millimeters at any angle depending on the planning of the insertion path.
그러나 최근에는 종양(tumor)이 균질한 것이 아니라 종양 내의 각 부위에 따라 종양의 생물학적 성질(예: DNA mutation, 악성도)이 다르다(heterogeneous)는 것이 정설로 받아짐에 따라, 어느 부위에서 조직을 채취했느냐가 종양의 진단, 종양의 치료효과 예측 및 환자의 예후 추정에 있어서 중요한 이슈가 되고 있다. 예컨대, 종양의 가장 자리에는 액티브한 종양 셀(cancer cell)이 위치하고, 종양의 내부는 괴사되어, 종양 셀이 없는 경우(necrosis)가 있어서, 종양의 중심을 목표로 생검 바늘을 삽입할 경우, 위음성으로 잘못 진단 되는 오류가 발생할 수 있는 것이다. 따라서 시술자가 이러한 이질성을 가지는 종양을 생체검사하기 위해 플로로스코피 영상을 보면서 경험에 의해 감각적으로 종양의 외연 부위를 의도적으로 찌르는 경우도 있다. 그러나 시술자가 종양의 한가운데를 계획대로 찌르는 것도 쉽지 않은데, 상대적으로 고난도의 종양 가장 자리에 분포하는 종양 셀을 정확히 생체검사하는 것은 기술적으로 매우 어려울 수 있다.Recently, however, it has been accepted that the tumor is not homogeneous but that the biological properties of the tumor (eg DNA mutation, malignancy) are different for each site within the tumor. Whether it is collected is an important issue in the diagnosis of the tumor, the prediction of the therapeutic effect of the tumor and the estimation of the patient's prognosis. For example, an active tumor cell is located at the edge of the tumor, and the inside of the tumor is necrotic, and there are no tumor cells (necrosis). When a biopsy needle is inserted at the center of the tumor, false negative Incorrectly diagnosed errors may occur. Therefore, the operator may intentionally stab the periphery of the tumor sensibly by experience while looking at the floroscopy image to biopsy this heterogeneous tumor. However, it is not easy for the operator to stick the center of the tumor as planned, and it can be technically very difficult to accurately biopsy tumor cells that are distributed over relatively difficult tumor edges.
또한, 이질적인 종양은 복수의 목표점에서 생체검사를 수행하는 멀티 스팟(multi-spot)으로 생체검사하고, 생체검사 위치와 샘플의 특성을 매칭하여 종양에서 위치에 따른 조직의 성질을 나타내는 맵을 작성하는 것이 의료적으로 매우 중요한데, 시술자가의 경험에 의해 멀티 스팟으로 생체검사하는 것은 더욱 어렵다.In addition, heterogeneous tumors are biopsied with multi-spots that perform biopsy at a plurality of target points, and maps representing the properties of tissues according to positions in the tumor are matched by matching the biopsy location with the characteristics of the sample. It is very important medically, and it is more difficult to biopsy multiple spots based on the operator's experience.
도 2는 미국 공개특허공보 제2013/0317363호에 개시된 절제술(ablation procedure)를 위한 네비게이션 화면의 일 예를 나타내는 도면으로서, ablation procedure의 타겟(138b)과 시술이 시행되었을 때, 예상되는 치료범위(138a)를 보여준다. 이에 의하면, 타겟의 이질성에 대한 고려가 없고, 타겟의 가장자리에 의료 도구를 도달시키기 위한 효과적인 가이드 방법의 개시가 없다.FIG. 2 is a diagram illustrating an example of a navigation screen for an ablation procedure disclosed in U.S. Patent Application Publication No. 2013/0317363. FIG. 2 illustrates a target treatment area 138b of an ablation procedure and an expected treatment range when the procedure is performed. 138a). There is no consideration of heterogeneity of the target and no disclosure of an effective guide method for reaching the medical instrument at the edge of the target.
이에 대하여 '발명의 실시를 위한 형태'의 후단에 기술한다.This will be described later in the section on Embodiments of the Invention.
여기서는, 본 개시의 전체적인 요약(Summary)이 제공되며, 이것이 본 개시의 외연을 제한하는 것으로 이해되어서는 아니된다(This section provides a general summary of the disclosure and is not a comprehensive disclosure of its full scope or all of its features).This section provides a general summary of the disclosure and is not a comprehensive disclosure of its full scope or all, provided that this is a summary of the disclosure. of its features).
본 개시에 따른 일 태양에 의하면(According to one aspect of the present disclosure), 바늘 삽입형 중재시술 로봇 장치에 있어서, 수술장 영상에 수술 계획 정보를 통합하는 컴퓨터;로서, 이질성을 가지는 환부(heterogeneous surgical target)의 가장 자리(border) 상의 목표점(target point) 및 바늘형 의료 도구의 삽입 경로를 포함하는 수술 계획 정보를 수술장 영상에 통합하는 컴퓨터; 바늘형 의료 도구를 구비하는 로봇;으로서, 바늘형 의료 도구가 삽입 경로를 따르도록 컴퓨터의 지시에 따라 동작하는 로봇; 그리고 컴퓨터와 연동되어 수술 계획 정보가 통합된 수술장 영상을 사용하여 환부의 가장 자리를 보여주는 사용자 인터페이스(UI);로서, 수술 계획에 따라 로봇이 동작할 때 목표점에 대한 바늘형 의료 도구의 끝의 예상 도달 위치를 보여주는 사용자 인터페이스;를 포함하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치가 제공된다.According to one aspect of the present disclosure (According to one aspect of the present disclosure), a needle implantable interventional robot apparatus, comprising: a computer integrating surgical planning information into an operating room image; a heterogeneous surgical target A computer incorporating surgical planning information, including a target point on the border of the device and insertion path of the needle-shaped medical instrument, into the operating room image; A robot having a needle-shaped medical tool, comprising: a robot operating according to a computer instruction such that the needle-shaped medical tool follows an insertion path; And a user interface (UI) showing the edge of the affected area using an operating room image in which the operation plan information is integrated with a computer. The operation of the end of the needle-shaped medical tool for the target point when the robot operates according to the operation plan. There is provided a needle-inserted interventional robot device comprising a; user interface showing the expected arrival position.
이에 대하여 '발명의 실시를 위한 형태'의 후단에 기술한다.This will be described later in the section on Embodiments of the Invention.
도 1은 Surgeon's Eye View의 일 예를 나타내는 도면,1 is a view showing an example of Surgeon's Eye View,
도 2는 미국 공개특허공보 제2013/0317363호에 개시된 ablation procedure를 위한 네비게이션 화면의 일 예를 나타내는 도면,2 is a view showing an example of a navigation screen for the ablation procedure disclosed in US Patent Publication No. 2013/0317363;
도 3은 본 개시에 따른 바늘 삽입형 중재시술 로봇 장치의 일 예를 설명하는 도면,3 is a view for explaining an example of the needle insertion interventional robot apparatus according to the present disclosure,
도 4는 수술전 영상에서 종양을 분할하고 수술 계획을 생성하는 방법의 일 예를 설명하는 도면,4 is a view illustrating an example of a method of dividing a tumor and generating a surgical plan in a preoperative image;
도 5는 종양의 가장 자리에서 생체검사의 복수의 목표점과 삽입 경로 및 삽입점을 포함하는 수술 계획의 일 예를 설명하는 도면,5 is a view illustrating an example of a surgical plan including a plurality of target points, an insertion path, and an insertion point of a biopsy at the edge of a tumor;
도 6은 종양과 삽입 경로가 시각화된 수술전 영상의 일 예를 설명하는 도면,6 is a diagram illustrating an example of a preoperative image in which a tumor and an insertion path are visualized;
도 7은 수술장 영상에 수술 계획이 통합되는 방법의 일 예를 설명하는 도면,7 is a diagram illustrating an example of a method in which an operation plan is integrated into an operating room image;
도 8은 환자와 생검 바늘의 상대적 위치 정보를 파악하는 위치 파악 수단의 일 예를 설명하는 도면,8 is a view for explaining an example of the positioning means for grasping the relative position information of the patient and the biopsy needle;
도 9는 사용자 인터페이스의 일 예를 설명하는 도면,9 illustrates an example of a user interface;
도 10은 리볼버 타입의 생검 바늘이 장착된 로봇의 일 예를 설명하는 도면.10 is a view for explaining an example of a robot equipped with a revolver type biopsy needle.
이하, 본 개시를 첨부된 도면을 참고로 하여 자세하게 설명한다(The present disclosure will now be described in detail with reference to the accompanying drawing(s)). The present disclosure will now be described in detail with reference to the accompanying drawing (s).
도 3은 본 개시에 따른 바늘 삽입형 중재시술 로봇 장치의 일 예를 설명하는 도면으로서, 바늘 삽입형 중재시술 로봇 장치(이하, 중재시술 로봇 장치)는 방사선 피폭을 저감하고, 시술 정확도 향상을 위한 생체검사 및 치료용 바늘 삽입형 영상 중재시술 로봇 시스템에 사용될 수 있다. 중재시술 로봇 장치는 복부, 흉부 등에서 1cm 급 병소의 생체검사 및 치료용으로 사용될 수 있다. 바늘형 의료 도구로는 생검 바늘(biopsy needle)을 예로 들 수 있다.3 is a view for explaining an example of the needle insertion intervention robot device according to the present disclosure, the needle insertion intervention robot device (hereinafter referred to as intervention intervention robot device) is a biopsy for reducing the radiation exposure, improving the accuracy of the procedure And therapeutic needle implantable imaging interventional robotic systems. The interventional robotic device can be used for biopsy and treatment of 1 cm-class lesions in the abdomen, chest, and the like. An example of a needle-type medical tool is a biopsy needle.
예를 들어, 중재시술 로봇 장치는, 의료 영상을 처리하거나 생성하는 컴퓨터(600), 컴퓨터와 연동되어 동작하는 로봇(100) 및 종양의 가장 자리에 생검 바늘(111)의 끝의 예상 도달 위치를 보여주고 가이드 하는 사용자 인터페이스(500)를 포함한다. 중재시술 로봇 장치는, 실시간으로 사용자 인터페이스(500)와 연동하여 로봇(100)을 제어하는 마스터 장치(200)와, 인체 내에서 생검 바늘(111)의 위치를 촬영하는 영상 촬영 장치(300)와, 로봇(100), 환자(50) 및 주변 장치들의 위치 및 자세를 모니터링하는 장치(400) 등을 포함할 수 있다.For example, the interventional robot device may include a computer 600 that processes or generates a medical image, a robot 100 that works in conjunction with the computer, and an estimated arrival position of the tip of the biopsy needle 111 at the edge of the tumor. It includes a user interface 500 for showing and guiding. The interventional robot device includes a master device 200 for controlling the robot 100 in real time in conjunction with the user interface 500, an image capturing device 300 for capturing the position of the biopsy needle 111 in the human body, and , The apparatus 100 for monitoring the position and posture of the robot 100, the patient 50, and peripheral devices, and the like.
도 4는 수술전 영상에서 종양을 분할하고 수술 계획을 생성하는 방법의 일 예를 설명하는 도면이다. 중재시술 로봇 장치는 폐, 신장, 간 등의 장기(organ)의 생체검사에 적용될 수 있으며, 장기 이외의 부위에도 적용이 배제되는 것은 아니다. 본 예에서는 폐를 중심으로 설명된다.4 is a diagram illustrating an example of a method of segmenting a tumor and generating a surgical plan in a preoperative image. The interventional robot device may be applied to biopsies of organs such as lungs, kidneys, liver, etc., and application to other parts of the organs is not excluded. In this example, the lungs are described.
도 4에 도시된 바와 같이, 환자의 폐에 대해 수술전 영상을 쓰레쉬홀딩(thresholding)하여 환부(10; 예: 종양)을 분할(segmentation)하고 수술 계획을 생성한다. 예를 들어, 볼륨 흉부 시티 영상(volumetric chest CT images; 이하 폐 영상)을 획득한 후, 폐 영상이 분할(segmentation)되어 분할된 폐 영상이 준비된다. 분할의 결과, 폐 영상에 포함된 해부학적 구조물(예: 혈관, 갈비뼈, 에어웨이(airway), 폐 경계 등)이 복셀의 3차원 집합으로 추출될 수 있고, 폐 영상으로부터 분할된 혈관, 갈비뼈(rib), 에어웨이(airway) 등의 해부학적 구조물이 폐 마스크(lung mask), 혈관 마스크(vessel mask), 립 마스크(Rib mask), 에어웨이 마스크(airway mask) 등으로 저장될 수 있다. 종양(10)에 알맞은 HU값을 threshold 값으로하여 분할 기법(예: adaptive threshold)에 의해 종양(10)이 분할된다. 도 4는 종양(10)이 분할된 폐 영상의 엑시얼(axial) 단면의 일 예를 보여준다.As shown in FIG. 4, the preoperative image is thresholded on the lung of the patient to segment the lesion 10 (eg, a tumor) and generate a surgical plan. For example, after obtaining volumetric chest CT images (pulmonary images), the lung images are segmented to prepare a divided lung image. As a result of the segmentation, anatomical structures (eg, blood vessels, ribs, airways, lung boundaries, etc.) included in the lung image can be extracted into a three-dimensional set of voxels, and blood vessels, ribs, segmented from the lung image Anatomical structures, such as airways and the like, may be stored as a lung mask, a vessel mask, a rib mask, an airway mask, or the like. The tumor 10 is divided by a segmentation technique (eg, adaptive threshold) using a HU value appropriate for the tumor 10 as a threshold value. 4 shows an example of an axial cross section of a lung image in which the tumor 10 is divided.
컴퓨터(600)에는 환자의 수술전 영상이 로딩되며, 시술장에서 획득된 환자의 수술장 영상과 수술전 영상이 컴퓨터에 의해 정합(registration)된다. 정합의 결과 수술전 영상을 사용하여 만들어진 삽입 경로(82, 84), 삽입점(41), 종양 상의 목표점 등을 포함하는 수술 계획이 수술장 영상에 이전된다. 이에 대해서는 더 후술된다.The computer 600 is loaded with a preoperative image of the patient, and the operating room image and the preoperative image of the patient acquired at the procedure room are registered by the computer. As a result of the registration, a surgical plan including the insertion paths 82 and 84, the insertion point 41, the target point on the tumor, and the like made using the preoperative image is transferred to the operating room image. This is further described below.
도 5는 종양의 가장 자리에서 생체검사의 복수의 목표점과 삽입 경로 및 삽입점을 포함하는 수술 계획의 일 예를 설명하는 도면이다.5 is a diagram illustrating an example of a surgical plan including a plurality of target points, an insertion path, and an insertion point of a biopsy at the edge of a tumor.
전술한 바와 같이, 종양(10)은 가장 자리(11; border) 또는 바깥 측 벽(wall)에는 활성이 있는(active) 암세포가 있고, 종양(10)의 내부에는 썩은 물이 있을 수 있다. 따라서, 종양의 가장 자리(11)와 내부가 영상에서 인텐시티가 차이가 나며, 종양(10)의 가장자리와 내부를 구분되도록 threshold하면 도 5에 도시된 바와 같이 종양의 가장 자리(11)와 내부(15)가 구분되도록 분할된다. 또는, 종양(10)의 대사적 특성에 따라, FDG 등의 섭취가 높은 대사체학적으로 활성부위와 대사가 낮은 부위를 구분할 수 있도록, FDG-PET/CT 영상을 이용하여, 종양(10)의 standardized uptake value (SUV) 값을 threshold하면 도 5에 도시된 바와 같이 종양(10)을 분할할 수 있다. 또는, 종양(10) 전체를 분할해서 종양(10) 경계로 부터 Distance map 또는 Eroding 등의 Morphological operator를 이용하여 종양(10)의 주변부를 정의-분할 한다.As described above, the tumor 10 may have active cancer cells at the edge 11 or the outer wall, and may have rotten water inside the tumor 10. Therefore, when the edge 11 and the inside of the tumor have different intensities in the image, and the thresholds are distinguished from the edges and the inside of the tumor 10, as shown in FIG. 5, the edge 11 and the inside of the tumor ( 15) are divided to separate. Alternatively, according to the metabolic characteristics of the tumor 10, the FDG-PET / CT image is used to distinguish between active and low metabolic sites with high intake of FDG and the like. Thresholding the standardized uptake value (SUV) allows the tumor 10 to be segmented as shown in FIG. 5. Alternatively, the entire tumor 10 may be divided and defined-divided around the tumor 10 using Morphological operators such as distance map or Eroding from the tumor 10 boundary.
이렇게 분할된 종양(10)은 3차원 이미지로 생성될 수 있다. 따라서 영상처리 소프트웨어에 의해 필요한 방향에서 종양(10)의 단면을 볼 수 있고, 상기 단면에는 종양(10)이 주변과 구분되게 시각화되며, 종양(10)의 가장자리는 종양(10)의 내부와 구분될 수 있다. 예를 들어, 엑시얼 뷰(axial view), 코로날 뷰(coronal view) 및 세지털 뷰(sagittal view)와 같이 대표적인 방향에서 종양(10)을 볼 수 있고, 이를 기초로 수술 계획을 만들 수 있다. The divided tumor 10 may be generated as a 3D image. Therefore, the cross section of the tumor 10 can be seen in the direction required by the image processing software, the tumor 10 is visualized to be distinguished from the surroundings, and the edge of the tumor 10 is distinguished from the interior of the tumor 10. Can be. For example, the tumor 10 can be viewed in representative directions, such as axial view, coronal view, and sagittal view, and a surgical plan can be created based on this. .
도 5에 도시된 바와 같이, 종양의 가장 자리(11)에서 복수의 생체검사 목표점(target point; 예; 21, 22, 23, 24, 26)을 잡는다. 전술한 바와 같이, 분할에 의해 종양의 가장 자리(11)가 내부와 주변으로부터 분할될 수 있다. 종양(10)의 내부는 세포가 죽은 상태(necrosis)이고, 가장 자리(11)에는 활성의 암세포가 분포할 수 있다. 목표점의 개수는 종양(10)에서 여러 위치로서 가장 자리(11)는 물론 종양(10)의 내부에도 목표점이 정해질 수 있다. 종양(10)이 이질적이고 위치에 따라 디엔에이(DNA) 뮤테이션이 다를 수 있어서 종양(10)에서 위치에 따라 특정 약물이나 치료를 행하였을 때 그 효과가 다를 수 있다. 따라서 생체검사를 한 점에서만 하면 다른 곳은 살아서 재발되는 문제가 있다. 종양(10)의 사이즈에 따라 가장 자리(11)의 두께가 통계적으로 대략 추측될 수 있고, 예를 들어, 종양(10)이 2센티 정도로 크면 종양(10)의 센터를 생체검사하는 것이 아니라 또는 센터를 생체검사하는 것에 추가하여 가장 자리(11)에 복수의 생체검사 목표점을 잡도록 수술 계획을 한다. 생검 바늘(111)이 서브 밀리미터 정확도(accuracy)를 가진다면, 20 밀리미터 폭의 사이즈를 가지는 종양(10)의 경우 1밀리미터*1밀리미터의 정확도로 가장 자리(11)를 찌를 수 있도록 수술 계획을 할 수 있다. As shown in FIG. 5, a plurality of biopsy target points (eg, 21, 22, 23, 24, 26) are captured at the edge 11 of the tumor. As described above, the edge 11 of the tumor may be divided from the inside and the periphery by the division. The interior of the tumor 10 is a cell (necrosis) dead, the edge 11 may be active cancer cells are distributed. The number of target points may be determined at various positions in the tumor 10 as well as at the edge 11 as well as inside the tumor 10. The tumor 10 may be heterogeneous, and the DNA mutation may be different according to the location, and thus the effect may be different when a particular drug or treatment is performed according to the location in the tumor 10. Therefore, if you do only one biopsy, there is a problem that other places live and relapse. Depending on the size of the tumor 10, the thickness of the edge 11 can be estimated approximately statistically, for example, if the tumor 10 is as large as 2 centimeters, then the center of the tumor 10 is not biopsied or In addition to the biopsy of the center, a surgical plan is made to set a plurality of biopsy target points on the edge 11. If the biopsy needle 111 has a submillimeter accuracy, the surgical plan may be made to stab the edge 11 with an accuracy of 1 millimeter * 1 millimeter for a tumor 10 having a size of 20 millimeters wide. Can be.
종양(10)에서 복수의 목표점에서 생체검사가 바람직하고, 이후 샘플의 성질과 위치를 매칭하여 종양(10)의 맵을 작성하면 의료적으로 의미가 크다. 각 목표점에 도달하는 각 삽입 경로들이 만들어질 수 있고, 예를 들어, 삽입 경로가 교차하는 혈관이나 다른 구조물이 최소가 되게 삽입 경로가 생성된다. 이에 따라 환자의 피부에 생검 바늘(111)의 삽입점이 정해진다. 삽입점은 목표점의 개수보다 작을 수 있다. 예를 들어, 생검 바늘(111)을 찌른 후에 폐로부터 완전히 빼지 않고 방향을 변경하여 다른 목표점을 생체검사할 수도 있다.A biopsy is preferable at a plurality of target points in the tumor 10, and then a map of the tumor 10 is made medically meaningful by matching the properties and positions of the samples. Each insertion path can be made to reach each target point, for example, an insertion path is created such that blood vessels or other structures that intersect the insertion path are minimal. Accordingly, the insertion point of the biopsy needle 111 is determined on the skin of the patient. The insertion point may be smaller than the number of target points. For example, after the biopsy needle 111 is stabbed, other target points may be biopsied by changing the direction without completely removing it from the lung.
도 6은 종양과 삽입 경로가 시각화된 수술전 영상의 일 예를 설명하는 도면으로서, 실재 갈비뼈와, 갈비뼈 사이로 삽입 경로(예: 82)가 3D로 시각화된다. 상기와 같이 결정된 복수의 목표점, 삽입점, 삽입 경로가 수술전 영상에 부가되어 수술 계획이 생성된다. 수술전 영상은 3차원 영상이며, 볼륨렌더링을 통해 도 6에 도시된 것과 같이, 수술 계획은 3차원으로 생성될 수 있다. 종양(10)이 주변으로부터 분할되어 있고, 가장 자리(11)가 구분되도록 표시된다. 3차원으로 삽입 경로가 시각화되며, 종양의 가장 자리(11)에는 목표점(예: 21)이 표시된다.FIG. 6 illustrates an example of a preoperative image in which a tumor and an insertion path are visualized, and an insertion path (eg, 82) is visualized in 3D between a real rib and a rib. The plurality of target points, insertion points, and insertion paths determined as described above are added to the preoperative image to generate a surgical plan. The preoperative image is a 3D image, and as shown in FIG. 6 through volume rendering, the surgical plan may be generated in 3D. The tumor 10 is divided from the periphery and is marked so that the edge 11 is distinguished. The insertion path is visualized in three dimensions, and a target point (eg, 21) is displayed at the edge 11 of the tumor.
종양(10)은 콘트라스트가 거의 없어서 플로로스코피 시티로는 잘 보이지 않고, 일반적으로 종양(10)은 대략 원형으로 표시했지만, 본 예에서는 수술전 영상에서 종양(10)의 가장자리가 구분되도록 분할하고, 이것이 수술장 영상에 나타나도록 정합한다. Tumor 10 has little contrast and is hardly visible in fluoroscopy city, and tumor 10 is generally shown in a circular shape, but in this example, the tumor 10 is divided so that the edge of tumor 10 is distinguished in the preoperative image. , So that it appears on the operating room image.
종양의 가장 자리(11)가 분명하게 분할되지 않더라도 종양(10)과 주변이 분명히 구분되게 분할된다면, 종양(10)과 주변의 경계로부터 일정 두께를 가장 자리(11)로 가정하고 생체검사할 수도 있다.Even if the edge 11 of the tumor is not clearly divided, if the tumor 10 and the periphery are clearly divided, the biopsy may be performed assuming a certain thickness as the edge 11 from the boundary between the tumor 10 and the periphery. have.
도 7은 수술장 영상에 수술 계획이 통합되는 방법의 일 예를 설명하는 도면으로서, 시술장에서 수술장 영상이 획득되며, 수술전 영상과 수술장 영상이 정합되어 수술장 영상에 삽입 경로 등 수술 계획이 이전된다. 의료 영상 간의 정합의 방법으로서 강체 정합(rigid registration) 및 변형가능한(deformable) 정합 방법 등이 이용될 수 있다.FIG. 7 is a view illustrating an example of a method of integrating a surgical plan into an operating room image. An operating room image is acquired at an operating room, and the preoperative image and the operating room image are matched to insert an operation path into the operating room image. The plan is transferred. As a method of registration between medical images, rigid registration and deformable registration methods and the like may be used.
삽입 경로(82)는 사용자 인터페이스(500)를 통해 수정될 수 있고, 호흡 또는 움직임을 고려하여 부적절한 삽입 경로가 제거될 수 있다. 도 7(a)는 수술전 영상의 일 예이고, 도 7(b)는 수술장 영상과 수술전 영상이 정합된 영상으로서 수술 계획이 이전된 영상의 일 예이다. The insertion path 82 may be modified through the user interface 500, and an inappropriate insertion path may be removed in consideration of breathing or movement. 7 (a) is an example of the pre-operative image, Figure 7 (b) is an image of the operation plan transfer image as the image of the operating room image and the pre-operative image is matched.
상기 3D 시각화된 삽입 경로(82) 및 종양의 가장 자리(11) 상의 목표점을 더욱 확실하게 확인(confirm)하기 위해, MPR(multiplanar reconstruction; 예: axial view, coronal view, sagittal view) 상에 삽입 경로(82), 삽입점, 목표점이 오버레이되어 표시될 수 있다(도 7에는 axial view가 예시됨).In order to more reliably confirm the target point on the 3D visualized insertion path 82 and the edge 11 of the tumor, the insertion path on a multiplanar reconstruction (MPR, axial view, coronal view, sagittal view) 82, the insertion point and the target point may be overlaid and displayed (the axial view is illustrated in FIG. 7).
이와 같이, MPR 상에서 확인된 삽입 경로를 따라 생검 바늘(111)이 가이드되어 시술이 수행될 수 있다. 예를 들어, 최종 컨펌된 삽입 경로가 TCP/IP 또는 전용 통신 프로토콜을 이용하여 로봇 또는 사용자 인터페이스(예: 항법 장치) 등으로 전송된다. 생검 바늘(111)은 단일 바늘 타입도 물론 가능하지만 멀티 스팟으로 생체검사하기 위해서는 리볼버 타입(도 10 참조)으로 복수 개가 로봇에 장착되어, 순차적으로 각 목표점을 생체검사하는 것이 더 효과적일 수 있다. 한편, 폐를 찌르고 폐로부터 생검 바늘(111)을 빼고 다시 찌르는 횟수를 줄이기 위해 도 5에 도시된 것과 생검 바늘(111)을 빼지 않고 삽입 경로의 방향을 변경하여 다른 목표점을 찌를 수 있다.As such, the biopsy needle 111 may be guided along the insertion path identified on the MPR to perform the procedure. For example, the final confirmed insertion path is transmitted to a robot or user interface (e.g. navigation device) or the like using TCP / IP or a dedicated communication protocol. The biopsy needle 111 may of course be a single needle type, but a plurality of revolver types (see FIG. 10) may be mounted on the robot in order to biopsy a multi-spot, and thus it may be more effective to biopsy each target point sequentially. On the other hand, to reduce the number of times to stab the lungs and remove the biopsy needle 111 from the lungs again, it is possible to stab other target points by changing the direction of the insertion path without removing the biopsy needle 111 and those shown in FIG.
도 8은 환자와 생검 바늘의 상대적 위치 정보를 파악하는 위치 파악 수단의 일 예를 설명하는 도면이다.8 is a view for explaining an example of the positioning means for grasping the relative position information of the patient and the biopsy needle.
환자(960)와 생검 바늘(912)의 상대적 위치 관계를 파악하는 위치 파악 수단으로는 여러 가지가 사용될 수 있다. 예를 들어, 도 8에 도시된 바와 같이, 환자(960), 생검 바늘(912)을 구비한 로봇(911), 적외선 카메라(991), 적외선 반사구 어셈블리(911,913,914), 모니터(920) 그리고 컴퓨터(940)가 구비되어 있다. 환자(960)의 위치를 나타내는 복수의 적외선 반사구(911,914)와 생검 바늘(912)의 끝에 마련된 복수의 적외선 반사구 내지는 적외선 에미터(913)를 적외선 카메라(991)가 파악함으로써, 바늘(912)과 환자(960)의 위치가 파악될 수 있다. 마스터 콘솔의 전체 운용을 위한 컴퓨터(940)가 구비되어 있고, 모니터(920)도 구비되어 있다. 여기서 컴퓨터(940) 및 모니터(920)는 도 3에서 설명된 컴퓨터(600) 및 사용자 인터페이스(500)에 대응할 수 있다.Various methods may be used as the positioning means for determining the relative positional relationship between the patient 960 and the biopsy needle 912. For example, as shown in FIG. 8, the patient 960, the robot 911 with the biopsy needle 912, the infrared camera 991, the infrared reflector assemblies 911, 913, 914, the monitor 920 and the computer ( 940 is provided. The infrared camera 991 detects the plurality of infrared reflectors 911 and 914 indicating the position of the patient 960 and the plurality of infrared reflectors or infrared emitters 913 provided at the ends of the biopsy needle 912, thereby allowing the needle 912 and The location of the patient 960 can be identified. A computer 940 is provided for overall operation of the master console, and a monitor 920 is also provided. The computer 940 and the monitor 920 may correspond to the computer 600 and the user interface 500 described with reference to FIG. 3.
환자(960)와 바늘(912)의 상대적인 위치 관계를 이용하는 경우에, 컴퓨터(940)는 수술용 항법장치의 기능도 한다. 마스터(200; 도 1 참조) 조작자의 조정에 따라, 컴퓨터(940)에 의해 로봇(911)의 생검 바늘(912)이 작동된다. 적외선 반사구 어셈블리(911)는 환자(960)에 고정되어 환자(960)의 위치를 나타내고, 적외선 반사구 어셈블리(913)는 생검 바늘(912)에 고정되어 생검 바늘(912)의 위치를 나타내고, 적외선 반사구 어셈블리(914)는 환자(960)의 가슴에 위치되어, 환자의 호흡, 재채기와 같은 환자 움직임을 표시한다. 위치 파악을 위한 위치 감지 수단으로서, 적외선 카메라와 적외선 반사구가 사용되었지만, 자기장을 이용하는 것도 가능하며, 위치 파악이 가능한 수단이라면 어떠한 것이 사용되어도 좋다. 일 예로, 생검 바늘에 마그네틱 센서를 부착하고 얼마나 움직이는지 카메라로 트래킹하는 것이 가능하다.In the case of using the relative positional relationship between the patient 960 and the needle 912, the computer 940 also functions as a surgical navigation device. In accordance with the manipulation of the master 200 (see FIG. 1) operator, the biopsy needle 912 of the robot 911 is actuated by the computer 940. The infrared reflector assembly 911 is fixed to the patient 960 to indicate the position of the patient 960, the infrared reflector assembly 913 is fixed to the biopsy needle 912 to indicate the position of the biopsy needle 912, and the infrared reflector Assembly 914 is positioned on the chest of patient 960 to indicate patient movement, such as breathing, sneezing of the patient. As the position sensing means for positioning, an infrared camera and an infrared reflector are used, but a magnetic field can be used, and any means can be used as long as the position can be sensed. For example, it is possible to attach a magnetic sensor to the biopsy needle and track with the camera how far it moves.
적외선 반사구 어셈블리(911)는 환자(960)의 위치 정보를 나타내는 것으로 이용되어도 좋고, 전체 시스템의 기준 위치로서 기능하여도 좋으며, 환자(960)에 고정될 수도 있지만, 수술대에 고정될 수도 있고, 수술대에 기준 위치로 기능하는 별도의 적외선 반사구 어셈블리(미도시)를 추가하여도 좋다. 환자(960)에 대한 생검 바늘(912)의 위치를 파악할 수 있다.The infrared reflector assembly 911 may be used to indicate the location information of the patient 960, may function as a reference position of the entire system, may be fixed to the patient 960, but may be fixed to the operating table, or the operating table. An additional infrared reflector assembly (not shown) may serve as a reference position. The location of the biopsy needle 912 relative to the patient 960 can be determined.
상기 예들과 다르게, 도 3을 참조하면, 로봇(100) 자체가 위치를 알고 있는 예도 가능하다. 예를 들어, 로봇(100)이 생검 바늘(111)을 잡고 있고, 로봇(100) 자체적으로 시술장 내에서 자신의 좌표를 알 수 있다. 또한, 생검 바늘(111)이 몇 밀리미터를 움직이는지 로봇(100) 자체적으로 감지하는 것이 가능하다. 따라서 시술장 영상의 공간에서 생검 바늘(111)의 방위와 위치를 컴퓨터가 계산할 수 있다.Unlike the above examples, referring to FIG. 3, an example in which the robot 100 itself knows a location is possible. For example, the robot 100 is holding the biopsy needle 111, the robot 100 itself can know its coordinates in the procedure. In addition, the robot 100 itself may detect how many millimeters the biopsy needle 111 moves. Therefore, the computer can calculate the orientation and position of the biopsy needle 111 in the space of the procedure image.
또한, 시술장 영상을 획득하는 플로로스코피 시티로 촬영하여 생검 바늘(111)의 현재 위치를 정합된 수술장 영상 공간에서 컴퓨터가 계산할 수 있다.In addition, the computer may calculate the current position of the biopsy needle 111 by matching with the floroscopy city for acquiring the procedure image in a matched operating image space.
상기 위치 파악 수단들은 하나만 사용하는 것보다 복수의 방법을 사용하여 위치 관계를 파악하는 것이 정확성과 안전성 측면에서 바람직하다. 이러한 하나 이상의 위치 파악 수단에 의해 파악된 환자와 생검 바늘(111)의 상대적 위치관계를 컴퓨터가 계산함으로써 생검 바늘(111)과 종양의 가장 자리(11)의 목표점 간의 거리가 계산될 수 있다. 따라서 수술 계획에서 결정된 삽입 경로, 삽입 각도, 삽입점, 삽입 거리로 찌를 때, 생검 바늘(111)의 끝의 예상 도달 위치가 컴퓨터에 의해 계산된다. 예상 도달 위치는 정합된 수술장 영상에 표시되어 시술자에게 정보를 준다. 이에 대해서는 더 후술된다.It is preferable in terms of accuracy and safety that the positioning means uses a plurality of methods to determine the positional relationship rather than using only one. The distance between the biopsy needle 111 and the target point of the edge 11 of the tumor can be calculated by the computer calculating the relative positional relationship between the patient and the biopsy needle 111 identified by the one or more locating means. Therefore, when poking with the insertion path, the insertion angle, the insertion point, and the insertion distance determined in the surgical plan, the expected arrival position of the tip of the biopsy needle 111 is calculated by the computer. The expected arrival location is displayed on the matched operating room image to inform the operator. This is further described below.
도 9는 사용자 인터페이스의 일 예를 설명하는 도면으로서, 사용자 인터페이스(500)에는 복수의 화면(510, 520, 530, 540)이 표시된다. 예를 들어, 상측 화면에는 CT 이미지(CT volume; 예: 플로로스코피로부터 전달된 이미지)가 있고, 폐의 여러 구조물이나 병변을 보여주는 mask가 표시된다. 또한, 수술 계획을 수행하기 위한 버튼 및 시술 정보나 종류를 표시하는 정보 창이 있다.9 illustrates an example of a user interface, and a plurality of screens 510, 520, 530, and 540 are displayed on the user interface 500. For example, there is a CT image (CT volume; for example, an image transmitted from a floroscopy) on the upper screen, and a mask showing various structures or lesions of the lung is displayed. In addition, there is a button for performing an operation plan and an information window for displaying procedure information or type.
메인 화면(510)에는 3D로 환부의 가장자리(11)가 시각적으로 구분되도록 표시된 정합된 수술장 영상이 표시된다. 또한, 수술장 영상으로부터 컴퓨터가 생성해 내는 MPR 영상(예: 520, 530, 540)이 우측에 표시된다. 예를 들어, 엑시얼 뷰(520; axial view), 코로날 뷰(520; coronal view) 및 세지털 뷰(530; sagittal view)와 같은 방향에서 가장 자리(11)가 구분된 종양(10)과, 삽입 경로(82), 삽입점(41), 가장 자리(11)에서 생검 바늘 끝의 예상 도달 위치(예: 21, 22, 23, 24, 26)가 표시된다. The main screen 510 displays a matched operating room image displayed in 3D such that the edge 11 of the affected part is visually distinguished. In addition, MPR images (eg, 520, 530, 540) generated by the computer from the operating room image are displayed on the right side. For example, the tumor 10 having the edge 11 separated in the same direction as the axial view, the coronal view, and the sagittal view 530; In the insertion path 82, insertion point 41, and edge 11, the expected arrival positions of the biopsy needle tips (eg, 21, 22, 23, 24, 26) are indicated.
생검 바늘(111)과 환자(50)의 상대적 위치가 도 8에서 설명된 것과 같은 방법으로 알 수 있고, 수술장 영상(예: 510)에는 정합된 종양(10)이 시각적으로 표시되며 종양의 가장 자리(11)가 나타난다. 따라서 종양의 가장자리(11) 상의 목표점과 생검 바늘(111) 간의 거리와, 현재 얼라인된 각도로 계획된 깊이로 찌르면 어디에 도달할 것인지 예상 도달 위치(예: 21, 22, 23, 24, 26)를 컴퓨터가 계산할 수 있다. 따라서 이렇게 계산된 예상 도달 위치를 컴퓨터는 정합된 수술장 영상에 표시할 수 있다. 따라서 목표점과 예상 도달 위치(21, 22, 23, 24, 26)의 일치 여부를 시각적으로 알 수 있다. 일치하는 경우, 조작자가 마스터 콘솔을 통해 컴퓨터(600; 도 3 참조)에 지시를 내리고 컴퓨터(600)에 연동된 로봇(100)의 동작에 의해 생검 바늘(111)이 인체에 삽입된다. 한편, 전술한 바와 같이 플로로스코피, 콘빔시티 등으로 현재의 수술장 영상을 다시 찍어서 생검 바늘(111)의 끝의 위치가 수술장 영상 공간에서 계산될 수 있고, 이것과 정합된 종양(10)의 목표점을 비교하여 목표점에 일치하는 지를 다시 검사할 수 있다. 만약 예상 도달 위치와 목표점이 불일치 한다면, 수술 계획을 수정할 수 있다. 예를 들어, 화면에는 목표점과 예상 도달 위치 간의 차이가 표시되고, 컴퓨터는 로봇의 위치를 수정하는 지시를 만들 수 있다. 또는 시술자가 삽입 경로나 깊이를 수정하여 컴퓨터에 지시할 수 있다.The relative position of the biopsy needle 111 and the patient 50 can be known in the same manner as described in FIG. 8, and the operating room image (eg, 510) visually displays the matched tumor 10 and simulates the tumor. Seat 11 appears. Therefore, the distance between the target point on the edge 11 of the tumor and the biopsy needle 111 and the expected arrival position (eg, 21, 22, 23, 24, 26) where it will be reached if stabbed to the projected depth at the currently aligned angle. The computer can calculate it. Thus, the computer can display the calculated estimated arrival position on the matched operating room image. Therefore, it is possible to visually know whether the target point and the expected arrival position (21, 22, 23, 24, 26) match. If there is a match, the operator instructs the computer 600 (see FIG. 3) via the master console and the biopsy needle 111 is inserted into the human body by the operation of the robot 100 linked to the computer 600. On the other hand, as described above, by re-taking the current operating room image, such as fluoroscopy, cone beam city, the position of the end of the biopsy needle 111 can be calculated in the operating room image space, the tumor 10 matched with this You can recheck if your goal is met by comparing the targets. If the expected arrival position and target point do not match, the surgical plan can be modified. For example, the screen displays the difference between the target point and the expected arrival position, and the computer can create an instruction to correct the position of the robot. Alternatively, the operator can instruct the computer by modifying the insertion path or depth.
이하 본 개시의 다양한 실시 형태에 대하여 설명한다.Hereinafter, various embodiments of the present disclosure will be described.
(1) 바늘 삽입형 중재시술 로봇 장치에 있어서, 수술장 영상에 수술 계획 정보를 통합하는 컴퓨터;로서, 이질성을 가지는 환부(heterogeneous surgical target)의 가장 자리(border) 상의 목표점(target point) 및 바늘형 의료 도구의 삽입 경로를 포함하는 수술 계획 정보를 수술장 영상에 통합하는 컴퓨터; 바늘형 의료 도구를 구비하는 로봇;으로서, 바늘형 의료 도구가 삽입 경로를 따르도록 컴퓨터의 지시에 따라 동작하는 로봇; 그리고 컴퓨터와 연동되어 수술 계획 정보가 통합된 수술장 영상을 사용하여 환부의 가장 자리를 보여주는 사용자 인터페이스(UI);로서, 수술 계획에 따라 로봇이 동작할 때 목표점에 대한 바늘형 의료 도구의 끝의 예상 도달 위치를 보여주는 사용자 인터페이스;를 포함하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.(1) A needle implantable interventional robot device, comprising: a computer integrating surgical planning information into an operating room image, comprising: a target point and a needle point on the edge of a heterogeneous surgical target; A computer incorporating surgical planning information including an insertion path of a medical instrument into the operating room image; A robot having a needle-shaped medical tool, comprising: a robot operating according to a computer instruction such that the needle-shaped medical tool follows an insertion path; And a user interface (UI) showing the edge of the affected area using an operating room image in which the operation plan information is integrated with a computer. The operation of the end of the needle-shaped medical tool for the target point when the robot operates according to the operation plan is performed. And a user interface showing an expected arrival position.
(2) 수술 계획은 환부의 가장 자리 상의 복수의 목표점을 포함하며, 컴퓨터는 환부에서 각 목표점의 위치와, 바늘형 의료 도구에 의해 각 목표점에서 획득된 환부의 각 샘플을 매칭하는 환부 위치-샘플 정보를 저장하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.(2) The surgical plan includes a plurality of target points on the edge of the lesion, wherein the computer matches the location of each target point in the lesion and each sample of the lesion obtained at each target point by a needle-shaped medical instrument. Needle insertion interventional robot device, characterized in that for storing information.
(3) 사용자 인터페이스는: 환부의 단면을 보여주는 적어도 하나의 화면;으로서, 환부의 가장 자리 및 바늘형 의료 도구의 끝의 예상 도달 위치를 보여주는 화면;을 포함하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.(3) the user interface includes: at least one screen showing a cross section of the affected part, the screen showing the edge of the affected part and the expected arrival position of the end of the needle-shaped medical tool; Device.
(4) 환부와 바늘형 의료 도구의 상대적 위치 정보를 파악하여 컴퓨터에 제공하는 위치 파악 수단;을 포함하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.And (4) a positioning means for grasping relative position information of the affected part and the needle-type medical tool and providing the same to a computer.
(5) 컴퓨터는 목표점과 바늘형 의료 도구의 끝의 예상 도달 위치를 비교하여 불일치할 때, 일치시키기 위한 로봇의 위치 변경 정보를 사용자 인터페이스에 표시하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.(5) The needle inserted interventional robot device, characterized in that the computer compares the estimated arrival position of the target point with the tip of the needle-type medical instrument and displays the position change information of the robot for matching on the user interface when there is a mismatch.
(6) 적어도 하나의 화면은 환부의 엑시얼 뷰(axial view), 코로날 뷰(Coronal view), 세지털 뷰(sagittal view) 중 적어도 하나를 포함하며, 엑시얼 뷰, 코로날 뷰 및 세지털 뷰 중 적어도 하나에는 환부의 가장 자리, 바늘형 의료 도구의 끝의 예상 도달 위치가 표시된 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.(6) The at least one screen includes at least one of an axial view, a coronal view, and a sagittal view of the affected area, wherein the at least one screen includes an axial view, a coronal view, and a digital view. At least one of the views is a needle insertion interventional robot device, characterized in that the edge of the affected area, the expected arrival position of the end of the needle-shaped medical tool is displayed.
(7) 사용자 인터페이스는: 수술 계획이 통합된 3차원의 수술장 영상을 보여주는 추가의 화면;으로서, 환부의 가장 자리 및 바늘형 의료 도구의 끝의 예상 도달 위치를 보여주는 추가의 화면;을 포함하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.(7) the user interface includes: an additional screen showing a three-dimensional operating room image incorporating a surgical plan; an additional screen showing an edge of the affected part and an expected arrival position of the end of the needle-shaped medical tool; Needle insertion type interventional robot device, characterized in that.
(8) 사용자 인터페이스를 통해 변경된 수술 계획이 입력된 때, 컴퓨터는 변경된 예상 도달 위치를 계산하고, 사용자 인터페이스는 화면 및 추가의 화면에 변경된 예상 도달 위치를 표시하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.(8) When the changed surgical plan is input through the user interface, the computer calculates the changed estimated arrival position, and the user interface displays the changed estimated arrival position on the screen and additional screens. Device.
(9) 위치 파악 수단은: 바늘형 의료 도구 및 환자를 표지하는 마커; 그리고(9) The locating means includes: a marker for marking the needle-shaped medical tool and the patient; And
마커를 감지하는 감지 장치;를 포함하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.Needle insertion interventional robot device comprising a; sensing device for detecting a marker.
(10) 바늘형 의료 도구는 생검 바늘이고, 생검 바늘은 리볼버 타입으로 복수 개가 로봇에 장착되어, 순차적으로 각 목표점을 생체검사하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.(10) The needle-type medical instrument is a biopsy needle, the needle of the biopsy needle is a revolver type, a plurality of dogs mounted on the robot, the needle insertion interventional robot device, characterized in that to sequentially biopsy each target point.
본 개시에 따른 하나의 바늘 삽입형 중재시술 로봇 장치에 의하면, 이질성을 가지는 종양의 가장 자리에서 생체검사를 더 정확히 할 수 있고, 사람에 의한 생체검사시 발행하는 오류나 위험을 줄일 수 있다. According to one of the needle-inserted interventional robot device according to the present disclosure, it is possible to more accurately biopsy at the edge of the heterogeneous tumor, and to reduce the error or risk issued by the human biopsy.
본 개시에 따른 다른 하나의 바늘 삽입형 중재시술 로봇 장치에 의하면, 환부를 멀티 스팟으로 생체검사하는 경우에 더 효과적이다.According to another needle-insertion interventional robot device according to the present disclosure, it is more effective when the affected part is biopsyed with multi-spots.
본 개시에 따른 또 다른 하나의 바늘 삽입형 중재시술 로봇 장치에 의하면,복수의 목표점에 대해 생체검사를 수행하고 샘플을 종양에서 위치와 연관하여 기록하면 종양에서의 위치에 따라 약물이나 치료 계획을 수립하기 위해 맵을 만들 수 있다.According to another needle-insertion interventional robotic device according to the present disclosure, performing a biopsy on multiple target points and recording the sample in association with the location in the tumor to establish a drug or treatment plan according to the location in the tumor. To create a map.

Claims (10)

  1. 바늘 삽입형 중재시술 로봇 장치에 있어서,In the needle insertion interventional robot device,
    수술장 영상에 수술 계획 정보를 통합하는 컴퓨터;로서, 이질성을 가지는 환부(heterogeneous surgical target)의 가장 자리(border) 상의 목표점(target point) 및 바늘형 의료 도구의 삽입 경로를 포함하는 수술 계획 정보를 수술장 영상에 통합하는 컴퓨터;A computer that integrates surgical planning information into an operating room image, comprising: a surgical point information including a target point on an edge of a heterogeneous surgical target and an insertion path of a needle-shaped medical tool; Computer integrating to operating room image;
    바늘형 의료 도구를 구비하는 로봇;으로서, 바늘형 의료 도구가 삽입 경로를 따르도록 컴퓨터의 지시에 따라 동작하는 로봇; 그리고A robot having a needle-shaped medical tool, comprising: a robot operating according to a computer instruction such that the needle-shaped medical tool follows an insertion path; And
    컴퓨터와 연동되어 수술 계획 정보가 통합된 수술장 영상을 사용하여 환부의 가장 자리를 보여주는 사용자 인터페이스(UI);로서, 수술 계획에 따라 로봇이 동작할 때 목표점에 대한 바늘형 의료 도구의 끝의 예상 도달 위치를 보여주는 사용자 인터페이스;를 포함하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.A user interface (UI) showing the edge of the affected area using operating room images integrated with computerized surgical planning information; predicting the end of the needle-shaped medical tool to the target point when the robot operates according to the surgical plan. And a user interface showing a location of arrival.
  2. 청구항 1에 있어서, The method according to claim 1,
    수술 계획은 환부의 가장 자리 상의 복수의 목표점을 포함하며,The surgical plan includes a plurality of target points on the edge of the lesion,
    컴퓨터는 환부에서 각 목표점의 위치와, 바늘형 의료 도구에 의해 각 목표점에서 획득된 환부의 각 샘플을 매칭하는 환부 위치-샘플 정보를 저장하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.And the computer stores the affected part location-sample information for matching the location of each target point in the affected part and each sample of the affected part obtained at each target point by the needle-shaped medical instrument.
  3. 청구항 1에 있어서,The method according to claim 1,
    사용자 인터페이스는:The user interface is:
    환부의 단면을 보여주는 적어도 하나의 화면;으로서, 환부의 가장 자리 및 바늘형 의료 도구의 끝의 예상 도달 위치를 보여주는 화면;을 포함하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.At least one screen showing a cross-section of the affected area, including a screen showing the expected position of the edge of the affected area and the end of the needle-shaped medical instrument; needle inserted interventional robot device comprising a.
  4. 청구항 1에 있어서,The method according to claim 1,
    환부와 바늘형 의료 도구의 상대적 위치 정보를 파악하여 컴퓨터에 제공하는 위치 파악 수단;을 포함하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.The needle insertion interventional robot device comprising a; means for locating the relative position information of the affected part and the needle-type medical tool to provide to the computer.
  5. 청구항 1에 있어서,The method according to claim 1,
    컴퓨터는 목표점과 바늘형 의료 도구의 끝의 예상 도달 위치를 비교하여 불일치할 때, 일치시키기 위한 로봇의 위치 변경 정보를 사용자 인터페이스에 표시하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.And the computer compares the predicted arrival position of the target point with the end of the needle-shaped medical tool, and displays the position change information of the robot for matching on the user interface when there is a mismatch.
  6. 청구항 3에 있어서, The method according to claim 3,
    적어도 하나의 화면은 환부의 엑시얼 뷰(axial view), 코로날 뷰(Coronal view), 세지털 뷰(sagittal view) 중 적어도 하나를 포함하며,The at least one screen includes at least one of an axial view, a coronal view, and a sagittal view of the affected part,
    엑시얼 뷰, 코로날 뷰 및 세지털 뷰 중 적어도 하나에는 환부의 가장 자리, 바늘형 의료 도구의 끝의 예상 도달 위치가 표시된 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.And at least one of the axial view, the coronal view, and the three-dimensional view indicate an edge of the affected part and an expected arrival position of the tip of the needle-shaped medical tool.
  7. 청구항 3에 있어서, The method according to claim 3,
    사용자 인터페이스는:The user interface is:
    수술 계획이 통합된 3차원의 수술장 영상을 보여주는 추가의 화면;으로서, 환부의 가장 자리 및 바늘형 의료 도구의 끝의 예상 도달 위치를 보여주는 추가의 화면;을 포함하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.An additional screen showing a three-dimensional operating room image incorporating a surgical plan; an additional screen showing an edge of the affected part and an expected location of the end of the needle-shaped medical tool; Surgical robotic device.
  8. 청구항 7에 있어서, The method according to claim 7,
    사용자 인터페이스를 통해 변경된 수술 계획이 입력된 때, 컴퓨터는 변경된 예상 도달 위치를 계산하고, 사용자 인터페이스는 화면 및 추가의 화면에 변경된 예상 도달 위치를 표시하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.When the changed surgical plan is input through the user interface, the computer calculates the changed expected arrival position, and the user interface displays the changed expected arrival position on the screen and additional screens.
  9. 청구항 4에 있어서, The method according to claim 4,
    위치 파악 수단은:Positioning means are:
    바늘형 의료 도구 및 환자를 표지하는 마커; 그리고Markers to mark needle-like medical instruments and patients; And
    마커를 감지하는 감지 장치;를 포함하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.Needle insertion interventional robot device comprising a; sensing device for detecting a marker.
  10. 청구항 2에 있어서, The method according to claim 2,
    바늘형 의료 도구는 생검 바늘이고,The medical needle is a biopsy needle,
    생검 바늘은 리볼버 타입으로 복수 개가 로봇에 장착되어, 순차적으로 각 목표점을 생체검사하는 것을 특징으로 하는 바늘 삽입형 중재시술 로봇 장치.Biopsy needle is a revolver type is a plurality of dogs mounted on the robot, needle insertion interventional robot device, characterized in that to sequentially biopsy each target point.
PCT/KR2014/009839 2014-10-17 2014-10-20 Needle insertion type robot apparatus for interventional surgery WO2016060308A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140140896A KR101862133B1 (en) 2014-10-17 2014-10-17 Robot apparatus for interventional procedures having needle insertion type
KR10-2014-0140896 2014-10-17

Publications (1)

Publication Number Publication Date
WO2016060308A1 true WO2016060308A1 (en) 2016-04-21

Family

ID=55746837

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/009839 WO2016060308A1 (en) 2014-10-17 2014-10-20 Needle insertion type robot apparatus for interventional surgery

Country Status (2)

Country Link
KR (1) KR101862133B1 (en)
WO (1) WO2016060308A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113133813A (en) * 2021-04-01 2021-07-20 上海复拓知达医疗科技有限公司 Dynamic information display system and method based on puncture process
JP2021518243A (en) * 2018-03-17 2021-08-02 キヤノン ユーエスエイ, インコーポレイテッドCanon U.S.A., Inc Methods for Virtual Device Placement on Skin Surface in 3D Medical Imaging Data

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102168462B1 (en) * 2018-06-12 2020-10-21 경북대학교 산학협력단 Surgical navigation device, navigation surgery system and method using the device
KR102467282B1 (en) 2019-12-31 2022-11-17 주식회사 코어라인소프트 System and method of interventional procedure using medical images
CN115279267A (en) * 2020-04-02 2022-11-01 氧气医疗株式会社 Automatic body intrusion device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06165783A (en) * 1992-11-30 1994-06-14 Olympus Optical Co Ltd Optical diagnostic device
US20090149867A1 (en) * 2006-06-05 2009-06-11 Daniel Glozman Controlled steering of a flexible needle
WO2010110560A2 (en) * 2009-03-24 2010-09-30 주식회사 래보 Surgical robot system using augmented reality, and method for controlling same
WO2011040769A2 (en) * 2009-10-01 2011-04-07 주식회사 이턴 Surgical image processing device, image-processing method, laparoscopic manipulation method, surgical robot system and an operation-limiting method therefor
US20110280810A1 (en) * 2010-03-12 2011-11-17 Carl Zeiss Meditec, Inc. Surgical optical systems for detecting brain tumors

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BR9609484A (en) * 1995-07-16 1999-12-14 Yoav Paltieli Process and apparatus for freehand targeting of a needle towards a target located in a body volume and needle apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06165783A (en) * 1992-11-30 1994-06-14 Olympus Optical Co Ltd Optical diagnostic device
US20090149867A1 (en) * 2006-06-05 2009-06-11 Daniel Glozman Controlled steering of a flexible needle
WO2010110560A2 (en) * 2009-03-24 2010-09-30 주식회사 래보 Surgical robot system using augmented reality, and method for controlling same
WO2011040769A2 (en) * 2009-10-01 2011-04-07 주식회사 이턴 Surgical image processing device, image-processing method, laparoscopic manipulation method, surgical robot system and an operation-limiting method therefor
US20110280810A1 (en) * 2010-03-12 2011-11-17 Carl Zeiss Meditec, Inc. Surgical optical systems for detecting brain tumors

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021518243A (en) * 2018-03-17 2021-08-02 キヤノン ユーエスエイ, インコーポレイテッドCanon U.S.A., Inc Methods for Virtual Device Placement on Skin Surface in 3D Medical Imaging Data
JP7252268B2 (en) 2018-03-17 2023-04-04 キヤノン ユーエスエイ,インコーポレイテッド A method for virtual device placement on the skin surface in 3D medical image data
CN113133813A (en) * 2021-04-01 2021-07-20 上海复拓知达医疗科技有限公司 Dynamic information display system and method based on puncture process
WO2022206416A1 (en) * 2021-04-01 2022-10-06 上海复拓知达医疗科技有限公司 Puncture process-based dynamic information display system and method

Also Published As

Publication number Publication date
KR101862133B1 (en) 2018-06-05
KR20160046012A (en) 2016-04-28

Similar Documents

Publication Publication Date Title
US11974865B2 (en) System and method of providing distance and orientation feedback while navigating in 3D
US20200146588A1 (en) Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue
US11871913B2 (en) Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
EP3289964B1 (en) Systems for providing proximity awareness to pleural boundaries, vascular structures, and other critical intra-thoracic structures during electromagnetic navigation bronchoscopy
CN106659373B (en) Dynamic 3D lung map view for tool navigation inside the lung
EP3133983B1 (en) Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
US10123841B2 (en) Method for generating insertion trajectory of surgical needle
JP2022524443A (en) Guidance and tracking system for templated and targeted biopsies and treatments
WO2016060308A1 (en) Needle insertion type robot apparatus for interventional surgery
WO2011058516A1 (en) Systems & methods for planning and performing percutaneous needle procedures
WO2017043926A1 (en) Guiding method of interventional procedure using medical images, and system for interventional procedure therefor
US20210235983A1 (en) Marker placement
US20220379008A1 (en) Localization needle
EP3783568A2 (en) Systems and methods of fluoro-ct imaging for initial registration
WO2017043924A1 (en) Guiding method of interventional procedure using medical images, and system for interventional procedure therefor
KR102467282B1 (en) System and method of interventional procedure using medical images
WO2016056838A1 (en) Medical navigation device
JP7221190B2 (en) Structural masking or unmasking for optimized device-to-image registration
He et al. A minimally invasive multimodality image-guided (MIMIG) molecular imaging system for peripheral lung cancer intervention and diagnosis
WO2024079639A1 (en) Systems and methods for confirming position or orientation of medical device relative to target
Rai et al. Fluoroscopic image-guided intervention system for transbronchial localization
WO2014175608A1 (en) Method for comparing preoperative respiratory level with intraoperative respiratory level

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14904207

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11/09/2017)

122 Ep: pct application non-entry in european phase

Ref document number: 14904207

Country of ref document: EP

Kind code of ref document: A1