KR20160046012A - Robot apparatus for interventional procedures having needle insertion type - Google Patents

Robot apparatus for interventional procedures having needle insertion type Download PDF

Info

Publication number
KR20160046012A
KR20160046012A KR1020140140896A KR20140140896A KR20160046012A KR 20160046012 A KR20160046012 A KR 20160046012A KR 1020140140896 A KR1020140140896 A KR 1020140140896A KR 20140140896 A KR20140140896 A KR 20140140896A KR 20160046012 A KR20160046012 A KR 20160046012A
Authority
KR
South Korea
Prior art keywords
needle
computer
surgical
robot
tumor
Prior art date
Application number
KR1020140140896A
Other languages
Korean (ko)
Other versions
KR101862133B1 (en
Inventor
박창민
김남국
Original Assignee
서울대학교병원
울산대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 서울대학교병원, 울산대학교 산학협력단 filed Critical 서울대학교병원
Priority to KR1020140140896A priority Critical patent/KR101862133B1/en
Priority to PCT/KR2014/009839 priority patent/WO2016060308A1/en
Publication of KR20160046012A publication Critical patent/KR20160046012A/en
Application granted granted Critical
Publication of KR101862133B1 publication Critical patent/KR101862133B1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • A61B10/02Instruments for taking cell samples or for biopsy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges

Abstract

The present invention relates to a needle insertion type interventional robot apparatus, comprising: a computer for integrating surgical planning information into a surgical field image, the computer system comprising a target point on a border of a heterogeneous surgical target, A computer that integrates surgical planning information including an insertion path of a medical instrument into a surgical field image; 1. A robot having an needle-like medical tool, comprising: a robot that operates according to instructions from a computer such that the needle-like medical tool follows the insertion path; And a user interface (UI) for displaying the edges of the lesion using the surgical field image integrated with the operation plan information linked to the computer, wherein when the robot is operated according to the surgical plan, the end of the needle- And a user interface showing an expected arrival position of the needle insertion type interventional robot.

Description

Technical Field [0001] The present invention relates to a needle insertion type interventional robot apparatus,

Disclosure relates generally to a needle insertion type interventional robotic device and more particularly to a needle insertion type interventional robotic device that effectively performs biopsy on a target point on a border of a heterogeneous lesion. will be.

Herein, the background art relating to the present disclosure is provided, and these are not necessarily meant to be known arts.

Medical imaging based biopsy (Biopsy) is an interventional procedure that minimizes damage to surrounding normal tissues and extracts the samples necessary for the pathological diagnosis of abnormal lesions. It is used as an interventional procedure in the adrenal, pancreas, It is widely applied to various parts of the abdominal organs, lungs, mediastinum, spine, and limb. Medical image-based biopsy can localize the lesion three-dimensionally using a high-resolution image, and can detect a biopsy needle that has entered the tissue, so it is easy to detect small-sized lesions .

In a medical field-based biopsy, the insertion path of a biopsy needle can be guided using a CT or C-arm fluoroscopy apparatus and the image. For example, The insertion angle of the biopsy needle and the insertion point are determined so that the insertion path of the biopsy needle can be accurately planned. When the patient enters the procedure field and the operation begins, the image acquisition device (eg Fluoroscopy device, CBCT device) placed in the procedure field is set to the same orientation as the planned path, ie the orientation in which the biopsy needle is inserted.

A navigation view is used to accurately guide the biopsy needle during the biopsy procedure. For example, in a navigation view such as the Surgeon's Eye View shown in FIG. 1, when a biopsy needle is pierced with an entry point, a center point of a target is seen, and a biopsy needle It looks like this point. In this navigation view, the target is represented by one point, and a circle is drawn around one point. Here you can plan to stab a few millimeters at any angle, depending on the plan of the insertion path.

However, in recent years, it has been hypothesized that tumor biology (eg, DNA mutation, malignancy) is heterogeneous depending on the location of the tumor, not the tumor. And whether the tumor has been taken or not has been an important issue in the diagnosis of the tumor, the prediction of the therapeutic effect of the tumor, and the prediction of the prognosis of the patient. For example, when an active tumor cell is located at the edge of a tumor, necrosis is present in the interior of the tumor, and a biopsy needle is inserted to the center of the tumor, false negative May be misdiagnosed as < RTI ID = 0.0 > Therefore, in order to biopsy a tumor having this heterogeneity, the operator may intentionally stab the outer portion of the tumor sensitively by experiencing the fluoroscopic image. However, it is not easy for a practitioner to pinpoint the center of a tumor as planned, and it is technically very difficult to accurately biopsy a tumor cell that is located at the edge of a relatively difficult tumor.

In addition, a heterogeneous tumor is biologically inspected at a plurality of target points by a multi-spot in which a biopsy is performed, and a map indicating the nature of the tissue according to the position in the tumor is created by matching the biopsy location with the characteristics of the sample It is very important in terms of medical care that it is even more difficult to perform a biopsy with a multi - spot due to the experience of the practitioner.

FIG. 2 shows an example of a navigation screen for an ablation procedure disclosed in U.S. Patent Publication No. 2013/0317363. The target 138b of the ablation procedure and the expected treatment range 138a. There is no consideration of the heterogeneity of the target and there is no disclosure of an effective guiding method for reaching the medical tool at the edge of the target.

This will be described later in the Specification for Implementation of the Invention.

SUMMARY OF THE INVENTION Herein, a general summary of the present disclosure is provided, which should not be construed as limiting the scope of the present disclosure. of its features).

According to one aspect of the present disclosure, there is provided a computer system for integrating surgical planning information into a surgical field image in a needle insertion interventional robotic device, comprising: a heterogeneous surgical target A target point on the border of the needle-like medical tool and an insertion path of the needle-like medical tool; 1. A robot having an needle-like medical tool, comprising: a robot that operates according to instructions from a computer such that the needle-like medical tool follows the insertion path; And a user interface (UI) for displaying the edges of the lesion using the surgical field image integrated with the operation plan information linked to the computer, wherein when the robot is operated according to the surgical plan, the end of the needle- And a user interface showing a predicted arrival position of the needle-insertion-type interventional robot.

This will be described later in the Specification for Implementation of the Invention.

1 is a view showing an example of Surgeon's Eye View,
2 is a diagram showing an example of a navigation screen for the ablation procedure disclosed in U.S. Patent Application Publication No. 2013/0317363,
3 is a view for explaining an example of a needle insertion type interventional treatment robot apparatus according to the present disclosure,
FIG. 4 is a view for explaining an example of a method of dividing a tumor and generating a surgical plan in a preoperative image;
5 is a diagram illustrating an example of a surgical plan including a plurality of target points of a biopsy and an insertion path and an insertion point at the edge of a tumor,
6 is a view illustrating an example of a preoperative image in which a tumor and an insertion path are visualized,
7 is a view for explaining an example of a method of integrating a surgical plan into a surgical field image,
8 is a view for explaining an example of a positioning means for grasping relative positional information of a patient and a biopsy needle;
9 is a view for explaining an example of a user interface,
10 is a view for explaining an example of a robot equipped with a biopsy needle of a revolver type.

The present disclosure will now be described in detail with reference to the accompanying drawings.

FIG. 3 is a view for explaining an example of a needle insertion type interventional procedure robot apparatus according to the present disclosure, in which a needle insertion type interventional procedure robot apparatus (hereinafter referred to as an interventional procedure robot apparatus) And can be used for a therapeutic needle insertion type image intervention robot system. The interventional robotic device can be used for biopsy and treatment of 1cm level lesions in abdomen, chest, etc. Needle-type medical devices include biopsy needles.

For example, the interventional robotic device includes a computer 600 that processes or generates a medical image, a robot 100 that operates in cooperation with a computer, and an estimated arrival position of the end of the biopsy needle 111 at the side of the tumor And a user interface 500 for displaying and guiding. The interventional robotic device includes a master device 200 for controlling the robot 100 in real time in cooperation with the user interface 500, a video image pickup device 300 for photographing the position of the biopsy needle 111 in the human body, A robot 100, a patient 50, and an apparatus 400 for monitoring the position and posture of peripheral devices.

4 is a view for explaining an example of a method of dividing a tumor and generating a surgical plan in an preoperative image. The interventional procedure robotic device can be applied to biopsy of organs such as lung, kidney, liver, etc., and it is not excluded to apply to parts other than organs. In this example, the lung is mainly described.

4, thresholding the pre-operative image of the patient's lungs to segment the lesion 10 (e.g., tumor) and create a surgical plan. For example, after acquiring volumetric chest CT images (lung images), a lung image is segmented and a segmented lung image is prepared. As a result of the segmentation, anatomical structures (eg, blood vessels, ribs, airways, lung boundaries, etc.) contained in the lung image can be extracted into a three-dimensional collection of voxels, and blood vessels, ribs An airway or the like may be stored as a lung mask, a vessel mask, a rib mask, an airway mask, or the like. The tumor (10) is segmented by a segmentation technique (eg, adaptive threshold) with an appropriate HU value for the tumor (10) as a threshold value. FIG. 4 shows an example of an axial section of a lung tumor in which the tumor 10 is divided.

In the computer 600, a preoperative image of the patient is loaded, and a surgical field image and a preoperative image of the patient obtained in the procedure field are registered by a computer. As a result of the matching, the surgical plan including the insertion path 82, 84, the insertion point 41, the target point on the tumor, etc. made using the preoperative image is transferred to the surgical field image. This will be further described below.

5 is a view for explaining an example of a surgical plan including a plurality of target points of a biopsy and an insertion path and an insertion point at an end of a tumor.

As described above, the tumor 10 has cancer cells active in the border 11 or the outer wall, and there may be rotten water inside the tumor 10. Therefore, when the intensity of the tumor is different from that of the tumor (10) and the edge (11) of the tumor is different from that of the tumor (10) 15). Alternatively, FDG-PET / CT images can be used to distinguish between active sites and sites with low metabolism, such as FDG, according to the metabolic characteristics of the tumor (10). If the standardized uptake value (SUV) is thresholded, the tumor 10 can be divided as shown in FIG. Alternatively, the entire tumor 10 is divided and the peripheral portion of the tumor 10 is defined-divided using a distance map or a morphological operator such as Eroding from the boundary of the tumor 10.

The thus divided tumor 10 can be generated as a three-dimensional image. Thus, the section of the tumor 10 can be seen in the direction required by the image processing software, and the tumor 10 is visualized separately from the periphery on the section, and the edge of the tumor 10 is distinguished from the inside of the tumor 10 . For example, the tumor 10 can be viewed in a representative orientation, such as an axial view, a coronal view, and a sagittal view, .

As shown in Fig. 5, a plurality of biopsy target points (e.g., 21, 22, 23, 24, and 26) are held at an edge 11 of the tumor. As described above, the edge 11 of the tumor can be divided from the interior and the periphery by the division. The inside of tumor 10 is necrosis, and active cancer cells can be distributed in the periphery 11. The number of target points can be set in various positions in the tumor 10 as well as in the inside of the tumor 10 as well as the edge 11. Since the tumor 10 may be heterogeneous and the DNA mutation may be different depending on the location, the effect may be different when a specific drug or treatment is performed depending on the location in the tumor 10. Therefore, there is a problem that if the biopsy is performed only at one point, the other is alive and recurring. The thickness of the rim 11 may be statistically approximated depending on the size of the tumor 10 and not the biopsy of the center of the tumor 10 if the tumor 10 is as large as 2 centimeters, In addition to the biopsy of the center, the surgeon plans to take multiple biopsy targets at the vertebral end (11). If the biopsy needle (111) has sub-millimeter accuracy, plan the surgery so that the tumor (10) with a size of 20 millimeters wide can stab the rim (11) with an accuracy of 1 millimeter * 1 millimeter .

Biopsy is preferably performed at a plurality of target points in the tumor 10, and then a map of the tumor 10 is matched with the property and position of the sample to make a medical sense. Each insertion path that reaches each target point can be made, for example, an insertion path is created such that the blood vessels or other structures that intersect the insertion path are minimized. Thus, the insertion point of the biopsy needle 111 is determined on the skin of the patient. The insertion point may be smaller than the number of target points. For example, after the biopsy needle 111 is pierced, it is possible to change the direction without completely removing the biopsy needle 111 to perform biopsy of another target point.

FIG. 6 is a diagram illustrating an example of a preoperative image in which a tumor and an insertion path are visualized. An insertion path (for example, 82) between an actual rib and a rib is visualized in 3D. The plurality of target points, the insertion point, and the insertion path determined as described above are added to the pre-operation image to generate the surgical plan. The pre-operative image is a three-dimensional image, and through the volume rendering, as shown in FIG. 6, the surgical plan can be generated in three dimensions. The tumor 10 is segmented from the periphery and the edge 11 is marked to be distinct. The insertion path is visualized in three dimensions, and a target point (e.g., 21) is displayed at the tumor edge 11.

The tumor 10 is almost invisible to flowoscopy because there is almost no contrast, and the tumor 10 is generally shown as a circle. In this example, however, the edge 10 of the tumor 10 is divided in the pre- , So that it appears on the surgical field image.

Assuming a certain thickness from the border of the tumor (10) and the periphery is assumed to be the edge (11) if the tumor (10) and the periphery are clearly separated even if the tumor's edge (11) have.

FIG. 7 is a view for explaining an example of a method of integrating a surgical plan into a surgical field image. The surgical field image is acquired in a procedure field, and the preoperative image and the surgical field image are matched, The plan is transferred. Rigid registration and deformable matching methods can be used as methods of matching between medical images.

The insertion path 82 may be modified via the user interface 500 and an inappropriate insertion path may be eliminated in consideration of breathing or motion. FIG. 7 (a) is an example of a preoperative image, and FIG. 7 (b) is an example of an image in which an operation plan is transferred as an image in which a surgical field image and a preoperative image are matched.

In order to more reliably confirm the 3D visualized insertion path 82 and the target point on the tumor's edge 11, an insertion path (not shown) on a multiplanar reconstruction (e.g., axial view, coronal view, sagittal view) (82), the insertion point, and the target point can be overlaid and displayed (an axial view is illustrated in Fig. 7).

As such, the biopsy needle 111 is guided along the insertion path identified on the MPR, so that the procedure can be performed. For example, the final agreed-upon insertion path is transmitted to a robot or a user interface (e.g., navigation device) using TCP / IP or a proprietary communication protocol. The biopsy needle 111 can be of a single needle type, of course. However, in order to perform a biopsy with a multi-spot, it is more effective to attach a plurality of robots to the robot by a revolver type (see FIG. 10) and sequentially perform biopsy of each target point. Meanwhile, in order to reduce the number of times that the biopsy needle 111 is pushed out of the lungs and pushed again from the lung, the direction of the insertion path may be changed without punching out the biopsy needle 111 and the other target points shown in Fig.

FIG. 8 is a view for explaining an example of a positioning means for grasping relative position information of a patient and a biopsy needle.

Various types of positioning means for grasping the relative positional relationship between the patient 960 and the biopsy needle 912 can be used. 8, a patient 960, a robot 911 with a biopsy needle 912, an infrared camera 991, infrared reflector assemblies 911,913 and 914, a monitor 920 and a computer (not shown) 940 are provided. The infrared camera 991 grasps a plurality of infrared reflectors 911 and 914 indicating the position of the patient 960 and a plurality of infrared reflectors or infrared emitters 913 provided at the ends of the biopsy needle 912, The position of the patient 960 can be grasped. A computer 940 for the entire operation of the master console is provided, and a monitor 920 is also provided. The computer 940 and the monitor 920 may correspond to the computer 600 and the user interface 500 described in FIG.

When utilizing the relative positional relationship of the patient 960 and the needle 912, the computer 940 also functions as a surgical navigation device. The biopsy needle 912 of the robot 911 is operated by the computer 940 according to the adjustment of the master 200 (see FIG. 1) operator. The infrared reflector assembly 911 is fixed to the patient 960 to indicate the position of the patient 960 and the infrared reflector assembly 913 is fixed to the biopsy needle 912 to indicate the position of the biopsy needle 912, Assembly 914 is located in the chest of patient 960 to indicate patient movement, such as breathing, sneezing, or the like of the patient. Although an infrared camera and an infrared ray reflector are used as position sensing means for position recognition, a magnetic field can be used, and any means capable of positioning can be used. As an example, it is possible to attach a magnetic sensor to a biopsy needle and track how much it moves with the camera.

The infrared reflector assembly 911 may be used to indicate position information of the patient 960, may serve as a reference position of the entire system, may be fixed to the patient 960, but may be fixed to the operating table, An additional infrared ray reflector assembly (not shown) functioning as a reference position may be added. The position of the biopsy needle 912 with respect to the patient 960 can be grasped.

Unlike the above examples, referring to FIG. 3, it is also possible that the robot 100 itself knows its position. For example, the robot 100 holds the biopsy needle 111, and the robot 100 itself can know its coordinates in the procedure field. In addition, it is possible for the robot 100 itself to detect how many millimeters the biopsy needle 111 moves. Therefore, the computer can calculate the orientation and the position of the biopsy needle 111 in the space of the procedure field image.

In addition, the computer can calculate the current position of the biopsy needle 111 in the matched surgical field image space by photographing in a floroscopy where the procedure field image is acquired.

It is preferable from the viewpoints of accuracy and safety that the position determination means grasps the positional relationship using a plurality of methods rather than using only one. The distance between the biopsy needle 111 and the target point of the tumor edge 11 can be calculated by the computer by calculating the relative positional relationship between the patient and the biopsy needle 111 captured by the one or more positioning means. Therefore, when the insertion path, the insertion angle, the insertion point, and the insertion distance determined in the surgical plan are stabbed, the expected arrival position of the end of the biopsy needle 111 is calculated by the computer. The expected reach position is displayed on the matched surgical field image and gives information to the operator. This will be further described below.

9 is a view for explaining an example of a user interface. In the user interface 500, a plurality of screens 510, 520, 530, and 540 are displayed. For example, the upper screen contains a CT volume (eg, an image delivered from a flowoscope) and a mask showing various structures or lesions of the lung. There is also an information window for displaying buttons and procedure information or types for performing a surgery plan.

In the main screen 510, a matched surgical field image in which the edges 11 of the lesion portion are visually distinguished is displayed in 3D. In addition, MPR images (eg, 520, 530, 540) generated by the computer from the surgical field image are displayed on the right side. For example, a tumor 10 with an edge 11 separated in a direction such as an axial view, a coronal view 520 and a sagittal view 530, (E.g., 21, 22, 23, 24, and 26) of the needle tip at the insertion path 82, the insertion point 41, and the edge 11 are displayed.

The relative position of the biopsy needle 111 and the patient 50 can be known in the same manner as described in Fig. 8, and the operative field image (e. G., 510) shows the matched tumor 10 visually, A seat 11 appears. Thus, the distance between the target point on the marginal edge 11 of the tumor and the biopsy needle 111 and the predicted arrival position (e.g., 21, 22, 23, 24, 26) where to reach the planned depth at the currently aligned angle Computers can calculate. Therefore, the computer can display the calculated expected arrival position on the matched surgical field image. Therefore, whether or not the target point and the expected arrival position 21, 22, 23, 24, 26 are coincident can be visually recognized. If an agreement is found, the operator instructs the computer 600 (see FIG. 3) via the master console and the biopsy needle 111 is inserted into the human body by the operation of the robot 100 interlocked with the computer 600. The position of the end of the biopsy needle 111 can be calculated in the surgical field image space by repeating the current surgical field image with flowoscopy, cone beam, or the like as described above, and the tumor 10, It is possible to check again whether or not the target point coincides with the target point. If the expected arrival point and target point do not match, the surgical plan can be modified. For example, the screen displays the difference between the target point and the expected arrival point, and the computer can make an instruction to modify the robot's position. Or the operator can instruct the computer to modify the insertion path or depth.

Various embodiments of the present disclosure will be described below.

(1) A needle insertion type interventional surgical robot apparatus, comprising: a computer for integrating surgical planning information into a surgical field image, the target point on a border of a heterogeneous surgical target, A computer that integrates surgical planning information including an insertion path of a medical instrument into a surgical field image; 1. A robot having an needle-like medical tool, comprising: a robot that operates according to instructions from a computer such that the needle-like medical tool follows the insertion path; And a user interface (UI) for displaying the edges of the lesion using the surgical field image integrated with the operation plan information linked to the computer, wherein when the robot is operated according to the surgical plan, the end of the needle- And a user interface showing an expected arrival position of the needle insertion type robot.

(2) The surgical plan includes a plurality of target points on the edge of the lesion, and the computer displays the position of each target point at the lesion and the lesion position-sample matching the sample of the lesion obtained at each target point by the needle- Wherein the robot arm device stores the information.

(3) The user interface comprises: at least one screen showing a cross section of the affected part, the screen showing a position of an end of the affected part and an expected arrival position of the end of the needle-like medical instrument; Device.

(4) Positioning means for grasping the relative positional information of the affected part and the needle-like medical instrument and providing the information to a computer.

(5) The robot apparatus according to any one of claims 1 to 5, wherein the computer compares the expected arrival position of the end of the needle-like medical tool with the target point, and displays the position change information of the robot for matching when the mismatch occurs.

(6) At least one of the images includes at least one of an axial view, a coronal view, and a sagittal view of the lesion, and includes at least one of an axial view, a coronal view, Wherein at least one of the views shows an edge of the affected part and an expected arrival position of the end of the needle-like medical instrument.

(7) The user interface includes: an additional screen showing the surgical plan integrated with the three-dimensional surgical field image; as well as an additional screen showing the location of the lesion edge and the expected arrival position of the tip of the needle-like medical tool; Wherein the needle insertion type interventional robot apparatus comprises:

(8) When a modified surgical plan is inputted through the user interface, the computer calculates the changed expected arrival position, and the user interface displays the changed expected arrival position on the screen and the additional screen. Device.

(9) Positioning means comprises: a needle-shaped medical instrument and a marker for marking the patient; And

And a sensing device for sensing the marker.

(10) The needle-inserted type interventional robot apparatus according to any one of (1) to (3), wherein the needlelike medical instrument is a biopsy needle, and the biopsy needle is mounted on a robot in a revolver type.

According to the one needle insertion type interventional treatment robot apparatus according to the present disclosure, a biopsy can be performed more accurately at the edge of a tumor having heterogeneity, and errors and risks issued when a human is biopsied can be reduced.

According to another needle insertion type interventional procedure robotic device according to the present disclosure, it is more effective when the affected part is biopsied as a multi spot.

According to another needle insertion type interventional treatment robot apparatus according to the present disclosure, a biopsy is performed on a plurality of target points and a sample is recorded in relation to a position in the tumor to establish a drug or a treatment plan according to the position in the tumor You can create a map.

100: robot 111: biopsy needle
300: medical image capturing apparatus 500: user interface
600: computer 10: lesion 11: edge
21, 22, 23, 24, 26: target point, estimated arrival position

Claims (10)

1. A needle insertion type interventional procedure robotic device,
A computer for integrating surgical planning information into a surgical field image, comprising: a target point on a border of a heterogeneous surgical target and surgical planning information including an insertion path of the needle-like medical tool; A computer that integrates with the surgical field video;
A robot comprising a needle-like medical tool, comprising: a robot that operates in accordance with instructions from a computer such that the needle-like medical tool follows the insertion path; And
A user interface (UI) for displaying the edge of a lesion using a surgical field image integrated with a computer and linked to a computer, wherein the predicted end of the needle-like medical tool for the target point when the robot is operated according to the surgical plan And a user interface for indicating a reaching position of the needle insertion type interventional robot.
The method according to claim 1,
The surgical plan includes a plurality of target points on the edge of the lesion,
Wherein the computer stores the position of each target point in the lesion part and the lesion position-sample information matching each sample of the lesion part acquired at each target point by the needle-like medical instrument.
The method according to claim 1,
The user interface is:
At least one screen showing a cross section of the affected part, wherein the screen shows an edge of the affected part and an expected arrival position of the end of the needle-like medical instrument.
The method according to claim 1,
And a position detecting unit for detecting relative position information between the affected part and the needle-like medical instrument and providing the information to a computer.
The method according to claim 1,
Wherein the computer displays the position change information of the robot for matching when the target point and the expected arrival position of the end of the needle-like medical tool are compared with each other when the mismatch occurs, on the user interface.
The method of claim 3,
The at least one screen includes at least one of an axial view, a coronal view, and a sagittal view of the lesion,
Wherein at least one of the axial view, the coronal view, and the cervical view shows an estimated position of the tip of the needle-like medical instrument at the edge of the affected area.
The method of claim 3,
The user interface is:
An additional screen showing a three-dimensional surgical field image incorporating a surgical plan, comprising: an additional screen showing the location of the lesion and the expected location of the end of the needle-like medical instrument; A surgical robot apparatus.
The method of claim 7,
Wherein when the changed surgical plan is inputted through the user interface, the computer calculates the changed expected arrival position, and the user interface displays the changed expected arrival position on the screen and the additional screen.
The method of claim 4,
The locating means comprises:
Needle-shaped medical tools and markers to mark patients; And
And a sensing device for sensing the marker.
The method of claim 2,
Needle-shaped medical instruments are biopsy needles,
Wherein a plurality of biopsy needles are mounted on a robot in a revolver type, and each of the target points is sequentially inspected for biopsy.
KR1020140140896A 2014-10-17 2014-10-17 Robot apparatus for interventional procedures having needle insertion type KR101862133B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020140140896A KR101862133B1 (en) 2014-10-17 2014-10-17 Robot apparatus for interventional procedures having needle insertion type
PCT/KR2014/009839 WO2016060308A1 (en) 2014-10-17 2014-10-20 Needle insertion type robot apparatus for interventional surgery

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140140896A KR101862133B1 (en) 2014-10-17 2014-10-17 Robot apparatus for interventional procedures having needle insertion type

Publications (2)

Publication Number Publication Date
KR20160046012A true KR20160046012A (en) 2016-04-28
KR101862133B1 KR101862133B1 (en) 2018-06-05

Family

ID=55746837

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140140896A KR101862133B1 (en) 2014-10-17 2014-10-17 Robot apparatus for interventional procedures having needle insertion type

Country Status (2)

Country Link
KR (1) KR101862133B1 (en)
WO (1) WO2016060308A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019240424A1 (en) * 2018-06-12 2019-12-19 경북대학교 산학협력단 Surgical navigation apparatus, and navigation surgery system and method using same
WO2021201343A1 (en) * 2020-04-02 2021-10-07 주식회사 아티큐 Automatic body-invasive device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11317972B2 (en) * 2018-03-17 2022-05-03 Canon U.S.A., Inc. Method for virtual device positioning on skin surface in 3D medical image data
KR102467282B1 (en) 2019-12-31 2022-11-17 주식회사 코어라인소프트 System and method of interventional procedure using medical images
CN113133813A (en) * 2021-04-01 2021-07-20 上海复拓知达医疗科技有限公司 Dynamic information display system and method based on puncture process

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990029038A (en) * 1995-07-16 1999-04-15 요아브 빨띠에리 Free aiming of needle ceramic
US20090149867A1 (en) * 2006-06-05 2009-06-11 Daniel Glozman Controlled steering of a flexible needle

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06165783A (en) * 1992-11-30 1994-06-14 Olympus Optical Co Ltd Optical diagnostic device
US20110306986A1 (en) * 2009-03-24 2011-12-15 Min Kyu Lee Surgical robot system using augmented reality, and method for controlling same
WO2011040769A2 (en) * 2009-10-01 2011-04-07 주식회사 이턴 Surgical image processing device, image-processing method, laparoscopic manipulation method, surgical robot system and an operation-limiting method therefor
US9044142B2 (en) * 2010-03-12 2015-06-02 Carl Zeiss Meditec Ag Surgical optical systems for detecting brain tumors

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990029038A (en) * 1995-07-16 1999-04-15 요아브 빨띠에리 Free aiming of needle ceramic
US20090149867A1 (en) * 2006-06-05 2009-06-11 Daniel Glozman Controlled steering of a flexible needle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019240424A1 (en) * 2018-06-12 2019-12-19 경북대학교 산학협력단 Surgical navigation apparatus, and navigation surgery system and method using same
WO2021201343A1 (en) * 2020-04-02 2021-10-07 주식회사 아티큐 Automatic body-invasive device

Also Published As

Publication number Publication date
KR101862133B1 (en) 2018-06-05
WO2016060308A1 (en) 2016-04-21

Similar Documents

Publication Publication Date Title
US10123841B2 (en) Method for generating insertion trajectory of surgical needle
US11896414B2 (en) System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
CN107072736B (en) Computed tomography enhanced fluoroscopy systems, devices, and methods of use thereof
US20170065248A1 (en) Device and Method for Image-Guided Surgery
US8165660B2 (en) System and method for selecting a guidance mode for performing a percutaneous procedure
KR101720820B1 (en) Manual instrumented medical tool system
CN108135563B (en) Light and shadow guided needle positioning system and method
US10357317B2 (en) Handheld scanner for rapid registration in a medical navigation system
US20180263707A1 (en) System and method for mapping navigation space to patient space in a medical procedure
CN107530044B (en) Method and system for performing guided biopsy using digital tomosynthesis
KR101862133B1 (en) Robot apparatus for interventional procedures having needle insertion type
WO2011058516A1 (en) Systems & methods for planning and performing percutaneous needle procedures
EP1727471A1 (en) System for guiding a medical instrument in a patient body
CN111970986A (en) System and method for performing intraoperative guidance
US20210045813A1 (en) Systems, devices, and methods for surgical navigation with anatomical tracking
CN110916702B (en) Method of supporting a user, data carrier and imaging system
KR101635515B1 (en) Medical mavigation apparatus
KR102467282B1 (en) System and method of interventional procedure using medical images
Cheng et al. Robot Assisted Needle Placement: Developed Using Image Guided Surgery Toolkit (IGSTK)
WO2011083412A1 (en) Biopsy planning

Legal Events

Date Code Title Description
A201 Request for examination
N231 Notification of change of applicant
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant