KR20160046012A - Robot apparatus for interventional procedures having needle insertion type - Google Patents
Robot apparatus for interventional procedures having needle insertion type Download PDFInfo
- Publication number
- KR20160046012A KR20160046012A KR1020140140896A KR20140140896A KR20160046012A KR 20160046012 A KR20160046012 A KR 20160046012A KR 1020140140896 A KR1020140140896 A KR 1020140140896A KR 20140140896 A KR20140140896 A KR 20140140896A KR 20160046012 A KR20160046012 A KR 20160046012A
- Authority
- KR
- South Korea
- Prior art keywords
- needle
- computer
- surgical
- robot
- tumor
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
- A61B10/02—Instruments for taking cell samples or for biopsy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
Abstract
The present invention relates to a needle insertion type interventional robot apparatus, comprising: a computer for integrating surgical planning information into a surgical field image, the computer system comprising a target point on a border of a heterogeneous surgical target, A computer that integrates surgical planning information including an insertion path of a medical instrument into a surgical field image; 1. A robot having an needle-like medical tool, comprising: a robot that operates according to instructions from a computer such that the needle-like medical tool follows the insertion path; And a user interface (UI) for displaying the edges of the lesion using the surgical field image integrated with the operation plan information linked to the computer, wherein when the robot is operated according to the surgical plan, the end of the needle- And a user interface showing an expected arrival position of the needle insertion type interventional robot.
Description
Disclosure relates generally to a needle insertion type interventional robotic device and more particularly to a needle insertion type interventional robotic device that effectively performs biopsy on a target point on a border of a heterogeneous lesion. will be.
Herein, the background art relating to the present disclosure is provided, and these are not necessarily meant to be known arts.
Medical imaging based biopsy (Biopsy) is an interventional procedure that minimizes damage to surrounding normal tissues and extracts the samples necessary for the pathological diagnosis of abnormal lesions. It is used as an interventional procedure in the adrenal, pancreas, It is widely applied to various parts of the abdominal organs, lungs, mediastinum, spine, and limb. Medical image-based biopsy can localize the lesion three-dimensionally using a high-resolution image, and can detect a biopsy needle that has entered the tissue, so it is easy to detect small-sized lesions .
In a medical field-based biopsy, the insertion path of a biopsy needle can be guided using a CT or C-arm fluoroscopy apparatus and the image. For example, The insertion angle of the biopsy needle and the insertion point are determined so that the insertion path of the biopsy needle can be accurately planned. When the patient enters the procedure field and the operation begins, the image acquisition device (eg Fluoroscopy device, CBCT device) placed in the procedure field is set to the same orientation as the planned path, ie the orientation in which the biopsy needle is inserted.
A navigation view is used to accurately guide the biopsy needle during the biopsy procedure. For example, in a navigation view such as the Surgeon's Eye View shown in FIG. 1, when a biopsy needle is pierced with an entry point, a center point of a target is seen, and a biopsy needle It looks like this point. In this navigation view, the target is represented by one point, and a circle is drawn around one point. Here you can plan to stab a few millimeters at any angle, depending on the plan of the insertion path.
However, in recent years, it has been hypothesized that tumor biology (eg, DNA mutation, malignancy) is heterogeneous depending on the location of the tumor, not the tumor. And whether the tumor has been taken or not has been an important issue in the diagnosis of the tumor, the prediction of the therapeutic effect of the tumor, and the prediction of the prognosis of the patient. For example, when an active tumor cell is located at the edge of a tumor, necrosis is present in the interior of the tumor, and a biopsy needle is inserted to the center of the tumor, false negative May be misdiagnosed as < RTI ID = 0.0 > Therefore, in order to biopsy a tumor having this heterogeneity, the operator may intentionally stab the outer portion of the tumor sensitively by experiencing the fluoroscopic image. However, it is not easy for a practitioner to pinpoint the center of a tumor as planned, and it is technically very difficult to accurately biopsy a tumor cell that is located at the edge of a relatively difficult tumor.
In addition, a heterogeneous tumor is biologically inspected at a plurality of target points by a multi-spot in which a biopsy is performed, and a map indicating the nature of the tissue according to the position in the tumor is created by matching the biopsy location with the characteristics of the sample It is very important in terms of medical care that it is even more difficult to perform a biopsy with a multi - spot due to the experience of the practitioner.
FIG. 2 shows an example of a navigation screen for an ablation procedure disclosed in U.S. Patent Publication No. 2013/0317363. The
This will be described later in the Specification for Implementation of the Invention.
SUMMARY OF THE INVENTION Herein, a general summary of the present disclosure is provided, which should not be construed as limiting the scope of the present disclosure. of its features).
According to one aspect of the present disclosure, there is provided a computer system for integrating surgical planning information into a surgical field image in a needle insertion interventional robotic device, comprising: a heterogeneous surgical target A target point on the border of the needle-like medical tool and an insertion path of the needle-like medical tool; 1. A robot having an needle-like medical tool, comprising: a robot that operates according to instructions from a computer such that the needle-like medical tool follows the insertion path; And a user interface (UI) for displaying the edges of the lesion using the surgical field image integrated with the operation plan information linked to the computer, wherein when the robot is operated according to the surgical plan, the end of the needle- And a user interface showing a predicted arrival position of the needle-insertion-type interventional robot.
This will be described later in the Specification for Implementation of the Invention.
1 is a view showing an example of Surgeon's Eye View,
2 is a diagram showing an example of a navigation screen for the ablation procedure disclosed in U.S. Patent Application Publication No. 2013/0317363,
3 is a view for explaining an example of a needle insertion type interventional treatment robot apparatus according to the present disclosure,
FIG. 4 is a view for explaining an example of a method of dividing a tumor and generating a surgical plan in a preoperative image;
5 is a diagram illustrating an example of a surgical plan including a plurality of target points of a biopsy and an insertion path and an insertion point at the edge of a tumor,
6 is a view illustrating an example of a preoperative image in which a tumor and an insertion path are visualized,
7 is a view for explaining an example of a method of integrating a surgical plan into a surgical field image,
8 is a view for explaining an example of a positioning means for grasping relative positional information of a patient and a biopsy needle;
9 is a view for explaining an example of a user interface,
10 is a view for explaining an example of a robot equipped with a biopsy needle of a revolver type.
The present disclosure will now be described in detail with reference to the accompanying drawings.
FIG. 3 is a view for explaining an example of a needle insertion type interventional procedure robot apparatus according to the present disclosure, in which a needle insertion type interventional procedure robot apparatus (hereinafter referred to as an interventional procedure robot apparatus) And can be used for a therapeutic needle insertion type image intervention robot system. The interventional robotic device can be used for biopsy and treatment of 1cm level lesions in abdomen, chest, etc. Needle-type medical devices include biopsy needles.
For example, the interventional robotic device includes a
4 is a view for explaining an example of a method of dividing a tumor and generating a surgical plan in an preoperative image. The interventional procedure robotic device can be applied to biopsy of organs such as lung, kidney, liver, etc., and it is not excluded to apply to parts other than organs. In this example, the lung is mainly described.
4, thresholding the pre-operative image of the patient's lungs to segment the lesion 10 (e.g., tumor) and create a surgical plan. For example, after acquiring volumetric chest CT images (lung images), a lung image is segmented and a segmented lung image is prepared. As a result of the segmentation, anatomical structures (eg, blood vessels, ribs, airways, lung boundaries, etc.) contained in the lung image can be extracted into a three-dimensional collection of voxels, and blood vessels, ribs An airway or the like may be stored as a lung mask, a vessel mask, a rib mask, an airway mask, or the like. The tumor (10) is segmented by a segmentation technique (eg, adaptive threshold) with an appropriate HU value for the tumor (10) as a threshold value. FIG. 4 shows an example of an axial section of a lung tumor in which the
In the
5 is a view for explaining an example of a surgical plan including a plurality of target points of a biopsy and an insertion path and an insertion point at an end of a tumor.
As described above, the
The thus divided
As shown in Fig. 5, a plurality of biopsy target points (e.g., 21, 22, 23, 24, and 26) are held at an
Biopsy is preferably performed at a plurality of target points in the
FIG. 6 is a diagram illustrating an example of a preoperative image in which a tumor and an insertion path are visualized. An insertion path (for example, 82) between an actual rib and a rib is visualized in 3D. The plurality of target points, the insertion point, and the insertion path determined as described above are added to the pre-operation image to generate the surgical plan. The pre-operative image is a three-dimensional image, and through the volume rendering, as shown in FIG. 6, the surgical plan can be generated in three dimensions. The
The
Assuming a certain thickness from the border of the tumor (10) and the periphery is assumed to be the edge (11) if the tumor (10) and the periphery are clearly separated even if the tumor's edge (11) have.
FIG. 7 is a view for explaining an example of a method of integrating a surgical plan into a surgical field image. The surgical field image is acquired in a procedure field, and the preoperative image and the surgical field image are matched, The plan is transferred. Rigid registration and deformable matching methods can be used as methods of matching between medical images.
The
In order to more reliably confirm the 3D visualized
As such, the
FIG. 8 is a view for explaining an example of a positioning means for grasping relative position information of a patient and a biopsy needle.
Various types of positioning means for grasping the relative positional relationship between the patient 960 and the
When utilizing the relative positional relationship of the
The
Unlike the above examples, referring to FIG. 3, it is also possible that the
In addition, the computer can calculate the current position of the
It is preferable from the viewpoints of accuracy and safety that the position determination means grasps the positional relationship using a plurality of methods rather than using only one. The distance between the
9 is a view for explaining an example of a user interface. In the
In the
The relative position of the
Various embodiments of the present disclosure will be described below.
(1) A needle insertion type interventional surgical robot apparatus, comprising: a computer for integrating surgical planning information into a surgical field image, the target point on a border of a heterogeneous surgical target, A computer that integrates surgical planning information including an insertion path of a medical instrument into a surgical field image; 1. A robot having an needle-like medical tool, comprising: a robot that operates according to instructions from a computer such that the needle-like medical tool follows the insertion path; And a user interface (UI) for displaying the edges of the lesion using the surgical field image integrated with the operation plan information linked to the computer, wherein when the robot is operated according to the surgical plan, the end of the needle- And a user interface showing an expected arrival position of the needle insertion type robot.
(2) The surgical plan includes a plurality of target points on the edge of the lesion, and the computer displays the position of each target point at the lesion and the lesion position-sample matching the sample of the lesion obtained at each target point by the needle- Wherein the robot arm device stores the information.
(3) The user interface comprises: at least one screen showing a cross section of the affected part, the screen showing a position of an end of the affected part and an expected arrival position of the end of the needle-like medical instrument; Device.
(4) Positioning means for grasping the relative positional information of the affected part and the needle-like medical instrument and providing the information to a computer.
(5) The robot apparatus according to any one of claims 1 to 5, wherein the computer compares the expected arrival position of the end of the needle-like medical tool with the target point, and displays the position change information of the robot for matching when the mismatch occurs.
(6) At least one of the images includes at least one of an axial view, a coronal view, and a sagittal view of the lesion, and includes at least one of an axial view, a coronal view, Wherein at least one of the views shows an edge of the affected part and an expected arrival position of the end of the needle-like medical instrument.
(7) The user interface includes: an additional screen showing the surgical plan integrated with the three-dimensional surgical field image; as well as an additional screen showing the location of the lesion edge and the expected arrival position of the tip of the needle-like medical tool; Wherein the needle insertion type interventional robot apparatus comprises:
(8) When a modified surgical plan is inputted through the user interface, the computer calculates the changed expected arrival position, and the user interface displays the changed expected arrival position on the screen and the additional screen. Device.
(9) Positioning means comprises: a needle-shaped medical instrument and a marker for marking the patient; And
And a sensing device for sensing the marker.
(10) The needle-inserted type interventional robot apparatus according to any one of (1) to (3), wherein the needlelike medical instrument is a biopsy needle, and the biopsy needle is mounted on a robot in a revolver type.
According to the one needle insertion type interventional treatment robot apparatus according to the present disclosure, a biopsy can be performed more accurately at the edge of a tumor having heterogeneity, and errors and risks issued when a human is biopsied can be reduced.
According to another needle insertion type interventional procedure robotic device according to the present disclosure, it is more effective when the affected part is biopsied as a multi spot.
According to another needle insertion type interventional treatment robot apparatus according to the present disclosure, a biopsy is performed on a plurality of target points and a sample is recorded in relation to a position in the tumor to establish a drug or a treatment plan according to the position in the tumor You can create a map.
100: robot 111: biopsy needle
300: medical image capturing apparatus 500: user interface
600: computer 10: lesion 11: edge
21, 22, 23, 24, 26: target point, estimated arrival position
Claims (10)
A computer for integrating surgical planning information into a surgical field image, comprising: a target point on a border of a heterogeneous surgical target and surgical planning information including an insertion path of the needle-like medical tool; A computer that integrates with the surgical field video;
A robot comprising a needle-like medical tool, comprising: a robot that operates in accordance with instructions from a computer such that the needle-like medical tool follows the insertion path; And
A user interface (UI) for displaying the edge of a lesion using a surgical field image integrated with a computer and linked to a computer, wherein the predicted end of the needle-like medical tool for the target point when the robot is operated according to the surgical plan And a user interface for indicating a reaching position of the needle insertion type interventional robot.
The surgical plan includes a plurality of target points on the edge of the lesion,
Wherein the computer stores the position of each target point in the lesion part and the lesion position-sample information matching each sample of the lesion part acquired at each target point by the needle-like medical instrument.
The user interface is:
At least one screen showing a cross section of the affected part, wherein the screen shows an edge of the affected part and an expected arrival position of the end of the needle-like medical instrument.
And a position detecting unit for detecting relative position information between the affected part and the needle-like medical instrument and providing the information to a computer.
Wherein the computer displays the position change information of the robot for matching when the target point and the expected arrival position of the end of the needle-like medical tool are compared with each other when the mismatch occurs, on the user interface.
The at least one screen includes at least one of an axial view, a coronal view, and a sagittal view of the lesion,
Wherein at least one of the axial view, the coronal view, and the cervical view shows an estimated position of the tip of the needle-like medical instrument at the edge of the affected area.
The user interface is:
An additional screen showing a three-dimensional surgical field image incorporating a surgical plan, comprising: an additional screen showing the location of the lesion and the expected location of the end of the needle-like medical instrument; A surgical robot apparatus.
Wherein when the changed surgical plan is inputted through the user interface, the computer calculates the changed expected arrival position, and the user interface displays the changed expected arrival position on the screen and the additional screen.
The locating means comprises:
Needle-shaped medical tools and markers to mark patients; And
And a sensing device for sensing the marker.
Needle-shaped medical instruments are biopsy needles,
Wherein a plurality of biopsy needles are mounted on a robot in a revolver type, and each of the target points is sequentially inspected for biopsy.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140140896A KR101862133B1 (en) | 2014-10-17 | 2014-10-17 | Robot apparatus for interventional procedures having needle insertion type |
PCT/KR2014/009839 WO2016060308A1 (en) | 2014-10-17 | 2014-10-20 | Needle insertion type robot apparatus for interventional surgery |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140140896A KR101862133B1 (en) | 2014-10-17 | 2014-10-17 | Robot apparatus for interventional procedures having needle insertion type |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20160046012A true KR20160046012A (en) | 2016-04-28 |
KR101862133B1 KR101862133B1 (en) | 2018-06-05 |
Family
ID=55746837
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020140140896A KR101862133B1 (en) | 2014-10-17 | 2014-10-17 | Robot apparatus for interventional procedures having needle insertion type |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR101862133B1 (en) |
WO (1) | WO2016060308A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019240424A1 (en) * | 2018-06-12 | 2019-12-19 | 경북대학교 산학협력단 | Surgical navigation apparatus, and navigation surgery system and method using same |
WO2021201343A1 (en) * | 2020-04-02 | 2021-10-07 | 주식회사 아티큐 | Automatic body-invasive device |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11317972B2 (en) * | 2018-03-17 | 2022-05-03 | Canon U.S.A., Inc. | Method for virtual device positioning on skin surface in 3D medical image data |
KR102467282B1 (en) | 2019-12-31 | 2022-11-17 | 주식회사 코어라인소프트 | System and method of interventional procedure using medical images |
CN113133813A (en) * | 2021-04-01 | 2021-07-20 | 上海复拓知达医疗科技有限公司 | Dynamic information display system and method based on puncture process |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR19990029038A (en) * | 1995-07-16 | 1999-04-15 | 요아브 빨띠에리 | Free aiming of needle ceramic |
US20090149867A1 (en) * | 2006-06-05 | 2009-06-11 | Daniel Glozman | Controlled steering of a flexible needle |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06165783A (en) * | 1992-11-30 | 1994-06-14 | Olympus Optical Co Ltd | Optical diagnostic device |
US20110306986A1 (en) * | 2009-03-24 | 2011-12-15 | Min Kyu Lee | Surgical robot system using augmented reality, and method for controlling same |
WO2011040769A2 (en) * | 2009-10-01 | 2011-04-07 | 주식회사 이턴 | Surgical image processing device, image-processing method, laparoscopic manipulation method, surgical robot system and an operation-limiting method therefor |
US9044142B2 (en) * | 2010-03-12 | 2015-06-02 | Carl Zeiss Meditec Ag | Surgical optical systems for detecting brain tumors |
-
2014
- 2014-10-17 KR KR1020140140896A patent/KR101862133B1/en active IP Right Grant
- 2014-10-20 WO PCT/KR2014/009839 patent/WO2016060308A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR19990029038A (en) * | 1995-07-16 | 1999-04-15 | 요아브 빨띠에리 | Free aiming of needle ceramic |
US20090149867A1 (en) * | 2006-06-05 | 2009-06-11 | Daniel Glozman | Controlled steering of a flexible needle |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019240424A1 (en) * | 2018-06-12 | 2019-12-19 | 경북대학교 산학협력단 | Surgical navigation apparatus, and navigation surgery system and method using same |
WO2021201343A1 (en) * | 2020-04-02 | 2021-10-07 | 주식회사 아티큐 | Automatic body-invasive device |
Also Published As
Publication number | Publication date |
---|---|
KR101862133B1 (en) | 2018-06-05 |
WO2016060308A1 (en) | 2016-04-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10123841B2 (en) | Method for generating insertion trajectory of surgical needle | |
US11896414B2 (en) | System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target | |
CN107072736B (en) | Computed tomography enhanced fluoroscopy systems, devices, and methods of use thereof | |
US20170065248A1 (en) | Device and Method for Image-Guided Surgery | |
US8165660B2 (en) | System and method for selecting a guidance mode for performing a percutaneous procedure | |
KR101720820B1 (en) | Manual instrumented medical tool system | |
CN108135563B (en) | Light and shadow guided needle positioning system and method | |
US10357317B2 (en) | Handheld scanner for rapid registration in a medical navigation system | |
US20180263707A1 (en) | System and method for mapping navigation space to patient space in a medical procedure | |
CN107530044B (en) | Method and system for performing guided biopsy using digital tomosynthesis | |
KR101862133B1 (en) | Robot apparatus for interventional procedures having needle insertion type | |
WO2011058516A1 (en) | Systems & methods for planning and performing percutaneous needle procedures | |
EP1727471A1 (en) | System for guiding a medical instrument in a patient body | |
CN111970986A (en) | System and method for performing intraoperative guidance | |
US20210045813A1 (en) | Systems, devices, and methods for surgical navigation with anatomical tracking | |
CN110916702B (en) | Method of supporting a user, data carrier and imaging system | |
KR101635515B1 (en) | Medical mavigation apparatus | |
KR102467282B1 (en) | System and method of interventional procedure using medical images | |
Cheng et al. | Robot Assisted Needle Placement: Developed Using Image Guided Surgery Toolkit (IGSTK) | |
WO2011083412A1 (en) | Biopsy planning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
N231 | Notification of change of applicant | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |