KR101758741B1 - Guiding method of interventional procedure using medical images and system for interventional procedure for the same - Google Patents
Guiding method of interventional procedure using medical images and system for interventional procedure for the same Download PDFInfo
- Publication number
- KR101758741B1 KR101758741B1 KR1020150127611A KR20150127611A KR101758741B1 KR 101758741 B1 KR101758741 B1 KR 101758741B1 KR 1020150127611 A KR1020150127611 A KR 1020150127611A KR 20150127611 A KR20150127611 A KR 20150127611A KR 101758741 B1 KR101758741 B1 KR 101758741B1
- Authority
- KR
- South Korea
- Prior art keywords
- arm
- image
- biopsy needle
- patient
- medical
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
-
- A61B2019/5244—
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/305—Details of wrist mechanisms at distal ends of robotic arms
Abstract
The present invention relates to an interventional treatment system using medical images, comprising an image acquisition device for acquiring a medical image of a patient, a patient table for supporting the patient, a robot base for positioning on the side of the patient table, A robot arm mounted on one side of the robot base, and a control unit having a display for setting an insertion path of the medical instrument, and a medical image using the medical image.
Description
Disclosure relates to an interventional procedure guide method using a medical image as a whole, and to an interventional procedure system for the same, and more particularly, to an integrated interventional procedure system for positioning a robot arm according to a procedure plan and confirming insertion conditions of a medical instrument , And an interventional treatment system for the same.
Herein, the background art relating to the present disclosure is provided, and these are not necessarily meant to be known arts.
1 shows an example of a system for supporting percutaneous intervention procedures as disclosed in U.S. Patent No. 8,386,019. The system disclosed herein comprises a CT imaging system, a robot that is matched to the device, and a device for sensing the patient's movement. According to the method of the present invention, the robot is equipped with an interventional device and is matched with the imaging device, and the system detects the movement of the patient. The system simultaneously transmits the patient's movement to the robot. The system is designed to match the 3D image of the demonstration field with the demonstrative 3D image created before the procedure and prevent the insertion of the intervention device of the robot unless the matching is done. The
FIG. 2 is a view for explaining an example in which an operator is exposed to radiation during interventional procedures. In a needle insertion type intervention such as a biopsy, a minimally invasive procedure has been rapidly increasing recently. Such interventional procedures are generally performed under radiographic imaging guidance. These interventions are highly dependent on the practitioner experience, and the radiation exposure of the practitioner and patient is a problem.
A medical device such as a biopsy needle (eg, a biopsy needle), a lead (eg, a lead for Deep Brain Stimulation), a probe, a catheter, It is important that interventional procedures, such as insertion or implantation, are performed so that vessels or anatomically important structures are not damaged or minimally invasive. Biopsy is an intervention that minimizes the damage to the surrounding normal tissue and extracts the specimens necessary for the pathological diagnosis of the target. The biopsy is based on the retroperitoneal membrane of the adrenal, pancreas, It is widely applied to areas of the lungs, spine, and limb.
In such a medical image-based biopsy, the insertion route of the biopsy needle is generally planned in advance in the diagnosis image (pre-operation image) due to problems such as exposure to radiation.
As a medical image-based biopsy, the CT-based biopsy can localize the lesion in a delicate three-dimensional region using a high-resolution image and view the biopsy needle that has entered the tissue, It is easy to detect lesions. CT-based biopsies are superior to lesions in which tissue is superimposed over ultrasound or x-ray fluoroscopy. In addition, the CT-based biopsy shows the relationship with surrounding tissues, so that the clinician can set the trajectory to the lesion and can perform the operation at various patient positions.
In the CT-based biopsy, the initial angle of entry of the biopsy needle to the patient's body is important. In CT-based biopsy, the surgeon adjusts the biopsy needle, and the assistant can use the protractor to guide the biopsy needle to the surgeon in the on-the-spot biopsy, or CT or C-arm fluoroscopy, In some cases, the insertion path of the biopsy needle may be guided by the image, but in this case, the operator is highly dependent on the experience of the operator, so that the operation is performed with the operator exposed to the radiation, and the exposure time can be changed according to his experience.
Therefore, there is a need to develop an interventional robot in order to solve problems such as an increase in the exposure time of the operator and the patient to the radiation and the accuracy of the operation. The use of such an interventional robot can reduce the radiation dose of the patient by shortening the procedure time, and it is expected to reduce the complication and maximize the safety. In addition, it is possible to reduce or eliminate the radiation exposure of the practitioner and to improve the safety of the operator through the automation system.
Fig. 3 is a diagram showing an example of an arbitration procedure robot shown in U.S. Published Patent Application No. 2010-0250000, in which an arbitration procedure robot called product da Vinci is presented. The interventional procedure robot has a plurality of robot arms (201, 202, 203, 204). Each
However, conventional techniques using such an interventional robot have limitations in achieving the automation of the intervention procedure and the accuracy, safety, and convenience of the intervention using medical instruments such as a biopsy needle. The system is too heavy, the movement and the installation are inconvenient, and the medical expenses are increased. Also, there is a possibility that the operator and the patient may not be conscious of the radiation exposure and may expose them to the radiation for a long time.
This will be described later in the Specification for Implementation of the Invention.
SUMMARY OF THE INVENTION Herein, a general summary of the present disclosure is provided, which should not be construed as limiting the scope of the present disclosure. of its features).
According to one aspect of the present disclosure, there is provided an interventional treatment system using a medical image, comprising: an image acquisition device for acquiring a medical image of a patient; a patient table supporting the patient; A robot arm mounted on a side of the robot base, a robot arm mounted on a side of the robot base and capable of entering the medical instrument into the image acquisition apparatus, and a control unit having a display, An interventional procedure system is provided.
This will be described later in the Specification for Implementation of the Invention.
BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a view showing an example of an arbitration procedure system disclosed in U.S. Patent No. 8,386,019,
FIG. 2 is a view for explaining an example in which an operator is exposed to radiation when an intervention is performed;
FIG. 3 is a diagram showing an example of an arbitration procedure robot presented in U.S. Published Patent Application No. 2010-0250000,
4 and 5 are views showing an example of an interventional treatment system using a medical image according to the present disclosure,
6 is a view for explaining an example of a robot arm according to the present disclosure,
7 is a view for explaining an example of a multifunctional end effector,
8 is a view for explaining an example in which a system of an intervention procedure using a medical image uses a camera,
9 is a view for explaining examples of a master console,
10 is a view for explaining an example of a process of controlling a biopsy needle mounted on an end effector by a master console,
11 is a view for explaining an example of an intervention procedure guide method using a medical image according to the present disclosure;
12 to 14 are diagrams for explaining an example of a procedure plan generation method,
15 is a view for explaining an example of the operation of the interventional treatment system,
16 is a diagram for explaining an overall process of an example of an intervention procedure guide method,
17 is a diagram showing an example of a display screen in a split mode,
18 is a view showing an example of a display screen of the plan mode,
19 and 20 are views showing an example of a screen of a display in a matching mode,
21 is a view for explaining an example of a matching method,
22 and 23 are views for explaining an example of a window in which a treatment plan can be modified,
24 to 26 are views showing an example of a display screen of the navigation mode,
27 and 28 are views for explaining an example of a display screen in the insertion mode.
The present disclosure will now be described in detail with reference to the accompanying drawings.
FIGS. 4 and 5 are diagrams showing an example of a configuration and a configuration method of an arbitration procedure system using the medical image according to the present disclosure, wherein an arbitration procedure system using a medical image (hereinafter referred to as an arbitration procedure system) A control unit 500 (e.g., a computer) for controlling a
The
The
The
FIG. 6 is a view for explaining an example of a slave robot according to the present disclosure. The slave robot is composed of a
The
5B, the
6 (c) is a view showing the structure of the
In the treatment field, the
6 (d) is a view for explaining the
It is desirable to have a function of releasing the
The structure of the
Figures 7 (a) and 7 (b) show a
The
The
The
8 is a view for explaining an example in which additional equipment is added to the guide system of the intervention procedure using the medical image. In this example, the
The
9 is a diagram for explaining the
The mechanism portion supporting the
The control unit may be divided into a positioning step for moving the
After the
At this time, as described above, the
10 is a view for explaining an example of a process of controlling the
FIG. 11 is a view for explaining an example of a guide method of an intervention procedure using a medical image according to the present disclosure. The guide method of an intervention procedure using a medical image may be applied to an organ such as lung, kidney, liver, , It is not excluded to apply to other parts of the organ.
In an intervention procedure guide method using a medical image (hereinafter referred to as an intervention procedure guide method), a pre-operation image is acquired (S210). The pre-operation image is acquired using the
Thereafter, an insertion path (e.g., 475; see FIG. 17) of the
12 to 14 are diagrams for explaining an example of a procedure plan generation method. First, the anatomical structures (eg, blood vessels, bones, etc.) included in the pre-operation image are obtained as a three-dimensional set of voxels as a result of division of the pre-operation images. For example, after acquiring volumetric chest CT images (lung images), the lung images are divided into divided lung images. For example, anatomical structures (eg, blood vessels, ribs, airways, lung boundaries, etc.) included in a lung image are segmented by a segmentation technique (eg, adaptive threshold) )do. As a result of the segmentation, anatomical structures such as blood vessels are extracted into a three-dimensional set of voxels. 12 shows an axial view of a lung image in which an anatomical structure such as a blood vessel is divided. Anatomical structures such as blood vessels, ribs, and airways, which are divided from lung images, may be used as a lung mask, a vessel mask, a rib mask, an airway mask, etc. .
Thereafter, using a lung mask, a vessel mask, a rib mask, an airway mask, and the like, a distance map of a lung boundary, a distance map map of rib, a distance map of pulmonary vessel, and a distance map of airway.
The generation process of the pulmonary blood vessel distance map may include a process of giving distance information to all the voxels in the pulmonary image from the blood vessel boundary to all the voxels. The generation process of the lung boundary distance map, lip distance map and airway distance map may likewise include processes in which distance information from the lung boundary, distance from the rib boundary, and distance information from the airway boundary are imparted to the voxels, respectively. Using such distance maps, the distance of the insertion path or the distance between the insertion path and the anatomical structure can be calculated. Thus, an anatomical structure that intersects the insertion path of the
The above-described distance map can be used in the process of calculating the distance of the penetration amount and the insertion path. In calculating the distance between the invasive volume and the insertion path, a method of using a pulmonary vein tree may be considered in addition to the method of using the distance map. Using the pulmonary vein tree, the number of blood vessels meeting the insertion path and the extent to which the blood vessels are invasive can be calculated.
The distance to the anatomical structures, such as the vessels that meet the insertion path, is calculated using a distance map by 3D casting or using a pulmonary vascular tree. Although the entire 360 degrees may be laid to find the insertion path, a user (e.g., a practitioner) may define the
FIG. 13 is a diagram showing an insertion path reduced by a safety margin, and such a plurality of insertion paths can be represented by a three-
Fig. 14 is a diagram showing an example in which the insertion path described in Fig. 13 is actually implemented. In Fig. 14, a cone-shaped
15A is a view for explaining an example of the operation of the interventional treatment system. In the process of acquiring the procedure field image and generating the insertion path of the
Thereafter, the immediately preceding procedure procedure image Ref-CT is acquired. 15B, the
The pre-operation image and the treatment field image can be displayed on the display 350 (e.g., see Fig. 19). Thereafter, the pre-operation image and the treatment field image are matched (S240; see, for example, FIGS. 19 and 20). For example, the procedure field image is matched to the pre-operation image using the coordinate system of the table 620, and then converted into the procedure field image scale.
As the matching method, a method of rigid registration and non-rigid registration may be used together. Such a mutual information based rigid registration matches the pre-operation image with the operation field image. In complementary information-based rigid body matching, it is assumed that similar tissue regions with similar shades in one image will correspond to regions of similar shading in other images. Alternatively, other known methods of matching can be used. As a result of the matching, the insertion path is mapped to the procedure field image (e.g., see Fig. 20), and the coordinate system of the
After the images are registered, the
Thereafter, the
If there is a difference in level between the procedure field image and the pre-operation image, an offset may occur when the planned
Also, a process may be added in which the
Thereafter, in this example, a process for reducing the error due to breathing may be added between the confirmation of the alignment of the
Thereafter, the alignment of the respiration and
The
The sampling of the tissue by the
16 is a diagram for another example illustrating the overall process of the intervention procedure guide method. The presented process is displayed at the top of the
The system is divided into a planning stage (Navigation Stage) and a navigation stage (Navigation Stage). First, before the planning stage, the patient is acquired and segmented before the procedure. A high resolution image is obtained, a three-dimensional image of the patient is created, and the organs are divided on the image so that the
Each process will be described in detail below.
17 is a diagram illustrating an example of a display screen in a segmentation mode in which a segmentation mode is selected in an
Next, FIG. 18 shows an example of a display screen of a plan mode. In FIG. 18, an
The boundary around the
An obstacle such as a blood vessel is checked while simulating the
19 and 20 are views showing an example of a screen of a display in the matching mode. 19 shows that the
As a matching method, a level-set motion registration method can be used. The
On the other hand, if the
Matching may be Rigid Transformation and Level-set Registration, which is non-rigid matching. Fig. 21 shows the contents related to this.
22 and 23 are diagrams for explaining an example of a window in which a treatment plan can be modified. Even if the matching is performed, as shown in FIG. 22 and FIG. 23, when the levels of the
The
23 in which the
FIGS. 24 to 26 are views showing examples of a display screen in the navigation mode. The
The
The
Referring to FIG. 26, the CT apparatus is operated in a ready state, and a real-
The control unit may have a section acquisition module for generating a plurality of two-dimensional sectional images related to the
27 and 28 are diagrams for explaining an example of a display screen in the insertion mode, in which an insertion mode is selected when an indication or an alarm that the respiration levels described above are provided, and the
The interventional procedure system can guide the insertion of the
When the
On the other hand, as shown in FIGS. 27 and 28, the real-
A shape (e.g., a shape of a needle) related to the positional information of the
Alternatively, a
Also, as described above, in the interventional procedure method and system of the present example, the
The
The interventional treatment system may set a virtual wall around the target to signal the practitioner at each step as the
Various embodiments of the present disclosure will be described below. The practice of the invention is possible by various combinations thereof.
(1) An interventional treatment system using a medical image, comprising: an image acquisition device for acquiring a medical image of a patient; a patient table supporting the patient; a robot base located on the side of the patient table; A robot arm mounted on one side of the robot base, and a controller having a display for setting an insertion path of the medical instrument.
(2) Robot base is an intervention system using medical images equipped with moving weight.
(3) An intervention procedure system using medical images that move automatically according to the movement of a moving arm.
(4) An interventional procedure system using a medical image mounted movably in both directions in the direction of the patient table from the lower part of the inside of the robot base.
(5) The robot base is an interventional system using a medical image including a wheel for movement, a fixing device for fixing the position of the robot base, and a robot arm controller.
(6) An interventional procedure system using a medical image in which a part of the robot arm is radio wave permeable.
(7) The robot arm includes a sliding portion movable in the direction of the image acquisition device from the robot base, a first arm and a second arm connected to each other to determine the height of the medical instrument, and a second arm rotatably mounted on the second arm, And a third arm formed in the direction of the device.
(8) An interventional treatment system using a medical image, wherein the sliding portion, the first arm, and the second arm are rotatably connected to each other at their ends, and the first arm is mounted with two drive cover portions in the opposite directions.
(9) An interventional procedure system using a medical image having an end effector to which a pitching motion of a medical instrument is mounted on a third arm.
(10) An interventional surgical system using a medical image, wherein the robot arm includes an end effector configured to have a medical tool drive for at least a portion of which is radio wave transmissive and inserts a medical tool.
(11) An interventional treatment system using medical images, comprising: an image acquisition device for acquiring a medical image of a patient; a patient table for supporting the patient; a robot base for positioning the patient table; A robot arm mounted on the robot base, and a controller having a display and setting an insertion path of the medical instrument.
(12) An interventional procedure system using a medical image moving in accordance with a movement of a robot arm and being mounted in a lower portion of the inside of the robot base so as to be movable in both directions toward a patient table.
According to the guide method and the intervention procedure system using the medical image according to the present disclosure, the automation, accuracy, stability, and convenience of the intervention guide method using medical images are improved.
Table (620) Image acquisition device (600) Master console (310) Clutch (313)
Insertion depth gauge bar (560) Upper menu bar (551)
The
The
Claims (12)
An image acquisition device for acquiring a medical image of a patient;
A patient table supporting the patient;
A movable robot base located on the side of the patient table;
A robot arm mounted on one side of the robot base to allow the medical tool to enter into the image acquisition device; And,
And a control unit for setting an insertion path of the medical instrument and having a display,
Wherein the robot base includes a moving weight which is mounted movably in the lower portion of the inside of the robot base in a bidirectional direction toward the patient table.
A moving weight system is an intervention system that uses medical images that move automatically according to the movement of a robot arm.
Wheels for movement;
A fixing device for fixing the position of the robot base; And
And a robot arm control unit.
A part of the robot arm is an interventional procedure system using medical image which is radio wave transparent.
An image acquisition device for acquiring a medical image of a patient;
A patient table supporting the patient;
A movable robot base located on the side of the patient table;
A robot arm mounted on one side of the robot base to allow the medical tool to enter into the image acquisition device; And,
And a control unit for setting an insertion path of the medical instrument and having a display,
The robot arm,
A sliding part capable of moving in the direction of the image acquisition device from the robot base;
A first arm and a second arm connected to each other to determine the height of the medical instrument,
And a third arm rotatably mounted to the second arm and formed in the direction of the image acquisition device.
Wherein the sliding portion, the first arm, and the second arm are rotatably connected to each other at an end thereof, and the first arm is mounted with two driving unit covers in the opposite directions.
An interventional procedure system using a medical image having an end effector to which a pitching motion of a medical instrument is mounted on a third arm.
An end effector configured to have a medical tool drive for at least a portion of which is radio wave transparent and inserts a medical tool.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150127611A KR101758741B1 (en) | 2015-09-09 | 2015-09-09 | Guiding method of interventional procedure using medical images and system for interventional procedure for the same |
PCT/KR2016/010192 WO2017043926A1 (en) | 2015-09-09 | 2016-09-09 | Guiding method of interventional procedure using medical images, and system for interventional procedure therefor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150127611A KR101758741B1 (en) | 2015-09-09 | 2015-09-09 | Guiding method of interventional procedure using medical images and system for interventional procedure for the same |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170030690A KR20170030690A (en) | 2017-03-20 |
KR101758741B1 true KR101758741B1 (en) | 2017-08-11 |
Family
ID=58240946
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150127611A KR101758741B1 (en) | 2015-09-09 | 2015-09-09 | Guiding method of interventional procedure using medical images and system for interventional procedure for the same |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR101758741B1 (en) |
WO (1) | WO2017043926A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20210086871A (en) | 2019-12-31 | 2021-07-09 | 주식회사 코어라인소프트 | System and method of interventional procedure using medical images |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10716627B2 (en) | 2017-05-03 | 2020-07-21 | Covidien Lp | Method and system for planning a surgical instrument path |
US11065069B2 (en) | 2017-05-10 | 2021-07-20 | Mako Surgical Corp. | Robotic spine surgery system and methods |
KR101855461B1 (en) * | 2017-10-24 | 2018-05-04 | (주)포위즈시스템 | Automatic needling system and method for minimally invasive surgery |
EP3754606A1 (en) * | 2019-06-17 | 2020-12-23 | Galgo Medical, SL | A computer implemented method, a system and computer programs for computing simultaneous rectilinear paths using medical images |
CN110464456B (en) * | 2019-09-11 | 2023-07-11 | 嘉兴莫比乌斯智能科技有限公司 | Automatic laser treatment robot |
WO2021142272A1 (en) | 2020-01-09 | 2021-07-15 | Canon U.S.A., Inc. | Enhanced planning and visualization with curved instrument pathway and its curved instrument |
CN113425412B (en) * | 2021-06-18 | 2023-07-28 | 上海交通大学 | Robot for interventional vascular operation |
CN114305613B (en) * | 2021-12-30 | 2024-01-30 | 武汉联影智融医疗科技有限公司 | Image-guided interventional puncture system |
CN114767031B (en) * | 2022-03-31 | 2024-03-08 | 常州朗合医疗器械有限公司 | Endoscope apparatus, position guidance apparatus, system, method, and computer-readable storage medium for endoscope |
CN114947691A (en) * | 2022-03-31 | 2022-08-30 | 常州朗合医疗器械有限公司 | Endoscope apparatus, position guide apparatus for endoscope, and medical bed |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002165804A (en) * | 2000-09-22 | 2002-06-11 | Mitaka Koki Co Ltd | Medical stand device |
US20080208212A1 (en) * | 2007-02-23 | 2008-08-28 | Siemens Aktiengesellschaft | Arrangement for supporting a percutaneous intervention |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101108927B1 (en) * | 2009-03-24 | 2012-02-09 | 주식회사 이턴 | Surgical robot system using augmented reality and control method thereof |
KR101234618B1 (en) * | 2009-09-17 | 2013-02-25 | (주)미래컴퍼니 | Surgical robot |
KR101598773B1 (en) * | 2010-10-21 | 2016-03-15 | (주)미래컴퍼니 | Method and device for controlling/compensating movement of surgical robot |
KR20140035294A (en) * | 2012-09-13 | 2014-03-21 | (주)알에프메디컬 | Needle guiding system and ct image display apparatus |
KR101464330B1 (en) * | 2013-04-26 | 2014-11-24 | 서울대학교병원 | Method of comparing preoperative respiratory level with intraoperative respiratory level |
-
2015
- 2015-09-09 KR KR1020150127611A patent/KR101758741B1/en active IP Right Grant
-
2016
- 2016-09-09 WO PCT/KR2016/010192 patent/WO2017043926A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002165804A (en) * | 2000-09-22 | 2002-06-11 | Mitaka Koki Co Ltd | Medical stand device |
US20080208212A1 (en) * | 2007-02-23 | 2008-08-28 | Siemens Aktiengesellschaft | Arrangement for supporting a percutaneous intervention |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20210086871A (en) | 2019-12-31 | 2021-07-09 | 주식회사 코어라인소프트 | System and method of interventional procedure using medical images |
KR102467282B1 (en) * | 2019-12-31 | 2022-11-17 | 주식회사 코어라인소프트 | System and method of interventional procedure using medical images |
Also Published As
Publication number | Publication date |
---|---|
WO2017043926A1 (en) | 2017-03-16 |
KR20170030690A (en) | 2017-03-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101758741B1 (en) | Guiding method of interventional procedure using medical images and system for interventional procedure for the same | |
US20220346886A1 (en) | Systems and methods of pose estimation and calibration of perspective imaging system in image guided surgery | |
CN108024838B (en) | System and method for using registered fluoroscopic images in image-guided surgery | |
US20220409308A1 (en) | Surgical robot platform | |
KR101758740B1 (en) | Guiding method of interventional procedure using medical images and system for interventional procedure for the same | |
EP3164051B1 (en) | System and program for providing distance and orientation feedback while navigating in 3d | |
CA2772679C (en) | Manual instrumented medical tool system | |
US11701492B2 (en) | Active distal tip drive | |
EP3831328A1 (en) | Method for maintaining localization of distal catheter tip to target during ventilation and/or cardiac cycles | |
WO2021198906A1 (en) | Target anatomical feature localization | |
CN113558735A (en) | Robot puncture positioning method and device for biliary tract puncture | |
US8467850B2 (en) | System and method to determine the position of a medical instrument | |
US20140343407A1 (en) | Methods for the assisted manipulation of an instrument, and associated assistive assembly | |
CN117615724A (en) | Medical instrument guidance system and associated methods | |
KR101635515B1 (en) | Medical mavigation apparatus | |
KR20170030688A (en) | Guiding method of interventional procedure using medical images and system for interventional procedure for the same | |
KR20210086871A (en) | System and method of interventional procedure using medical images | |
CN215874870U (en) | Robot puncture positioning device for biliary tract puncture | |
WO2023004303A1 (en) | Image guidance for medical procedures | |
CN117813631A (en) | System and method for depth-based measurement in three-dimensional views | |
CN117412724A (en) | System and method for evaluating breath hold during intra-procedural imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
N231 | Notification of change of applicant | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right |