WO2016056838A1 - Medical navigation device - Google Patents
Medical navigation device Download PDFInfo
- Publication number
- WO2016056838A1 WO2016056838A1 PCT/KR2015/010591 KR2015010591W WO2016056838A1 WO 2016056838 A1 WO2016056838 A1 WO 2016056838A1 KR 2015010591 W KR2015010591 W KR 2015010591W WO 2016056838 A1 WO2016056838 A1 WO 2016056838A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- needle
- medical
- distance
- computer
- image
- Prior art date
Links
- 238000001574 biopsy Methods 0.000 claims description 75
- 238000003780 insertion Methods 0.000 claims description 59
- 230000037431 insertion Effects 0.000 claims description 58
- 238000000034 method Methods 0.000 claims description 32
- 238000002059 diagnostic imaging Methods 0.000 claims description 5
- 239000003550 marker Substances 0.000 claims description 5
- 238000001356 surgical procedure Methods 0.000 claims description 4
- 230000003213 activating effect Effects 0.000 claims description 3
- 230000000968 intestinal effect Effects 0.000 claims 1
- 206010028980 Neoplasm Diseases 0.000 description 28
- 210000004072 lung Anatomy 0.000 description 12
- 230000003902 lesion Effects 0.000 description 8
- 210000000038 chest Anatomy 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000002594 fluoroscopy Methods 0.000 description 3
- 238000002679 ablation Methods 0.000 description 2
- 210000003484 anatomy Anatomy 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000029058 respiratory gaseous exchange Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 210000001015 abdomen Anatomy 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 210000004100 adrenal gland Anatomy 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013170 computed tomography imaging Methods 0.000 description 1
- 238000007408 cone-beam computed tomography Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000013152 interventional procedure Methods 0.000 description 1
- 210000003734 kidney Anatomy 0.000 description 1
- 210000004185 liver Anatomy 0.000 description 1
- 210000001165 lymph node Anatomy 0.000 description 1
- 210000001370 mediastinum Anatomy 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000001613 neoplastic effect Effects 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 210000000496 pancreas Anatomy 0.000 description 1
- 238000010827 pathological analysis Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 210000004303 peritoneum Anatomy 0.000 description 1
- 230000002685 pulmonary effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 206010041232 sneezing Diseases 0.000 description 1
- 230000009469 supplementation Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
- A61B10/02—Instruments for taking cell samples or for biopsy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/28—Surgical forceps
- A61B17/29—Forceps for use in minimally invasive surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3403—Needle locating or guiding means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
- A61B10/02—Instruments for taking cell samples or for biopsy
- A61B2010/0225—Instruments for taking cell samples or for biopsy for taking multiple samples
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
Definitions
- the present disclosure relates to a medical navigation apparatus as a whole, and more particularly, to a medical navigation apparatus that intuitively shows an alignment of a medical tool on a screen and a distance between the medical tool and the target point on one screen.
- Medical imaging-based biopsy is an interventional procedure that minimizes damage to the surrounding normal tissue and extracts the samples necessary for the pathological diagnosis of neoplastic disease, such as the adrenal glands, pancreas and lymph nodes. It is widely applied to areas such as the peritoneum, lung mediastinum, spine and extremities. Medical imaging-based biopsies use high-resolution images to delicately localize lesions in three dimensions and to view biopsy needles that enter tissues, making it easier to detect small lesions.
- the insertion path of the biopsy needle may be guided by a CT or C-arm fluoroscopy image at a procedure for performing a medical image-based biopsy. Due to problems such as radiation exposure, the insertion path is usually planned in advance on the diagnostic image. For example, in the planning of the insertion path, the entry angle of the biopsy needle to the patient's body is important, and the insertion path is planned by defining the entry angle and insertion point.
- the image acquisition device eg, Fluoroscopy device, CBCT device placed in the procedure room is aligned with the planned path, i.e., the orientation in which the biopsy needle will be inserted.
- a navigation view is used to accurately guide the biopsy needle during the biopsy procedure.
- a navigation view such as surgeon's Eye View shown in FIG. 1
- the center point of the target is shown, and the biopsy needle is based on the insertion point.
- the target is displayed as a point and a circle is drawn around the point.
- the user interface screen or navigation screen shows a cross-sectional image perpendicular to the surgeon's Eye View, and has two or three more views of the biopsy needle, and the operator looks at the surgeon's Eye View.
- FIG. 2 is a diagram illustrating an example of a navigation screen for an ablation procedure disclosed in US 2013/0317363, in which the centers of two circles 453 and 454 coincide with each other to determine whether a medical tool is accurately aimed at a target. You can check with In addition, the distance between the target and the medical instrument is indicated by numbers below. According to this, there is a lack of a means for intuitively recognizing the distance and direction that changes as the medical tool is continuously inserted into the human body.
- a medical navigation device for guiding insertion of a needle-like medical tool, comprising: a computer integrating surgical planning information into an operating room image; a computer for integrating surgical planning information, including a target point, an insertion point, and an insertion path of a needle-shaped medical tool in a planned lesion in a preoperative image including a surgical target; Positioning means for identifying the relative position information of the patient and the needle-shaped medical tool and providing it to the computer; And a user interface (UI) showing an entry point in the direction of insertion of the needle-shaped medical tool using an operating room image in which the surgery planning information is integrated with the computer. And a user interface displaying a distance between the calculated target point and the end of the needle-type medical instrument, the navigation interface displaying a distance of the distance by a plurality of lines spaced about the insertion point.
- UI user interface
- FIG. 1 is a view showing an example of a surgeon's Eye View
- FIG. 2 is a view showing an example of a navigation screen for the ablation procedure disclosed in US Patent Publication No. 2013/0317363;
- FIG. 3 is a view for explaining an example of a medical navigation device according to the present disclosure
- FIG. 4 is a view illustrating an example of a method of dividing a tumor and generating a surgical plan in a preoperative image
- FIG. 5 is a view illustrating an example of a preoperative image in which a tumor and an insertion path are visualized
- FIG. 6 is a view for explaining an example of how the surgical plan is integrated into the operating room image
- FIG. 7 is a view for explaining an example of the positioning means for grasping the relative position information of the patient and the biopsy needle;
- FIG. 8 is a view for explaining an example of a user interface screen
- 9 and 10 are views showing examples of a screen showing both the direction and the depth of the biopsy needle in the user interface screen.
- the medical navigation device may be used to navigate the needle insertion interventional robot.
- Needle-insertion interventional robots are used for biopsy and treatment to reduce radiation exposure and improve procedure accuracy. Intervention robots can be used for biopsy and treatment of 1 cm-grade lesions in the abdomen, chest, and the like. Examples of needle-type medical instruments include biopsy needles.
- the medical navigation apparatus includes a computer 600 for processing or generating a medical image, a position detecting means for grasping the relative position information of the patient 50 and the needle-type medical tool 111 and providing it to the computer 600 ( 400, and a user interface 500 interoperating with the computer 600 to show an entry point in the insertion direction of the needle-shaped medical instrument 111 using an operating room image in which surgery planning information is integrated.
- the user interface 500 displays the distance between the target point calculated by the computer 600 and the end of the needle medical instrument 111 using the relative position information of the patient 50 and the needle medical instrument 111.
- the navigation screen displays distances of distances with a plurality of lines spaced around the insertion point.
- the needle-type medical tool 111 is provided in the slave robot 100 that operates in conjunction with the computer 600, and master console 200 for controlling the slave robot 100 in cooperation with the user interface 500 in real time. And an image capturing apparatus 300 for capturing the position of the biopsy needle 111 in the human body, and a device 400 for monitoring the position and posture of the slave robot 100, the patient 50, and peripheral devices. can do.
- the medical navigation apparatus may be applied to a biopsy of an organ such as lung, kidney, liver, and the like. The application is also not excluded from the site of. In this example, the lungs are described.
- the preoperative image is thresholded on the lung of the patient to segment the lesion 10 (eg, a tumor) and generate a surgical plan.
- the lung images are segmented to prepare a divided lung image.
- anatomical structures eg, blood vessels, ribs, airways, lung boundaries, etc.
- Anatomical structures, such as airways and the like may be stored as a lung mask, a vessel mask, a rib mask, an airway mask, or the like.
- Tumor 10 is divided by a segmentation technique (for example, adaptive threshold) by the HU value appropriate for the tumor 10. 4 shows an example of an axial cross section of a lung image in which the tumor 10 is divided.
- the computer 600 is loaded with a preoperative image of the patient, and the operating room image and the preoperative image of the patient acquired at the operating room are registered by the computer 600. As a result of the registration, a surgical plan including the insertion paths 82 and 84, the insertion point 41, the target point on the tumor, and the like made using the preoperative image is transferred to the operating room image. This is further described below.
- the divided tumor 10 may be generated as a 3D image.
- the cross section of the tumor can be seen in the direction required by the image processing software, and the tumor can be seen in representative directions such as, for example, axial view, coronal view and sagittal view. (10) can be seen, and a surgical plan can be made based on this.
- FIG. 5 is a diagram illustrating an example of a preoperative image in which a tumor and an insertion path are visualized, and an insertion path 82 is visualized in 3D between the actual ribs and the ribs.
- the preoperative image is a 3D image, and as shown in FIG. 5 through volume rendering, the surgical plan may be generated in 3D.
- the tumor 10 is divided from the periphery, the insertion path 82 is visualized in three dimensions, and the tumor 10 is marked with a target point (eg, the center point or edge of the tumor).
- Tumor 10 has little contrast and is therefore invisible to fluoroscopy cities, and tumor 10 is generally represented in a generally circular shape. Accordingly, unlike the case shown in FIG. 5, the tumor is not visualized to be distinguished from the surroundings, and the location of the tumor may be determined by the internal calculation process of the computer 600.
- FIG. 6 is a view illustrating an example of a method of integrating a surgical plan into an operating room image.
- An operating room image is acquired at an operating room, and the preoperative image and the operating room image are matched to insert an operation path into the operating room image.
- the plan is transferred.
- rigid registration and deformable registration methods and the like may be used.
- the insertion path 82 may be modified through the user interface 500, and an inappropriate insertion path may be removed in consideration of breathing or movement.
- 6 (a) is an example of a preoperative image
- FIG. 6 (b) is an image in which an operating room image and a preoperative image are matched and an operation plan is transferred.
- the insertion path 251, the insertion point, and the target point (eg, axial view, coronal view, sagittal view) on the MPR may be overlaid (FIG. 6 illustrates the axial view).
- the biopsy needle 111 may be guided along the insertion path 82 identified on the MPR to perform the procedure.
- the final confirmed insertion path is transmitted to the slave robot 100 or the user interface 500 (eg, navigation device) or the like using TCP / IP or a dedicated communication protocol.
- the biopsy needle 111 may of course be a single needle type, but in order to biopsy a multi-spot, a plurality of revolver types may be mounted on the slave robot 100, so that it may be more effective to biopsy each target point sequentially.
- FIG. 7 is a view for explaining an example of a positioning means for grasping relative position information of a patient and a biopsy needle, and various methods may be used as the positioning means for grasping the relative positional relationship between the patient and the biopsy needle.
- the patient 960, the slave robot 911 with the biopsy needle 912, the infrared camera 991, the infrared reflector assemblies 911, 913, 914, the monitor 920 and the computer. 940 is provided.
- the infrared camera 991 detects the plurality of infrared reflectors 911 and the infrared reflectors or infrared emitters 913 provided at the ends of the biopsy needle 912, indicating the position of the patient 960.
- the location of the patient 960 can be identified.
- a computer 940 is provided for overall operation of the master console, and a monitor 920 is also provided.
- the computer 940 and the monitor 920 may correspond to the computer 600 and the user interface 500 described with reference to FIG. 1.
- the computer 940 also functions as a surgical navigation device.
- the biopsy needle 912 of the slave robot 911 is operated by the computer 940.
- the infrared reflector assembly 911 is fixed to the patient 960 to indicate the position of the patient 960
- the infrared reflector assembly 913 is fixed to the biopsy needle 912 to indicate the position of the biopsy needle 912
- An infrared reflector assembly 914 is positioned on the chest of the patient 960 to indicate patient movement, such as breathing, sneezing of the patient.
- an infrared camera and an infrared reflector are used, but a magnetic field can be used, and any means can be used as long as the position can be sensed. For example, it is possible to attach a magnetic sensor to the biopsy needle and track with the camera how far it moves.
- the infrared reflector assembly 911 may be used to indicate the location information of the patient 960, may function as a reference position of the entire system, may be fixed to the patient 960, but may be fixed to the operating table, or the operating table. An additional infrared reflector assembly (not shown) may serve as a reference position. The location of the biopsy needle 912 relative to the patient 960 can be determined.
- the slave robot itself knows its position.
- the slave robot is holding the biopsy needle, and the slave robot itself can know its coordinates within the procedure room. It is also possible to detect by itself how many millimeters the biopsy needle moves. Therefore, the computer can calculate the orientation and position of the biopsy needle in the space of the procedure image.
- the computer can calculate the current position of the biopsy needle in the matched operating room image space by imaging with floroscopy city.
- the positioning means uses a plurality of methods to determine the positional relationship rather than using only one.
- the distance between the biopsy needle 111 and the target point 11 on the tumor 10 can be calculated by the computer calculating the relative positional relationship of the patient and the biopsy needle identified by one or more of these locating means.
- the user interface screen displays two small circles showing the orientation along with a crosshair to see if the biopsy needle 111 matches the tumor 10, and provides a depth to intuitively determine the distance between the tumor and the tip of the biopsy needle.
- the distance between the tip of the biopsy needle and the target point is displayed, and the distance between the tip of the biopsy needle and the target point of the tumor is displayed in depth with a plurality of curves spaced apart from each other about the insertion point. Since the distance between the tip and the target point of the biopsy needle is constantly changing when inserted into the insertion path, the insertion angle, the insertion point, and the insertion distance determined in the surgical plan, it is preferable to sequentially display the depth on the user interface screen. This is further described below.
- the upper screen contains a CT image (e.g., an image transmitted from a floroscopy) and a mask showing various structures or lesions of the lung.
- CT image e.g., an image transmitted from a floroscopy
- a mask showing various structures or lesions of the lung.
- the main screen 510 displays an operating room image matched with 3D.
- MPR images eg, 520, 530, 540
- the direction of the biopsy needle and the like are displayed.
- a camera is installed at the end of the biopsy needle to show the insertion point of the skin, and when the centers of the two circles coincide, the angle of the biopsy needle is aimed at the tumor as planned.
- the operator instructs the computer through the master console and the biopsy needle is inserted into the human body by the operation of the slave robot linked to the computer.
- the relative position of the biopsy needle and the patient can be seen in the same way as described in FIG. 7, with the spiral curve 70 (see FIG. 9) centered around the insertion point being the stab depth of the biopsy needle, ie the tip and affected part of the biopsy needle. Shows the distance between targets. This allows the doctor to intuitively recognize the angle and depth of the biopsy needle by looking at this screen.
- the distance is calculated and displayed on the screen in real time, and as shown in FIG. can do.
- the location of the affected part is grasped in the space of the operating room image and the distance can be obtained by identifying the biopsy needle displayed on the operating room image.
- the movement of the slave robot can be detected by the slave robot itself. That is, the sensor can detect how far the needle has advanced in the slave robot. Therefore, the distance information thereby may be supplemented to the above-mentioned distance information.
- the computer can calculate the puncture rate by identifying the biopsy needles appearing on the plurality of operating room images, the time delay and processing the operating image to display on the user interface and the speed is evaluated, the position of the end of the current biopsy needle and The distance can be calculated. This calculated distance can also be used for supplementation.
- FIG. 9 and 10 are diagrams showing examples of the screen showing the direction and depth of the biopsy needle in the user interface screen, and whether the biopsy needle is accurately aimed at the affected part is the center of the center of the two small circles inside It can be seen whether or not.
- a spiral curve 70 (refer to FIG. 9) is displayed around the center, and the outer spiral indicates a distance from the affected part than the inner spiral. The spirals are numbered in depth.
- the distance continues to change, which in turn activates certain portions of the spiral corresponding to the distance (e.g. color changes, line thickness changes, flickers, etc.), thereby allowing the operator to determine the direction and depth of the biopsy needle. Intuitively together.
- a plurality of circles 75 concentric with the center 11 are shown, with the outer circle farther from the affected part than the inner circle.
- the circle corresponding to the distance is sequentially activated (e.g. color change, line thickness change, blinking, etc.) so that the operator can intuitively adjust the direction and depth of the biopsy needle together. You can check it.
- the biopsy needle is pierced while checking the distance of the tip of the biopsy needle and the affected part in the above-described user interface screen according to the present example, and a CT or medical imaging apparatus immediately before the affected part. You can also consider using a biopsy needle to finally take the image and use it to biopsy the affected area.
- a medical navigation device for guiding the insertion of a needle-shaped medical tool comprising: a computer integrating surgical planning information into an operating room image, comprising: a target point on an image of a planned lesion in a preoperative image including a surgical target, A computer incorporating surgical planning information, including an insertion point and an insertion path of a needle-shaped medical instrument, in the operating room image; Positioning means for identifying the relative position information of the patient and the needle-shaped medical tool and providing it to the computer; And a user interface (UI) showing an entry point in the direction of insertion of the needle-shaped medical tool using an operating room image in which the surgery planning information is integrated with the computer. And a user interface displaying a distance between the calculated target point and the end of the needle-type medical instrument, the navigation interface displaying a distance of the distance by a plurality of lines spaced about the insertion point. Medical navigation device.
- a medical navigation apparatus characterized in that a plurality of lines are spirals centering on an insertion point.
- a medical navigation apparatus characterized in that a plurality of lines have a plurality of causes of concentric insertion points.
- the navigation screen displays two circles enclosing the insertion point inside the plurality of lines, and the alignment of the needle-like medical tool and the insertion path is confirmed by coinciding the centers of the two circles. Medical navigation device.
- the locating means comprises: a marker for marking the needle-shaped medical tool and the patient; And a sensing device for sensing a marker.
- the positioning means includes: a medical imaging apparatus for photographing the operating room image; the computer calculates the location of the affected area in the space of the operating room image by matching the plurality of operating room images and preoperative images , Medical navigation device, characterized in that for calculating the distance between the needle-shaped medical tool and the lesion included in the plurality of operating room images.
- the positioning means includes: a sensor provided in the slave robot equipped with the needle-type medical tool, wherein the sensor detects the position of the needle-type medical tool with respect to a reference position of the operating room. .
- the medical instrument of the needle type is a biopsy needle, wherein the user interface sequentially activates a circle corresponding to the corresponding distance to display the distance between the target point and the tip of the biopsy needle.
- the distance between the affected part and the tip of the needle-type medical tool is activated in a spiral or a plurality of circles sequentially and displayed in a sense of depth, it is convenient to know the depth of stab intuitively, The safety and accuracy of the procedure can be further improved.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Robotics (AREA)
- General Physics & Mathematics (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- Radiology & Medical Imaging (AREA)
- Ophthalmology & Optometry (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
The present disclosure relates to a medical navigation device for guiding entry of a needle-type medical tool, the medical navigation device comprising: a computer for integrating an operation room image with operation plan information, the computer integrating operation plan information with an operation room image, the operation plan information comprising a target point on a surgical target, an entry point, and a path of entry of the needle-type medical tool, which have been planned with reference to a pre-operation image comprising the surgical target; a position grasping means for grasping information regarding the relative position between the patient and the needle-type medical tool and providing the computer with the same; and a UI for interworking with the computer and showing the entry point in the direction of entry of the needle-type medical tool using the operation room image, which is integrated with operation plan information, the UI comprising a navigation screen for displaying the distance between a target point, which has been calculated by the computer, and the end of the needle-type medical tool using the information regarding the relative position such that a plurality of lines, which are arranged at an interval about the entry point, indicate the interval of distance.
Description
본 개시(Disclosure)는 전체적으로 의료용 항법 장치에 관한 것으로, 특히 하나의 화면에서 의료 도구의 목표점에 대한 정렬과 목표점과의 의료 도구 간의 거리를 직관적으로 보여주는 의료용 항법 장치에 관한 것이다.The present disclosure relates to a medical navigation apparatus as a whole, and more particularly, to a medical navigation apparatus that intuitively shows an alignment of a medical tool on a screen and a distance between the medical tool and the target point on one screen.
여기서는, 본 개시에 관한 배경기술이 제공되며, 이들이 반드시 공지기술을 의미하는 것은 아니다(This section provides background information related to the present disclosure which is not necessarily prior art).This section provides background information related to the present disclosure which is not necessarily prior art.
의료 영상 기반 생체검사(Biopsy)는 주위의 정상조직에 대한 피해를 최소화하고, 종양 질환(neoplastic disease)의 병리적 진단에 필요한 샘플을 뽑아내는 중재시술(interventional procedure)로서, 부신, 췌장, 림프절 등의 후 복막, 폐 종격, 척추, 사지골 등의 부위에 광범위하게 적용된다. 의료 영상 기반 생체검사는 고해상의 영상을 이용하여 병변 부위를 섬세하게 3차원적으로 지역화(localization) 하고 조직 내에 진입한 생검 바늘(Biopsy Needle)을 볼 수 있어서 작은 크기의 병변 감지가 용이하다. Medical imaging-based biopsy is an interventional procedure that minimizes damage to the surrounding normal tissue and extracts the samples necessary for the pathological diagnosis of neoplastic disease, such as the adrenal glands, pancreas and lymph nodes. It is widely applied to areas such as the peritoneum, lung mediastinum, spine and extremities. Medical imaging-based biopsies use high-resolution images to delicately localize lesions in three dimensions and to view biopsy needles that enter tissues, making it easier to detect small lesions.
의료 영상 기반 생체검사를 시행하는 시술장에서 CT 또는 C-arm 플로로스코피(fluoroscopy) 영상에 의해 생검 바늘의 삽입 경로가 가이드될 수 있다. 방사선 노출 등의 문제로 인해 삽입 경로는 사전에 진단 영상에서 계획되는 것이 일반적이다. 예를 들어, 삽입 경로의 계획에서 환자 몸에 생검 바늘의 진입 각도가 중요하며, 진입 각도 및 삽입점을 정함으로써 삽입 경로가 계획된다. 환자가 시술장에 들어와서, 수술이 시작되면, 시술장에 놓인 이미지 획득 장치(예: Fluoroscopy 장치, CBCT 장치)를 계획된 경로, 즉 생검 바늘이 삽입될 방위와 동일한 방위로 맞춘다. The insertion path of the biopsy needle may be guided by a CT or C-arm fluoroscopy image at a procedure for performing a medical image-based biopsy. Due to problems such as radiation exposure, the insertion path is usually planned in advance on the diagnostic image. For example, in the planning of the insertion path, the entry angle of the biopsy needle to the patient's body is important, and the insertion path is planned by defining the entry angle and insertion point. When the patient enters the procedure room and the operation begins, the image acquisition device (eg, Fluoroscopy device, CBCT device) placed in the procedure room is aligned with the planned path, i.e., the orientation in which the biopsy needle will be inserted.
상기 생검 과정에서 생검 바늘을 정확히 가이드하기 위해 항법뷰가 사용된다. 예를 들어, 도 1에 도시된 surgeon's Eye View와 같은 항법뷰(navigation view)에서는 생검 바늘로 삽입점(entry point)을 찌르면, 타겟(target)의 센터포인트가 보이고, 삽입점을 기준으로 생검 바늘이 점처럼 보인다. 이러한 항법뷰에서는 타겟이 한 점으로 표시되고, 한 점을 중심으로 원이 그려져 있다. 여기서 삽입 경로의 계획에 따라 어떠한 각도로 몇 센티를 찌르면 한 점에 도달하는 것을 보여준다. 이러한 surgeon's Eye View에 더하여 사용자 인터페이스 화면 또는 네비게이션 화면에는 surgeon's Eye View에 수직인 단면 영상을 보여주며 생검 바늘이 보이는 뷰가 2개~3개 정도 더 구비되며, 시술자는 surgeon's Eye View를 보면서 생검 바늘의 방향을 확인하고, 다른 추가의 2개의 뷰를 보면서 생검 바늘의 위치 또는 타겟까지의 거리를 확인한다. 따라서 여러 화면을 함께 보아야 하기에 방향과 거리를 직관적으로 함께 인식하는 것이 어렵다. 거리 정보가 숫자로 표시될 수도 있지만 직관적으로 생검 바늘의 방향과 계속 변하는 거리를 인식하는 데에는 불편함이 있다.A navigation view is used to accurately guide the biopsy needle during the biopsy procedure. For example, in a navigation view such as surgeon's Eye View shown in FIG. 1, when the insertion point is inserted into the biopsy needle, the center point of the target is shown, and the biopsy needle is based on the insertion point. Looks like this point In this navigation view, the target is displayed as a point and a circle is drawn around the point. Here you can see that a few centimeters at any angle will reach a point, depending on the planning of the insertion path. In addition to the surgeon's Eye View, the user interface screen or navigation screen shows a cross-sectional image perpendicular to the surgeon's Eye View, and has two or three more views of the biopsy needle, and the operator looks at the surgeon's Eye View. Check the orientation and look at the other two views to see the location of the biopsy needle or the distance to the target. Therefore, it is difficult to intuitively recognize directions and distances together because multiple screens must be viewed together. Although the distance information may be displayed numerically, it is intuitively inconvenient to recognize the direction of the biopsy needle and the continuously changing distance.
도 2는 미국 공개특허공보 제2013/0317363호에 개시된 ablation procedure를 위한 네비게이션 화면의 일 예를 나타내는 도면으로서, 타겟에 의료 도구가 정확히 조준되었는지를 두 개의 원(453, 454)의 센터가 일치하는지로 확인할 수 있다. 또한, 타겟과 의료 도구 간의 거리가 아래에 숫자로 표시되어 있다. 이에 의하면, 의료 도구가 인체에 계속 삽입되어 들어감에 따라 변화하는 거리와 방향을 함께 직관적으로 인식할 수 있는 수단이 미흡하다.FIG. 2 is a diagram illustrating an example of a navigation screen for an ablation procedure disclosed in US 2013/0317363, in which the centers of two circles 453 and 454 coincide with each other to determine whether a medical tool is accurately aimed at a target. You can check with In addition, the distance between the target and the medical instrument is indicated by numbers below. According to this, there is a lack of a means for intuitively recognizing the distance and direction that changes as the medical tool is continuously inserted into the human body.
이에 대하여 '발명의 실시를 위한 형태'의 후단에 기술한다.This will be described later in the section on Embodiments of the Invention.
여기서는, 본 개시의 전체적인 요약(Summary)이 제공되며, 이것이 본 개시의 외연을 제한하는 것으로 이해되어서는 아니된다(This section provides a general summary of the disclosure and is not a comprehensive disclosure of its full scope or all of its features).This section provides a general summary of the disclosure and is not a comprehensive disclosure of its full scope or all, provided that this is a summary of the disclosure. of its features).
본 개시에 따른 일 태양에 의하면(According to one aspect of the present disclosure), 바늘형 의료 도구의 삽입을 가이드하는 의료용 항법 장치에 있어서, 수술장 영상에 수술 계획 정보를 통합하는 컴퓨터;로서, 환부(surgical target)를 포함하는 수술전 영상에서 계획된 환부의 상의 목표점, 삽입점 및 바늘형 의료 도구의 삽입 경로를 포함하는 수술 계획 정보를 수술장 영상에 통합하는 컴퓨터; 환자와 바늘형 의료 도구의 상대적 위치 정보를 파악하여 컴퓨터에 제공하는 위치 파악 수단; 그리고 컴퓨터와 연동되어 수술 계획 정보가 통합된 수술장 영상을 사용하여 바늘형 의료 도구의 삽입 방향에서 삽입점(entry point)을 보여주는 사용자 인터페이스(UI);로서, 상대적 위치 정보를 사용하여 컴퓨터에 의해 계산된 목표점과 바늘형 의료 도구의 끝 간의 거리를 표시하되, 삽입점을 중심으로 간격을 두고 있는 복수의 선으로 거리의 간격을 표시한 항법 화면을 구비하는 사용자 인터페이스;를 포함하는 것을 특징으로 하는 의료용 항법 장치가 제공된다.According to one aspect of the present disclosure (According to one aspect of the present disclosure), a medical navigation device for guiding insertion of a needle-like medical tool, comprising: a computer integrating surgical planning information into an operating room image; a computer for integrating surgical planning information, including a target point, an insertion point, and an insertion path of a needle-shaped medical tool in a planned lesion in a preoperative image including a surgical target; Positioning means for identifying the relative position information of the patient and the needle-shaped medical tool and providing it to the computer; And a user interface (UI) showing an entry point in the direction of insertion of the needle-shaped medical tool using an operating room image in which the surgery planning information is integrated with the computer. And a user interface displaying a distance between the calculated target point and the end of the needle-type medical instrument, the navigation interface displaying a distance of the distance by a plurality of lines spaced about the insertion point. A medical navigation device is provided.
이에 대하여 '발명의 실시를 위한 형태'의 후단에 기술한다.This will be described later in the section on Embodiments of the Invention.
도 1은 surgeon's Eye View의 일 예를 나타내는 도면,1 is a view showing an example of a surgeon's Eye View,
도 2는 미국 공개특허공보 제2013/0317363호에 개시된 ablation procedure를 위한 네비게이션 화면의 일 예를 나타내는 도면,2 is a view showing an example of a navigation screen for the ablation procedure disclosed in US Patent Publication No. 2013/0317363;
도 3은 본 개시에 따른 의료용 항법 장치의 일 예를 설명하는 도면,3 is a view for explaining an example of a medical navigation device according to the present disclosure,
도 4는 수술전 영상에서 종양을 분할하고 수술 계획을 생성하는 방법의 일 예를 설명하는 도면,4 is a view illustrating an example of a method of dividing a tumor and generating a surgical plan in a preoperative image;
도 5는 종양과 삽입 경로가 시각화된 수술전 영상의 일 예를 설명하는 도면,5 is a view illustrating an example of a preoperative image in which a tumor and an insertion path are visualized;
도 6은 수술장 영상에 수술 계획이 통합되는 방법의 일 예를 설명하는 도면,6 is a view for explaining an example of how the surgical plan is integrated into the operating room image,
도 7은 환자와 생검 바늘의 상대적 위치 정보를 파악하는 위치 파악 수단의 일 예를 설명하는 도면,7 is a view for explaining an example of the positioning means for grasping the relative position information of the patient and the biopsy needle;
도 8은 사용자 인터페이스 화면의 일 예를 설명하는 도면,8 is a view for explaining an example of a user interface screen;
도 9 및 도 10은 사용자 인터페이스 화면에서 생검 바늘의 방향과 깊이를 함께 보여주는 화면의 예들을 보여주는 도면들.9 and 10 are views showing examples of a screen showing both the direction and the depth of the biopsy needle in the user interface screen.
이하, 본 개시를 첨부된 도면을 참고로 하여 자세하게 설명한다(The present disclosure will now be described in detail with reference to the accompanying drawing(s)). The present disclosure will now be described in detail with reference to the accompanying drawing (s).
이하, 본 개시를 첨부된 도면을 참고로 하여 자세하게 설명한다(The present disclosure will now be described in detail with reference to the accompanying drawing(s)). The present disclosure will now be described in detail with reference to the accompanying drawing (s).
도 3은 본 개시에 따른 의료용 항법 장치의 일 예를 설명하는 도면으로서, 의료용 항법 장치는 바늘 삽입형 중재시술 로봇을 네비게이션하는 데에 사용될 수 있다. 바늘 삽입형 중재시술 로봇은 방사선 피폭을 저감하고, 시술 정확도 향상을 위한 생검 및 치료용으로 사용된다. 중재시술 로봇은 복부, 흉부 등에서 1cm 급 병소의 생검 및 치료용으로 사용될 수 있다. 바늘형 의료 도구로는 생검 바늘을 예로 들 수 있다.3 is a view for explaining an example of a medical navigation device according to the present disclosure, the medical navigation device may be used to navigate the needle insertion interventional robot. Needle-insertion interventional robots are used for biopsy and treatment to reduce radiation exposure and improve procedure accuracy. Intervention robots can be used for biopsy and treatment of 1 cm-grade lesions in the abdomen, chest, and the like. Examples of needle-type medical instruments include biopsy needles.
예를 들어, 의료용 항법 장치는 의료 영상을 처리하거나 생성하는 컴퓨터(600), 환자(50)와 바늘형 의료 도구(111)의 상대적 위치 정보를 파악하여 컴퓨터(600)에 제공하는 위치 파악 수단(400), 그리고 컴퓨터(600)와 연동되어 수술 계획 정보가 통합된 수술장 영상을 사용하여 바늘형 의료 도구(111)의 삽입 방향에서 삽입점(entry point)을 보여주는 사용자 인터페이스(500)를 포함한다. 사용자 인터페이스(500)는 환자(50)와 바늘형 의료 도구(111)의 상대적 위치 정보를 사용하여 컴퓨터(600)에 의해 계산된 목표점과 바늘형 의료 도구(111)의 끝 간의 거리를 표시한 항법 화면을 구비한다. 항법 화면에는 삽입점을 중심으로 간격을 두고 있는 복수의 선으로 거리의 간격이 표시된다.For example, the medical navigation apparatus includes a computer 600 for processing or generating a medical image, a position detecting means for grasping the relative position information of the patient 50 and the needle-type medical tool 111 and providing it to the computer 600 ( 400, and a user interface 500 interoperating with the computer 600 to show an entry point in the insertion direction of the needle-shaped medical instrument 111 using an operating room image in which surgery planning information is integrated. . The user interface 500 displays the distance between the target point calculated by the computer 600 and the end of the needle medical instrument 111 using the relative position information of the patient 50 and the needle medical instrument 111. Have a screen. The navigation screen displays distances of distances with a plurality of lines spaced around the insertion point.
바늘형 의료 도구(111)는 컴퓨터(600)와 연동되어 동작하는 슬레이브 로봇(100)에 구비되어 있고, 실시간으로 사용자 인터페이스(500)와 연동하여 슬레이브 로봇(100)을 제어하는 마스터 콘솔(200)와, 인체 내에서 생검 바늘(111)의 위치를 촬영하는 영상 촬영 장치(300)와, 슬레이브 로봇(100), 환자(50) 및 주변 장치들의 위치 및 자세를 모니터링하는 장치(400) 등을 포함할 수 있다. The needle-type medical tool 111 is provided in the slave robot 100 that operates in conjunction with the computer 600, and master console 200 for controlling the slave robot 100 in cooperation with the user interface 500 in real time. And an image capturing apparatus 300 for capturing the position of the biopsy needle 111 in the human body, and a device 400 for monitoring the position and posture of the slave robot 100, the patient 50, and peripheral devices. can do.
도 4는 수술전 영상에서 종양을 분할하고 수술 계획을 생성하는 방법의 일 예를 설명하는 도면으로서, 의료용 항법 장치는 폐, 신장, 간 등의 장기(organ)의 생검에 적용될 수 있으며, 장기 이외의 부위에도 적용이 배제되는 것은 아니다. 본 예에서는 폐를 중심으로 설명된다.4 is a view illustrating an example of a method of dividing a tumor and generating a surgical plan in a preoperative image. The medical navigation apparatus may be applied to a biopsy of an organ such as lung, kidney, liver, and the like. The application is also not excluded from the site of. In this example, the lungs are described.
도 4에 도시된 바와 같이, 환자의 폐에 대해 수술전 영상을 쓰레쉬홀딩(thresholding)하여 환부(10; 예: 종양)을 분할(segmentation)하고 수술 계획을 생성한다. 예를 들어, 볼륨 흉부 시티 영상(volumetric chest CT images; 이하 폐 영상)을 획득한 후, 폐 영상이 분할(segmentation)되어 분할된 폐 영상이 준비된다. 분할의 결과, 폐 영상에 포함된 해부학적 구조물(예: 혈관, 갈비뼈, 에어웨이(airway), 폐 경계 등)이 복셀의 3차원 집합으로 추출될 수 있고, 폐 영상으로부터 분할된 혈관, 갈비뼈(rib), 에어웨이(airway) 등의 해부학적 구조물이 폐 마스크(lung mask), 혈관 마스크(vessel mask), 립 마스크(Rib mask), 에어웨이 마스크(airway mask) 등으로 저장될 수 있다. 종양(10)에 알맞은 HU값을 threshold 값으로 분할 기법(예: adaptive threshold)에 의해 종양(10)이 분할된다. 도 4는 종양(10)이 분할된 폐 영상의 엑시얼(axial) 단면의 일 예를 보여준다.As shown in FIG. 4, the preoperative image is thresholded on the lung of the patient to segment the lesion 10 (eg, a tumor) and generate a surgical plan. For example, after obtaining volumetric chest CT images (pulmonary images), the lung images are segmented to prepare a divided lung image. As a result of the segmentation, anatomical structures (eg, blood vessels, ribs, airways, lung boundaries, etc.) included in the lung image can be extracted into a three-dimensional set of voxels, and blood vessels, ribs, segmented from the lung image Anatomical structures, such as airways and the like, may be stored as a lung mask, a vessel mask, a rib mask, an airway mask, or the like. Tumor 10 is divided by a segmentation technique (for example, adaptive threshold) by the HU value appropriate for the tumor 10. 4 shows an example of an axial cross section of a lung image in which the tumor 10 is divided.
컴퓨터(600)에는 환자의 수술전 영상이 로딩되며, 시술장에서 획득된 환자의 수술장 영상과 수술전 영상이 컴퓨터(600)에 의해 정합(registration)된다. 정합의 결과 수술전 영상을 사용하여 만들어진 삽입 경로(82, 84), 삽입점(41), 종양 상의 목표점 등을 포함하는 수술 계획이 수술장 영상에 이전된다. 이에 대해서는 더 후술된다.The computer 600 is loaded with a preoperative image of the patient, and the operating room image and the preoperative image of the patient acquired at the operating room are registered by the computer 600. As a result of the registration, a surgical plan including the insertion paths 82 and 84, the insertion point 41, the target point on the tumor, and the like made using the preoperative image is transferred to the operating room image. This is further described below.
분할된 종양(10)은 3차원 이미지로 생성될 수 있다. 따라서 영상처리 소프트웨어에 의해 필요한 방향에서 종양의 단면을 볼 수 있고, 예를 들어, 엑시얼 뷰(axial view), 코로날 뷰(coronal view) 및 세지털 뷰(sagittal view)와 같이 대표적인 방향에서 종양(10)을 볼 수 있고, 이를 기초로 수술 계획을 만들 수 있다. The divided tumor 10 may be generated as a 3D image. Thus, the cross section of the tumor can be seen in the direction required by the image processing software, and the tumor can be seen in representative directions such as, for example, axial view, coronal view and sagittal view. (10) can be seen, and a surgical plan can be made based on this.
도 5는 종양과 삽입 경로가 시각화된 수술전 영상의 일 예를 설명하는 도면으로서, 실재 갈비뼈와, 갈비뼈 사이로 삽입 경로(82)가 3D로 시각화된다. 상기와 같이 종양(10)의 센터(11)와 이에 도달하기 위한 삽입 경로(82) 및 삽입점이 수술전 영상에 부가되어 수술 계획이 생성된다. 수술전 영상은 3차원 영상이며, 볼륨렌더링을 통해 도 5에 도시된 것과 같이, 수술 계획은 3차원으로 생성될 수 있다. 예를 들어, 종양(10)이 주변으로부터 분할되어 있고, 3차원으로 삽입 경로(82)가 시각화되며, 종양(10)에는 목표점(예: 종양의 중심점 또는 가장 자리)이 표시된다. 종양(10)은 콘트라스트가 거의 없어서 플로로스코피 시티로는 잘 보이지 않고, 일반적으로 종양(10)은 대략 원형으로 표시된다. 따라서, 도 5에 도시된 것과 다르게 종양이 주변과 구분되게 시각화되는 것이 생략되고 컴퓨터(600)의 내적 계산 과정에 의해 종양의 위치가 판별될 수 있다.FIG. 5 is a diagram illustrating an example of a preoperative image in which a tumor and an insertion path are visualized, and an insertion path 82 is visualized in 3D between the actual ribs and the ribs. As described above, the center 11 of the tumor 10, an insertion path 82 and an insertion point for reaching the tumor 10 are added to the preoperative image to generate a surgical plan. The preoperative image is a 3D image, and as shown in FIG. 5 through volume rendering, the surgical plan may be generated in 3D. For example, the tumor 10 is divided from the periphery, the insertion path 82 is visualized in three dimensions, and the tumor 10 is marked with a target point (eg, the center point or edge of the tumor). Tumor 10 has little contrast and is therefore invisible to fluoroscopy cities, and tumor 10 is generally represented in a generally circular shape. Accordingly, unlike the case shown in FIG. 5, the tumor is not visualized to be distinguished from the surroundings, and the location of the tumor may be determined by the internal calculation process of the computer 600.
도 6은 수술장 영상에 수술 계획이 통합되는 방법의 일 예를 설명하는 도면으로서, 시술장에서 수술장 영상이 획득되며, 수술전 영상과 수술장 영상이 정합되어 수술장 영상에 삽입 경로 등 수술 계획이 이전된다. 의료 영상 간의 정합의 방법으로서 강체 정합(rigid registration) 및 변형가능한(deformable) 정합 방법 등이 이용될 수 있다.FIG. 6 is a view illustrating an example of a method of integrating a surgical plan into an operating room image. An operating room image is acquired at an operating room, and the preoperative image and the operating room image are matched to insert an operation path into the operating room image. The plan is transferred. As a method of registration between medical images, rigid registration and deformable registration methods and the like may be used.
삽입 경로(82)는 사용자 인터페이스(500)를 통해 수정될 수 있고, 호흡 또는 움직임을 고려하여 부적절한 삽입 경로가 제거될 수 있다. 도 6(a)는 수술전 영상의 일 예이고, 도 6(b)는 수술장 영상과 수술전 영상이 정합된 영상으로서 수술 계획이 이전된 영상이다. The insertion path 82 may be modified through the user interface 500, and an inappropriate insertion path may be removed in consideration of breathing or movement. 6 (a) is an example of a preoperative image, and FIG. 6 (b) is an image in which an operating room image and a preoperative image are matched and an operation plan is transferred.
상기 3D 시각화된 삽입 경로(82)를 더욱 확실하게 확인(confirm)하기 위해, MPR(multiplanar reconstruction; 예: axial view, coronal view, sagittal view) 상에 삽입 경로(251), 삽입점, 목표점(예: 종양의 센터 또는 가장 자리)이 오버레이되어 표시될 수 있다(도 6에는 axial view가 예시됨).In order to more reliably confirm the 3D visualized insertion path 82, the insertion path 251, the insertion point, and the target point (eg, axial view, coronal view, sagittal view) on the MPR (multiplanar reconstruction; : The center or edge of the tumor) may be overlaid (FIG. 6 illustrates the axial view).
이와 같이, MPR 상에서 확인된 삽입 경로(82)를 따라 생검 바늘(111)이 가이드되어 시술이 수행될 수 있다. 예를 들어, 최종 컨펌된 삽입 경로가 TCP/IP 또는 전용 통신 프로토콜을 이용하여 슬레이브 로봇(100) 또는 사용자 인터페이스(500; 예: 항법 장치) 등으로 전송된다. 생검 바늘(111)은 단일 바늘 타입도 물론 가능하지만 멀티 스팟으로 생검하기 위해서는 리볼버 타입으로 복수 개가 슬레이브 로봇(100)에 장착되어, 순차적으로 각 목표점을 생검하는 것이 더 효과적일 수 있다. As such, the biopsy needle 111 may be guided along the insertion path 82 identified on the MPR to perform the procedure. For example, the final confirmed insertion path is transmitted to the slave robot 100 or the user interface 500 (eg, navigation device) or the like using TCP / IP or a dedicated communication protocol. The biopsy needle 111 may of course be a single needle type, but in order to biopsy a multi-spot, a plurality of revolver types may be mounted on the slave robot 100, so that it may be more effective to biopsy each target point sequentially.
도 7은 환자와 생검 바늘의 상대적 위치 정보를 파악하는 위치 파악 수단의 일 예를 설명하는 도면으로서, 환자와 생검 바늘의 상대적 위치 관계를 파악하는 위치 파악 수단으로는 여러 가지가 사용될 수 있다. 예를 들어, 도 7에 도시된 바와 같이, 환자(960), 생검 바늘(912)을 구비한 슬레이브 로봇(911), 적외선 카메라(991), 적외선 반사구 어셈블리(911,913,914), 모니터(920) 그리고 컴퓨터(940)가 구비되어 있다. 환자(960)의 위치를 나타내는 복수의 적외선 반사구(911)와 생검 바늘(912)의 끝에 마련된 복수의 적외선 반사구 내지는 적외선 에미터(913)를 적외선 카메라(991)가 파악함으로써, 바늘(912)과 환자(960)의 위치가 파악될 수 있다. 마스터 콘솔의 전체 운용을 위한 컴퓨터(940)가 구비되어 있고, 모니터(920)도 구비되어 있다. 여기서 컴퓨터(940) 및 모니터(920)은 도 1에서 설명된 컴퓨터(600) 및 사용자 인터페이스(500)에 대응할 수 있다.FIG. 7 is a view for explaining an example of a positioning means for grasping relative position information of a patient and a biopsy needle, and various methods may be used as the positioning means for grasping the relative positional relationship between the patient and the biopsy needle. For example, as shown in FIG. 7, the patient 960, the slave robot 911 with the biopsy needle 912, the infrared camera 991, the infrared reflector assemblies 911, 913, 914, the monitor 920 and the computer. 940 is provided. The infrared camera 991 detects the plurality of infrared reflectors 911 and the infrared reflectors or infrared emitters 913 provided at the ends of the biopsy needle 912, indicating the position of the patient 960. The location of the patient 960 can be identified. A computer 940 is provided for overall operation of the master console, and a monitor 920 is also provided. The computer 940 and the monitor 920 may correspond to the computer 600 and the user interface 500 described with reference to FIG. 1.
환자(960)와 바늘(912)의 상대적인 위치 관계를 이용하는 경우에, 컴퓨터(940)는 수술용 항법장치의 기능도 한다. 마스터(200; 도 1 참조) 조작자의 조정에 따라, 컴퓨터(940)에 의해 슬레이브 로봇(911)의 생검용 바늘(912)이 작동된다. 적외선 반사구 어셈블리(911)는 환자(960)에 고정되어 환자(960)의 위치를 나타내고, 적외선 반사구 어셈블리(913)는 생검용 바늘(912)에 고정되어 생검용 바늘(912)의 위치를 나타내고, 적외선 반사구 어셈블리(914)는 환자(960)의 가슴에 위치되어, 환자의 호흡, 재채기와 같은 환자 움직임을 표시한다. 위치 파악을 위한 위치 감지 수단으로서, 적외선 카메라와 적외선 반사구가 사용되었지만, 자기장을 이용하는 것도 가능하며, 위치 파악이 가능한 수단이라면 어떠한 것이 사용되어도 좋다. 일 예로, 생검 바늘에 마그네틱 센서를 부착하고 얼마나 움직이는지 카메라로 트래킹하는 것이 가능하다.In the case of using the relative positional relationship between the patient 960 and the needle 912, the computer 940 also functions as a surgical navigation device. In accordance with the adjustment of the master 200 (see FIG. 1) operator, the biopsy needle 912 of the slave robot 911 is operated by the computer 940. The infrared reflector assembly 911 is fixed to the patient 960 to indicate the position of the patient 960, the infrared reflector assembly 913 is fixed to the biopsy needle 912 to indicate the position of the biopsy needle 912, An infrared reflector assembly 914 is positioned on the chest of the patient 960 to indicate patient movement, such as breathing, sneezing of the patient. As the position sensing means for positioning, an infrared camera and an infrared reflector are used, but a magnetic field can be used, and any means can be used as long as the position can be sensed. For example, it is possible to attach a magnetic sensor to the biopsy needle and track with the camera how far it moves.
적외선 반사구 어셈블리(911)는 환자(960)의 위치 정보를 나타내는 것으로 이용되어도 좋고, 전체 시스템의 기준 위치로서 기능하여도 좋으며, 환자(960)에 고정될 수도 있지만, 수술대에 고정될 수도 있고, 수술대에 기준 위치로 기능하는 별도의 적외선 반사구 어셈블리(미도시)를 추가하여도 좋다. 환자(960)에 대한 생검 바늘(912)의 위치를 파악할 수 있다.The infrared reflector assembly 911 may be used to indicate the location information of the patient 960, may function as a reference position of the entire system, may be fixed to the patient 960, but may be fixed to the operating table, or the operating table. An additional infrared reflector assembly (not shown) may serve as a reference position. The location of the biopsy needle 912 relative to the patient 960 can be determined.
상기 예들과 다르게, 슬레이브 로봇 자체가 위치를 알고 있는 예도 가능하다. 예를 들어, 슬레이브 로봇이 생검 바늘을 잡고 있고, 슬레이브 로봇 자체적으로 시술장 내에서 자신의 좌표를 알 수 있다. 또한, 생검 바늘이 몇 밀리미터를 움직이는지 슬레이브 로봇 자체적으로 감지하는 것이 가능하다. 따라서 시술장 영상의 공간에서 생검 바늘의 방위와 위치를 컴퓨터가 계산할 수 있다.Unlike the above examples, an example in which the slave robot itself knows its position is possible. For example, the slave robot is holding the biopsy needle, and the slave robot itself can know its coordinates within the procedure room. It is also possible to detect by itself how many millimeters the biopsy needle moves. Therefore, the computer can calculate the orientation and position of the biopsy needle in the space of the procedure image.
또한, 플로로스코피 시티로 촬영하여 생검 바늘의 현재 위치를 정합된 수술장 영상 공간에서 컴퓨터가 계산할 수 있다.In addition, the computer can calculate the current position of the biopsy needle in the matched operating room image space by imaging with floroscopy city.
상기 위치 파악 수단들은 하나만 사용하는 것보다 복수의 방법을 사용하여 위치 관계를 파악하는 것이 정확성과 안전성 측면에서 바람직하다. 이러한 하나 이상의 위치 파악 수단에 의해 파악된 환자와 생검 바늘의 상대적 위치관계를 컴퓨터가 계산함으로써 생검 바늘(111)과 종양(10) 상의 목표점(11) 간의 거리가 계산될 수 있다.It is preferable in terms of accuracy and safety that the positioning means uses a plurality of methods to determine the positional relationship rather than using only one. The distance between the biopsy needle 111 and the target point 11 on the tumor 10 can be calculated by the computer calculating the relative positional relationship of the patient and the biopsy needle identified by one or more of these locating means.
본 예에서는 사용자 인터페이스 화면에 생검 바늘(111)이 종양(10)에 일치하는지 십자 표시와 함께 방위를 보여주는 2개의 작은 원이 표시되며, 종양과 생검 바늘의 끝 간의 거리를 직관적으로 알 수 있도록 깊이감을 표시한다. 예를 들어, 생검 바늘의 끝과 목표점 간의 거리가 표시되되, 삽입점을 중심으로 서로 간격을 두고 있는 복수의 곡선으로 생검 바늘의 끝과 종양의 목표점 간의 거리를 깊이감 있게 표시한다. 수술 계획에서 결정된 삽입 경로, 삽입 각도, 삽입점, 삽입 거리로 찌를 때, 생검 바늘의 끝과 목표점 간의 거리가 계속 변하므로 상기 사용자 인터페이스 화면에는 깊이를 순차로 표시하는 것이 바람직하다. 이에 대해서는 더 후술된다.In this example, the user interface screen displays two small circles showing the orientation along with a crosshair to see if the biopsy needle 111 matches the tumor 10, and provides a depth to intuitively determine the distance between the tumor and the tip of the biopsy needle. Indicate the sense. For example, the distance between the tip of the biopsy needle and the target point is displayed, and the distance between the tip of the biopsy needle and the target point of the tumor is displayed in depth with a plurality of curves spaced apart from each other about the insertion point. Since the distance between the tip and the target point of the biopsy needle is constantly changing when inserted into the insertion path, the insertion angle, the insertion point, and the insertion distance determined in the surgical plan, it is preferable to sequentially display the depth on the user interface screen. This is further described below.
도 8은 사용자 인터페이스 화면의 일 예를 설명하는 도면으로서, 사용자 인터페이스에는 복수의 화면(510, 520, 530, 540)이 표시된다. 예를 들어, 상측 화면에는 CT 이미지(예: 플로로스코피로부터 전달된 이미지)가 있고, 폐의 여러 구조물이나 병변을 보여주는 mask가 표시된다. 또한, 수술 계획을 수행하기 위한 버튼 및 시술 정보나 종류를 표시하는 정보 창이 있다.8 illustrates an example of a user interface screen, and a plurality of screens 510, 520, 530, and 540 are displayed on the user interface. For example, the upper screen contains a CT image (e.g., an image transmitted from a floroscopy) and a mask showing various structures or lesions of the lung. In addition, there is a button for performing an operation plan and an information window for displaying procedure information or type.
메인 화면(510)에는 3D로 정합된 수술장 영상이 표시된다. 또한, 수술장 영상으로부터 컴퓨터가 생성해 내는 MPR 영상(예: 520, 530, 540)이 우측에 표시된다. 예를 들어, 엑시얼 뷰(axial view), 코로날 뷰(530; coronal view) 및 세지털 뷰(540; sagittal view)와 같은 방향에서 종양(10)과, 삽입 경로(82), 삽입점, 생검 바늘의 방향 등이 표시된다. 또한, 생검 바늘의 끝에 카메라가 설치되어 피부의 삽입점을 보여주며, 두 개의 원의 센터가 일치하면 생검 바늘의 각도가 계획대로 종양에 조준된 것을 알려준다. 조작자가 마스터 콘솔을 통해 컴퓨터에 지시를 내리고 컴퓨터에 연동된 슬레이브 로봇의 동작에 의해 생검 바늘이 인체에 삽입된다.The main screen 510 displays an operating room image matched with 3D. In addition, MPR images (eg, 520, 530, 540) generated by the computer from the operating room image are displayed on the right side. For example, tumor 10, insertion path 82, insertion point, in the same direction as axial view, coronal view 530 and sagittal view, The direction of the biopsy needle and the like are displayed. In addition, a camera is installed at the end of the biopsy needle to show the insertion point of the skin, and when the centers of the two circles coincide, the angle of the biopsy needle is aimed at the tumor as planned. The operator instructs the computer through the master console and the biopsy needle is inserted into the human body by the operation of the slave robot linked to the computer.
생검 바늘과 환자의 상대적 위치가 도 7에서 설명된 것과 같은 방법으로 알 수 있고, 삽입점을 중심으로 하는 나선형 곡선(70; 도 9 참조)이 생검 바늘의 찌른 깊이, 즉 생검 바늘의 끝과 환부의 목표점 간의 거리를 보여준다. 따라서 의사는 이 화면을 보고 생검 바늘의 각도와 깊이를 함께 직관적으로 인식할 수 있어서 매우 편리하다.The relative position of the biopsy needle and the patient can be seen in the same way as described in FIG. 7, with the spiral curve 70 (see FIG. 9) centered around the insertion point being the stab depth of the biopsy needle, ie the tip and affected part of the biopsy needle. Shows the distance between targets. This allows the doctor to intuitively recognize the angle and depth of the biopsy needle by looking at this screen.
상대적 위치를 파악하는 수단으로서, 적외선 반사구나 마그네틱 마커를 사용하는 경우, 실시간으로 거리를 계산하여 화면에 표시하고, 도 10에 도시된 것과 같이 변화하는 거리를 순차적으로 나선의 일정 부분을 활성화하여 표시할 수 있다.In the case of using an infrared reflector or a magnetic marker, as a means of determining the relative position, the distance is calculated and displayed on the screen in real time, and as shown in FIG. can do.
한편, 플로로스코피로부터 전달된 수술장 영상에 수술전 영상을 정합하면 환부의 위치가 수술장 영상의 공간에서 파악되며, 수술장 영상에 나타난 생검 바늘을 식별하여 거리를 구할 수 있다. 이렇게 구해진 거리와 상기 마커를 사용한 거리를 비교하여 오류를 줄일 수 있고 더 정확하고 안전하게 거리를 깊이감 있게 표시할 수 있다.On the other hand, if the preoperative image is matched to the operating room image transmitted from the florography, the location of the affected part is grasped in the space of the operating room image and the distance can be obtained by identifying the biopsy needle displayed on the operating room image. By comparing the distance thus obtained with the distance using the marker, errors can be reduced and the distance can be displayed more accurately and safely.
또한, 슬레이브 로봇의 움직임은 슬레이브 로봇 자체적으로 파악할 수 있다. 즉 얼마나 바늘이 전진한 것인지 슬레이브 로봇 내에 센서가 감지할 수 있다. 따라서 이에 의한 거리 정보도 추가로 전술된 거리 정보에 보완될 수 있다.In addition, the movement of the slave robot can be detected by the slave robot itself. That is, the sensor can detect how far the needle has advanced in the slave robot. Therefore, the distance information thereby may be supplemented to the above-mentioned distance information.
한편, 복수의 수술장 영상에 나타난 생검 바늘을 식별하여 찌르는 속도를 컴퓨터가 계산할 수 있고, 수술장 영상을 처리하여 사용자 인터페이스에 표시하는 시간 딜레이와 상기 속도를 평가하면 현재 생검 바늘의 끝의 위치 및 거리를 계산할 수 있다. 이렇게 계산된 거리도 추가로 보완을 위해 사용될 수 있다.On the other hand, the computer can calculate the puncture rate by identifying the biopsy needles appearing on the plurality of operating room images, the time delay and processing the operating image to display on the user interface and the speed is evaluated, the position of the end of the current biopsy needle and The distance can be calculated. This calculated distance can also be used for supplementation.
도 9 및 도 10은 사용자 인터페이스 화면에서 생검 바늘의 방향과 깊이를 함께 보여주는 화면의 예들을 보여주는 도면들로서, 환부에 생검 바늘이 정확히 조준되는지는 내측의 작은 두 개의 원의 센터(11)가 일치하는지 여부로 알 수 있다. 한편, 센터를 중심으로 나선형 곡선(70; 도 9 참조)이 표시되며, 바깥 측 나선은 내측 나선보다 환부로부터 거리가 먼 것을 표시한다. 나선에는 깊이가 숫자로 표시되어 있다. 생검 바늘이 인체를 찌름에 따라 거리를 계속 변화하며 이에 따라 거리에 대응하는 나선의 특정 부분이 순차적으로 활성화(예: 색 변화, 선의 두께 변화, 깜빡임 등)되어 시술자는 생검 바늘의 방향과 깊이를 함께 직관적으로 확인할 수 있다. 9 and 10 are diagrams showing examples of the screen showing the direction and depth of the biopsy needle in the user interface screen, and whether the biopsy needle is accurately aimed at the affected part is the center of the center of the two small circles inside It can be seen whether or not. On the other hand, a spiral curve 70 (refer to FIG. 9) is displayed around the center, and the outer spiral indicates a distance from the affected part than the inner spiral. The spirals are numbered in depth. As the biopsy needle pierces the human body, the distance continues to change, which in turn activates certain portions of the spiral corresponding to the distance (e.g. color changes, line thickness changes, flickers, etc.), thereby allowing the operator to determine the direction and depth of the biopsy needle. Intuitively together.
다른 방법으로서, 센터(11)를 동심으로 하는 복수의 원(75; 도 10 참조)이 표시되며, 바깥 측 원은 내측 원보다 환부와의 거리가 멀다. 생검 바늘이 인체를 찌름에 따라 거리를 계속 변화하며 이에 따라 거리에 대응하는 원이 순차적으로 활성화(예: 색 변화, 선의 두께 변화, 깜빡임 등)되어 시술자는 생검 바늘의 방향과 깊이를 함께 직관적으로 확인할 수 있다. Alternatively, a plurality of circles 75 (see FIG. 10) concentric with the center 11 are shown, with the outer circle farther from the affected part than the inner circle. As the biopsy needle pierces the human body, the distance continues to change, and thus the circle corresponding to the distance is sequentially activated (e.g. color change, line thickness change, blinking, etc.) so that the operator can intuitively adjust the direction and depth of the biopsy needle together. You can check it.
따라서 여러 가지 다른 화면을 참조하지 않고도 정확하고 안전하게 시술을 수행하는 데에 큰 도움이 된다. 예를 들어, 환부에 생검 바늘의 끝이 도달하기 직전까지는 본 예에 따라 전술된 사용자 인터페이스 화면에서 생검 바늘의 끝과 환부의 거리를 확인하면서 생검 바늘을 찌르고, 환부 직전에서 CT나 의료 영상 촬영 장치를 사용하여 영상을 찍어서 마지막으로 환부를 생검 바늘로 찌르는 방법도 고려할 수 있다.Therefore, it is a great help to perform the procedure accurately and safely without referring to various other screens. For example, until the tip of the biopsy needle reaches the affected part, the biopsy needle is pierced while checking the distance of the tip of the biopsy needle and the affected part in the above-described user interface screen according to the present example, and a CT or medical imaging apparatus immediately before the affected part. You can also consider using a biopsy needle to finally take the image and use it to biopsy the affected area.
이하 본 개시의 다양한 실시 형태에 대하여 설명한다.Hereinafter, various embodiments of the present disclosure will be described.
(1) 바늘형 의료 도구의 삽입을 가이드하는 의료용 항법 장치에 있어서, 수술장 영상에 수술 계획 정보를 통합하는 컴퓨터;로서, 환부(surgical target)를 포함하는 수술전 영상에서 계획된 환부의 상의 목표점, 삽입점 및 바늘형 의료 도구의 삽입 경로를 포함하는 수술 계획 정보를 수술장 영상에 통합하는 컴퓨터; 환자와 바늘형 의료 도구의 상대적 위치 정보를 파악하여 컴퓨터에 제공하는 위치 파악 수단; 그리고 컴퓨터와 연동되어 수술 계획 정보가 통합된 수술장 영상을 사용하여 바늘형 의료 도구의 삽입 방향에서 삽입점(entry point)을 보여주는 사용자 인터페이스(UI);로서, 상대적 위치 정보를 사용하여 컴퓨터에 의해 계산된 목표점과 바늘형 의료 도구의 끝 간의 거리를 표시하되, 삽입점을 중심으로 간격을 두고 있는 복수의 선으로 거리의 간격을 표시한 항법 화면을 구비하는 사용자 인터페이스;를 포함하는 것을 특징으로 하는 의료용 항법 장치.(1) A medical navigation device for guiding the insertion of a needle-shaped medical tool, comprising: a computer integrating surgical planning information into an operating room image, comprising: a target point on an image of a planned lesion in a preoperative image including a surgical target, A computer incorporating surgical planning information, including an insertion point and an insertion path of a needle-shaped medical instrument, in the operating room image; Positioning means for identifying the relative position information of the patient and the needle-shaped medical tool and providing it to the computer; And a user interface (UI) showing an entry point in the direction of insertion of the needle-shaped medical tool using an operating room image in which the surgery planning information is integrated with the computer. And a user interface displaying a distance between the calculated target point and the end of the needle-type medical instrument, the navigation interface displaying a distance of the distance by a plurality of lines spaced about the insertion point. Medical navigation device.
(2) 항법 화면은 바늘형 의료 도구의 끝의 도달한 거리에 해당하는 복수의 선의 일부를 시각적으로 나머지와 구분하여 표시하는 것을 특징으로 하는 의료용 항법 장치.(2) The navigation screen for medical navigation, characterized in that the part of the plurality of lines corresponding to the reached distance of the end of the needle-shaped medical tool visually displayed and distinguished from the rest.
(3) 복수의 선은 삽입점을 중심으로 하는 나선인 것을 특징으로 하는 의료 항법 장치.(3) A medical navigation apparatus, characterized in that a plurality of lines are spirals centering on an insertion point.
(4) 복수의 선은 삽입점을 동심으로 하는 복수의 원인 것을 특징으로 하는 의료용 항법 장치.(4) A medical navigation apparatus, characterized in that a plurality of lines have a plurality of causes of concentric insertion points.
(5) 항법 화면에는 복수의 선의 내측에 삽입점을 둘러싸는 2개의 원이 표시되고, 2개의 원의 센터가 일치하는 것에 의해 바늘형 의료 도구와 삽입 경로의 동일 방향 정렬이 확인되는 것을 특징으로 하는 의료용 항법 장치.(5) The navigation screen displays two circles enclosing the insertion point inside the plurality of lines, and the alignment of the needle-like medical tool and the insertion path is confirmed by coinciding the centers of the two circles. Medical navigation device.
(6) 위치 파악 수단은: 바늘형 의료 도구 및 환자를 표지하는 마커; 그리고 마커를 감지하는 감지 장치;를 포함하는 것을 특징으로 하는 의료용 항법 장치.(6) the locating means comprises: a marker for marking the needle-shaped medical tool and the patient; And a sensing device for sensing a marker.
(7) 위치 파악 수단은: 수술장 영상을 촬영하는 의료 영상 촬영 장치;를 포함하며, 컴퓨터는 복수의 수술장 영상과 수술전 영상의 정합에 의해 환부의 위치를 수술장 영상의 공간에서 계산하며, 복수의 수술장 영상에 포함된 바늘형 의료 도구와 환부의 거리를 계산하는 것을 특징으로 하는 의료용 항법 장치.(7) The positioning means includes: a medical imaging apparatus for photographing the operating room image; the computer calculates the location of the affected area in the space of the operating room image by matching the plurality of operating room images and preoperative images , Medical navigation device, characterized in that for calculating the distance between the needle-shaped medical tool and the lesion included in the plurality of operating room images.
(8) 위치 파악 수단은: 바늘형 의료 도구가 구비된 슬레이브 로봇에 구비된 센서;를 포함하며, 센서는 수술장의 기준 위치에 대해 바늘형 의료 도구의 위치를 감지하는 것을 특징으로 하는 의료용 항법 장치.(8) The positioning means includes: a sensor provided in the slave robot equipped with the needle-type medical tool, wherein the sensor detects the position of the needle-type medical tool with respect to a reference position of the operating room. .
(9) 바늘형 의료 도구는 생검 바늘이고, 사용자 인터페이스는 나선형 곡선에서 해당 거리에 대응하는 일부 구간을 순차적으로 활성화하여 목표점과 생검 바늘 끝 간의 거리를 표시하는 것을 특징으로 하는 의료용 항법 장치.(9) The medical instrument of claim 9, wherein the needle-type medical instrument is a biopsy needle, and the user interface displays the distance between the target point and the tip of the biopsy needle by sequentially activating some sections corresponding to the corresponding distances in the spiral curve.
(10) 바늘형 의료 도구는 생검 바늘이고, 사용자 인터페이스는 해당 거리에 대응하는 원을 순차적으로 활성화하여 목표점과 생검 바늘의 끝 간의 거리를 표시하는 것을 특징으로 하는 의료용 항법 장치.(10) The medical instrument of the needle type is a biopsy needle, wherein the user interface sequentially activates a circle corresponding to the corresponding distance to display the distance between the target point and the tip of the biopsy needle.
본 개시에 따른 하나의 의료용 항법 장치에 의하면, 하나의 항법 화면에 바늘형 의료 도구의 환부에 대한 방향 및 환부와의 거리가 함께 표시되므로 다른 화면을 번갈아 볼 필요가 없어서 편리하며, 시술의 안전성과 정확성을 더 향상할 수 있다.According to one medical navigation apparatus according to the present disclosure, since the direction and distance of the affected part of the needle-shaped medical tool are displayed together on one navigation screen, there is no need to alternate between the other screens, and the safety of the procedure and The accuracy can be further improved.
본 개시에 따른 또 하나의 의료용 항법 장치에 의하면, 환부와 바늘형 의료 도구의 끝 간의 거리가 나선 또는 복수의 원으로 순차로 활성화되어 깊이감 있게 표시되므로 찌른 깊이를 직관적으로 알 수 있어서 편리하며, 시술의 안전성과 정확성을 더 향상할 수 있다.According to another medical navigation apparatus according to the present disclosure, since the distance between the affected part and the tip of the needle-type medical tool is activated in a spiral or a plurality of circles sequentially and displayed in a sense of depth, it is convenient to know the depth of stab intuitively, The safety and accuracy of the procedure can be further improved.
Claims (10)
- 바늘형 의료 도구의 삽입을 가이드하는 의료용 항법 장치에 있어서,In the medical navigation device for guiding the insertion of the needle-shaped medical tool,수술장 영상에 수술 계획 정보를 통합하는 컴퓨터;로서, 환부(surgical target)를 포함하는 수술전 영상에서 계획된 환부의 상의 목표점, 삽입점 및 바늘형 의료 도구의 삽입 경로를 포함하는 수술 계획 정보를 수술장 영상에 통합하는 컴퓨터;A computer for integrating surgical planning information into an operating room image, comprising: operating plan information including a target point, an insertion point, and an insertion path of a needle-shaped medical instrument in a pre-operative image including a surgical target; Computer integrating to intestinal video;환자와 바늘형 의료 도구의 상대적 위치 정보를 파악하여 컴퓨터에 제공하는 위치 파악 수단; 그리고Positioning means for identifying the relative position information of the patient and the needle-shaped medical tool and providing it to the computer; And컴퓨터와 연동되어 수술 계획 정보가 통합된 수술장 영상을 사용하여 바늘형 의료 도구의 삽입 방향에서 삽입점(entry point)을 보여주는 사용자 인터페이스(UI);로서, 상대적 위치 정보를 사용하여 컴퓨터에 의해 계산된 목표점과 바늘형 의료 도구의 끝 간의 거리를 표시하되, 삽입점을 중심으로 간격을 두고 있는 복수의 선으로 거리의 간격을 표시한 항법 화면을 구비하는 사용자 인터페이스;를 포함하는 것을 특징으로 하는 의료용 항법 장치.A user interface (UI) that shows an entry point in the direction of insertion of a needle-like medical instrument using an operating room image incorporating surgery planning information in conjunction with a computer, calculated by the computer using relative position information And a user interface for displaying a distance between the target point and the tip of the needle-type medical tool, and having a navigation screen displaying the distance of the distance by a plurality of lines spaced about the insertion point. Navigation device.
- 청구항 1에 있어서,The method according to claim 1,항법 화면은 바늘형 의료 도구의 끝의 도달한 거리에 해당하는 복수의 선의 일부를 시각적으로 나머지와 구분하여 표시하는 것을 특징으로 하는 의료용 항법 장치.The navigation screen is a medical navigation device, characterized in that for displaying a portion of the plurality of lines corresponding to the distance reached of the end of the needle-shaped medical tool visually distinguished from the rest.
- 청구항 1에 있어서,The method according to claim 1,복수의 선은 삽입점을 중심으로 하는 나선인 것을 특징으로 하는 의료 항법 장치.And a plurality of lines are spirals centering on an insertion point.
- 청구항 1에 있어서,The method according to claim 1,복수의 선은 삽입점을 동심으로 하는 복수의 원인 것을 특징으로 하는 의료용 항법 장치.The plurality of lines are a plurality of causes of the concentric with the insertion point.
- 청구항 1에 있어서,The method according to claim 1,항법 화면에는 복수의 선의 내측에 삽입점을 둘러싸는 2개의 원이 표시되고, 2개의 원의 센터가 일치하는 것에 의해 바늘형 의료 도구와 삽입 경로의 동일 방향정렬이 확인되는 것을 특징으로 하는 의료용 항법 장치.The navigation screen displays two circles surrounding the insertion point inside the plurality of lines, and the same direction alignment between the needle-like medical tool and the insertion path is confirmed by coinciding the centers of the two circles. Device.
- 청구항 1에 있어서,The method according to claim 1,위치 파악 수단은:Location means are:바늘형 의료 도구 및 환자를 표지하는 마커; 그리고Markers to mark needle-like medical instruments and patients; And마커를 감지하는 감지 장치;를 포함하는 것을 특징으로 하는 의료용 항법 장치.Medical navigation device comprising a; sensing device for detecting a marker.
- 청구항 1에 있어서,The method according to claim 1,위치 파악 수단은:Location means are:수술장 영상을 촬영하는 의료 영상 촬영 장치;를 포함하며,It includes; medical imaging device for taking an image of the operating room;컴퓨터는 복수의 수술장 영상과 수술전 영상의 정합에 의해 환부의 위치를 수술장 영상의 공간에서 계산하며, 복수의 수술장 영상에 포함된 바늘형 의료 도구와 환부의 거리를 계산하는 것을 특징으로 하는 의료용 항법 장치.The computer calculates the location of the affected part in the space of the operating room image by matching the plurality of operating room images with the preoperative image, and calculates the distance between the needle-type medical tool and the affected part included in the plurality of operating room images. Medical navigation device.
- 청구항 1에 있어서,The method according to claim 1,위치 파악 수단은:Location means are:바늘형 의료 도구가 구비된 슬레이브 로봇에 구비된 센서;를 포함하며, 센서는 수술장의 기준 위치에 대해 바늘형 의료 도구의 위치를 감지하는 것을 특징으로 하는 의료용 항법 장치.And a sensor provided in the slave robot having the needle-type medical tool, wherein the sensor detects the position of the needle-type medical tool with respect to a reference position of the operating room.
- 청구항 3에 있어서,The method according to claim 3,바늘형 의료 도구는 생검 바늘이고,The medical needle is a biopsy needle,사용자 인터페이스는 나선형 곡선에서 해당 거리에 대응하는 일부 구간을 순차적으로 활성화하여 목표점과 생검 바늘 끝 간의 거리를 표시하는 것을 특징으로 하는 의료용 항법 장치.The user interface sequentially displays a distance between the target point and the tip of the biopsy needle by sequentially activating some sections corresponding to the corresponding distances in the spiral curve.
- 청구항 4에 있어서,The method according to claim 4,바늘형 의료 도구는 생검 바늘이고,The medical needle is a biopsy needle,사용자 인터페이스는 해당 거리에 대응하는 원을 순차적으로 활성화하여 목표점과 생검 바늘의 끝 간의 거리를 표시하는 것을 특징으로 하는 의료용 항법 장치.The user interface sequentially displays the distance between the target point and the tip of the biopsy needle by sequentially activating a circle corresponding to the corresponding distance.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2014-0135719 | 2014-10-08 | ||
KR1020140135719A KR101635515B1 (en) | 2014-10-08 | 2014-10-08 | Medical mavigation apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016056838A1 true WO2016056838A1 (en) | 2016-04-14 |
Family
ID=55653383
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2015/010591 WO2016056838A1 (en) | 2014-10-08 | 2015-10-07 | Medical navigation device |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR101635515B1 (en) |
WO (1) | WO2016056838A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101837301B1 (en) | 2016-10-28 | 2018-03-12 | 경북대학교 산학협력단 | Surgical navigation system |
KR102102942B1 (en) * | 2018-07-31 | 2020-04-21 | 서울대학교산학협력단 | Device of providing 3d image registration and method thereof |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR19990029038A (en) * | 1995-07-16 | 1999-04-15 | 요아브 빨띠에리 | Free aiming of needle ceramic |
KR20100112310A (en) * | 2009-04-09 | 2010-10-19 | 의료법인 우리들의료재단 | Method and system for controlling microsurgery robot |
JP2011139734A (en) * | 2010-01-05 | 2011-07-21 | Hoya Corp | Endoscope apparatus |
KR20120041455A (en) * | 2010-10-21 | 2012-05-02 | 주식회사 이턴 | Method and device for controlling/compensating movement of surgical robot |
US20130317363A1 (en) * | 2012-05-22 | 2013-11-28 | Vivant Medical, Inc. | Planning System and Navigation System for an Ablation Procedure |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20010010260A (en) * | 1999-07-16 | 2001-02-05 | 윤한걸 | Preparation method of beancurd containing inorganic matter |
-
2014
- 2014-10-08 KR KR1020140135719A patent/KR101635515B1/en active IP Right Grant
-
2015
- 2015-10-07 WO PCT/KR2015/010591 patent/WO2016056838A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR19990029038A (en) * | 1995-07-16 | 1999-04-15 | 요아브 빨띠에리 | Free aiming of needle ceramic |
KR20100112310A (en) * | 2009-04-09 | 2010-10-19 | 의료법인 우리들의료재단 | Method and system for controlling microsurgery robot |
JP2011139734A (en) * | 2010-01-05 | 2011-07-21 | Hoya Corp | Endoscope apparatus |
KR20120041455A (en) * | 2010-10-21 | 2012-05-02 | 주식회사 이턴 | Method and device for controlling/compensating movement of surgical robot |
US20130317363A1 (en) * | 2012-05-22 | 2013-11-28 | Vivant Medical, Inc. | Planning System and Navigation System for an Ablation Procedure |
Also Published As
Publication number | Publication date |
---|---|
KR101635515B1 (en) | 2016-07-04 |
KR20160042297A (en) | 2016-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220346886A1 (en) | Systems and methods of pose estimation and calibration of perspective imaging system in image guided surgery | |
KR102607065B1 (en) | Graphical user interface for displaying guidance information in a plurality of modes during an image-guided procedure | |
EP3164050B1 (en) | Dynamic 3d lung map view for tool navigation inside the lung | |
JP6710643B2 (en) | System and method for navigating in the lungs | |
EP3289964B1 (en) | Systems for providing proximity awareness to pleural boundaries, vascular structures, and other critical intra-thoracic structures during electromagnetic navigation bronchoscopy | |
EP3964161B1 (en) | Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same | |
JP6404713B2 (en) | System and method for guided injection in endoscopic surgery | |
KR20190015580A (en) | A graphical user interface for displaying guide information during an image guided procedure | |
WO2017043926A1 (en) | Guiding method of interventional procedure using medical images, and system for interventional procedure therefor | |
EP3937793A1 (en) | Guidance and tracking system for templated and targeted biopsy and treatment | |
EP3530221B1 (en) | System for performing a percutaneous navigation procedure | |
JP2017528175A (en) | System and method for providing distance and orientation feedback during 3D navigation | |
WO2015164587A2 (en) | Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue | |
WO2011058516A1 (en) | Systems & methods for planning and performing percutaneous needle procedures | |
EP3783568A2 (en) | Systems and methods of fluoro-ct imaging for initial registration | |
WO2017043924A1 (en) | Guiding method of interventional procedure using medical images, and system for interventional procedure therefor | |
EP3500159B1 (en) | System for the use of soft-point features to predict respiratory cycles and improve end registration | |
WO2016060308A1 (en) | Needle insertion type robot apparatus for interventional surgery | |
KR102467282B1 (en) | System and method of interventional procedure using medical images | |
WO2016056838A1 (en) | Medical navigation device | |
KR20170030688A (en) | Guiding method of interventional procedure using medical images and system for interventional procedure for the same | |
JP2023507434A (en) | Selection of cursor position on medical images by direction from the distal tip of the probe | |
CN117084786A (en) | Puncture navigation system, positioning support and positioning puncture needle | |
Rai et al. | Fluoroscopic image-guided intervention system for transbronchial localization | |
WO2014175608A1 (en) | Method for comparing preoperative respiratory level with intraoperative respiratory level |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15849156 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15849156 Country of ref document: EP Kind code of ref document: A1 |