WO2016056838A1 - Dispositif médical de navigation - Google Patents

Dispositif médical de navigation Download PDF

Info

Publication number
WO2016056838A1
WO2016056838A1 PCT/KR2015/010591 KR2015010591W WO2016056838A1 WO 2016056838 A1 WO2016056838 A1 WO 2016056838A1 KR 2015010591 W KR2015010591 W KR 2015010591W WO 2016056838 A1 WO2016056838 A1 WO 2016056838A1
Authority
WO
WIPO (PCT)
Prior art keywords
needle
medical
distance
computer
image
Prior art date
Application number
PCT/KR2015/010591
Other languages
English (en)
Korean (ko)
Inventor
이상민
김남국
Original Assignee
울산대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 울산대학교산학협력단 filed Critical 울산대학교산학협력단
Publication of WO2016056838A1 publication Critical patent/WO2016056838A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • A61B10/02Instruments for taking cell samples or for biopsy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/28Surgical forceps
    • A61B17/29Forceps for use in minimally invasive surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • A61B10/02Instruments for taking cell samples or for biopsy
    • A61B2010/0225Instruments for taking cell samples or for biopsy for taking multiple samples
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery

Definitions

  • the present disclosure relates to a medical navigation apparatus as a whole, and more particularly, to a medical navigation apparatus that intuitively shows an alignment of a medical tool on a screen and a distance between the medical tool and the target point on one screen.
  • Medical imaging-based biopsy is an interventional procedure that minimizes damage to the surrounding normal tissue and extracts the samples necessary for the pathological diagnosis of neoplastic disease, such as the adrenal glands, pancreas and lymph nodes. It is widely applied to areas such as the peritoneum, lung mediastinum, spine and extremities. Medical imaging-based biopsies use high-resolution images to delicately localize lesions in three dimensions and to view biopsy needles that enter tissues, making it easier to detect small lesions.
  • the insertion path of the biopsy needle may be guided by a CT or C-arm fluoroscopy image at a procedure for performing a medical image-based biopsy. Due to problems such as radiation exposure, the insertion path is usually planned in advance on the diagnostic image. For example, in the planning of the insertion path, the entry angle of the biopsy needle to the patient's body is important, and the insertion path is planned by defining the entry angle and insertion point.
  • the image acquisition device eg, Fluoroscopy device, CBCT device placed in the procedure room is aligned with the planned path, i.e., the orientation in which the biopsy needle will be inserted.
  • a navigation view is used to accurately guide the biopsy needle during the biopsy procedure.
  • a navigation view such as surgeon's Eye View shown in FIG. 1
  • the center point of the target is shown, and the biopsy needle is based on the insertion point.
  • the target is displayed as a point and a circle is drawn around the point.
  • the user interface screen or navigation screen shows a cross-sectional image perpendicular to the surgeon's Eye View, and has two or three more views of the biopsy needle, and the operator looks at the surgeon's Eye View.
  • FIG. 2 is a diagram illustrating an example of a navigation screen for an ablation procedure disclosed in US 2013/0317363, in which the centers of two circles 453 and 454 coincide with each other to determine whether a medical tool is accurately aimed at a target. You can check with In addition, the distance between the target and the medical instrument is indicated by numbers below. According to this, there is a lack of a means for intuitively recognizing the distance and direction that changes as the medical tool is continuously inserted into the human body.
  • a medical navigation device for guiding insertion of a needle-like medical tool, comprising: a computer integrating surgical planning information into an operating room image; a computer for integrating surgical planning information, including a target point, an insertion point, and an insertion path of a needle-shaped medical tool in a planned lesion in a preoperative image including a surgical target; Positioning means for identifying the relative position information of the patient and the needle-shaped medical tool and providing it to the computer; And a user interface (UI) showing an entry point in the direction of insertion of the needle-shaped medical tool using an operating room image in which the surgery planning information is integrated with the computer. And a user interface displaying a distance between the calculated target point and the end of the needle-type medical instrument, the navigation interface displaying a distance of the distance by a plurality of lines spaced about the insertion point.
  • UI user interface
  • FIG. 1 is a view showing an example of a surgeon's Eye View
  • FIG. 2 is a view showing an example of a navigation screen for the ablation procedure disclosed in US Patent Publication No. 2013/0317363;
  • FIG. 3 is a view for explaining an example of a medical navigation device according to the present disclosure
  • FIG. 4 is a view illustrating an example of a method of dividing a tumor and generating a surgical plan in a preoperative image
  • FIG. 5 is a view illustrating an example of a preoperative image in which a tumor and an insertion path are visualized
  • FIG. 6 is a view for explaining an example of how the surgical plan is integrated into the operating room image
  • FIG. 7 is a view for explaining an example of the positioning means for grasping the relative position information of the patient and the biopsy needle;
  • FIG. 8 is a view for explaining an example of a user interface screen
  • 9 and 10 are views showing examples of a screen showing both the direction and the depth of the biopsy needle in the user interface screen.
  • the medical navigation device may be used to navigate the needle insertion interventional robot.
  • Needle-insertion interventional robots are used for biopsy and treatment to reduce radiation exposure and improve procedure accuracy. Intervention robots can be used for biopsy and treatment of 1 cm-grade lesions in the abdomen, chest, and the like. Examples of needle-type medical instruments include biopsy needles.
  • the medical navigation apparatus includes a computer 600 for processing or generating a medical image, a position detecting means for grasping the relative position information of the patient 50 and the needle-type medical tool 111 and providing it to the computer 600 ( 400, and a user interface 500 interoperating with the computer 600 to show an entry point in the insertion direction of the needle-shaped medical instrument 111 using an operating room image in which surgery planning information is integrated.
  • the user interface 500 displays the distance between the target point calculated by the computer 600 and the end of the needle medical instrument 111 using the relative position information of the patient 50 and the needle medical instrument 111.
  • the navigation screen displays distances of distances with a plurality of lines spaced around the insertion point.
  • the needle-type medical tool 111 is provided in the slave robot 100 that operates in conjunction with the computer 600, and master console 200 for controlling the slave robot 100 in cooperation with the user interface 500 in real time. And an image capturing apparatus 300 for capturing the position of the biopsy needle 111 in the human body, and a device 400 for monitoring the position and posture of the slave robot 100, the patient 50, and peripheral devices. can do.
  • the medical navigation apparatus may be applied to a biopsy of an organ such as lung, kidney, liver, and the like. The application is also not excluded from the site of. In this example, the lungs are described.
  • the preoperative image is thresholded on the lung of the patient to segment the lesion 10 (eg, a tumor) and generate a surgical plan.
  • the lung images are segmented to prepare a divided lung image.
  • anatomical structures eg, blood vessels, ribs, airways, lung boundaries, etc.
  • Anatomical structures, such as airways and the like may be stored as a lung mask, a vessel mask, a rib mask, an airway mask, or the like.
  • Tumor 10 is divided by a segmentation technique (for example, adaptive threshold) by the HU value appropriate for the tumor 10. 4 shows an example of an axial cross section of a lung image in which the tumor 10 is divided.
  • the computer 600 is loaded with a preoperative image of the patient, and the operating room image and the preoperative image of the patient acquired at the operating room are registered by the computer 600. As a result of the registration, a surgical plan including the insertion paths 82 and 84, the insertion point 41, the target point on the tumor, and the like made using the preoperative image is transferred to the operating room image. This is further described below.
  • the divided tumor 10 may be generated as a 3D image.
  • the cross section of the tumor can be seen in the direction required by the image processing software, and the tumor can be seen in representative directions such as, for example, axial view, coronal view and sagittal view. (10) can be seen, and a surgical plan can be made based on this.
  • FIG. 5 is a diagram illustrating an example of a preoperative image in which a tumor and an insertion path are visualized, and an insertion path 82 is visualized in 3D between the actual ribs and the ribs.
  • the preoperative image is a 3D image, and as shown in FIG. 5 through volume rendering, the surgical plan may be generated in 3D.
  • the tumor 10 is divided from the periphery, the insertion path 82 is visualized in three dimensions, and the tumor 10 is marked with a target point (eg, the center point or edge of the tumor).
  • Tumor 10 has little contrast and is therefore invisible to fluoroscopy cities, and tumor 10 is generally represented in a generally circular shape. Accordingly, unlike the case shown in FIG. 5, the tumor is not visualized to be distinguished from the surroundings, and the location of the tumor may be determined by the internal calculation process of the computer 600.
  • FIG. 6 is a view illustrating an example of a method of integrating a surgical plan into an operating room image.
  • An operating room image is acquired at an operating room, and the preoperative image and the operating room image are matched to insert an operation path into the operating room image.
  • the plan is transferred.
  • rigid registration and deformable registration methods and the like may be used.
  • the insertion path 82 may be modified through the user interface 500, and an inappropriate insertion path may be removed in consideration of breathing or movement.
  • 6 (a) is an example of a preoperative image
  • FIG. 6 (b) is an image in which an operating room image and a preoperative image are matched and an operation plan is transferred.
  • the insertion path 251, the insertion point, and the target point (eg, axial view, coronal view, sagittal view) on the MPR may be overlaid (FIG. 6 illustrates the axial view).
  • the biopsy needle 111 may be guided along the insertion path 82 identified on the MPR to perform the procedure.
  • the final confirmed insertion path is transmitted to the slave robot 100 or the user interface 500 (eg, navigation device) or the like using TCP / IP or a dedicated communication protocol.
  • the biopsy needle 111 may of course be a single needle type, but in order to biopsy a multi-spot, a plurality of revolver types may be mounted on the slave robot 100, so that it may be more effective to biopsy each target point sequentially.
  • FIG. 7 is a view for explaining an example of a positioning means for grasping relative position information of a patient and a biopsy needle, and various methods may be used as the positioning means for grasping the relative positional relationship between the patient and the biopsy needle.
  • the patient 960, the slave robot 911 with the biopsy needle 912, the infrared camera 991, the infrared reflector assemblies 911, 913, 914, the monitor 920 and the computer. 940 is provided.
  • the infrared camera 991 detects the plurality of infrared reflectors 911 and the infrared reflectors or infrared emitters 913 provided at the ends of the biopsy needle 912, indicating the position of the patient 960.
  • the location of the patient 960 can be identified.
  • a computer 940 is provided for overall operation of the master console, and a monitor 920 is also provided.
  • the computer 940 and the monitor 920 may correspond to the computer 600 and the user interface 500 described with reference to FIG. 1.
  • the computer 940 also functions as a surgical navigation device.
  • the biopsy needle 912 of the slave robot 911 is operated by the computer 940.
  • the infrared reflector assembly 911 is fixed to the patient 960 to indicate the position of the patient 960
  • the infrared reflector assembly 913 is fixed to the biopsy needle 912 to indicate the position of the biopsy needle 912
  • An infrared reflector assembly 914 is positioned on the chest of the patient 960 to indicate patient movement, such as breathing, sneezing of the patient.
  • an infrared camera and an infrared reflector are used, but a magnetic field can be used, and any means can be used as long as the position can be sensed. For example, it is possible to attach a magnetic sensor to the biopsy needle and track with the camera how far it moves.
  • the infrared reflector assembly 911 may be used to indicate the location information of the patient 960, may function as a reference position of the entire system, may be fixed to the patient 960, but may be fixed to the operating table, or the operating table. An additional infrared reflector assembly (not shown) may serve as a reference position. The location of the biopsy needle 912 relative to the patient 960 can be determined.
  • the slave robot itself knows its position.
  • the slave robot is holding the biopsy needle, and the slave robot itself can know its coordinates within the procedure room. It is also possible to detect by itself how many millimeters the biopsy needle moves. Therefore, the computer can calculate the orientation and position of the biopsy needle in the space of the procedure image.
  • the computer can calculate the current position of the biopsy needle in the matched operating room image space by imaging with floroscopy city.
  • the positioning means uses a plurality of methods to determine the positional relationship rather than using only one.
  • the distance between the biopsy needle 111 and the target point 11 on the tumor 10 can be calculated by the computer calculating the relative positional relationship of the patient and the biopsy needle identified by one or more of these locating means.
  • the user interface screen displays two small circles showing the orientation along with a crosshair to see if the biopsy needle 111 matches the tumor 10, and provides a depth to intuitively determine the distance between the tumor and the tip of the biopsy needle.
  • the distance between the tip of the biopsy needle and the target point is displayed, and the distance between the tip of the biopsy needle and the target point of the tumor is displayed in depth with a plurality of curves spaced apart from each other about the insertion point. Since the distance between the tip and the target point of the biopsy needle is constantly changing when inserted into the insertion path, the insertion angle, the insertion point, and the insertion distance determined in the surgical plan, it is preferable to sequentially display the depth on the user interface screen. This is further described below.
  • the upper screen contains a CT image (e.g., an image transmitted from a floroscopy) and a mask showing various structures or lesions of the lung.
  • CT image e.g., an image transmitted from a floroscopy
  • a mask showing various structures or lesions of the lung.
  • the main screen 510 displays an operating room image matched with 3D.
  • MPR images eg, 520, 530, 540
  • the direction of the biopsy needle and the like are displayed.
  • a camera is installed at the end of the biopsy needle to show the insertion point of the skin, and when the centers of the two circles coincide, the angle of the biopsy needle is aimed at the tumor as planned.
  • the operator instructs the computer through the master console and the biopsy needle is inserted into the human body by the operation of the slave robot linked to the computer.
  • the relative position of the biopsy needle and the patient can be seen in the same way as described in FIG. 7, with the spiral curve 70 (see FIG. 9) centered around the insertion point being the stab depth of the biopsy needle, ie the tip and affected part of the biopsy needle. Shows the distance between targets. This allows the doctor to intuitively recognize the angle and depth of the biopsy needle by looking at this screen.
  • the distance is calculated and displayed on the screen in real time, and as shown in FIG. can do.
  • the location of the affected part is grasped in the space of the operating room image and the distance can be obtained by identifying the biopsy needle displayed on the operating room image.
  • the movement of the slave robot can be detected by the slave robot itself. That is, the sensor can detect how far the needle has advanced in the slave robot. Therefore, the distance information thereby may be supplemented to the above-mentioned distance information.
  • the computer can calculate the puncture rate by identifying the biopsy needles appearing on the plurality of operating room images, the time delay and processing the operating image to display on the user interface and the speed is evaluated, the position of the end of the current biopsy needle and The distance can be calculated. This calculated distance can also be used for supplementation.
  • FIG. 9 and 10 are diagrams showing examples of the screen showing the direction and depth of the biopsy needle in the user interface screen, and whether the biopsy needle is accurately aimed at the affected part is the center of the center of the two small circles inside It can be seen whether or not.
  • a spiral curve 70 (refer to FIG. 9) is displayed around the center, and the outer spiral indicates a distance from the affected part than the inner spiral. The spirals are numbered in depth.
  • the distance continues to change, which in turn activates certain portions of the spiral corresponding to the distance (e.g. color changes, line thickness changes, flickers, etc.), thereby allowing the operator to determine the direction and depth of the biopsy needle. Intuitively together.
  • a plurality of circles 75 concentric with the center 11 are shown, with the outer circle farther from the affected part than the inner circle.
  • the circle corresponding to the distance is sequentially activated (e.g. color change, line thickness change, blinking, etc.) so that the operator can intuitively adjust the direction and depth of the biopsy needle together. You can check it.
  • the biopsy needle is pierced while checking the distance of the tip of the biopsy needle and the affected part in the above-described user interface screen according to the present example, and a CT or medical imaging apparatus immediately before the affected part. You can also consider using a biopsy needle to finally take the image and use it to biopsy the affected area.
  • a medical navigation device for guiding the insertion of a needle-shaped medical tool comprising: a computer integrating surgical planning information into an operating room image, comprising: a target point on an image of a planned lesion in a preoperative image including a surgical target, A computer incorporating surgical planning information, including an insertion point and an insertion path of a needle-shaped medical instrument, in the operating room image; Positioning means for identifying the relative position information of the patient and the needle-shaped medical tool and providing it to the computer; And a user interface (UI) showing an entry point in the direction of insertion of the needle-shaped medical tool using an operating room image in which the surgery planning information is integrated with the computer. And a user interface displaying a distance between the calculated target point and the end of the needle-type medical instrument, the navigation interface displaying a distance of the distance by a plurality of lines spaced about the insertion point. Medical navigation device.
  • a medical navigation apparatus characterized in that a plurality of lines are spirals centering on an insertion point.
  • a medical navigation apparatus characterized in that a plurality of lines have a plurality of causes of concentric insertion points.
  • the navigation screen displays two circles enclosing the insertion point inside the plurality of lines, and the alignment of the needle-like medical tool and the insertion path is confirmed by coinciding the centers of the two circles. Medical navigation device.
  • the locating means comprises: a marker for marking the needle-shaped medical tool and the patient; And a sensing device for sensing a marker.
  • the positioning means includes: a medical imaging apparatus for photographing the operating room image; the computer calculates the location of the affected area in the space of the operating room image by matching the plurality of operating room images and preoperative images , Medical navigation device, characterized in that for calculating the distance between the needle-shaped medical tool and the lesion included in the plurality of operating room images.
  • the positioning means includes: a sensor provided in the slave robot equipped with the needle-type medical tool, wherein the sensor detects the position of the needle-type medical tool with respect to a reference position of the operating room. .
  • the medical instrument of the needle type is a biopsy needle, wherein the user interface sequentially activates a circle corresponding to the corresponding distance to display the distance between the target point and the tip of the biopsy needle.
  • the distance between the affected part and the tip of the needle-type medical tool is activated in a spiral or a plurality of circles sequentially and displayed in a sense of depth, it is convenient to know the depth of stab intuitively, The safety and accuracy of the procedure can be further improved.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Robotics (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Optics & Photonics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Ophthalmology & Optometry (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

La présente invention concerne un dispositif médical de navigation destiné à guider l'entrée d'un outil médical du type aiguille, ce dispositif médical de navigation comprenant : un ordinateur destiné à intégrer une image de chambre d'opération avec des informations de plan d'opération, l'ordinateur intégrant des informations de plan d'opération avec une image de chambre d'opération, les information de plan d'opération comprenant un point cible situé sur une cible chirurgicale, un point d'entrée, et un chemin d'entrée de l'outil médical du type aiguille, lesquels ont été planifiés par référence à une image de pré-opération comprenant la cible chirurgicale ; un moyen de saisie de position destiné à saisir des informations concernant la position relative entre le patient et l'outil médical du type aiguille et à entrer ces informations dans l'ordinateur ; ainsi qu'une interface utilisateur (UI) destinée à être en interconnectabilité avec l'ordinateur et à présenter le point d'entrée dans la direction de l'entrée de l'outil médical du type aiguille à l'aide de l'image de chambre d'opération, qui est intégrée aux informations de plan d'opération, l'IU comprenant un écran de navigation pour afficher la distance entre un point cible, qui a été calculé par l'ordinateur, et l'extrémité de l'outil médical du type aiguille à l'aide des informations concernant la position relative, de telle sorte qu'une pluralité de lignes, qui sont agencées selon un certain intervalle autour du point d'entrée, indiquent l'intervalle de distance.
PCT/KR2015/010591 2014-10-08 2015-10-07 Dispositif médical de navigation WO2016056838A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140135719A KR101635515B1 (ko) 2014-10-08 2014-10-08 의료용 항법 장치
KR10-2014-0135719 2014-10-08

Publications (1)

Publication Number Publication Date
WO2016056838A1 true WO2016056838A1 (fr) 2016-04-14

Family

ID=55653383

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/010591 WO2016056838A1 (fr) 2014-10-08 2015-10-07 Dispositif médical de navigation

Country Status (2)

Country Link
KR (1) KR101635515B1 (fr)
WO (1) WO2016056838A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101837301B1 (ko) * 2016-10-28 2018-03-12 경북대학교 산학협력단 수술 항법 시스템
KR102102942B1 (ko) 2018-07-31 2020-04-21 서울대학교산학협력단 3d 영상 정합 제공 장치 및 그 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990029038A (ko) * 1995-07-16 1999-04-15 요아브 빨띠에리 바늘 도자의 자유로운 조준
KR20100112310A (ko) * 2009-04-09 2010-10-19 의료법인 우리들의료재단 수술용 로봇의 제어 방법 및 그 시스템
JP2011139734A (ja) * 2010-01-05 2011-07-21 Hoya Corp 内視鏡装置
KR20120041455A (ko) * 2010-10-21 2012-05-02 주식회사 이턴 수술용 로봇의 움직임 제어/보상 방법 및 장치
US20130317363A1 (en) * 2012-05-22 2013-11-28 Vivant Medical, Inc. Planning System and Navigation System for an Ablation Procedure

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010010260A (ko) * 1999-07-16 2001-02-05 윤한걸 두부제조방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990029038A (ko) * 1995-07-16 1999-04-15 요아브 빨띠에리 바늘 도자의 자유로운 조준
KR20100112310A (ko) * 2009-04-09 2010-10-19 의료법인 우리들의료재단 수술용 로봇의 제어 방법 및 그 시스템
JP2011139734A (ja) * 2010-01-05 2011-07-21 Hoya Corp 内視鏡装置
KR20120041455A (ko) * 2010-10-21 2012-05-02 주식회사 이턴 수술용 로봇의 움직임 제어/보상 방법 및 장치
US20130317363A1 (en) * 2012-05-22 2013-11-28 Vivant Medical, Inc. Planning System and Navigation System for an Ablation Procedure

Also Published As

Publication number Publication date
KR20160042297A (ko) 2016-04-19
KR101635515B1 (ko) 2016-07-04

Similar Documents

Publication Publication Date Title
US20220346886A1 (en) Systems and methods of pose estimation and calibration of perspective imaging system in image guided surgery
KR102420386B1 (ko) 영상 안내식 시술 중에 복수의 모드에서 안내 정보를 디스플레이하기 위한 그래픽 사용자 인터페이스
JP6710643B2 (ja) 肺の中をナビゲートするシステムおよび方法
EP3289964B1 (fr) Systèmes pour fournir une sensibilisation de proximité des limites pleurales, structures vasculaires et autres structures intra-thoraciques critiques pendant une bronchoscopie par navigation électromagnétique
EP3964161B1 (fr) Système fluoroscopique amélioré par tomodensitométrie, dispositif et son procédé d'utilisation
JP6404713B2 (ja) 内視鏡手術におけるガイド下注入のためのシステム及び方法
KR20190015580A (ko) 영상 안내식 시술 중에 안내 정보를 디스플레이하기 위한 그래픽 사용자 인터페이스
WO2017043926A1 (fr) Procédé de guidage de processus d'intervention utilisant des images médicales et système pour processus d'intervention pour celui-ci
US20150305612A1 (en) Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
JP2017528175A (ja) 3dのナビゲーションの間に距離および向きのフィードバックを提供するシステムおよび方法
WO2020186198A1 (fr) Système de guidage et de poursuite pour biopsie et traitement modélisés et ciblés
EP3133995A2 (fr) Appareils et méthodes pour la navigation endobronchique vers l'emplacement d'un tissu cible, la confirmation de l'emplacement d'un tissu cible et l'interception percutanée du tissu cible
EP3530221B1 (fr) Système pour la réalisation d'une procédure de navigation percutanée
WO2011058516A1 (fr) Systèmes et procédés de planification et de réalisation de procédures avec aiguille percutanée
WO2017043924A1 (fr) Procédé de guidage de procédure d'intervention à l'aide d'images médicales, et système de procédure d'intervention associé
EP3500159B1 (fr) Système pour l'utilisation de caractéristiques de points mous pour prédire des cycles respiratoires et améliorer l'enregistrement final
WO2016060308A1 (fr) Robot du type à insertion d'aiguille pour intervention chirurgicale
EP3783568A2 (fr) Systèmes et procédés d'imagerie fluoro-ct pour enregistrement initial
WO2016056838A1 (fr) Dispositif médical de navigation
KR102467282B1 (ko) 의료 영상을 이용하는 중재시술 시스템 및 방법
KR20170030688A (ko) 의료영상을 사용하는 중재시술 가이드 방법 및 이를 위한 중재시술 시스템
JP2023507434A (ja) プローブの遠位端からの方向による医用画像上のカーソル位置の選択
Rai et al. Fluoroscopic image-guided intervention system for transbronchial localization
WO2014175608A1 (fr) Méthode de comparaison de niveau respiratoire préopératoire avec un niveau respiratoire peropératoire

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15849156

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15849156

Country of ref document: EP

Kind code of ref document: A1