WO2015030455A1 - 수술 내비게이션 시스템 운용 방법 및 수술 내비게이션 시스템 - Google Patents
수술 내비게이션 시스템 운용 방법 및 수술 내비게이션 시스템 Download PDFInfo
- Publication number
- WO2015030455A1 WO2015030455A1 PCT/KR2014/007909 KR2014007909W WO2015030455A1 WO 2015030455 A1 WO2015030455 A1 WO 2015030455A1 KR 2014007909 W KR2014007909 W KR 2014007909W WO 2015030455 A1 WO2015030455 A1 WO 2015030455A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- organ
- navigation system
- image
- surgical navigation
- model
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 34
- 210000000056 organ Anatomy 0.000 claims abstract description 107
- 230000003190 augmentative effect Effects 0.000 claims abstract description 28
- 238000009877 rendering Methods 0.000 claims abstract description 18
- 238000011017 operating method Methods 0.000 claims description 11
- 238000007493 shaping process Methods 0.000 claims description 11
- 239000003814 drug Substances 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 claims description 2
- 210000004556 brain Anatomy 0.000 description 13
- 238000010586 diagram Methods 0.000 description 6
- 230000007774 longterm Effects 0.000 description 6
- 210000002784 stomach Anatomy 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 2
- 238000001356 surgical procedure Methods 0.000 description 2
- 238000007794 visualization technique Methods 0.000 description 2
- 210000003484 anatomy Anatomy 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/02—Affine transformations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/08—Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20128—Atlas-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Definitions
- a 3D rendering of a 2D organ image shaped through augmented reality is converted into a virtual organ model, thereby allowing a user to determine the depth degree between the virtual organ model and the depth relationship between the virtual organ model and the surgical tool.
- the present invention relates to a surgical navigation system operating method and a surgical navigation system that can be more easily recognized.
- Surgical navigation system using augmented reality is a system that displays the anatomical structure that cannot be actually photographed like the skin or the inside of the organ by using a virtual object on the image of the actual patient photographed through the camera. It can prevent damage and prevent the incision of unnecessary parts.
- a user during surgery using a surgical navigation system using augmented reality may have difficulty in recognizing depth. If the user does not accurately recognize the depth of the organ between the surgical tool implemented in the surgical navigation system and the patient, there is a risk of damaging the unwanted area.
- augmented reality since virtual objects in three-dimensional space are projected on a two-dimensional screen, it may be difficult to determine the depth relationship between the actual patient's position and several virtual objects, especially when the virtual objects are translucent. Difficulties in identifying depth relationships can also be involved.
- HMD head mounted display
- a selection signal is input to a two-dimensional organ image formed through augmented reality
- the user is virtualized by three-dimensional rendering based on a point in the organ image to which the selection signal is input and converting it into a virtual organ model. It is an object of the present invention to provide a method and a system for operating a surgical navigation system, which allows a user to move a viewpoint of a virtual camera directly in reality to easily grasp a depth level between virtual organ models.
- the present invention displays the positional relationship (direction and distance) between the virtual organ model and the surgical tool in real time, so that the user can more accurately recognize the depth relationship between the virtual organ model and the surgical tool in virtual reality. It is an object of the present invention to provide a method and system for operating a surgical navigation system.
- the method of operating a surgical navigation system for achieving the above object includes: identifying an object from an image of a human body photographed by a camera, shaping a two-dimensional organ image of the object using augmented reality, and the organ image 3D rendering, to create a virtual organ model.
- the surgical navigation system for identifying the object from the human body image captured by the camera, and using augmented reality, a two-dimensional long-term image of the object It comprises an image shaping unit for shaping the three-dimensional rendering of the organ image and the organ modeling unit for creating a virtual organ model.
- a selection signal is input to a two-dimensional organ image shaped through augmented reality
- the user selects a three-dimensional rendering based on a point in the organ image to which the selection signal is input and converts the image into a virtual organ model. It is possible to provide a method and system for operating a surgical navigation system to move the viewpoint of the virtual camera directly in the virtual reality, so that the depth degree between the virtual organ models can be easily understood.
- the user by displaying the positional relationship (direction and distance) between the virtual organ model and the surgical tool on the screen in real time, the user more accurately recognizes the depth relationship between the virtual organ model and the surgical tool in virtual reality You can do it.
- FIG. 1 is a view showing a specific configuration of a surgical navigation system according to an embodiment of the present invention.
- FIG. 2 is a diagram showing an example of a two-dimensional organ image and a virtual organ model.
- FIG. 3 is a diagram illustrating an example of switching from augmented reality to virtual reality according to an input of a selection signal.
- FIG. 4 is a diagram illustrating an example of enlarging a virtual organ model in virtual reality according to an input of a selection signal.
- FIG. 5 is a flowchart illustrating a method for operating a surgical navigation system according to an embodiment of the present invention.
- FIG. 1 is a view showing a specific configuration of a surgical navigation system according to an embodiment of the present invention.
- the surgical navigation system 100 of the present invention may include an object identification unit 110, an image shaping unit 120, and an organ model preparation unit 130.
- the surgical navigation system 100 may be configured by adding the direction / distance display unit 140.
- the object identification unit 110 identifies the object from the human body image captured by the camera.
- the object may refer to an area in which the organ of the patient is located or is estimated to be located, such as, for example, 'brain', 'heart', and 'stomach' in the human body image.
- the surgical navigation system 100 of the present invention can maintain the general form of each organ, such as 'brain', 'heart', 'stomach', etc. as a regular model.
- the object identification unit 110 may project the human body image into a normal model of an organ, and identify an object in the human body image overlapping with the normal model within a predetermined range.
- the object identification unit 110 projects the human body image into a normal model for organs 'brain', a normal model for organs 'heart', a normal model for organs 'stomach', and the like, respectively.
- Each object in the human body image within a range of '70% 'or more may be identified.
- the image shaping unit 120 shapes a two-dimensional organ image of the object by using augmented reality.
- the surgical navigation system 100 of the present invention may maintain a DICOM (Digital Imaging and Communications in Medicine) image of a patient in which the human body image is captured by a camera.
- DICOM Digital Imaging and Communications in Medicine
- the DICOM image may collectively refer to an image obtained by photographing a patient in various directions (eg, front, side, cross section, etc.) using medical equipment such as CT, MRI, and X-ray.
- the image shaping unit 120 may shape the organ image by two-dimensionally rendering the DICOM image of the regular model associated with the object identification in a planar form. For example, when an object corresponding to an organ 'brain' is identified in the human body image, the image shaping unit 120 renders a plurality of DICOM images of the organ 'brain' of the patient in various directions, respectively. In addition, the 2D organ image of the organ 'brain' may be visualized in the region of the identified object in the human body image.
- the organ model preparation unit 130 three-dimensionally renders the organ image to create a virtual organ model.
- the organ model preparing unit 130 may receive a selection signal including a touch signal and a click signal with respect to the organ image, and perform 3D rendering on a point in the organ image to which the selection signal is input. have.
- the long-term model generator 130 does not move the screen (ie, maintains the position and focus of the virtual camera corresponding to the camera). You can seamlessly switch from 3D to 3D virtual reality.
- the organ model preparing unit 130 may express the depth of the organ model by using a plane perspective generated between the planar organ images.
- the organ modeling unit 130 controls at least one of rotation, enlargement, and reduction of the organ model by interlocking with input of a selection signal including a touch signal and a click signal for the organ model. By doing so, it is possible to easily express the positional relationship and the sense of depth between the organ models in the screen converted to a three-dimensional virtual reality.
- the selection signal is input to the two-dimensional organ image formed through augmented reality
- the image is converted into a virtual organ model by three-dimensional rendering based on a point in the organ image to which the selection signal is input.
- a method and system for operating a navigational navigation system may be provided in which a user may directly move a viewpoint of a virtual camera in a virtual reality to easily grasp the degree of depth between virtual organ models.
- the surgical navigation system 100 of the present invention so that the user can more accurately recognize the depth relationship between the virtual organ model and the surgical tool as well as the depth degree between the virtual organ model in the virtual reality.
- the direction / distance display unit 140 may further include.
- the direction / distance display unit 140 acquires the position of a point on the surface of the organ model relatively close to the end of the surgical tool by using the shortest neighbor search technique, and between the acquired point and the surgical tool. By calculating the distance of can be displayed on the screen together with the organ model.
- the direction / distance display unit 140 may calculate and display an entrance direction of a surgical tool approaching the organ model. In this case, the direction / distance display unit 140 may display a warning step by step according to the calculated distance.
- the positional relationship (direction and distance) between the virtual organ model and the surgical tool is displayed on the screen in real time, so that the user can more accurately determine the depth relationship between the virtual organ model and the surgical tool in the virtual reality. It becomes perceptible. That is, the present invention can provide a sense of intuitive depth and contribute to improving the stability of the procedure by displaying the shortest distance between the virtual organ model and the surgical instrument to the user.
- the surgical navigation system 100 represents a virtual camera and a virtual patient or organ corresponding to the camera and the patient in a virtual space, and projects the image acquired from the camera behind the virtual object to accurately display the virtual organ and the actual organ.
- the augmented reality may be implemented by overlapping.
- the internal parameter values of the camera and the positional relationship between the patient and the camera must be known.
- a camera calibration method of Zhengyou Zhang can be used.
- the internal parameter value may be calculated by photographing a chessboard calibration tool 50 times at different positions.
- the positional relationship between the camera and the patient can be tracked in real time using an optical position tracker (Polaris Spectra, NDI, Waterloo, Canada, etc.) after attaching passive markers to the patient and the camera body, respectively.
- the virtual space may be configured by using an internal parameter value of the camera and a positional relationship between the camera and the patient.
- the field of view (FOV) of a virtual camera can be calculated using the actual camera's CCD size and internal parameter values, and the size of the virtual screen for augmented reality also uses the camera's parameter values and the relationship between the camera and the patient. Can be calculated.
- the surgical navigation system 100 of the present invention may enable intuitive depth recognition using the advantages of augmented reality and virtual reality.
- the surgical navigation system 100 may direct the focal point to the virtual organ without leaving the position of the virtual camera so that the position of the organ being observed on the screen of the augmented reality does not change and naturally shifts to the virtual reality environment. have.
- the surgical navigation system 100 may visualize the DICOM image on the screen of the augmented reality so that depth recognition may be performed in the augmented reality, and may help depth recognition by comparing the cross-sectional image of the virtual object and the patient.
- the surgical navigation system 100 may solve the depth recognition problem by tracking and visualizing in real time the distance and direction between the end of the surgical tool and the surface of the particular organ located closest to. For example, the surgical navigation system 100 registers surface data of a target organ using a KD Tree data structure to speed up a search, and virtually closest to the end of a surgical tool using a shortest neighbor search method. The position of a point on the surface of the object may be acquired and displayed on the screen. In addition, the surgical navigation system 100 may be displayed to the user step by step warning when the distance to the surgical tool is close.
- the present invention it is possible to naturally move from augmented reality to virtual reality without moving the screen, so that the user can easily determine the depth between objects by moving the viewpoint of the virtual camera directly in the virtual reality system.
- a surgical navigation system that can feel the depth without burdening the user can be provided.
- FIG. 2 is a diagram showing an example of a two-dimensional organ image (i) and a virtual organ model (ii).
- the surgical navigation system of the present invention may shape a long-term image by two-dimensional rendering a DICOM image of a regular model associated with identification of an object in a planar form. That is, since the surgical navigation system identifies the object corresponding to the organ 'brain' in the human body image, two-dimensional rendering each of the plurality of DICOM images of the organ 'brain' of the patient in various directions is performed. By visualizing the two-dimensional organ image of the organ 'brain' in the area of the identified object, the sense of depth can be transmitted using the DICOM image.
- the surgical navigation system switches the screen from 2D augmented reality to 3D virtual reality and inputs the selection signal.
- the organ image may be three-dimensionally rendered with respect to a point in the organ image, and a virtual organ model may be created on the screen of the virtual reality as shown in FIG.
- the surgical navigation system can deliver accurate depth information by tracking and visualizing in real time the minimum distance and direction between the end of the surgical tool and the surface of a particular organ located closest to it.
- FIG 3 is a diagram illustrating an example of switching from augmented reality (i) to virtual reality (ii) according to an input of a selection signal.
- the surgical navigation system of the foot may maintain focus without moving the screen. It is possible to smoothly switch the screen from 3D augmented reality to 3D virtual reality. That is, the surgical navigation system may be converted into a virtual organ model by three-dimensional rendering based on a point in the organ image to which a selection signal is input, as shown in (ii) of FIG. 3. In this case, the surgical navigation system may express the depth of the virtual organ model by using the perspective of the plane generated between the planar organ images.
- FIG. 4 is a diagram illustrating an example of enlarging a virtual organ model in virtual reality according to an input of a selection signal.
- the surgical navigation system of the present foot is linked to an input of a selection signal including a touch signal and a click signal for a virtual organ model in the virtual reality shown in FIG.
- a selection signal including a touch signal and a click signal for a virtual organ model in the virtual reality shown in FIG.
- the positional relationship and the sense of depth between the organ models can be easily expressed.
- the user may easily determine the depth of the virtual long-term model by moving the viewpoint of the virtual camera directly in the virtual reality.
- FIG. 5 will be described in detail the workflow of the surgical navigation system 100 according to an embodiment of the present invention.
- FIG. 5 is a flowchart illustrating a method for operating a surgical navigation system according to an embodiment of the present invention.
- the surgical navigation system operating method according to the present embodiment may be performed by the above-described surgical navigation system 100.
- the surgical navigation system 100 identifies the object from the human body image captured by the camera (510).
- the surgical navigation system 100 projects the human body image into a normal model for organs 'brain', a regular model for organs 'heart', a regular model for organs 'stomach', and the like, respectively.
- Each object in the human body image within a range of '70% 'or more may be identified.
- the surgical navigation system 100 uses augmented reality to shape a two-dimensional organ image of the object (520).
- the surgical navigation system 100 renders a plurality of DICOM images of the organ 'brain' of the patient in various directions by two-dimensional rendering.
- the 2D organ image of the organ 'brain' may be visualized in the region of the identified object in the human body image.
- the DICOM image may collectively refer to an image obtained by photographing a patient in various directions (eg, front, side, cross section, etc.) using medical equipment such as CT, MRI, and X-ray.
- the surgical navigation system 100 renders the organ image three-dimensionally, creates a virtual organ model (530), and inputs a selection signal including a touch signal and a click signal to the organ model.
- the organ model is controlled at least one of rotation, enlargement, and reduction.
- the surgical navigation system 100 performs three-dimensional operation in two-dimensional augmented reality without moving the screen (that is, maintaining the position and focus of the virtual camera corresponding to the camera). You can switch the screen smoothly into virtual reality.
- the surgical navigation system 100 may express a sense of depth for the organ model by using a plane perspective generated between the planar organ images.
- the surgical navigation system 100 calculates the distance between the point and the surgical tool on the surface of the organ model relatively close to the end of the surgical tool and displayed on the screen with the organ model (550), The surgical navigation system 100 calculates and displays an entrance direction of a surgical tool approaching the organ model (560).
- the surgical navigation system 100 acquires the position of a point on the surface of the organ model relatively close to the end of the surgical tool by using the shortest neighbor search technique, and obtains the acquired point and the surgery.
- the distance between the tools and the entry direction of the surgical tool approaching the organ model may be calculated and displayed on the screen together with the organ model.
- the surgical navigation system 100 may display a warning step by step according to the calculated distance, so that the user can recognize.
- the positional relationship (direction and distance) between the virtual organ model and the surgical tool is displayed on the screen in real time, so that the user can more accurately determine the depth relationship between the virtual organ model and the surgical tool in the virtual reality. It becomes perceptible. That is, the present invention can provide a sense of intuitive depth and contribute to improving the stability of the procedure by displaying the shortest distance between the virtual organ model and the surgical instrument to the user.
- the method according to an embodiment of the present invention can be implemented in the form of program instructions that can be executed by various computer means and recorded in a computer readable medium.
- the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
- the program instructions recorded on the media may be those specially designed and constructed for the purposes of the embodiments, or they may be of the kind well-known and available to those having skill in the computer software arts.
- Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks, such as floppy disks.
- Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
- the hardware device described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Geometry (AREA)
- Processing Or Creating Images (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
Claims (13)
- 카메라에 의해 촬상되는 인체영상으로부터 오브젝트를 식별하는 단계;증강현실을 이용하여, 상기 오브젝트에 대한 2차원의 장기 이미지를 형상화하는 단계; 및상기 장기 이미지를 3차원 렌더링하여, 가상의 장기 모형을 작성하는 단계를 포함하는 수술 내비게이션 시스템 운용 방법.
- 제1항에 있어서,상기 오브젝트를 식별하는 단계는,상기 인체영상을, 장기에 관한 정규 모델로 투영하여, 상기 정규 모델과 선정된 범위 이내에서 중첩되는 상기 인체영상 내 오브젝트를 식별하는 단계를 포함하는 수술 내비게이션 시스템 운용 방법.
- 제1항에 있어서,상기 오브젝트에 대한 2차원의 장기 이미지를 형상화하는 단계는,상기 오브젝트의 식별과 연관되는 정규 모델의 DICOM(Digital Imaging and Communications in Medicine) 이미지를 평면 형태로 2차원 렌더링하여, 상기 장기 이미지를 형상화하는 단계를 포함하는 수술 내비게이션 시스템 운용 방법.
- 제3항에 있어서,상기 가상의 장기 모형을 작성하는 단계는,상기 평면 형태의 장기 이미지 사이에서 발생하는 평면의 원근감을 이용하여, 상기 장기 모형에 대한 깊이감을 표현하는 단계를 포함하는 수술 내비게이션 시스템 운용 방법.
- 제1항에 있어서,상기 가상의 장기 모형을 작성하는 단계는,상기 장기 이미지에 대해 터치 신호 및 클릭 신호를 포함하는 선택 신호를 입력받는 단계; 및상기 선택 신호가 입력된 상기 장기 이미지 내 지점에 대해, 상기 3차원 렌더링하는 단계를 포함하는 수술 내비게이션 시스템 운용 방법.
- 제1항에 있어서,상기 장기 모형에 대한, 터치 신호 및 클릭 신호를 포함하는 선택 신호의 입력에 연동하여, 상기 장기 모형에 대해, 회전, 확대, 및 축소 중 적어도 하나의 제어를 수행하는 단계를 더 포함하는 수술 내비게이션 시스템 운용 방법.
- 제1항에 있어서,최단 이웃점 탐색 기법을 이용하여, 수술 도구의 끝 부분과 상대적으로 근접하는 상기 장기 모형의 표면 상에 있는 점의 위치를 획득하는 단계; 및상기 획득한 점과 수술 도구 사이의 거리를 연산하여, 상기 장기 모형과 함께 화면에 표시하는 단계를 더 포함하는 수술 내비게이션 시스템 운용 방법.
- 제7항에 있어서,상기 장기 모형으로 접근하는 수술 도구의 진입 방향을 연산하여 표시하는 단계를 더 포함하는 수술 내비게이션 시스템 운용 방법.
- 제7항에 있어서,상기 연산된 거리에 따라, 단계적으로 경고를 표시하는 단계를 더 포함하는 수술 내비게이션 시스템 운용 방법.
- 카메라에 의해 촬상되는 인체영상으로부터 오브젝트를 식별하는 오브젝트 식별부;증강현실을 이용하여, 상기 오브젝트에 대한 2차원의 장기 이미지를 형상화하는 이미지 형상화부; 및상기 장기 이미지를 3차원 렌더링하여, 가상의 장기 모형을 작성하는 장기 모형 작성부를 포함하는 수술 내비게이션 시스템.
- 제10항에 있어서,상기 장기 모형 작성부는,상기 장기 모형에 대한, 터치 신호 및 클릭 신호를 포함하는 선택 신호의 입력에 연동하여, 상기 장기 모형에 대해, 회전, 확대, 및 축소 중 적어도 하나의 제어를 수행하는수술 내비게이션 시스템.
- 제10항에 있어서,최단 이웃점 탐색 기법을 이용하여, 수술 도구의 끝 부분과 상대적으로 근접하는 상기 장기 모형의 표면 상에 있는 점의 위치를 획득하고, 상기 획득한 점과 수술 도구 사이의 거리를 연산하여, 상기 장기 모형과 함께 화면에 표시하는 방향/거리 표시부를 더 포함하는 수술 내비게이션 시스템.
- 제12항에 있어서,상기 방향/거리 표시부는,상기 장기 모형으로 접근하는 수술 도구의 진입 방향을 연산하여 표시하는수술 내비게이션 시스템.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016538846A JP2016533832A (ja) | 2013-08-26 | 2014-08-26 | 手術ナビゲーションシステム運用方法及び手術ナビゲーションシステム |
US14/441,398 US20160163105A1 (en) | 2013-08-26 | 2014-08-26 | Method of operating a surgical navigation system and a system using the same |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130100945A KR101536115B1 (ko) | 2013-08-26 | 2013-08-26 | 수술 내비게이션 시스템 운용 방법 및 수술 내비게이션 시스템 |
KR10-2013-0100945 | 2013-08-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015030455A1 true WO2015030455A1 (ko) | 2015-03-05 |
Family
ID=52586929
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2014/007909 WO2015030455A1 (ko) | 2013-08-26 | 2014-08-26 | 수술 내비게이션 시스템 운용 방법 및 수술 내비게이션 시스템 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20160163105A1 (ko) |
JP (1) | JP2016533832A (ko) |
KR (1) | KR101536115B1 (ko) |
WO (1) | WO2015030455A1 (ko) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105045886A (zh) * | 2015-07-23 | 2015-11-11 | 青岛海信医疗设备股份有限公司 | 一种dicom图像的导入方法 |
US10499997B2 (en) | 2017-01-03 | 2019-12-10 | Mako Surgical Corp. | Systems and methods for surgical navigation |
WO2023234492A1 (ko) * | 2022-05-30 | 2023-12-07 | (주)휴톰 | 환자 맞춤형 3d 수술 시뮬레이션을 제공하는 방법, 장치 및 프로그램 |
Families Citing this family (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10154239B2 (en) | 2014-12-30 | 2018-12-11 | Onpoint Medical, Inc. | Image-guided surgery with surface reconstruction and augmented reality visualization |
US10013808B2 (en) | 2015-02-03 | 2018-07-03 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
CN111329553B (zh) | 2016-03-12 | 2021-05-04 | P·K·朗 | 用于手术的装置与方法 |
US10748339B2 (en) | 2016-06-03 | 2020-08-18 | A Big Chunk Of Mud Llc | System and method for implementing computer-simulated reality interactions between users and publications |
CN108268120B (zh) | 2016-12-29 | 2020-07-28 | 同方威视技术股份有限公司 | 基于vr或ar的图像数据处理方法、设备和安检系统 |
CA3049662A1 (en) | 2017-01-16 | 2018-07-19 | Philipp K. Lang | Optical guidance for surgical, medical, and dental procedures |
JP7200213B2 (ja) | 2017-03-22 | 2023-01-06 | ア ビッグ チャンク オブ マッド リミテッド ライアビリティ カンパニー | 一体型ヘッドマウントディスプレイを含むコンバーチブルサッチェル |
WO2019051464A1 (en) | 2017-09-11 | 2019-03-14 | Lang Philipp K | INCREASED REALITY DISPLAY FOR VASCULAR AND OTHER INTERVENTIONS, COMPENSATION FOR CARDIAC AND RESPIRATORY MOVEMENT |
KR102056930B1 (ko) | 2017-11-21 | 2019-12-17 | 경희대학교 산학협력단 | 증강현실 기술을 이용한 척추 수술 네비게이션 시스템 및 방법 |
TWI642404B (zh) * | 2017-12-06 | 2018-12-01 | 奇美醫療財團法人奇美醫院 | Bone surgery navigation system and image navigation method for bone surgery |
KR102082290B1 (ko) * | 2017-12-06 | 2020-02-27 | 조선대학교산학협력단 | 저장 매체에 저장된 수술 네비게이션 컴퓨터 프로그램, 그 프로그램이 저장된 스마트 기기 및 수술 네비게이션 시스템 |
WO2019148154A1 (en) | 2018-01-29 | 2019-08-01 | Lang Philipp K | Augmented reality guidance for orthopedic and other surgical procedures |
US20190254753A1 (en) | 2018-02-19 | 2019-08-22 | Globus Medical, Inc. | Augmented reality navigation systems for use with robotic surgical systems and methods of their use |
CN108459802B (zh) * | 2018-02-28 | 2020-11-20 | 北京航星机器制造有限公司 | 一种触控显示终端交互方法和装置 |
US11553969B1 (en) | 2019-02-14 | 2023-01-17 | Onpoint Medical, Inc. | System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures |
US11857378B1 (en) | 2019-02-14 | 2024-01-02 | Onpoint Medical, Inc. | Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets |
US11992373B2 (en) | 2019-12-10 | 2024-05-28 | Globus Medical, Inc | Augmented reality headset with varied opacity for navigated robotic surgery |
US11464581B2 (en) | 2020-01-28 | 2022-10-11 | Globus Medical, Inc. | Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums |
US11382699B2 (en) | 2020-02-10 | 2022-07-12 | Globus Medical Inc. | Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery |
US11207150B2 (en) | 2020-02-19 | 2021-12-28 | Globus Medical, Inc. | Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment |
US11607277B2 (en) | 2020-04-29 | 2023-03-21 | Globus Medical, Inc. | Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery |
US11382700B2 (en) | 2020-05-08 | 2022-07-12 | Globus Medical Inc. | Extended reality headset tool tracking and control |
US11153555B1 (en) | 2020-05-08 | 2021-10-19 | Globus Medical Inc. | Extended reality headset camera system for computer assisted navigation in surgery |
US11510750B2 (en) | 2020-05-08 | 2022-11-29 | Globus Medical, Inc. | Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications |
US11737831B2 (en) | 2020-09-02 | 2023-08-29 | Globus Medical Inc. | Surgical object tracking template generation for computer assisted navigation during surgical procedure |
US11786206B2 (en) | 2021-03-10 | 2023-10-17 | Onpoint Medical, Inc. | Augmented reality guidance for imaging systems |
CN113761776B (zh) * | 2021-08-24 | 2023-03-14 | 中国人民解放军总医院第一医学中心 | 基于增强现实的心脏出血与止血模型的仿真系统和方法 |
WO2023126752A1 (en) * | 2021-12-31 | 2023-07-06 | Auris Health, Inc. | Three-dimensional instrument pose estimation |
KR20240070209A (ko) * | 2022-11-14 | 2024-05-21 | 주식회사 딥파인 | 증강콘텐츠 변환을 위한 영상처리 시스템 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100106834A (ko) * | 2009-03-24 | 2010-10-04 | 주식회사 이턴 | 증강현실을 이용한 수술 로봇 시스템 및 그 제어 방법 |
KR101288167B1 (ko) * | 2011-10-11 | 2013-07-18 | (주)지씨에스그룹 | 의료 영상 표준 규격의 의료 영상 처리 장치 및 방법 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10334220A (ja) * | 1997-05-29 | 1998-12-18 | Hitachi Medical Corp | 手術支援ナビゲーション方法 |
US20020082498A1 (en) * | 2000-10-05 | 2002-06-27 | Siemens Corporate Research, Inc. | Intra-operative image-guided neurosurgery with augmented reality visualization |
JP2003079637A (ja) * | 2001-09-13 | 2003-03-18 | Hitachi Medical Corp | 手術ナビゲーションシステム |
GB0507204D0 (en) * | 2005-04-08 | 2005-05-18 | Leuven K U Res & Dev | Maxillofacial and plastic surgery |
KR101161242B1 (ko) * | 2010-02-17 | 2012-07-02 | 전남대학교산학협력단 | 최소 침습 수술을 위한 영상 유도 튜블라 매니퓰레이터 수술 로봇 시스템 |
WO2011118208A1 (ja) * | 2010-03-24 | 2011-09-29 | パナソニック株式会社 | 切削シミュレーション装置 |
JP2013202313A (ja) * | 2012-03-29 | 2013-10-07 | Panasonic Corp | 手術支援装置および手術支援プログラム |
US20130316318A1 (en) * | 2012-05-22 | 2013-11-28 | Vivant Medical, Inc. | Treatment Planning System |
-
2013
- 2013-08-26 KR KR1020130100945A patent/KR101536115B1/ko active IP Right Grant
-
2014
- 2014-08-26 JP JP2016538846A patent/JP2016533832A/ja active Pending
- 2014-08-26 US US14/441,398 patent/US20160163105A1/en not_active Abandoned
- 2014-08-26 WO PCT/KR2014/007909 patent/WO2015030455A1/ko active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100106834A (ko) * | 2009-03-24 | 2010-10-04 | 주식회사 이턴 | 증강현실을 이용한 수술 로봇 시스템 및 그 제어 방법 |
KR101288167B1 (ko) * | 2011-10-11 | 2013-07-18 | (주)지씨에스그룹 | 의료 영상 표준 규격의 의료 영상 처리 장치 및 방법 |
Non-Patent Citations (2)
Title |
---|
CHOI, HYUN SEOK ET AL.: "Augmented reality navigation system for ear surgery", KOREAN SOCIETY OF MEDICAL ROBOTICS, THE 5TH SYMPOSIUM ON MEDICAL ROBOTICS, May 2013 (2013-05-01) * |
MOON, JIN-KI ET AL.: "Development of Immersive Augmented Reality interface for Minimally Invasive Surgery", THE JOURNAL OF KOREA ROBOTICS SOCIETY, March 2008 (2008-03-01) * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105045886A (zh) * | 2015-07-23 | 2015-11-11 | 青岛海信医疗设备股份有限公司 | 一种dicom图像的导入方法 |
US10499997B2 (en) | 2017-01-03 | 2019-12-10 | Mako Surgical Corp. | Systems and methods for surgical navigation |
US11707330B2 (en) | 2017-01-03 | 2023-07-25 | Mako Surgical Corp. | Systems and methods for surgical navigation |
WO2023234492A1 (ko) * | 2022-05-30 | 2023-12-07 | (주)휴톰 | 환자 맞춤형 3d 수술 시뮬레이션을 제공하는 방법, 장치 및 프로그램 |
Also Published As
Publication number | Publication date |
---|---|
US20160163105A1 (en) | 2016-06-09 |
JP2016533832A (ja) | 2016-11-04 |
KR20150024029A (ko) | 2015-03-06 |
KR101536115B1 (ko) | 2015-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015030455A1 (ko) | 수술 내비게이션 시스템 운용 방법 및 수술 내비게이션 시스템 | |
US7774044B2 (en) | System and method for augmented reality navigation in a medical intervention procedure | |
JP5551957B2 (ja) | 投影画像生成装置およびその作動方法、並びに投影画像生成プログラム | |
CN102821671B (zh) | 内窥镜观察支持系统和设备 | |
EP2548495B1 (en) | System and program for supporting endoscopic observation | |
JP6972163B2 (ja) | 奥行き知覚を高める仮想陰影 | |
US10506991B2 (en) | Displaying position and optical axis of an endoscope in an anatomical image | |
EP2829218B1 (en) | Image completion system for in-image cutoff region, image processing device, and program therefor | |
CN114145846B (zh) | 基于增强现实辅助的手术导航方法及系统 | |
EP3395282A1 (en) | Endoscopic view of invasive procedures in narrow passages | |
CN103356155A (zh) | 虚拟内窥镜辅助的腔体病灶检查系统 | |
JP5961504B2 (ja) | 仮想内視鏡画像生成装置およびその作動方法並びにプログラム | |
US20230114385A1 (en) | Mri-based augmented reality assisted real-time surgery simulation and navigation | |
WO2014050019A1 (ja) | 仮想内視鏡画像生成装置および方法並びにプログラム | |
Liu et al. | Toward intraoperative image-guided transoral robotic surgery | |
Kumar et al. | Stereoscopic visualization of laparoscope image using depth information from 3D model | |
EP3075342B1 (en) | Microscope image processing device and medical microscope system | |
US10951837B2 (en) | Generating a stereoscopic representation | |
Wang et al. | Real-time marker-free patient registration and image-based navigation using stereovision for dental surgery | |
US20220175485A1 (en) | Method for operating a visualization system in a surgical application, and visualization system for a surgical application | |
Bichlmeier et al. | Virtual window for improved depth perception in medical AR | |
Habert et al. | Multi-layer visualization for medical mixed reality | |
CN210301211U (zh) | 用于导航到靶标的系统 | |
Kumar et al. | Stereoscopic augmented reality for single camera endoscope using optical tracker: a study on phantom | |
Eom et al. | Did You Do Well? Real-Time Personalized Feedback on Catheter Placement in Augmented Reality-Assisted Neurosurgical Training |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14840987 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14441398 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2016538846 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14840987 Country of ref document: EP Kind code of ref document: A1 |