JP2009273521A - Navigation system for arthroscopical surgery - Google Patents
Navigation system for arthroscopical surgery Download PDFInfo
- Publication number
- JP2009273521A JP2009273521A JP2008125241A JP2008125241A JP2009273521A JP 2009273521 A JP2009273521 A JP 2009273521A JP 2008125241 A JP2008125241 A JP 2008125241A JP 2008125241 A JP2008125241 A JP 2008125241A JP 2009273521 A JP2009273521 A JP 2009273521A
- Authority
- JP
- Japan
- Prior art keywords
- surgery
- bone
- arthroscope
- navigation system
- arthroscopical
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Abstract
Description
本発明は、関節鏡手術ナビゲーションシステムに関する。 The present invention relates to an arthroscopic surgical navigation system.
関節鏡は骨関節疾患を治療対象とする整形外科領域における内視鏡であり日本で開発された。従来、膝関節を中心に行われていたが、低侵襲手術(Minimum Invasive Surgery:MIS)の考え方から普及が拡大し、現在膝関節のみならず肩関節、足関節、股関節、肘関節、手関節、指関節や、脊椎、アキレス腱など関節以外の神経や軟部組織の手術にも用いられている。特に、膝関節では、手術全体の60%以上が関節鏡で行われている。 The arthroscope is an endoscope in the field of orthopedics that treats bone and joint diseases, and was developed in Japan. Conventionally, it has been performed mainly on the knee joint, but the spread has expanded from the idea of minimal invasive surgery (Minimum Invasive Surgery: MIS), and now the shoulder joint, ankle joint, hip joint, elbow joint, wrist joint as well as the knee joint. It is also used for surgery of nerves and soft tissues other than joints such as finger joints, spine, and Achilles tendon. In particular, in the knee joint, 60% or more of the entire operation is performed with an arthroscope.
一方、近年、コンピュータを用いた各種のナビゲーション技術やロボット技術が整形外科手術にも導入され、人工関節手術や脊椎手術では3次元的な術前計画と手術中のナビゲーション利用による正確な手術手技が可能となっている。しかし、関節鏡手術はカメラを通して得られる画像を見て手術を行う特殊性から、一般的なナビゲーションで行う術中の形状と術前形状の重ね合わせ(レジストレーション)が応用できず、術前計画を手術中に再現することができない。そして、このことが現行の関節鏡手術のナビゲーションにおける大きな問題点となっている。 On the other hand, in recent years, various navigation technologies and robot technologies using computers have also been introduced in orthopedic surgery, and in artificial joint surgery and spine surgery, accurate surgical techniques using three-dimensional preoperative planning and use of navigation during surgery have been implemented. It is possible. However, because of the peculiarity of performing surgery by looking at images obtained through a camera, arthroscopic surgery cannot be applied to the registration (registration) of intraoperative and preoperative shapes that is performed by general navigation. It cannot be reproduced during surgery. This is a major problem in the navigation of current arthroscopic surgery.
なお、非特許文献1には、膝関節手術におけるオーバーレイシステムが開示されているが、これは全体が直視できる形を、ハーフミラーを通して重ね合わせるものであり、関節鏡による画像を重ね合わせるものとはなっていない。
そこで、本発明は、関節鏡手術において術前計画を手術中に再現することができる関節鏡手術ナビゲーションシステムを提供することを目的とする。 Therefore, an object of the present invention is to provide an arthroscopic surgical navigation system capable of reproducing a preoperative plan during surgery in arthroscopic surgery.
本発明の関節鏡手術ナビゲーションシステムは、関節鏡の内部変数及び外部変数と、予め計測して得た対象物の3次元モデルの情報と、前記対象物と前記関節鏡にそれぞれ設置したマーカーの位置をトラッキングして得た前記対象物の位置、姿勢及び前記関節鏡との位置関係の情報とに基づいて前記対象物の仮想鏡視画像を作成し、この仮想鏡視画像と実際の鏡視画像とを重ね合わせてリアルタイム表示するように構成したことを特徴とする。 The arthroscopic surgical navigation system of the present invention includes an internal variable and an external variable of an arthroscope, information on a three-dimensional model of an object obtained by measurement in advance, and positions of markers placed on the object and the arthroscope, respectively. A virtual mirror image of the target object is created based on the position and posture of the target object obtained by tracking the position and information on the positional relationship with the arthroscope, and the virtual mirror image and the actual mirror image Are superimposed on each other and displayed in real time.
本発明の関節鏡手術ナビゲーションシステムによれば、仮想鏡視画像と実際の鏡視画像とを重ね合わせてリアルタイム表示することにより、関節鏡手術において術前計画を正確かつ瞬時に再現することができる。 According to the arthroscopic surgery navigation system of the present invention, the pre-operative plan can be accurately and instantaneously reproduced in arthroscopic surgery by superimposing the virtual and real mirror images and displaying them in real time. .
以下、本発明の関節鏡手術ナビゲーションシステムの一実施例について、添付した図面を参照しながら説明する。なお、本発明は、以下の実施例により限定されるものではなく、種々の変形実施が可能である。 Hereinafter, an example of an arthroscopic surgical navigation system of the present invention will be described with reference to the accompanying drawings. In addition, this invention is not limited by the following Examples, Various deformation | transformation implementation is possible.
本発明の関節鏡手術ナビゲーションシステムの一実施例の模式図である図1において、1は、対象物としての骨であり、本実施例においては大腿骨である。そして、骨1を予めX線CT(コンピュータ断層撮影)により撮影して得たデータにより3次元モデル化したものが、3次元骨モデル2である。3は、関節鏡であり、骨1と関節鏡3には、それぞれマーカーとして赤外線反射マーカー4,5が設置されている。赤外線反射マーカー4,5の位置は、モーションキャプチャカメラ6によりトラッキング、すなわち追尾されるようになっている。そして、図示しない演算手段により、関節鏡3の内部変数及び外部変数と、予め計測して得た骨1の3次元骨モデル2の情報と、骨1と関節鏡3にそれぞれ設置した赤外線反射マーカー4,5の位置をトラッキングして得た骨1の位置、姿勢及び関節鏡3との位置関係の情報とに基づいて骨1の仮想鏡視画像を作成し、この仮想鏡視画像と実際の鏡視画像7とを重ね合わせてリアルタイム表示、すなわちリアルタイムオーバーレイ表示するように構成されている。なお、演算手段は、パーソナルコンピュータにより構成することができる。 In FIG. 1 which is a schematic diagram of an embodiment of the arthroscopic surgical navigation system of the present invention, 1 is a bone as an object, and in this embodiment is a femur. A three-dimensional bone model 2 is a three-dimensional model of data obtained by previously imaging the bone 1 by X-ray CT (computer tomography). Reference numeral 3 denotes an arthroscope. In the bone 1 and the arthroscope 3, infrared reflection markers 4 and 5 are provided as markers, respectively. The positions of the infrared reflection markers 4 and 5 are tracked, that is, tracked by the motion capture camera 6. Then, by an arithmetic unit (not shown), the internal and external variables of the arthroscope 3, the information of the three-dimensional bone model 2 of the bone 1 obtained by measurement in advance, and the infrared reflective markers respectively installed on the bone 1 and the arthroscope 3 Based on the position and posture of the bone 1 obtained by tracking the positions 4 and 5 and information on the positional relationship with the arthroscope 3, a virtual endoscopic image of the bone 1 is created. The mirror image 7 is superimposed on the real-time display, that is, the real-time overlay display. The computing means can be configured by a personal computer.
以下、本実施例の関節鏡手術ナビゲーションシステムの動作について説明する。 Hereinafter, the operation of the arthroscopic surgical navigation system of this embodiment will be described.
はじめに、関節鏡3のカメラを校正する。これにより、関節鏡3の内部変数(焦点距離、画像の歪み)と外部変数(ワールド座標系に対するカメラの位置及び姿勢)を推定することができる。実際に鋼球マーカーと3次元測定器を用いた基礎的実証実験を行ったところ、校正精度は0.23±0.02mmであった。 First, the camera of the arthroscope 3 is calibrated. Thereby, the internal variables (focal length, image distortion) of the arthroscope 3 and external variables (the position and orientation of the camera with respect to the world coordinate system) can be estimated. When a basic verification experiment was actually performed using a steel ball marker and a three-dimensional measuring instrument, the calibration accuracy was 0.23 ± 0.02 mm.
つぎに、骨1をX線CTにより撮影して3次元骨モデル2を作成する。3次元骨モデル2は、X線CTの撮影データから専用ソフトウエアを用いて作成することができる。また、この3次元骨モデル2における解剖学的座標系は、3次元骨モデル2内の大腿骨頭、内側後顆、外側後顆をそれぞれ球近似し、その中心に基づいて座標原点と座標軸を設定することにより、決定することができる。 Next, the bone 1 is imaged by X-ray CT to create a three-dimensional bone model 2. The three-dimensional bone model 2 can be created from X-ray CT imaging data using dedicated software. The anatomical coordinate system in the three-dimensional bone model 2 approximates the femoral head, the medial posterior condyle, and the lateral posterior condyle in the 3D bone model 2 respectively, and sets the coordinate origin and coordinate axis based on the center. Can be determined.
そして、3次元骨モデル2を用いて、病変部の正確な位置や大きさ、手術に用いる器具の至適設置や方向などの情報を集約して術前計画を立てるとともに、手術のシミュレーションを行う。 Then, using the three-dimensional bone model 2, a preoperative plan is created by collecting information such as the exact position and size of the lesion, the optimal installation and direction of the instrument used for the operation, and the operation is simulated. .
その後、骨1及び関節鏡3へそれぞれ赤外線反射マーカー4,5を設置し、この赤外線反射マーカー4,5の位置をモーションキャプチャカメラ6にて追尾させると同時に、リアルタイムに手術中の骨1の鏡視画像7と3次元骨モデル2とを重ね合わせる。 Thereafter, infrared reflection markers 4 and 5 are installed on the bone 1 and the arthroscope 3, respectively, and the positions of the infrared reflection markers 4 and 5 are tracked by the motion capture camera 6, and at the same time, the mirror of the bone 1 being operated in real time. The visual image 7 and the three-dimensional bone model 2 are superimposed.
具体的には、赤外線反射マーカー4が固定された骨1を2方向透視X線撮影し、その仮想空間に3次元骨モデル2を呼び込み、2D/3Dレジストレーション法を用いて透視X線像に3次元骨モデル2の仮想投影像を重ね合わせる。これにより、仮想空間における3次元骨モデル2の位置及び姿勢が得られる。つぎに、同じ仮想空間における赤外線反射マーカー4の位置を投影像から推定し、その上にマーカー座標系を設定して、最終的に骨1の解剖学的座標系からマーカー座標系への座標変換を得る。同じく赤外線反射マーカー5を設置した関節鏡3をモーションキャプチャカメラ6にて追尾し、関節鏡3、大腿骨、脛骨の位置関係をマーカートラッキングにより随時計測し、実際の鏡視画像7と3次元骨モデル2の仮想鏡視像をリアルタイムに重ね合わせモニターに表示する。このレジストレーション精度を実証したところ、膝関節の回転要素(屈曲・伸展、内反・外反、内旋・外旋)で0.5度以下、並進要素(内外、前後、遠近方向)で0.5mm以下であった。 Specifically, the bone 1 to which the infrared reflective marker 4 is fixed is photographed in two directions, and the three-dimensional bone model 2 is called into the virtual space to form a fluoroscopic X-ray image using the 2D / 3D registration method. The virtual projection images of the three-dimensional bone model 2 are superimposed. Thereby, the position and posture of the three-dimensional bone model 2 in the virtual space are obtained. Next, the position of the infrared reflective marker 4 in the same virtual space is estimated from the projection image, and a marker coordinate system is set thereon, and finally the coordinate conversion from the anatomical coordinate system of the bone 1 to the marker coordinate system is performed. Get. Similarly, the arthroscope 3 provided with the infrared reflective marker 5 is tracked by the motion capture camera 6, and the positional relationship between the arthroscope 3, the femur, and the tibia is measured at any time by marker tracking, and the actual mirror image 7 and the three-dimensional bone are measured. The virtual mirror image of model 2 is superimposed and displayed on the monitor in real time. When this registration accuracy was verified, it was 0.5 degrees or less for the rotational element (flexion / extension, varus / valgus, internal rotation / external rotation) of the knee joint, and 0 for the translation element (internal / external, front / rear, perspective). 0.5 mm or less.
本実施例の関節鏡手術ナビゲーションシステムによれば、仮想鏡視画像と実際の鏡視画像7とを重ね合わせて高精度にリアルタイム表示できる。したがって、関節鏡手術において術前計画を正確かつ瞬時に再現することができ、その結果、手術時間を短縮でき、より良好な臨床成績を得ることができる。 According to the arthroscopic surgical navigation system of the present embodiment, the virtual mirror image and the actual mirror image 7 can be superimposed and displayed in real time with high accuracy. Therefore, the preoperative plan can be accurately and instantaneously reproduced in arthroscopic surgery, and as a result, the operation time can be shortened and better clinical results can be obtained.
1 骨(対象物)
2 3次元骨モデル(3次元モデル)
3 関節鏡
4,5 赤外線反射マーカー(マーカー)
7 鏡視画像
1 bone (object)
2 3D bone model (3D model)
3 Arthroscopes 4 and 5 Infrared reflective marker (marker)
7 Mirror images
Claims (1)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008125241A JP2009273521A (en) | 2008-05-12 | 2008-05-12 | Navigation system for arthroscopical surgery |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008125241A JP2009273521A (en) | 2008-05-12 | 2008-05-12 | Navigation system for arthroscopical surgery |
Publications (1)
Publication Number | Publication Date |
---|---|
JP2009273521A true JP2009273521A (en) | 2009-11-26 |
Family
ID=41439515
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2008125241A Pending JP2009273521A (en) | 2008-05-12 | 2008-05-12 | Navigation system for arthroscopical surgery |
Country Status (1)
Country | Link |
---|---|
JP (1) | JP2009273521A (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009291342A (en) * | 2008-06-04 | 2009-12-17 | Univ Of Tokyo | Surgery assisting apparatus |
US8923584B2 (en) | 2010-06-16 | 2014-12-30 | A2 Surgical | Method and system of automatic determination of geometric elements characterizing a bone deformation from 3D image |
US8965108B2 (en) | 2010-06-16 | 2015-02-24 | A2 Surgical | Method and system of automatic determination of geometric elements from a 3D medical image of a bone |
CN104523218A (en) * | 2014-12-19 | 2015-04-22 | 王宝鹏 | Wireless arthroscopy system |
US9020223B2 (en) | 2010-06-16 | 2015-04-28 | A2 Surgical | Method for determining bone resection on a deformed bone surface from few parameters |
US9122670B2 (en) | 2010-06-16 | 2015-09-01 | A2 Surgical | Method for determining articular bone deformity resection using motion patterns |
US9314309B2 (en) | 2013-03-13 | 2016-04-19 | Samsung Electronics Co., Ltd. | Surgical robot and method of controlling the same |
US9320421B2 (en) | 2010-06-16 | 2016-04-26 | Smith & Nephew, Inc. | Method of determination of access areas from 3D patient images |
CN106405873A (en) * | 2016-06-13 | 2017-02-15 | 金英子 | Navigation system |
JP2017524445A (en) * | 2014-07-07 | 2017-08-31 | スミス アンド ネフュー インコーポレイテッド | Positioning accuracy |
JP2020518315A (en) * | 2017-03-31 | 2020-06-25 | デピュイ・シンセス・プロダクツ・インコーポレイテッド | System, apparatus, and method for improving surgical accuracy using an inertial measurement device |
US11395604B2 (en) | 2014-08-28 | 2022-07-26 | DePuy Synthes Products, Inc. | Systems and methods for intraoperatively measuring anatomical orientation |
US11464596B2 (en) | 2016-02-12 | 2022-10-11 | Medos International Sarl | Systems and methods for intraoperatively measuring anatomical orientation |
US11563345B2 (en) | 2015-12-30 | 2023-01-24 | Depuy Synthes Products, Inc | Systems and methods for wirelessly powering or communicating with sterile-packed devices |
US11660149B2 (en) | 2015-12-30 | 2023-05-30 | DePuy Synthes Products, Inc. | Method and apparatus for intraoperative measurements of anatomical orientation |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10143652A (en) * | 1996-11-14 | 1998-05-29 | Toshiba Iyou Syst Eng Kk | Medical image composing device |
JP2002510230A (en) * | 1997-06-27 | 2002-04-02 | ザ・ボード・オブ・トラスティーズ・オブ・ザ・リーランド・スタンフォード・ジュニア・ユニバーシティ | Stereoscopic image navigation method and apparatus |
JP2002538884A (en) * | 1999-03-17 | 2002-11-19 | ジンテーズ アクチエンゲゼルシャフト クール | Imaging and planning equipment for ligament graft placement |
-
2008
- 2008-05-12 JP JP2008125241A patent/JP2009273521A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10143652A (en) * | 1996-11-14 | 1998-05-29 | Toshiba Iyou Syst Eng Kk | Medical image composing device |
JP2002510230A (en) * | 1997-06-27 | 2002-04-02 | ザ・ボード・オブ・トラスティーズ・オブ・ザ・リーランド・スタンフォード・ジュニア・ユニバーシティ | Stereoscopic image navigation method and apparatus |
JP2002538884A (en) * | 1999-03-17 | 2002-11-19 | ジンテーズ アクチエンゲゼルシャフト クール | Imaging and planning equipment for ligament graft placement |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009291342A (en) * | 2008-06-04 | 2009-12-17 | Univ Of Tokyo | Surgery assisting apparatus |
US9320421B2 (en) | 2010-06-16 | 2016-04-26 | Smith & Nephew, Inc. | Method of determination of access areas from 3D patient images |
US8923584B2 (en) | 2010-06-16 | 2014-12-30 | A2 Surgical | Method and system of automatic determination of geometric elements characterizing a bone deformation from 3D image |
US8965108B2 (en) | 2010-06-16 | 2015-02-24 | A2 Surgical | Method and system of automatic determination of geometric elements from a 3D medical image of a bone |
US9020223B2 (en) | 2010-06-16 | 2015-04-28 | A2 Surgical | Method for determining bone resection on a deformed bone surface from few parameters |
US9122670B2 (en) | 2010-06-16 | 2015-09-01 | A2 Surgical | Method for determining articular bone deformity resection using motion patterns |
US9314309B2 (en) | 2013-03-13 | 2016-04-19 | Samsung Electronics Co., Ltd. | Surgical robot and method of controlling the same |
US11166767B2 (en) | 2014-07-07 | 2021-11-09 | Smith & Nephew, Inc. | Alignment precision |
JP2017524445A (en) * | 2014-07-07 | 2017-08-31 | スミス アンド ネフュー インコーポレイテッド | Positioning accuracy |
JP2020036944A (en) * | 2014-07-07 | 2020-03-12 | スミス アンド ネフュー インコーポレイテッド | Alignment precision |
US11395604B2 (en) | 2014-08-28 | 2022-07-26 | DePuy Synthes Products, Inc. | Systems and methods for intraoperatively measuring anatomical orientation |
CN104523218A (en) * | 2014-12-19 | 2015-04-22 | 王宝鹏 | Wireless arthroscopy system |
US11563345B2 (en) | 2015-12-30 | 2023-01-24 | Depuy Synthes Products, Inc | Systems and methods for wirelessly powering or communicating with sterile-packed devices |
US11660149B2 (en) | 2015-12-30 | 2023-05-30 | DePuy Synthes Products, Inc. | Method and apparatus for intraoperative measurements of anatomical orientation |
US11464596B2 (en) | 2016-02-12 | 2022-10-11 | Medos International Sarl | Systems and methods for intraoperatively measuring anatomical orientation |
CN106405873A (en) * | 2016-06-13 | 2017-02-15 | 金英子 | Navigation system |
JP2020518315A (en) * | 2017-03-31 | 2020-06-25 | デピュイ・シンセス・プロダクツ・インコーポレイテッド | System, apparatus, and method for improving surgical accuracy using an inertial measurement device |
JP7204663B2 (en) | 2017-03-31 | 2023-01-16 | デピュイ・シンセス・プロダクツ・インコーポレイテッド | Systems, apparatus, and methods for improving surgical accuracy using inertial measurement devices |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2009273521A (en) | Navigation system for arthroscopical surgery | |
EP3273854B1 (en) | Systems for computer-aided surgery using intra-operative video acquired by a free moving camera | |
JP7071479B2 (en) | Ultra-wideband positioning for wireless ultrasound tracking and communication | |
US11727581B2 (en) | Augmented reality guidance for dental procedures | |
US11553969B1 (en) | System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures | |
US20240041530A1 (en) | Systems for superimposing a virtual display on a physical spine | |
Sugano | Computer-assisted orthopedic surgery | |
US8165659B2 (en) | Modeling method and apparatus for use in surgical navigation | |
JP2019534717A (en) | System for sensory enhancement in medical procedures | |
EP3054851B1 (en) | Method for optimally visualizing a morphologic region of interest of a bone in an x-ray image | |
Ewurum et al. | Surgical navigation in orthopedics: workflow and system review | |
Murphy et al. | Development of a biomechanical guidance system for periacetabular osteotomy | |
Raposo et al. | Video-based computer navigation in knee arthroscopy for patient-specific ACL reconstruction | |
Zheng et al. | Computer-aided orthopaedic surgery: state-of-the-art and future perspectives | |
JP7012302B2 (en) | Surgery support terminals and programs | |
Chen | Application progress of artificial intelligence and augmented reality in orthopaedic arthroscopy surgery | |
Raposo et al. | Video-based computer aided arthroscopy for patient specific reconstruction of the anterior cruciate ligament | |
Masjedi et al. | Protocol for evaluation of robotic technology in orthopedic surgery | |
Bong et al. | Development of a surgical navigation system for corrective osteotomy based on augmented reality | |
Lorenz et al. | Development of an image-free navigation tool for high tibial osteotomy | |
Bisson et al. | 3D visualization tool for minimally invasive discectomy assistance | |
US20230233259A1 (en) | Augmented reality headset systems and methods for surgical planning and guidance for knee surgery | |
Kuga et al. | Navigation system for ACL reconstruction using registration between multi-viewpoint X-ray images and CT images | |
Holmes | Augmented reality assisted orthopaedic surgery. | |
Zhao et al. | Computer-assisted orthopedic surgery research and implementation of measurement techniques |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20110401 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20121116 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20121126 |
|
A02 | Decision of refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A02 Effective date: 20130402 |