TW200807309A - Method and system for reconstructing 3-D endoscopic images - Google Patents

Method and system for reconstructing 3-D endoscopic images Download PDF

Info

Publication number
TW200807309A
TW200807309A TW95127782A TW95127782A TW200807309A TW 200807309 A TW200807309 A TW 200807309A TW 95127782 A TW95127782 A TW 95127782A TW 95127782 A TW95127782 A TW 95127782A TW 200807309 A TW200807309 A TW 200807309A
Authority
TW
Taiwan
Prior art keywords
feature
matrix
image
value
dimensional
Prior art date
Application number
TW95127782A
Other languages
Chinese (zh)
Other versions
TWI323425B (en
Inventor
Yung-Nien Sun
Chia-Hsiang Wu
Yi-Chiao Chen
Chien-Chen Chang
Original Assignee
Univ Nat Cheng Kung
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Univ Nat Cheng Kung filed Critical Univ Nat Cheng Kung
Priority to TW95127782A priority Critical patent/TWI323425B/en
Publication of TW200807309A publication Critical patent/TW200807309A/en
Application granted granted Critical
Publication of TWI323425B publication Critical patent/TWI323425B/en

Links

Abstract

A method and a system for reconstructing three-dimensional (3-D) endoscopic images are disclosed. The present invention uses the projection relation between an endoscopic camera and a lesion area, and combines a 3-D tracker fixed on a surgical instrument to reconstruct a 3-D structure of the observed area. After correcting endoscopic images distortion, the steps of extracting and tracking features is performed, and then the shape of the image is reconstructed by using matrix decomposition via the fact that image projection points are obtained from the multiplication of a camera projection matrix and 3-D coordinates. The present invention adds an additional dimension to the original 2D endoscopic images to compensate for loss of depth problem in regular endoscopic systems.

Description

200807309 九、發明說明. ...... : - · · : - :;·;· Ϊ··.:;.:. 【發明所屬之技術領域】 本發明疋有關於一種重構三維影像之方法,特別是有闕 於一種重構内視鏡之三維影像之方法。 L先雨技術】 在手術過程中,醫師必須確認手術卫具與周邊組織的相 對位置’以確保手術品質,但在内視鏡手術中,醫師僅能透 鏡!得二維影像來觀察體内結構’醫師無法立即認知 &成大小运近及其立體空間的關係,因而使手術㉟程增 多的風險》為辅助醫師進行相關診斷與治療,相關 ^:陸續地被發表出來’例如:以電腦斷層掃描(ct)影像 為基礎,應用於大腸直腸相關疾病的檢測與 將三維追縱器固定在内視鏡上,以此結:方去’或- 以雷腦齡爲卢w 口円視鏡二維影像與 電⑽斷層知搖衫像建構的腦内模型,使醫 三維結構。 看忒&域的 現有的虛擬内視鏡系統多以電 影像為基礎,需由專業人胃摔作料或核磁共振 不僅〜|人貝知作仏格印貝的設備取得資料, 不僅間與人力物力等成本 、针 機率。而目前栋田一从、A 患文輪射傷害的 —維追縱器的手術辅助系絲,A 前取得的醫學习禮I ’、、、先也以這類術 于的商予衫像為基礎,同樣有醫療成本或 因此’需要—種重構三維影像的方法和裝置,、二、’’’、: 供手術工具在體内的動態和病灶區域的立體幾 '以即時 使醫師得⑭虛擬環境遊走絲何模型來 的辅劫工具β 届灶以成為臨床診治上 5 200807309 【發明内容】 此本發明的目的是在提供一種三維重構方法和裝 =藉以建立正常尺度之三維模型,以提供醫師精確的三維 1訊來降低手術的風險並節省醫療成本和減低病人被輻射 傷害的機率。本發明亦可將事先預錄之内視鏡影像序列與數 個真實三維座標重構為正常尺度之三維模型,以做為手術或 治療方面的評估和教育訓練。 根據本發明之一較佳實施例,本發明之重構三維影像的 2至少包含:藉由内視鏡獲得二維影像,此二維影像被進 .徵k縱肖特徵操取步驟,以獲得此二維影像之特徵 Π:徵點進行濾除不可靠軌跡步驟’以濾除此特徵追蹤 二操取步驛:之追縱錯誤的特徵點。針對特徵點之軌跡所組 # 針對此;Γ::矩陣’以仿射重構之方法建構扭曲三維模型。 +驟^ Γ維模型’以歐氏重構之方法進行三維模型校正 ^第it 度異常三維模型,並獲得第-移動參數矩陣 取得第::二利用感應器任意點選空間中之-感應點,以 =1—絲值。將感應器固定於手術-具並以手術工具點 關Γ攄定追蹤器主機、感應器、感應點三者之間的 陣。==此獲得追蹤器裝置所提供之-第-轉換矩 值。以手臭矩陣和第二座標值計算求得一第三座標 點,以Υ在特徵財任意點選至少四個對應特徵 換矩陣和第三座標值將第==換矩陣。利用第二轉 一…度異常三維模型,以獲得第六座標值。 6 200807309 對第’、座標值和第五座標值進行一比較和計算步驟,以獲得 一第二轉換矩陣。利用此第三轉換矩陣,將第一座標值轉換 句來七座;^值,以及利用此第三轉換矩陣(G)之反矩陣,將 :一移動參數矩陣(M),轉換成第二移動參數矩陣,以 狂贫世界座標三維模型。利用世界座標内視鏡移動參數矩陣 、 和特徵點世界座標值計算出尺度測量矩陣(趵中之 複數個精確度因子,並更新精確度因子和尺度測量矩陣 :”比較更新前之精確度因子之值和更新後之這些精確度 f 丁之值之差值,若差值小於一預設閥值,則獲得此世界座 ^ —、准模型之二維影像,若差值大於預設閥值,則將已更新 尺度測量矩陣(叼代回仿射重構之步驟,直至更新前之 着雄度因子之值和更新後之精確度因子之值之差值小於預 設閥值。 ' 根據本發明之又一較佳實施例,在一欲觀察區域内以感 應惑任意點選複數個追加特徵點,並將此些追加特徵點投影 至兜二維影像以獲得複數個追加特徵點軌跡並將此些追加 .:徵點軌跡併入此尺度測量矩陣⑺’使此尺度測量矩陣⑺ ' 成為一特徵點增廣矩陣(《叹π並對此特徵點增廣矩陣 以仿射重構之方法建構此扭曲三維模型。 根據本發明之又—較佳實施例,在進行此特徵追縱與掘 取步驟之前,加人-廣角影像校正步驟,以校正由廣角鏡頭 所產生之影像形變,此廣角影像校正步驟至少包含··對此二 維影像進行亮度正規化(intensiiyn〇rmalizati〇n)處理·對已 進行亮度正規化處理之此二維影像,以高通滤波器㈣h 和閥值分割法進行二值化處理以區分前景和昔 7 200807309 景,並取出此二維影像上各圖形之複數個重心;對已進行二 值化處理之此二維影像以影像處理中型態學(morphology) 之閉合(closing)運算填補二值化處理所造成之洞隙;以及, 對此些重心绿寿咸之複數條直線,以計算直線性誤差之方法 校正形變影秦。 根據本蓊劈之叉一較佳實施例,此特徵追蹤與擷取步驟 係以 Kaim4e-Lixaas_Tomasi(KLT)特徵追蹤法和邊線特徵 (Edge Featm㈣連襄麽進行。 φ 根據本發^萌之又一較佳實施例,此濾除不可靠軌跡步驟 至少包含:較A.徵點於一第一二維影像、一第二二維影 像和一第三二維影像中,以取得此特徵點位於此第一二維影 像之第一特徵》座標值、第二特徵點座標值和第三特徵點座 標值;取第一特徵點至第二特徵點所代表之向量為一第一向 量和取第二特徵點至第三特徵點所代表之向量為一第二向 量,則此第二食量與此第一向量之夾角為一測試夾角,若此 測試夾角之篕太兹一預先設定之閥值則濾除此第二向量。 _ 【實施方式;3 請參蔓事2薦,其繪示本發明之重構内視鏡之三維影像 的系統示意舅。本發明之系統追蹤器裝置20和内視鏡裝置 30所組成;其肀追衆器裝置20包含有傳送器2卜追蹤器主 機22、電墨24和安裝於手術工具28上之感應器26。本發 明之内視鏡条统包含有内視鏡34和内視鏡主機32。本發明 之追蹤器裝置 20係採用例如美國 Ascension公司 (Ascension Technology,Corp·,VA, USA)所生產之 8 200807309200807309 IX. Invention Description. ...... : - · · : - : :···· Ϊ··.:;.:. [Technical Field of the Invention] The present invention relates to a reconstruction of a three-dimensional image. The method, in particular, is directed to a method of reconstructing a three-dimensional image of an endoscope. L first rain technology] During the operation, the doctor must confirm the relative position of the surgical guard and the surrounding tissue to ensure the quality of the operation, but in the endoscopic surgery, the doctor can only see through the lens! A two-dimensional image is used to observe the structure of the body. 'Physicians can't immediately recognize the relationship between the size and the size of the space and the relationship between the three dimensions, so that the risk of increasing the 35-course procedure is related to the diagnosis and treatment of the auxiliary physician. It has been published 'for example: based on computed tomography (CT) images, applied to the detection of colorectal-related diseases and the fixation of three-dimensional snoring devices on endoscopes, in order to: 'to go' or - ray The age-appropriate two-dimensional image of the Lu 円 円 与 与 与 与 10 10 10 10 10 10 10 10 10 10 10 10 建 建 建 建 建 建 建 建 建The existing virtual endoscope system of the 忒& field is based on electrical images, and it is necessary for the professional to take the stomach or the magnetic resonance. Not only the person who knows the 仏格印贝, but also the manpower Material cost and needle probability. At present, Dong Tianyi and A have suffered from the injury of the file--the surgical aid of the sputum-tracking device, the medical learning ceremony I obtained before A, and the commercial image of the singer Basics, there are also medical costs or therefore 'needs to be a method and device for reconstructing 3D images, 2, ''',: the dynamics of the surgical tool in the body and the stereoscopic view of the lesion area to instantly get the physician 14 The virtual environment wanders the model of the auxiliary tool to become a clinical diagnosis and treatment. 5 200807309 [Invention] The object of the present invention is to provide a three-dimensional reconstruction method and a three-dimensional model by which a normal scale is established. Provide physicians with accurate 3D 1 information to reduce the risk of surgery and save on medical costs and reduce the risk of radiation damage to patients. The invention can also reconstruct a pre-recorded endoscope image sequence and a plurality of real three-dimensional coordinates into a three-dimensional model of a normal scale for evaluation and education training in surgery or treatment. According to a preferred embodiment of the present invention, the reconstructed 3D image of the present invention includes at least: obtaining a two-dimensional image by using an endoscope, and the two-dimensional image is subjected to a step of acquiring a feature. The feature of the two-dimensional image is: the sign is filtered to remove the unreliable track step' to filter out the feature tracking second step: the feature point of the tracking error. For the trajectory of feature points, this is the case: Γ::matrix' constructs a distorted three-dimensional model by affine reconstruction. + ^ ^ Γ dimension model 'Euclidean reconstruction method for 3D model correction ^ The first degree of anomaly 3D model, and obtain the first - moving parameter matrix to obtain the first: , with =1 - silk value. The sensor is fixed to the surgical tool and the surgical tool is used to determine the array between the tracker main body, the sensor, and the sensing point. == This obtains the -to-transformation moment value provided by the tracker device. A third coordinate point is obtained by calculating the hand odor matrix and the second coordinate value, so that at least four corresponding features are exchanged at any point of the feature currency, and the third coordinate value is used to change the matrix. A second-degree anomaly three-dimensional model is used to obtain a sixth coordinate value. 6 200807309 A comparison and calculation step is performed on the 'th, coordinate value and the fifth coordinate value to obtain a second conversion matrix. Using the third transformation matrix, the first coordinate value is converted into a sentence; the value is used, and the inverse matrix of the third transformation matrix (G) is used to convert: a movement parameter matrix (M) into a second movement. The parameter matrix is a three-dimensional model of the world of poverty. Using the world coordinate endoscope movement parameter matrix and the feature point world coordinate values to calculate the scale measurement matrix (the multiple precision factors in the 趵, and update the accuracy factor and the scale measurement matrix: "Compare the accuracy factor before updating) The difference between the value and the updated value of the precision f, if the difference is less than a predetermined threshold, the two-dimensional image of the world seat and the quasi-model is obtained, and if the difference is greater than the preset threshold, Then, the scale measurement matrix has been updated (the step of degenerating the affine reconstruction step until the difference between the value of the dominant factor before the update and the value of the updated accuracy factor is less than a preset threshold. 'According to the present invention In another preferred embodiment, a plurality of additional feature points are selected by an arbitrary point in an area to be observed, and the additional feature points are projected to the two-dimensional image to obtain a plurality of additional feature point tracks. Some additional.: The eigen-point trajectory is incorporated into this scale measurement matrix (7)' to make this scale measurement matrix (7)' a feature point-augmented matrix ("Sigh π and construct this with a affine reconstruction method for this feature point augmentation matrix" Twist According to still another preferred embodiment of the present invention, a human-wide-angle image correcting step is performed to correct image distortion generated by the wide-angle lens before performing the feature tracking and digging step, the wide-angle image correcting step At least including: • normalizing the brightness of the two-dimensional image (intensiiyn〇rmalizati〇n), and performing binarization on the two-dimensional image subjected to brightness normalization processing by high-pass filter (four)h and threshold division method To distinguish between the foreground and the previous 7200807309 scene, and take out the plurality of centers of gravity of the graphics on the two-dimensional image; close the closing of the image processing morphology to the two-dimensional image that has been binarized. The operation fills the gap caused by the binarization processing; and, for the plurality of lines of the center of gravity, the linear distortion is corrected by the method for calculating the linearity error. According to a preferred embodiment of the fork of the present invention, This feature tracking and extraction step is performed by the Kaim4e-Lixaas_Tomasi (KLT) feature tracking method and the edge feature (Edge Featm (4).) φ According to this hair In an embodiment, the step of filtering the unreliable track includes: at least a point of the A. sign in a first two-dimensional image, a second two-dimensional image, and a third two-dimensional image, to obtain the feature point located at the first a first feature of the two-dimensional image, a coordinate value, a second feature point coordinate value, and a third feature point coordinate value; the vector represented by the first feature point to the second feature point is a first vector and a second feature point The vector represented by the third feature point is a second vector, and the angle between the second food quantity and the first vector is a test angle. If the test angle is a preset threshold, the filter value is filtered. The second vector. _ [Embodiment; 3 please refer to the vine 2 recommendation, which shows the system diagram of the three-dimensional image of the reconstructed endoscope of the present invention. The system tracker device 20 and the endoscope device 30 of the present invention are comprised; the 肀 chaser device 20 includes a transmitter 2 tracker host 22, an ink 24, and an inductor 26 mounted to the surgical tool 28. The endoscope strip of the present invention includes an endoscope 34 and an endoscope main unit 32. The tracker device 20 of the present invention is manufactured by, for example, Ascension Technology, Corp., VA, USA, 8 200807309

miniBIRD三維追蹤器。應用本發明之系統時,在進行内視 鏡手術期間,醫師可藉由伸入體内的内視鏡34來觀察體内 結構組織,並使用固定有感應春26¾手術工具28,來對病 灶作處理。因追蹤器主機22可透遢宾龜應器26捕捉傳送器 21所發出的電磁訊號以計算意產蠢酌方位,故只要將追 蹤器主機22直接與電腦24達接、爾可即時取得手術工具 2 8在三維空間中的位置。在&架養下1醫師可對欲觀察區 域線上以内視鏡34拍攝一瘦舊暴字到。接著,本發明之系 統會利用此影像序列和由追藥晉裝置20偵測觀察區域所得 到之數個三維座標,重建出此義察1裁的三維影像結構,並 同時提供手術工具的方位與翁焉寶讓。 為實現建立三維腔室結晷餐T 的,本發明之重構内視鏡 之三維影像的方法包括有影像形變校正、特徵點擷取與追 蹤、結合三維座標的分解法等步驟。請參照第2A圖,其繪 示根據本發明之較佳實施例之重構离視鏡之三維影像之方 法的流程示意圖。 首先,進行影像形變校正游步專100。通常,内視鏡配 備有廣角鏡頭,故由内視鏡费:秦^之觀察區中的二維影像會 出現幾何變形失真的現象,链I皇缘經内視鏡取像後,會成 像為曲線,而此現象在影像逢綠最壽明顯。因此,在進行内 視鏡影像三維重構之前,必層先賣意像進行廣角影像的校正 步驟,以去除廣角鏡頭帶來逵舊秦形變。首先,準備一印有 &複數個等間距圓點的校正板,《對影像進行前處理,由於内 視鏡僅使用固定在鏡頭旁的單一光源加上廣角鏡頭對光線 的散射,通常接近影像中間的區域其亮度會高於影像外圍’ 200807309 因此必須對於取得的校正版影像,進行亮度正規化 (Intensity Normalization)處理。以下說明亮度正規化處理: 假設二維影像大小為fTx孖,令/(/,/)為(ί,/)嵩之亮度, /avg為整張影像的平均亮度,若以每一點G J)筹Τ 〇,定義 一個大小為mxm的視窗(window),則正規化轰鲁一藤臺度 尸為:miniBIRD 3D tracker. When applying the system of the present invention, during endoscopic surgery, the physician can observe the internal structure of the body by extending the endoscope 34 into the body, and use the immobilized spring 263⁄4 surgical tool 28 to treat the lesion. deal with. Since the tracker main unit 22 can capture the electromagnetic signal emitted by the transmitter 21 through the beetle turtle device 26 to calculate the stray position, the tracker main unit 22 can be directly connected to the computer 24, and the surgical tool can be obtained immediately. 2 8 Position in 3D space. Under the & raise, a physician can take a thin and violent word on the endoscope 35 on the area to be observed. Then, the system of the present invention reconstructs the three-dimensional image structure of the sculpt by using the image sequence and the plurality of three-dimensional coordinates obtained by the tracking device 20 to detect the observation area, and simultaneously provides the orientation of the surgical tool. Weng Yibao let. In order to realize the establishment of a three-dimensional chamber junction meal T, the method for reconstructing the three-dimensional image of the endoscope of the present invention includes steps of image deformation correction, feature point extraction and tracking, and decomposition method combining three-dimensional coordinates. Please refer to FIG. 2A, which is a flow chart showing a method of reconstructing a three-dimensional image of a mirror according to a preferred embodiment of the present invention. First, perform image distortion correction travel 100. Usually, the endoscope is equipped with a wide-angle lens, so the endoscope cost: the two-dimensional image in the observation area of Qin ^ will appear geometric distortion distortion phenomenon, the chain I emperor will be imaged as a curve after taking the image through the endoscope. And this phenomenon is most evident in the image green. Therefore, before performing the three-dimensional reconstruction of the endoscope image, the image layer must first be sold to correct the wide-angle image to remove the wide-angle lens to bring the old Qin deformation. First, prepare a calibration plate printed with & multiple equal-pitch dots. “Pre-processing the image. Because the endoscope only uses a single light source fixed to the lens plus a wide-angle lens to scatter light, usually close to the middle of the image. The brightness of the area will be higher than the periphery of the image. 200807309 Therefore, it is necessary to perform the Intensity Normalization process on the acquired corrected image. The following describes the brightness normalization process: Suppose the size of the 2D image is fTx孖, let /(/,/) be the brightness of (ί,/)嵩, /avg is the average brightness of the entire image, if it is raised at each point GJ) Τ 〇, define a window (mow) of size mxm, then normalize the bang of the vines to the corpse:

1 mil mil mxmu=_m/2v=_m/21 mil mil mxmu=_m/2v=_m/2

揍著,以高通濾波器(High Pass Filter)與闕值分割法 (Thresholding)對影像作二值化處理,以將二維影像中之每 個像素分為前景與背景,並以影像處理申望態學 (Morphology)的閉合(Closing)運算填補洞隙。養後^ Λ影像 0 上各圓點重心作為校正點,藉由這些校正點薄毒m之宣線, 以 C.Brauer-Burchardt 在,’Automatic Corredimi Weak Radial Lens Distortion in Single Views of Urtea^ Scenes Using Vanishing Points” 一書中計算直線性誤差_:言式來校 正形變影像。首先,定義形變校正轉換如下: r = rW(l + d2ra+d4f4) (2) 其中,尖、<為未知參數,r與〆分別為雾影像點在校 正前與校正後到影像中心的距離。接著,選取位於同一線上 的校正點,這些點的連線原為一直線,現因内視鏡成像而變 200807309 成:曲線。以三個連續校正點為一組,並以此 二角形的面積為誤差函數,再田 〃、 /成之 當所有點的三角形面積和最小時取=參數。 接著,進行特徵擷取與追蹤的步驟,、形夂參數。 擷取之二維影像進行特徵點的 ’,,以對内視鏡所 蹤的結果,會立即景彡響到=唯 ^縱’此特徵擷取與追 則,詩示根據本㈣度。請參照第 步驟的流程示意圖。首先二Λ :之特徵操取與追縱 自无進仃内視鏡影像前虛沾丰_ 202a。由㈣視鏡拍攝對人 的4 能盔明&辦几 ^ ^ S組織,其顏色紋理可 ”、、m化’不利特徵擷取與追礙,因 r(Hist°g_E。啦㈣像Γ :特:。此外’内視鏡取像時,由於光源與鏡頭的::顯; =不:的情形發生,最常見的就是影像邊緣區域較:。 -疋在衫像均值化之後,雖然細節被強化,但此亮产 現象也更為嚴重,故本發明以上述之影像形變校=的 參 =規化技術來克服此缺點。如此,影像特徵仍Γ二 仁冗度为佈不均的現象已被去除。 、 點。本:明驟d202Lb,以追蹤並操取二維影像之特徵 法來_角=fr Γ用urTomasi (KLT)特徵追縱演算 冉隅特徵,但應用在内視鏡影像序列時,美 特性為生理組織,故不易找到明顯的角隅特徵。因此,:Next, the image is binarized by High Pass Filter and Thresholding to divide each pixel in the 2D image into foreground and background, and image processing The Closing operation of Morphology fills the gap. After the raising of the image, the center of gravity of each dot on the image 0 is used as a correction point. With these correction points, the line of the poison is m, by C. Brauer-Burchardt, 'Automatic Corredimi Weak Radial Lens Distortion in Single Views of Urtea^ Scenes Using Vanishing In the book "Points", calculate the linearity error _: language to correct the deformation image. First, define the deformation correction transformation as follows: r = rW(l + d2ra+d4f4) (2) where, the tip, <is an unknown parameter, r And 〆 are the distances of the fog image points before and after correction to the center of the image. Then, the correction points on the same line are selected, and the connection of these points is originally a straight line, which is now changed by the endoscope image: 200807309 Take three consecutive correction points as a group, and use the area of the square as the error function, and then add the parameter to the triangle area and the minimum time of all points. Then, perform feature extraction and tracking. Steps, shape parameters. Capture the 2D image to perform the feature point ', with the result of the endoscope, the scene will immediately ring to = only ^ vertical 'this feature capture and chase, Poem According to this (four) degree, please refer to the flow chart of the first step. First of all, the characteristics of the operation and tracking of the endoscope mirror image before the virtual image _ 202a. By (four) mirror to shoot the human 4 helmet Ming & do a few ^ ^ S organization, its color texture can be ",, m" 'adverse feature extraction and chasing, because r (Hist °g_E. (4) like Γ: special:. In addition 'endoscope image When the light source and the lens:: display; = not: the most common situation is the edge area of the image: - After the average of the shirt, although the details are strengthened, this bright phenomenon is more Seriously, the present invention overcomes this shortcoming by the above-mentioned image deformation correction technique. Thus, the phenomenon that the image features are still unevenly distributed has been removed. Point: Ben: d202Lb, in order to track and manipulate the feature method of two-dimensional image _ angle = fr Γ use urTomasi (KLT) feature to track 縱 縱 冉隅 characteristics, but when applied to the endoscope image sequence, the beauty characteristics are physiological organization, so it is not easy Find obvious corner features. Therefore,:

月支曰加邊線特徵的债測,以有效地增加特 X KLT特徵追蹤演算法: 以下呪明 兩部Π特徵追縱演算法主要可分為角隅擷取與特徵追縱 11 200807309 (1) 角隅擷取 首先,擷取影像序列的第一張影像,再將影像灰階化, 並利用一次高斯模糊低通濾波(Gaussian smoothingThe monthly test and the edge feature of the bond are measured to effectively increase the special X KLT feature tracking algorithm: The following two features of the feature tracking algorithm can be divided into angle extraction and feature tracking 11 200807309 (1) First, the first image of the image sequence is captured, the image is grayed out, and Gaussian smoothing is used.

Low-Pass Filter)去除雜訊干擾,且計算影像上水平方向與垂 直方向之亮度梯度影像:。接著’對每一影像點(zV)建 立協方差矩陣cw : Σ qsw cLow-Pass Filter) removes noise interference and calculates the brightness gradient image in the horizontal and vertical directions of the image: Then 'create a covariance matrix cw for each image point (zV): Σ qsw c

Sa(Ix(Cl) ) Sa(Ix((l)Iy(q)) S σ(Ι x(q)I y(q)) ^a(^y(q)2) (” 其中w是以⑹)為中心的子視窗;&為高斯濾波器 从1接著,求出cw矩陣之兩個特徵值(Eigenvaiues)ei,e2。 了軚小的特徵值,大於某事先設定的閥值,則點夕 :-個角隅特徵。利用此判斷,即可在第一張影像找出 良好的初始特徵點。 (2) 特徵追蹤 ⑤根據前一步驟找出的初始特徵點,計算其在下一張 2的位置。假設目前有兩張影像〇,纟中/影像的特 二:已知,並將被用來追蹤它們在7影像的特徵對應點。 /像/的4寺徵點X,經過位移D = ^之後^見在影像 則追蹤誤差S可定義為: \[U{X + D)^I{X)f0)^X)dX (4) 點发ί中""是以點J為中心的搜尋視窗,定義成以特徵 的權舌心-適當大小的區域,啦)則指定搜尋視窗内各點 量通常可設為U以高斯分布作分配。若將位移 里十均分配,則上式變成: 12 200807309 (5) ί ί [J(<X + f > - 7(χ ~ ^)Ϋ w(X)dxSa(Ix(Cl) ) Sa(Ix((1)Iy(q)) S σ(Ι x(q)I y(q)) ^a(^y(q)2) (where w is (6) ) is the center of the sub-window; & is a Gaussian filter from 1 to find the two eigenvalues of the cw matrix (Eigenvaiues) ei, e2. The smaller eigenvalue is greater than a predetermined threshold, then Xi:-A corner feature. With this judgment, a good initial feature point can be found in the first image. (2) Feature tracking 5 According to the initial feature point found in the previous step, calculate the next feature 2 The position. Suppose there are currently two images 〇, 纟中/Image Special 2: Known, and will be used to track their corresponding points in the 7 image features. / Like / 4 Temple sign X, after displacement D = ^ After ^ see in the image, the tracking error S can be defined as: \[U{X + D)^I{X)f0)^X)dX (4) Point ί中"" is the point J The center's search window, defined as the feature's right-handed heart--appropriate size area, specifies that the amount of points in the search window can usually be set to U as a Gaussian distribution. If the displacement is evenly distributed, then Change to: 12 200807309 (5) ί ί [J(<X + f > - 7(χ ~ ^)Ϋ w(X)dx

對此&差函數求解,誤I 值,解笟時以车_ 土 β ί 、瑕小犄可得最佳的位移 值解"Μ牛頓法豎代方式,疊代D(i + i) = D(i) +必,以 ::更新追縱誤差,直至收斂為止,…為疊代次數。 收斂日,,追縱誤差4最小’且透過位移",可 下一張影像的位詈。钟管笛 、在 像的對應位置之後,可以同描士 4 張〜 A…—冋樣方式用第二張影像特徵點 找出,、在弟三張影像的對應點’重複上述步冑,直 後一張影像’則可得影像序料徵點料蹤執跡。 以下說明步驟2 〇 2 b之邊線特徵的偵測步驟: Φ 完成影像前處理後,計算最佳閥值(〇pUmai加如⑷ 一,以將影像分為前景與背景。本發明實際使用的閥值為 (1 —咖。〆以避免取樣不足,#中伙—個依經驗定義的參 數。此外,若某前景像素的鄰域有一半以上為背景,則此像 素也被標記為背景。本發明依上述步驟所得前景像素之集人 定義-特徵遮罩(Feature Mask)。而影像再以高通濾:; (High-Pass FUter)作一次褶積(c〇nv〇iuti〇n),若某像素經高 通濾波後大於一事先定義的閥值且位於特徵遮罩内,則被標 記為邊線候選點(Edge Candidate)。然後,對這些候選點^ 高通濾波後的值由大到小排序,並依序置入(push) 一堆疊 (Stack) ’有最大值的像素在堆疊頂(τ〇ρ),有最小值則在堆 疊底(Bottom),每次由堆疊中取用(ρ〇ρ) 一點,檢查此點鄰 域(個由使用者定義大小的區域)是否已有邊線特徵點,若 無,則此點為邊線特徵,並重複此取用像素―檢查鄰域—標 13 200807309 記邊線點的步驟,直到堆疊淨空為止。Solve the & difference function, the error value I, the best displacement value solution for the car _ soil β ί, 瑕 犄 & quot Μ Μ Μ Μ Μ Μ Μ Μ Μ 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠 叠= D(i) + must be: to update the tracking error until convergence, ... is the number of iterations. On the convergence date, the tracking error 4 is the minimum 'and the displacement ", the position of the next image. The bell tube flute, after the corresponding position of the image, can be found with the second image feature point in the same way as the strokes of the image, and the above steps are repeated at the corresponding points of the three images of the younger brother. An image can be traced to the image sequence. The following describes the steps for detecting the edge features of step 2 〇2 b: Φ After completing the image pre-processing, calculate the optimal threshold (〇pUmai plus (4) one to divide the image into foreground and background. The valve actually used in the present invention The value is (1 - coffee. 〆 to avoid undersampling, #中伙 - an empirically defined parameter. In addition, if more than half of the neighborhood of a foreground pixel is the background, the pixel is also marked as the background. The invention According to the above steps, the foreground pixel is defined as a feature mask, and the image is subjected to high-pass filtering: (High-Pass FUter) for a convolution (c〇nv〇iuti〇n), if a pixel After high-pass filtering is greater than a predefined threshold and is located in the feature mask, it is marked as Edge Candidate. Then, the high-pass filtered values of these candidate points are sorted from large to small, and Pushing a stack (Stack) The pixel with the largest value is at the top of the stack (τ〇ρ), and the minimum value is at the bottom of the stack (Bottom), each time it is taken from the stack (ρ〇ρ) , check this point neighborhood (the size defined by the user) Region) Are there edge feature point, if no, this point is a feature edge, and repeat the pixel access - Neighborhood Check - labeled step 13200807309 edge points in mind, the stack until clearance.

m 在完成步驟202b後,以上述(2)特徵追蹤之方法進行 KLT特徵追蹤(步驟202c)。然後,進行步驟202d,以對特 敗K進行濾除不可靠執跡,請參照第2C圖,其繪示根據本 發爾之數叠實施例之濾除不可靠執跡步驟之示意圖,其中, 象、Pi與Pi + s代表代特測徵點對應於第、第i與第 i+s張馨象之對應點,s為一整數;VAB是由pi_s到pi的向 量;Υηχ:是由Pi到Pi + s的向量,若Vab與VBC之夾角0不 大,則待測特徵點經過第i-s、第i與第i + s張影像之執跡是 T靠翁,若夾角0太大,則表示待測特徵點的變化軌跡過 大,不可信賴,因此必須被剔除。夾角Θ的合格範圍由事先 定義之爾值決定。 請繼續參照第2A圖。在完成步驟202(202d)後,進行 步驟204,將以影像座標系表示之特徵追蹤軌跡二維座標, Λ Λ綦鏡內部參數矩陣(Calibration Matrix)轉換成其在内 翁鐘象量座標系的值,並以此建立尺度測量 矩陣 1— — 1 uip vn ··· vip ” ·: (6)m After completing step 202b, KLT feature tracking is performed by the above (2) feature tracking method (step 202c). Then, step 202d is performed to filter the unreliable traces of the faulty K. Referring to FIG. 2C, a schematic diagram of the step of filtering the unreliable traces according to the embodiment of the present invention is shown. The image, Pi and Pi + s represent the corresponding points of the dipoles corresponding to the corresponding points of the i, i and i + s sings, s is an integer; VAB is a vector from pi_s to pi; Υηχ: is from Pi to Pi + s vector, if the angle between Vab and VBC is not large, the trace of the feature points to be tested after the is, i, and i + s images is T, if the angle 0 is too large, it means The change track of the measured feature points is too large and untrustworthy, so it must be eliminated. The acceptable range of angle Θ is determined by the value defined in advance. Please continue to refer to Figure 2A. After step 202 (202d) is completed, step 204 is performed to trace the two-dimensional coordinates of the feature represented by the image coordinate system, and the internal matrix of the Λ綦 Λ綦 mirror is converted into the value of the internal coordinate system coordinate system. And to establish a scale measurement matrix 1 - 1 uip vn ··· vip ” ·: (6)

Uf\ …ujpUf\ ...ujp

yfx ··· vjp__lfxP 其中%二❼),-乃0)·(ι+〜), 兄厂為特徵擷取與追蹤出來的第/個特徵點在第f 張影像之座標,,兄Ό)為特徵點之集合的重心投影於弟Ζ·張 14 200807309 影像之影像座標,可用第/張影像各特徵點之重心計算此座 標值。/和尸分別為影像與特徵點的數目。矩陣中的每一個 值皆有一對應的%,%為重構之三維模型之精確度因子,並 將藉由疊代(Iterati^)讀方法逼近真正的^值,第一次建構 尺度測量矩陣r時之棗為零。 進行仿射重構爹^ 2Θ^,針對尺度測量矩陣(酽),以仿 射重構和建構三維模型,其中,仿射重構係利用奇異值分解 法將尺度測量矩奉(冒1金解成兩個秩(Rank)為三的矩陣之乘 φ 積,即豕,其申Μ备芮視鏡的仿射移動參數(AffineYfx ··· vjp__lfxP where %2❼),- is 0)·(ι+~), the feature of the feature point captured and traced by the brother factory is at the coordinates of the fth image, brother and sister) The center of gravity of the set of feature points is projected on the image coordinates of the image of the image of the Ζ Ζ 张 2008 2008 2008 2008 2008 2008 2008 2008 2008 2008 2008 2008 2008 。 。 。 。 。 。 。 。 。 。 。 。 。 / and corpse are the number of images and feature points. Each value in the matrix has a corresponding %, % is the accuracy factor of the reconstructed 3D model, and will be approximated by the Iterati^ reading method, the first constructed scale measurement matrix r The date is zero. Perform affine reconstruction 爹^ 2Θ^, for the scale measurement matrix (酽), to reconstruct and construct the 3D model by affine transformation. The affine reconstruction system uses the singular value decomposition method to measure the scales. The φ product of the matrix with three ranks (Rank) is the affine movement parameter of the mirror (Affine)

Motion Matrix); 了壽磬徽蠢的仿射三維座標(Affine Shape Matrix)。仿射重構餘建立之三維模型會有扭曲之現象,故 - 需再進行歐氏重構篇修正此模型。 進行歐氏重構之步驟2 0 8,以修正此扭曲之三維模型。 由於=Μ·左= ΜϋΆ,因此必需計算出矩陣丑,以將内 視鏡仿射移動參數i和巷徵點的仿射三維座標云,轉換為分 解座標系的内視鏡著動參數(Euclidean Moti(5n Matrix)M(第 • 一移動參數矩陣)和三章座焉(Euclidean Shape Matrix ) S(第 • 一座標值),即Μ 邊和$ =丑-1.左,其中,分解座標系係 以各特徵點之重心為蒙蠢之歐氏空間座標系(Euclidean Coordinate SystsE);三錐座標S為分解座標系下之特徵點 座標值之集合。在赢爲盖anses等人於“Robust Factorization” 書中所提出之方式息篆制條件計算矩陣丑,以使 以上之步驟即為歐氏重構。歐式重構所建構之三維模型尺度 異常於真實世界模型,因此,本發明提供了結合三維座標的 分解法,使歐氏重構後所產生之尺度異常三維模型能恢復原 15 200807309 有正常之尺度。 請繼續參照第1圖和第2A圖。在完成步驟202(202d) 後,進行步驟210,以在二維影像中翁養徵裏中,使用安裝 有感應器26之手術工具28,在觀察區中蔡選至少四個對應 至二維影像之特徵點的對應特徵點,並籍由一轉換步驟,得 到這些對應特徵點的世界座標值,這叠着赢替徵點世界座標 值所在之世界座標系統的原點為追隶雲蓑*2#所定義之原 點(在此實施例中,傳送器21為此原ft)。由發感應器26被 固定於手術工具28的手把端,以手貧工異2S 1S端所點選之 座標值與感應器所感應之世界座標雀1不担周,因此,若要 得到對應特徵點的世界座標值,則隹震求虐手術工具28頂 端所點選之座標值與感應器所感應之世界座標的轉換關 係。定義手術工具28頂端所點選之座標值所在之座標系為 感應器座標系,感應器座標系為將世界座標系的原點移至感 應器26之座標系。假設感應器座標条輿音界座標系的轉換 關係為•,其中,為手翁工異28於空間中 任意點選一感應點之感應器座標向奢,龜蠢蓋座標向量為感 應器座標系之原點至感應點之向量;4—此感應點之世 界座標向量’感應點世界座標向量為裳^斧里幕糸之原點至感 應點之向量;矩陣J代表感應器座蠢妻世界座標系之間 相互的轉換關係’在此定名為感應隹象'矩陣。由於追 蹤器裝置20可知感應器26位於此4產_之三維位置7^ 口 RT rp 其旋轉矩陣及,因此可得感應器座標#換矩陣χ , 又由於感應器26被固定於手術工具28的手把端,手術工具 16 200807309 28頂端相對於感應器26之位置是固定的,所以對於办門中 任意一點,其感應器座標值皆為固定值,也就3 ^ 值。由以上可知’若求出定值之户_,再由追蹤器系賴提 供之感應器座標轉換矩陣j,即可將以手術工一· 意點選之感應點,表示於世界座標系中。為求 ^ sensor " 應态26點選手術工具28所點選之感應點,以獲得此鑫畫$ 的世界座標值(第二座標值),再藉由追蹤器系統所提鲁:感 應器26位於感應點之感應器座標轉換矩陣a第一轉蕃: 陣)’即可求出定值之感應點的感應器座標值(第三盡發 值)。由上述之說明,可知利用個別對應於每一個對 點的感應器座標轉換矩陣(第二轉換矩陣)和定值之龜鴦器 座標值(第三座標值),即可將以车 ’丨J將以手術工具28所點選獲得之 對應特徵點的感應器座標值(第 你徂(罘四座祆值),轉換 襟 值(第五座標值)。 、 *在步•·驟212中,對應這些對應特徵點於尺度異常之三傘 模型中’以獲取這此 m (第六座標系卜 特徵點在分解座標系中編露 八解广#系 中D十开比較廷些對應特徵點世界座鬌養幸r =值’以求得—三乘三之座標系統棒_津 G(弟二轉換矩陣),並 、氏重構步驟所計算出之分暮塞舊 糸下之特徵點5和内韻尹 Α/Γ _ Λ/ί ^ 1 、兄和動參數矩陣Af由y 參 μ =Μ·σ轉換到世界座 一 酽=HJL中从知以不系,則世界座標三拿養費 A . „ r ,ffi . 和$之乘積仍等於尺度測量矩陣,,f 為世界座標糸下之转例_ A ^ ^ ^ ^…1 ^值之集合(第七座標值)益與 真貝世界的物體具有相同 匕例大小;f為世界座標系下之 17 200807309 内視鏡移動參數矩陣(第二移動參數矩陣)。為了求出座炉系 統轉換矩陣σ’假設在分解座標系的特徵點% = [j^ f 可經由位移ra後再y上矩陣Ga來轉換至世界座標系二可ς J[iJ /、中%為二維影像之第/個特徵點 位於,解,之座標值、為二維影像之第z•個特徵點位 於世界座m座標值;Gfl為—3x3矩陣;?;為—3χΐ矩陣。 因此,在歐氏重構後’即以這M固由三維追縱器得到的特徵 點世界座標^朴…及其對 所估計出的三維 座標w«,代入i Γα].γ,得—聯立方程組, 以农小平:法計算出矩陣故I []。最後,將〜位移,fl, 則矩陣Ga等於座標系統轉換矩陣G。位移&將在整個重建完 成之後補償回來。實際上,I r , Λ際上队丨U為—個仿射轉換(Affine T⑽sformation),除上述座標轉換外,也含有修正形狀,使 2近實際形狀的功能,仿射轉換假設每—點的形狀修正值 差異不大可用-整體轉換來代表^點在空分 零散,則這種轉換可能不適用。若 W刀命趨於 換矩…秩數不為三’:表:::求出之座標系統轉 ^ ^ , ’ 、、丁有t蹤錯誤之軌跡未被濾 除’因此㈣重新進行録不可純跡之步驟(2〇2d)。 在步驟216中’集合特徵點世界座標之矩陣&表示為: ⑺ ^ (:j) = lqwi^Ta if qwi exists, [^(· 5 〇 otherwise. :::4 下的特…,和内視鏡移動參數矩陣 养什异尺度測量矩陣r中的每個 以更新精確度因子之值和尺度測量矩“,又,由於 18 200807309 1腐’=(-鮮(j),因此會有兩組精確度因子之值個別對應 至(MJ)和(-紙J),取反投影誤差較小之今即可。Motion Matrix); Affine Shape Matrix. The 3D model created by the affine reconstruction has a distortion phenomenon, so it is necessary to modify the model by performing the Euclidean reconstruction. Perform the Euclidean reconstruction step 2 0 8 to correct the three-dimensional model of this distortion. Since =Μ·left=ΜϋΆ, it is necessary to calculate the matrix ugly to convert the endoscopic affine movement parameter i and the affine three-dimensional coordinate cloud of the lane sign to the endoscope mirroring parameter of the decomposition coordinate system (Euclidean) Moti (5n Matrix) M (the first moving parameter matrix) and the Euclidean Shape Matrix S (the first one), ie, the edge and $ = ugly - 1. left, where the decomposition coordinate system The focus of each feature point is the stupid Euclidean Coordinate SystsE; the three-cone coordinate S is the set of feature point coordinates under the decomposed coordinate system. In wins, the cover anses et al. in "Robust Factorization" The method proposed in the book is to calculate the conditional ugly matrix so that the above steps are the Euclidean reconstruction. The three-dimensional model scale constructed by European reconstruction is abnormal to the real world model. Therefore, the present invention provides a combination of three-dimensional The coordinate decomposition method enables the 3D model of the scale anomaly generated after the Euclidean reconstruction to be restored to the original 15 200807309. The normal scale is continued. Please continue to refer to Figure 1 and Figure 2A. After completing step 202 (202d), perform the steps. 210, In the two-dimensional image, in the Weng Yangzheng, using the surgical tool 28 equipped with the sensor 26, at least four corresponding feature points corresponding to the feature points of the two-dimensional image are selected in the observation area, and converted by a conversion Steps, obtaining the world coordinate values of the corresponding feature points, and the origin of the world coordinate system in which the world coordinate value of the winning point is overlapped is the origin defined by the chase cloud *2# (in this embodiment, The transmitter 21 is ft) for this purpose. The sensor 26 is fixed to the handle end of the surgical tool 28, and the coordinate value selected by the hand leaner 2S 1S end and the world coordinates of the sensor are not 1 Therefore, if the world coordinate value of the corresponding feature point is to be obtained, the coordinate value of the coordinate value selected by the top of the surgical tool 28 and the world coordinate sensed by the sensor is defined. The top point of the surgical tool 28 is defined. The coordinate value of the selected coordinate value is the sensor coordinate system, and the sensor coordinate is to move the origin of the world coordinate system to the coordinate system of the sensor 26. It is assumed that the conversion relationship of the sensor coordinate mark coordinate system is , among them, for the hand Weng Gong 28 The sensor coordinates of any point in the space are selected to the extravagant. The turtle stupid cover vector is the vector from the origin of the sensor coordinate system to the sensing point. 4—The world coordinate vector of the sensing point is the world coordinate vector of the sensing point. The origin of the axe in the axe to the vector of the sensing point; the matrix J represents the mutual conversion relationship between the sensory seat and the world coordinate system of the parasitic wife. Here, the name is called the induction image matrix. Since the tracker device 20 knows the induction The device 26 is located at the three-dimensional position of the fourth production port, and the rotation matrix thereof is obtained, so that the sensor coordinates # 换 matrix χ can be obtained, and since the sensor 26 is fixed to the handle end of the surgical tool 28, the surgical tool 16 200807309 28 The position of the top end relative to the sensor 26 is fixed, so the sensor coordinate value of any point in the door is a fixed value, that is, 3 ^ value. From the above, it can be seen that if the fixed value of the household is obtained, and then the sensor coordinate conversion matrix j provided by the tracker is used, the sensing point selected by the surgeon can be expressed in the world coordinate system. In order to obtain a sensor point, select the sensor point selected by the surgical tool 28 to obtain the world coordinate value (the second coordinate value) of the Xin painting, and then use the tracker system to extract the sensor: 26 The sensor coordinate conversion matrix a at the sensing point is the first to change the sensor coordinate value (the third highest value) of the sensing point of the fixed value. From the above description, it can be known that the sensor coordinate conversion matrix (second conversion matrix) corresponding to each pair of points and the turtle coordinate value (third coordinate value) of the fixed value can be used. The sensor coordinate value corresponding to the corresponding feature point selected by the surgical tool 28 (the first 祆 value) is converted to the 襟 value (the fifth coordinate value). , * In step · · step 212, corresponding These corresponding feature points are in the three umbrella model of scale anomaly' to obtain this m (the sixth coordinate system is in the decomposition coordinate system, and the eight solutions are in the decoupling system.鬌 幸 r r = value 'to obtain - three times three coordinate system bar _ Jin G (di sin transformation matrix), and the calculation of the sub-section of the old 之 之 之 之 特征 特征 5韵尹Α/Γ _ Λ/ί ^ 1 , brother and dynamic parameter matrix Af converted from y μ μ = Μ σ to the world seat 酽 = HJL in the knowledge of not to be, then the world coordinates three to raise A. The product of „ r ,ffi . and $ is still equal to the scale measurement matrix, and f is the rotation of the world coordinates _ A ^ ^ ^ ^...1 ^ The set (seventh coordinate value) has the same size as the object of the real world; f is the world coordinate system 17 200807309 endoscope movement parameter matrix (second moving parameter matrix). In order to find the furnace system conversion matrix σ' assumes that the feature point %=[j^ f in the decomposition coordinate system can be converted to the world coordinate system by the displacement of the ra and then the matrix coordinate Ga is ςJ[iJ /, % is the second / second of the two-dimensional image The feature point is located, the solution, the coordinate value, the z-th feature point of the two-dimensional image is located in the world m coordinate value; Gfl is -3x3 matrix; ?; is -3χΐ matrix. Therefore, after Euclidean reconstruction That is to say, the feature point world coordinates obtained by the three-dimensional tracer and its estimated three-dimensional coordinates w«, substituted into i Γα].γ, get the simultaneous equations, to Nong Xiaoping: Calculate the matrix I []. Finally, put ~displacement, fl, then the matrix Ga is equal to the coordinate system transformation matrix G. The displacement & will be compensated back after the completion of the reconstruction. In fact, I r , Λ 上 丨For an affine transformation (Affine T (10) sformation), in addition to the above coordinate conversion In addition, it also contains a modified shape to make 2 near-actual shape functions, and the affine transformation assumes that the difference in shape correction value per point is not large enough - the overall conversion to represent the ^ point in the space division is scattered, then this conversion may not be applicable. If W's life tends to change the moment... the rank number is not three': Table::: Find the coordinate system to turn ^^, ', and D. The track with t trace error is not filtered out. Therefore (4) Re-record The step of the trace (2〇2d). In step 216, the matrix of the set feature point world coordinates & is expressed as: (7) ^ (:j) = lqwi^Ta if qwi exists, [^(· 5 〇otherwise. : The specials under ::4, and the endoscopic shift parameter matrix raises each of the different scales to measure the matrix r to update the value of the accuracy factor and measure the moment ", again, due to 18 200807309 1 rot' = (- Fresh (j), so there will be two sets of precision factor values corresponding to (MJ) and (-paper J), and the back projection error is small.

在步驟218中’若更新後之值與初始^之差值小於預設 之閥值’則以此世界座標三維模型之精確度即符合要求,據 此獲得世界座標三維模型之三維影像,此三維影像具有正常 尺度大小。若精確度因子被更新後與被更新前差值大於預設 :閥值則回到步驟206’以更新之尺度測量矩p車不重新進 仃仿射重構之步驟,並重複此循環直至精確度因子被更新後 的值與被更新前的值之差值小於預設之閥值。以上反覆更新 世界座標三維模型之方法稱為疊代法,其中,步驟2ι〇只需 要進行一次即可,即步驟212於每次疊代過程中所需之 特徵點之世界座標值皆相,同。 仗于術過程中’病灶可能因為被其他組織遮蔽或超出影 像拍攝範圍等因素,使觀察區域的特徵點不均,而導致重構 的三維模型在重要區域缺乏特徵點以描述區域的幾何形 狀。另一方面’以手術工具34點選特徵點三維座標時,^ :工具34須準確點到特徵點以避免產生誤差,若:任意點 =,不限制在先前擷取的特徵點,可大幅增加點選的效率, =加幾何細節,並提升重建精確度。這些可任意點選的追加 '、徵點以下述步驟加入重構流程。進行三維重構後,將追加 的特徵點利用世界座標系下的内視鏡移動參數矩陣从= 影回二維料,以形成新的執跡«),並與原冑 二 軌跡、心)合併,可得一增廣尺度測量、矩陣 200807309 Π1 vnIn step 218, if the difference between the updated value and the initial value is less than the preset threshold, the accuracy of the three-dimensional model of the world coordinate is met, and the three-dimensional image of the three-dimensional model of the world coordinate is obtained. The image has a normal size. If the accuracy factor is updated and the difference before being updated is greater than the preset: threshold, then return to step 206' to measure the moment p of the updated scale without re-entering the affine reconstruction step, and repeat the cycle until accurate The difference between the updated value of the degree factor and the value before being updated is less than the preset threshold. The method of repeatedly updating the three-dimensional model of the world coordinate is called the iterative method, wherein the step 2 ι only needs to be performed once, that is, the world coordinate value of the feature points required in each step of the step 212 is the same, . In the process of surgery, the lesion may be unevenly covered by other tissues or beyond the scope of the image, so that the reconstructed three-dimensional model lacks feature points in important areas to describe the geometric shape of the region. On the other hand, when the three-dimensional coordinate of the feature point is selected by the surgical tool 34, ^: the tool 34 must accurately point to the feature point to avoid an error. If any point =, it is not limited to the previously extracted feature point, which can be greatly increased. Click on the efficiency, = add geometric details, and improve the reconstruction accuracy. These can be arbitrarily selected and added to the refactoring process in the following steps. After three-dimensional reconstruction, the additional feature points are reconstructed from the two-dimensional material by using the endoscope movement parameter matrix under the world coordinate system to form a new obstruction «), and merged with the original two-track and heart) , can get an augmented scale measurement, matrix 200807309 Π 1 vn

xmgW ufi lfP Uf\ v^ffn uxmgW ufi lfP Uf\ v^ffn u

V ff ’队、ff 风」2/x(P+A) (δ) t 其中乂為追加特徵點的數目·,增廣尺度測量矩陣awg『 茶繽進行以仿射重構、歐氏重構和疊代法來重構三維模型, 可使二維模型的精確度提升。 ^本發明之較佳實施例、目的、和特徵已被描述,即使用 j多的不同、改變、替換之方法製造,亦不會與所揭示之本 七月的精神和進一步列於下方的申請專利範圍分開,對於熟 悉此技藝的人,這是很明顯的。 、…、【圖式簡單說明】 易懂為讓本發明之上述和其他目的、特徵、和優點能更明顯 匆董,下文特舉一較佳實施例,並配合所附圖式,作 明如下·· °〜 示音21圖係繪不本發明之重構内視鏡之三維影像的系統 弟2Α圖係緣示根據本發明之較佳實施例之重構 之二,影像之方法的流程示意圖。 追跑2驟圖係繪不根據本發明之較佳實施例之特徵擷取與 蹤步驟的流程示意圖。 軌跡根據本發明之較佳實施例之遽除不可靠 20 200807309 【主要元件符號說明 21 :傳送器 24 :電腦 28 :手術工具 32 :内視鏡主機V ff 'team, ff wind' 2/x(P+A) (δ) t where 乂 is the number of additional feature points ·, augmented scale measurement matrix awg 『 tea coloring for affine reconstruction, Euclidean reconstruction And iterative method to reconstruct the 3D model, which can improve the accuracy of the 2D model. The preferred embodiments, objects, and features of the present invention have been described, i.e., manufactured using a variety of variations, alterations, and replacements, and are not inconsistent with the spirit of the disclosure of July and further applications listed below. The scope of the patents is separate and this is obvious to those skilled in the art. BRIEF DESCRIPTION OF THE DRAWINGS The above and other objects, features, and advantages of the present invention will become more apparent. The system of the present invention is a schematic diagram of the method for reconstructing the three-dimensional image of the reconstructed endoscope according to the preferred embodiment of the present invention. The chase 2 is a schematic diagram showing the flow of the feature acquisition and tracking steps in accordance with the preferred embodiment of the present invention. The track is unreliable according to the preferred embodiment of the present invention. 20 200807309 [Main component symbol description 21: Transmitter 24: Computer 28: Surgical tool 32: Endoscope host

20 :遵衆羅裝置 22 :追藥主機 26 : a患響 30 : 1%¾鏡裝置 34 : Η視栽 20G :奪拿.變校正 202 :兽徵達察與擷取 20為:宵樣蕓影像前處理 202¾ :镇爾變現特徵與角隅特徵 202c :特徵追蹤 202d ··濾除不可靠執跡 204 :建立尺度測量矩陣 206 :曹翁重講 208 :歐氏重構 2 1Θ :拿眷费應特徵點世界座標值 2 12 :取釋賢應特徵點的分解座標系座標值 214 :费專虐座標系統轉換矩陣、世界座標系下的特徵 點和影篆痛象叢置移動參數矩陣 21€:更着尺度測量矩陣中的每個精確度因子的值並以 之更新瓦衰漓量矩陣 218:若更新後的精確度因子與更新前的精確度因子之 差值小於嚷設閥值 2120: Compliance with the device 22: chasing the host 26: a suffering from the sound 30: 1% 3⁄4 mirror device 34: Η 栽 plant 20G: capture. Change correction 202: beast sign and capture 20 is: 芸 sample 芸Image pre-processing 2023⁄4: Zhener's realization characteristics and corner characteristics 202c: Feature tracking 202d ··Filtering unreliable traces 204: Establishing scale measurement matrix 206: Cao Wenzhong 208: Euclidean reconstruction 2 1Θ: Point World Coordinate Value 2 12: The decomposed coordinate system coordinate value of the characteristic point of Shi Xianying is 214: the conversion matrix of the fee-specific coordinate system, the feature points under the world coordinate system, and the movement parameter matrix of the shadow pain group 21 €: The value of each precision factor in the scale measurement matrix is used to update the tile attenuation matrix 218: if the difference between the updated accuracy factor and the pre-update accuracy factor is less than the threshold value 21

Claims (1)

200807309 十、申請專利範圍 1·一種重構三維影像之方法,其中至少包含: 藉由一内視鏡獲得一二_影豫; 對該二維影像進行一眷徵追襄輿一特徵擷取步驟,以 獲得該二維影像之複數顧鲁徵_『 對該些特徵點進行一翁除不可靠軌跡步驟,以濾除該 特徵追蹤與擷取步驟中之羞藥蠢屬特徵點;200807309 X. Patent application scope 1. A method for reconstructing a three-dimensional image, which comprises at least: obtaining a two-dimensional image by an endoscope; performing a feature tracking step on the two-dimensional image Obtaining the complex number of the two-dimensional image Gu Luzheng_" performing an unreliable trajectory step on the feature points to filter out the characteristic points of the idiots in the feature tracking and capturing step; 針對該些特徵點之_薄辨_成之一尺度測量矩陣 (〇,以一仿射重構之方去逢行#着重構之步驟,以建構 一扭曲三維模型; 針對該扭曲三維模型,以一歐氏重構之方法進行一三 維模型校正步驟,來建構一尺度異常三維模型,並獲得一 第一移動參數矩陣(M)和一第一座標值(S); 利用一感應器任意黯選空a中之一感應點,以取得一 第二座標值; 將該感應器固定於一于箭工要並以該手術工具點選 該感應點’以固定該追襄盡、藥壤急器、該感應點二者之 間的相對位置關係,並獲夢碧羞衆蒸所提供之一第一轉換 矩陣; 利用該第一轉換矩_帝首第二座標值計算求得一第 三座標值; 以該手術工具,在該些籍徵點中任意點選至少四個對 應特徵點,以得到複數個第四座標值和複數個第二轉換矩 陣; 22 200807309 利用該些第二轉換矩陣和該第三座標值將該些第四 座標值轉換為複數個第五座標值; 對應該些對應特徵點於該尺度異常三拿蔡望,m獲得 複數個第六座標值; 對該些第六座標值和該些第五座標僮患行一比教和 計算步驟,以獲得一第三轉換矩陣(G); 利用該第三轉換矩陣(G),將該第一座藝氬(5>_换為 一第七座標值(义),以及利用該第三轉換疼擎(纪之反矩 陣,將該第一移動參數矩陣(Μ),轉換成一羣二#翁參數 矩陣(Mf),以獲得一世界座標三維模型; 利用該第二移動參數矩陣(MD和該第七產標值(V)計 算出該尺度測量矩陣中之複數個精確度因子,並更新 該些精確度因子和該尺度測量矩陣(F);以及 比較更新前之該些精確度因子之值和更新後之該些 精確度因子之值之差值,若差值小於一預襲霄篕,_獲得 該世界座標三維模型之三維影像,若差隹太於該養設閥 值,則將已更新之該尺度測量矩陣(F)代S貧身重秦之步 驟,直至更新前之該些精確度因子之值和更脅隻之該些精 確度因子之值之差值小於該預設閥值。 2.如申請專利範圍第1項所述之方法,更至少包含: 在一欲觀察區域内以該感應器任意黠選複數個追加 特徵點,並將該些追加特徵點投影至該二維影像以獲得複 數個追加特徵點軌跡;以及 23 200807309 等Λ —追加特徵點軌跡併入該尺度測量矩陣(π),使 該尺度測量矩陣(们忐 M ^ Et ^ ^ )成為一特徵點增廣矩陣(⑽客縱)並對該 、陣(α%Κ)以仿射重構之方法建構該扭曲三 維挨型。 3·如申請專利範圍第1項所述之方法,更至少包含: 二、行該特徵追蹤與擷取步驟之前,加入一影像形變 杈正步驟’以校正由廣角鏡頭所產生之影像 形變校正步驟至少包含: 以像 對該一維衫像進行亮度正規化處理; 。對已進行亮度正規化處理之該二維影像,以高通遽波 器和閥值分割法進行二值化處理以區分前景和背景,並取 出該二維影像上各圖形之複數個重心; 對已進行二值化處理之該二維影像以影像處理中型 # 態學(M0rph0l0gy)之閉合(cl〇sing)運算填補二值化處理 所造成之洞隙;以及 對該些重心所形成之複數條直線’以計算直線性誤差 之方法校正形變影像。 4·如申請專利範圍第i項所述之方法,其中該特徵 榻取步驟係以Kanade-LuCas-Tomasi(KLT)特徵追縱法和 邊線特徵(Edge Feature)擷取法進行。 5.如申請專利範圍第1項所述之方法,其中該特徵 24 200807309 擁取步驟係以Kanade-LUcas-T〇masi(KLT)特徵追縱法和 邊線特徵(Edge Feature)追蹤法其中之一者進行。 6·如申請專利範圍第丨項所述之方法,其 不可靠執跡步驟至少包含: 于、 比較該些特徵點中之一待測特徵點於一 _ 7A ^ 一 ’衫 、-弟二二錄像和m維影像t,以取得該特徵 點位於該第一二維影像之一第一特徵點位 Wl· ^ m ^ ^ —荷徵 點位置和一弟三特徵點位置; 取該第一特徵點位置至該第二特徵點位置 向量為一第一向量; ^表之 ㈢取該第二特徵點位置至該第三特I點位置所代表之 向Ϊ為一第二向量;以及 ▲取該第二向量與該第一向量之夾角為一測試夾角,若 争 該測試夾角之值大於_預先設定之閥值則渡除該待測特 徵點。、 、 7·如申請專利範圍第1項所述之方法,其中該 直線性誤差之方法為C. Brauer-Burxhardt舛管古綠 的方法。 计异直線誤差 8·如申請專利範圍第i項所述之方法,i · 轉換矩陣(G)為一三乘三矩陣。 ,、μ弟 25Aiming at the feature points of the feature points, a scale measurement matrix (〇, with an affine reconstruction method to complete the steps to reconstruct a three-dimensional model; for the twisted three-dimensional model, A three-dimensional model correction step is performed by an Euclidean reconstruction method to construct a scale anomaly three-dimensional model, and a first moving parameter matrix (M) and a first coordinate value (S) are obtained; Select one of the sensing points in a to obtain a second coordinate value; fix the sensor to a arrow and select the sensing point with the surgical tool to fix the tracking and the drug The relative positional relationship between the sensing points, and one of the first conversion matrices provided by the dream of the dizzy steam; using the first conversion moment _ the second coordinate value of the emperor to calculate a third coordinate value Selecting, by the surgical tool, at least four corresponding feature points in the plurality of points to obtain a plurality of fourth coordinate values and a plurality of second conversion matrices; 22 200807309 utilizing the second conversion matrix and the The third coordinate value will be the fourth The scalar value is converted into a plurality of fifth coordinate values; corresponding to the corresponding feature points at the scale, the three exceptions are Cai Wang, m obtains a plurality of sixth coordinate values; the sixth coordinate value and the fifth coordinate child suffers Performing a comparison and calculation step to obtain a third conversion matrix (G); using the third conversion matrix (G), the first argon (5) is replaced by a seventh coordinate value (yi) And using the third conversion pain engine (the inverse matrix of the time, converting the first moving parameter matrix (Μ) into a group of two #翁 parameter matrix (Mf) to obtain a world coordinate three-dimensional model; using the second Moving the parameter matrix (MD and the seventh production value (V) to calculate a plurality of precision factors in the scale measurement matrix, and updating the accuracy factors and the scale measurement matrix (F); and comparing the pre-updates The difference between the value of the precision factor and the updated value of the accuracy factor, if the difference is less than a pre-attack, _ obtain a three-dimensional image of the three-dimensional model of the world coordinate, if the difference is too much Set the threshold, then the scaled measurement matrix will be updated (F) the step of the S-poor Qin, until the difference between the value of the precision factor before the update and the value of the more precise factor is less than the preset threshold. The method of claim 1, further comprising: arbitrarily selecting a plurality of additional feature points by using the sensor in an area to be observed, and projecting the additional feature points to the two-dimensional image to obtain a plurality of additional features Point trajectory; and 23 200807309, etc. - additional feature point trajectories are incorporated into the scale measurement matrix (π), so that the scale measurement matrix (the 忐M ^ Et ^ ^ ) becomes a feature point augmentation matrix ((10) passenger vertical) The ellipsometric three-dimensional 挨 type is constructed by the affine reconstruction method of the array (α% 。). 3. The method described in claim 1 includes at least: Before the step, adding an image deformation correction step 'to correct the image deformation correction step generated by the wide-angle lens comprises at least: performing brightness normalization processing on the one-dimensional shirt image; The two-dimensional image subjected to the brightness normalization processing is binarized by a high-pass chopper and a threshold division method to distinguish the foreground and the background, and the plurality of centers of gravity of the respective patterns on the two-dimensional image are taken out; The two-dimensional image subjected to the binarization processing fills the gap caused by the binarization processing by the closed processing (cl〇sing) operation of the image processing medium type (M0rph0l0gy); and the plurality of straight lines formed by the center of gravity 'Correction of the deformed image by calculating the linearity error. 4. The method of claim i, wherein the feature reclamation step is performed by Kanade-LuCas-Tomasi (KLT) feature tracking method and Edge feature extraction method. 5. The method of claim 1, wherein the feature 24 200807309 acquisition step is one of a Kanade-LUcas-T〇masi (KLT) feature tracking method and an edge feature tracking method. Conducted. 6. The method of claim 2, wherein the unreliable step of at least comprises: comparing and comparing one of the feature points to be measured in a _ 7A ^ a shirt, a brother two two Recording and m-dimensional image t, so that the feature point is located at a first feature point of the first two-dimensional image, Wl·^m^^, a charge point position, and a third feature point position; The point position to the second feature point position vector is a first vector; ^ (3) takes the second feature point position to the third special I point position represents a second vector; and ▲ takes the The angle between the second vector and the first vector is a test angle, and if the value of the test angle is greater than the _pre-set threshold, the feature point to be tested is removed. 7. The method of claim 1, wherein the method of linearity error is a method of C. Brauer-Burxhardt. Calculating the linear error 8 · As described in the scope of claim i, the conversion matrix (G) is a three-by-three matrix. ,,μ弟 25
TW95127782A 2006-07-28 2006-07-28 Method and system for reconstructing 3-d endoscopic images TWI323425B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW95127782A TWI323425B (en) 2006-07-28 2006-07-28 Method and system for reconstructing 3-d endoscopic images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW95127782A TWI323425B (en) 2006-07-28 2006-07-28 Method and system for reconstructing 3-d endoscopic images

Publications (2)

Publication Number Publication Date
TW200807309A true TW200807309A (en) 2008-02-01
TWI323425B TWI323425B (en) 2010-04-11

Family

ID=44766620

Family Applications (1)

Application Number Title Priority Date Filing Date
TW95127782A TWI323425B (en) 2006-07-28 2006-07-28 Method and system for reconstructing 3-d endoscopic images

Country Status (1)

Country Link
TW (1) TWI323425B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI678679B (en) * 2018-07-09 2019-12-01 財團法人資訊工業策進會 Space coordinate converting server and method thereof
TWI811162B (en) * 2022-11-22 2023-08-01 達擎股份有限公司 Video data conversion device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012075631A1 (en) 2010-12-08 2012-06-14 Industrial Technology Research Institute Methods for generating stereoscopic views from monoscopic endoscope images and systems using the same
TWI711010B (en) * 2019-11-29 2020-11-21 財團法人成大研究發展基金會 Geometric camera calibration system and method
TWI764593B (en) * 2021-02-26 2022-05-11 國立清華大學 Method of constructing three-dimensional model by tomographic reconstruction technology

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI678679B (en) * 2018-07-09 2019-12-01 財團法人資訊工業策進會 Space coordinate converting server and method thereof
TWI811162B (en) * 2022-11-22 2023-08-01 達擎股份有限公司 Video data conversion device

Also Published As

Publication number Publication date
TWI323425B (en) 2010-04-11

Similar Documents

Publication Publication Date Title
US10198872B2 (en) 3D reconstruction and registration of endoscopic data
JP5797352B1 (en) Method for tracking a three-dimensional object
Maier-Hein et al. Comparative validation of single-shot optical techniques for laparoscopic 3-D surface reconstruction
CN101229080B (en) Registration of images of an organ using anatomical features outside the organ
Guan et al. A review of point feature based medical image registration
Hasan et al. Detection, segmentation, and 3D pose estimation of surgical tools using convolutional neural networks and algebraic geometry
KR101572487B1 (en) System and Method For Non-Invasive Patient-Image Registration
CN111260786A (en) Intelligent ultrasonic multi-mode navigation system and method
Yang et al. Automatic 3-D imaging and measurement of human spines with a robotic ultrasound system
CN106651827A (en) Fundus image registering method based on SIFT characteristics
Wu et al. Three-dimensional modeling from endoscopic video using geometric constraints via feature positioning
KR20130026041A (en) Method and apparatus for creating medical image using partial medical image
US10390892B2 (en) System and methods for updating patient registration during surface trace acquisition
Wein et al. Integrating diagnostic B-mode ultrasonography into CT-based radiation treatment planning
Péntek et al. Image-based 3D surface approximation of the bladder using structure-from-motion for enhanced cystoscopy based on phantom data
TW200807309A (en) Method and system for reconstructing 3-D endoscopic images
Cheema et al. Image-aligned dynamic liver reconstruction using intra-operative field of views for minimal invasive surgery
Alam et al. Evaluation of medical image registration techniques based on nature and domain of the transformation
CN111588467A (en) Method for converting three-dimensional space coordinates into two-dimensional image coordinates based on medical images
CN116997928A (en) Method and apparatus for generating anatomical model using diagnostic image
CN111080676B (en) Method for tracking endoscope image sequence feature points through online classification
CN116883471A (en) Line structured light contact-point-free cloud registration method for chest and abdomen percutaneous puncture
CN116152235A (en) Cross-modal synthesis method for medical image from CT (computed tomography) to PET (positron emission tomography) of lung cancer
CN114283179A (en) Real-time fracture far-near end space pose acquisition and registration system based on ultrasonic images
Sun Ultrasound probe localization by tracking skin features

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees