TW201250628A - System and method of revising depth of a 3D image pair - Google Patents

System and method of revising depth of a 3D image pair Download PDF

Info

Publication number
TW201250628A
TW201250628A TW100120089A TW100120089A TW201250628A TW 201250628 A TW201250628 A TW 201250628A TW 100120089 A TW100120089 A TW 100120089A TW 100120089 A TW100120089 A TW 100120089A TW 201250628 A TW201250628 A TW 201250628A
Authority
TW
Taiwan
Prior art keywords
depth
image
pixels
value
map
Prior art date
Application number
TW100120089A
Other languages
Chinese (zh)
Other versions
TWI514325B (en
Inventor
Liang-Gee Chen
Chien Wu
Chung-Te Li
Yen-Chieh Lai
Chao-Chung Cheng
Ling-Hsiu Huang
Original Assignee
Univ Nat Taiwan
Himax Tech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Univ Nat Taiwan, Himax Tech Ltd filed Critical Univ Nat Taiwan
Priority to TW100120089A priority Critical patent/TWI514325B/en
Publication of TW201250628A publication Critical patent/TW201250628A/en
Application granted granted Critical
Publication of TWI514325B publication Critical patent/TWI514325B/en

Links

Landscapes

  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A method of revising depth of a three-dimensional (3D) image pair is disclosed. The method comprises the following steps: firstly, at least one initial depth map associated with one image of the 3D image pair based on stereo matching technique is received, wherein the one image comprises a plurality of pixels, and the initial depth map carries an initial depth value of each pixel. Then, the inconsistence among the pixels of the one image of the 3D image pair is detected to estimate a reliable map. Finally, the initial depth value is interpolated according to the reliable map and the proximate pixels, so as to generate a revised depth map by revising the initial depth value.

Description

201250628 六、發明說明: 【發明所屬之技術領域】 [0001] 本發明係有關數位影像處理,特別是關於一種三維(3D )影像的深度修正系統及其方法。 [0002] 〇 [先前技術] 當三維物體藉由照相機或攝影機而投影映射至二維影像 平面時,會失去許多的資訊,特別是三維深度資訊。三 維成像系統可藉由錄製三維視覺資訊或創造深度幻覺而 傳達三維資訊給觀者。雖然三維成像技術已有一世紀以 上的歷史,然而由於近來高解析度及低價格顯示器(例 如液晶顯示器)的發展,使得三維顯示器變得更為實際 及普遍。 [0003] 第一圖顯示傳統三維成像系統1的方塊圖,其中,深度產 生器11根據二維輸入影像而產生深度資訊。該深度資訊 經由深度影像成像器(depth-image-based rendering, DIBR) 12 的處理而產生左 (L) 影像 14A 及右 (R ◎ )影像14B,其經顯示而讓觀者觀看。 另外,請參考第二圖,其顯示另一種傳統三維成像系統2 的方塊圖。三維成像系統2使用兩個攝影機分別對目標物 擷取左影像20A和右影像20B,深度產生器21便利用如區 塊配對技術(block matching technique)的立體配對 技術(stereo matching technique)從立體成對影像--左影像20A和右影像20B分別獲得左右深度資訊。深度影 像成像器(DIBR) 22根據所產生的左右深度資訊以及左 右影像20A,20B的配對關係來產生觀者應該觀看到的至 100120089 表單編號A0101 第3頁/共17頁 1002033987-0 201250628 少兩張不同視角的影像(即至少一左影像24A和至少一右 影像24B)。 [0004] 然而,對傳統三維成像系統2來說,在立體影像中仍有一 些基本限制,例如,影像中被遮蔽到的部份,或兩攝影 機的設定參數誤差都會影響所產生的深度資訊。因此, 若只是考量立體成對影像的配對關係,可能會導致影像 中的部份像素具有不正確的深度資訊。 鑑於傳統三維成像系統無法有效顯示三維影像或視訊, 因此亟需提出一種新穎的三維影像的深度修正系統及方 法,以忠實地且簡易地重現或近似出立體表現。 【發明内容】 [0005] 鑑於上述,本發明實施例的目的之一在於提出一種三維 影像對(3D image pair)的深度修正系統及其方法,用 以提高三維影像或視訊的品質。 [0006] 本發明係揭示一種三維(3D)影像對的深度修正系統, 其包含一深度產生器以及一深度修正器。深度產生器產 生相關於三維影像對之其中一影像的至少一初始深度圖 ,其中該影像具有複數個像素,初始深度圖記錄了每一 像素的一初始深度值。深度修正器包含一差異性偵測單 元以及一内插單元。差異性偵測單元偵測影像中像素之 間的差異性,並根據所偵測到的差異性估測一可信圖 (reliable map)。内插單元根據可信圖以及近似的像素 來内插初始深度值,以藉由修正初始深度值產生一修正 深度圖。 [0007] 本發明係揭示一種三維影像對的深度修正方法,包含以 100120089 表單編號A0101 第4頁/共17頁 1002033987-0 201250628 下步驟:首先,接收相關於三維影像對之其中一影像的 至少一初始深度圖,其中該影像具有複數個像素,初始 深度圖記錄了每—像素的一初始深度值;接著,偵測影 像中像素之間的差異性,以估測一可信圖(re I i ab 1 e map);最後,根據可信圖以及近似的該些像素來内插初 始深度值’以藉由修正初始深度值產生一修正深度圖。 【實施方式】 [0008] Ο 〇 [0009] 請參考第三圖,係顯示本發明實施例之三維影像對 image pair)的深度修正系統之方塊圖β三維影像對又 稱為立體(stere〇scopic)影像。三維影像對的深度修 正系統3包含一深度產生器(depth generator)3l、一 深度修正器(depth revisor)32以及一深度影像成像( depth-image-based rendering, DIBR)單元33。深 度產生器31接收可顯示於三維成像系統的三維影像對, 如左(L)影像3〇a及右(R)影像30B,以產生至少一深 度圖。例如’深度產生器31基於立體配對技術(stere〇 matching technique)來產生左深度圖及右深度圖,其 分別對應至原始左影像30A及右影像30B。 為方便說明,以下以單一影像為例,請一併參考第四圖 。深度產生器31基於如區塊配對技術(block matching technique)的立體配對技術(stereo matching technique),從一影像41(如三維影像對的左影像3〇A 或右影像30B)產生一初始深度圖43。影像41具有複數個 像素,初始深度圖43記錄了每一像素或區塊相應的一初 始深度值。例如’靠近觀者物件的深度值比遠離觀者物 100120089 表單編號A0101 第5頁/共17頁 1002033987-0 201250628 件的深度值來得大。因此,於深度圖影像中,靠近觀者 物件的亮度會比遠離觀者物件的亮度來得大。其中,從 圖中可看出,初始深度圖43的深度資訊會有誤差,尤其 是影像41中的遮蔽區域(occiusion regi〇n)411或物件 邊界(object boundary)。 [0010] [0011] 永度修正器32包含一差異性彳貞測單元(inconsjstence detection unit)321以及一内插單元 (interpolation unit)323。差異性偵測單元321藉由 雙向遮蔽偵測技術(two-directional occlusion detection techniques) 來偵測影像41 中像素之間 的差異 性’進而對每一像素計算出一差異值(c〇st value),代 表其與鄰近像素的差異性。具體來說,雙向遮蔽偵測技 術包含左到右確認(Lef t-Right-checking)或右到左碟 認(Right-Left-checking)技術,用來找出影像41中被 遮蔽或邊界位置。雙向遮蔽偵測技術之實施可使用傳統 技術’例如Pattern Analysis及Machine Intelligence 中所揭露之” Detecting binocular half-occlusions: empirical comparisons of five approaches” ° 此外,差異性偵測單元321根據每個像素的差異值來估測 一可信圖(reliable map)45,其記錄每一像素的一可 信值。具體來說,差異性偵測單元321預先定義一預設門 檻值來對差異值進行分類,若差異值小於預設門檻值的 像素就視為可信區域,若差異值大於預設門檻值的像素 就視為不可信區域。其中,可信區域的像素之可信值被 100120089 表單編號A0101 第6頁/共17頁 1002033987-0 201250628 設定為1,而不可信區域的像素之可信值被設定為〇,如 可信圖45中的黑色區域。 [0012] Ο 内插單元323根據可信圖45以及近似的像素來内插初始深 度值,具體來說,内插單元323包含一三向濾波器 (Trilateral Filter)。其中,藉由公式(1)可計算出 像素之間的空間與亮度近似程度B(.),内插單元323便 可根據像素之間的空間與亮度近似程度來計算出修正後 的深度值。除此之外’内插單元323亦考量可信圖45來排 除不可信區域的深度資訊,如公式(2)。其中,R(X,y) 表示修正後的深度值’ S表示不可信區域中的像素之鄰近 像素,D(x,y)表示初始深度值’ i(x,y)表示亮度值。 b(x,y,Xj,>〇ξexp (Dl2 -Liy-y)l2 丨1(xj^H(x,y)t 2仿2 2σ^ (1) 〇 R(x, y)n 2 〇(x,y)B(x,y,Xi5^)D(x,y) 2 〇Cxsy)BCx,y,xj,^) cm今 (2) 因此,内插單元323藉由修正初始深度值D(x,y)產生一 修正深度圖47。根據實驗結果可看出,修正深度圖47的 深度品質大幅提升,尤其是遮蔽區域和物件邊界部分。 深度影像成像單元33便根據修正深度圖47及原始左影像 30A及右影像30B來產生至少一修正左(L’ )影像34八及 至少一修正右(R,)影像34B,其經顯示而讓觀者觀看 100120089 表單編號A0101 第7頁/共17頁 1002033987-0 201250628 [0013] [0014] [0015] [0016] 。深度影像成像單元33之實施可使用傳統技術,例如 一__ 所揭露之’,A3D-TVAppr〇achUs_ mg Depth-I.age^Based Rendering (Dibr)« 0 第五圖顯縣發明實旬狀三舞像的深度修正方法之 流程圖。首先,於步_5(H,從深度產生抑接收初始 深度圖43。接著’於步驟S5()3,差異性價測單元321偵 測影像41以三維影像對的左轉30A或右影像綱)中像 素之間的差異性,並判斷每個像素的差異值是否大於預 設門檻值(步驟S505)。 若差異值大於預设門播值的像素,則視為不可信區域, 便没疋像素的可信值為〇(步驟S5〇7);若差異值小於預設 門檻值的像素,則視為可信區域,便設定此像素的可信 值為1 (步驟S509)。最後,内插單元323便根據可信圖 45以及近似的像素來内插初始深度值(步驟S511),深度 影像成像單元33再根據修正深度圖47來產生修正左(L, )影像34A及修正右(R,)影像34B,並顯示之(步驟S513)。 根據上述實施例,本發明提出一種深度内插演算法來進 行深度後製處理’以加強遮蔽區域的深度資訊,並修正 不可靠的深度資訊》 以上所述僅為本發明之較佳實施例而已,並非用以限定 本發明之申請專利範圍;凡其它未脫離發明所揭示之精 神下所完成之等效改變或修飾,均應包含在下述之申請 專利範圍内。201250628 VI. Description of the Invention: [Technical Field of the Invention] [0001] The present invention relates to digital image processing, and more particularly to a depth correction system for a three-dimensional (3D) image and a method thereof. [0002] 先前 [Prior Art] When a three-dimensional object is projected onto a two-dimensional image plane by a camera or a camera, much information, especially three-dimensional depth information, is lost. The 3D imaging system delivers 3D information to viewers by recording 3D visual information or creating depth illusions. Although three-dimensional imaging technology has a history of more than a century, the recent development of high-resolution and low-priced displays (such as liquid crystal displays) has made three-dimensional displays more practical and ubiquitous. The first figure shows a block diagram of a conventional three-dimensional imaging system 1 in which depth generator 11 generates depth information based on a two-dimensional input image. The depth information is processed by a depth-image-based rendering (DIBR) 12 to produce a left (L) image 14A and a right (R ◎) image 14B, which are displayed for viewing by a viewer. In addition, please refer to the second figure, which shows a block diagram of another conventional three-dimensional imaging system 2. The three-dimensional imaging system 2 uses two cameras to respectively extract the left image 20A and the right image 20B from the object, and the depth generator 21 facilitates stereoscopic formation using a stereo matching technique such as a block matching technique. The left and right depth information is obtained for the image--left image 20A and right image 20B, respectively. The depth image imager (DIBR) 22 generates the viewer's view to the 100120089 based on the generated left and right depth information and the pairing relationship of the left and right images 20A, 20B. Form No. A0101 Page 3 / Total 17 Page 1002033987-0 201250628 Images of different viewing angles (ie, at least one left image 24A and at least one right image 24B). [0004] However, for the conventional three-dimensional imaging system 2, there are still some basic limitations in the stereoscopic image, for example, the portion of the image that is shaded, or the error of the setting parameters of the two cameras affects the generated depth information. Therefore, if you only consider the pairing relationship of stereo paired images, some pixels in the image may have incorrect depth information. In view of the inability of traditional 3D imaging systems to effectively display 3D images or video, it is imperative to propose a novel depth correction system and method for 3D images to faithfully and easily reproduce or approximate stereoscopic representation. SUMMARY OF THE INVENTION In view of the above, one of the objects of embodiments of the present invention is to provide a depth correction system for a 3D image pair and a method thereof for improving the quality of a three-dimensional image or video. The present invention discloses a depth correction system for a three-dimensional (3D) image pair, which includes a depth generator and a depth modifier. The depth generator generates at least one initial depth map associated with one of the three-dimensional image pairs, wherein the image has a plurality of pixels, and the initial depth map records an initial depth value for each pixel. The depth modifier includes a difference detection unit and an interpolation unit. The difference detection unit detects the difference between the pixels in the image and estimates a reliable map based on the detected difference. The interpolation unit interpolates the initial depth value based on the confidence map and the approximated pixels to generate a corrected depth map by correcting the initial depth value. [0007] The present invention discloses a depth correction method for a 3D image pair, including 100120089 Form No. A0101 Page 4 / Total 17 Page 1002033987-0 201250628 Steps: First, receive at least one of the images related to the 3D image pair An initial depth map, wherein the image has a plurality of pixels, the initial depth map records an initial depth value of each pixel; and then, the difference between the pixels in the image is detected to estimate a trusted image (re I i ab 1 e map); Finally, the initial depth value ' is interpolated according to the confidence map and the approximate pixels to generate a corrected depth map by correcting the initial depth value. [Embodiment] [0008] Please refer to the third figure, which is a block diagram showing a depth correction system of a three-dimensional image pair image pair according to an embodiment of the present invention. A three-dimensional image pair is also called stereoscopic (stere〇scopic )image. The depth correction system 3 of the 3D image pair includes a depth generator 31, a depth revisor 32, and a depth-image-based rendering (DIBR) unit 33. The depth generator 31 receives a three-dimensional image pair that can be displayed on the three-dimensional imaging system, such as a left (L) image 3a and a right (R) image 30B, to generate at least one depth map. For example, the depth generator 31 generates a left depth map and a right depth map based on a stereo pairing technique, which correspond to the original left image 30A and the right image 30B, respectively. For convenience of explanation, the following takes a single image as an example. Please refer to the fourth figure together. The depth generator 31 generates an initial depth map from an image 41 (eg, a left image 3A or a right image 30B of a 3D image pair) based on a stereo matching technique such as a block matching technique. 43. Image 41 has a plurality of pixels, and initial depth map 43 records a corresponding initial depth value for each pixel or block. For example, the depth value of the object closer to the viewer is larger than the depth value of the object 10090089 Form No. A0101 Page 5 of 17 1002033987-0 201250628. Therefore, in the depth map image, the brightness of the object close to the viewer is greater than the brightness of the object away from the viewer. It can be seen from the figure that the depth information of the initial depth map 43 has an error, especially the occiusion regi〇n 411 or the object boundary in the image 41. [0011] The permanent modifier 32 includes an inconsistency detecting unit 321 and an interpolation unit 323. The difference detecting unit 321 detects the difference between the pixels in the image 41 by two-directional occlusion detection techniques, and calculates a difference value (c〇st value) for each pixel. , representing its difference from neighboring pixels. Specifically, the two-way occlusion detection technique includes Left-Right-checking or Right-Left-checking techniques for finding the occluded or boundary position in the image 41. The implementation of the two-way occlusion detection technique can be performed using conventional techniques such as those disclosed in Pattern Analysis and Machine Intelligence. Detecting binocular half-occlusions: empirical comparisons of five approaches. The value is used to estimate a reliable map 45 that records a trusted value for each pixel. Specifically, the difference detecting unit 321 defines a preset threshold value to classify the difference value, and if the difference value is smaller than the preset threshold value, the pixel is regarded as a trusted area, and if the difference value is greater than the preset threshold value, Pixels are considered untrusted areas. Wherein, the trusted value of the pixel of the trusted area is set to 1 by the form number 1001089, the form number A0101, the sixth page, the 17th page, the 1001033987-0201250628 is set to 1, and the trusted value of the pixel of the untrusted area is set to 〇, such as the trusted figure. Black area in 45. [0012] The interpolation unit 323 interpolates the initial depth value according to the trusted map 45 and the approximate pixels. Specifically, the interpolation unit 323 includes a trilateral filter. Wherein, by the formula (1), the spatial approximation degree B(.) between the pixels can be calculated, and the interpolation unit 323 can calculate the corrected depth value according to the spatial approximation degree between the pixels. In addition to this, the interpolation unit 323 also considers the trusted map 45 to exclude the depth information of the untrusted area, as in the formula (2). Where R(X, y) represents the corrected depth value 'S represents the neighboring pixel of the pixel in the untrusted area, and D(x, y) represents the initial depth value 'i(x, y) represents the luminance value. b(x,y,Xj,>〇ξexp (Dl2 -Liy-y)l2 丨1(xj^H(x,y)t 2 imitation 2 2σ^ (1) 〇R(x, y)n 2 〇 (x, y) B(x, y, Xi5^) D(x, y) 2 〇 Cxsy) BCx, y, xj, ^) cm (2) Therefore, the interpolation unit 323 corrects the initial depth value D by (x, y) produces a corrected depth map 47. According to the experimental results, it can be seen that the depth quality of the corrected depth map 47 is greatly improved, especially the shielding area and the boundary portion of the object. The depth image imaging unit 33 generates at least one modified left (L') image 34 and at least one corrected right (R) image 34B according to the corrected depth map 47 and the original left image 30A and the right image 30B, which are displayed Viewer's View 100120089 Form No. A0101 Page 7 of 17 1002033987-0 201250628 [0013] [0015] [0016] [0016] The implementation of the depth image imaging unit 33 can be performed using conventional techniques, such as a __ disclosed by A3D-TVAppr〇achUs_mg Depth-I.age^Based Rendering (Dibr)« 0 A flow chart of the depth correction method of the dance image. First, in step _5 (H, the initial depth map 43 is received from the depth generation. Then, in step S5 () 3, the difference price unit 321 detects the image 41 to the left turn 30A or the right image of the three-dimensional image pair. The difference between the pixels in the middle, and determining whether the difference value of each pixel is greater than a preset threshold value (step S505). If the difference value is greater than the pixel of the preset gatecast value, it is regarded as an untrusted area, and the trusted value of the pixel is not 〇 (step S5〇7); if the difference value is smaller than the preset threshold value, it is regarded as For the trusted area, the trusted value of this pixel is set to 1 (step S509). Finally, the interpolation unit 323 interpolates the initial depth value according to the trusted map 45 and the approximate pixels (step S511), and the depth image imaging unit 33 generates the corrected left (L) image 34A and the correction according to the corrected depth map 47. The right (R,) image 34B is displayed and displayed (step S513). According to the above embodiment, the present invention proposes a depth interpolation algorithm for performing deep post-processing to enhance depth information of the masked area and correct unreliable depth information. The above description is only a preferred embodiment of the present invention. The scope of the invention is not limited to the scope of the invention, and the equivalents and modifications may be included in the following claims.

100120089 表單煸號A0101 第8頁/共17頁 1002033987-0 201250628 【圖式簡單說明】 [0017] 第一圖顯示傳統三維成像系統的方塊圖。 第二圖顯示另一種傳統三維成像系統的方塊圖。 第三圖顯示本發明實施例之三維影像對的深度修正系統 之方塊圖。 第四圖例示本發明實施例之影像及其相關初始深度圖、 可信圖、修正深度圖。 第五圖顯示本發明實施例之三維影像對的深度修正方法 之流程圖。100120089 Form nickname A0101 Page 8 of 17 1002033987-0 201250628 [Simplified Schematic] [0017] The first figure shows a block diagram of a conventional three-dimensional imaging system. The second figure shows a block diagram of another conventional three-dimensional imaging system. The third figure shows a block diagram of a depth correction system for a three-dimensional image pair in accordance with an embodiment of the present invention. The fourth figure illustrates an image of the embodiment of the present invention and its associated initial depth map, trusted map, and corrected depth map. The fifth figure shows a flow chart of a depth correction method for a three-dimensional image pair according to an embodiment of the present invention.

【主要元件符號說明】 [0018] 習知: 1 三維成像系統 11 深度產生器 12 深度影像成像器 14A 左影像 14B 右影像 2 三維成像系統 20A 左影像 20B 右影像 21 深度產生器 22 深度影像成像器 24A 左影像 24B 右影像 本發明: 3 三維影像對的深度修正系統 30A 左影像 100120089 表單編號A0101 第9頁/共17頁 1002033987-0 201250628 30B 右影像 31 深度產生器 32 深度修正器 321 差異性偵測單元 323 内插單元 33 深度影像成像器 34A 修正左影像 34B 修正右影像 41 影像 411 遮蔽區域 43 初始深度圖 45 可信圖 47 修正深度圖 S501 -S 513步驟 100120089 表單編號A0101 第10頁/共17頁 1002033987-0[Main Component Symbol Description] [0018] Conventional: 1 3D imaging system 11 depth generator 12 depth image imager 14A left image 14B right image 2 3D imaging system 20A left image 20B right image 21 depth generator 22 depth image imager 24A Left Image 24B Right Image The present invention: 3 3D image pair depth correction system 30A Left image 100120089 Form number A0101 Page 9/17 page 1002033987-0 201250628 30B Right image 31 Depth generator 32 Depth modifier 321 Difference detection Measurement unit 323 Interpolation unit 33 Depth image imager 34A Correct left image 34B Correct right image 41 Image 411 Masked area 43 Initial depth map 45 Trusted figure 47 Corrected depth map S501 - S 513 Step 100120089 Form number A0101 Page 10 / Total 17 pages 1002033987-0

Claims (1)

201250628 七、申請專利範圍: 1 . 一種三維(3D)影像對的深度修正系統,包含: 一深度產生器,用以基於立體配對技術來產生相關 於該三維影像對之其中一影像的至少一初始深度圖,其中 該影像具有複數個像素,該初始深度圖記錄了每一該些像 素的一初始深度值;及 一深度修正器,包含: 一差異性偵測單元,基於雙向遮蔽偵測技術 爲 (two-directional occlusion detection Ο techn i que)來债測該影像中該些像素之間的差異性,並 根據所偵測到的差異性估測一可信圖(reliable map); 及 一内插單元,根據該可信圖以及近似的該些像素來内插該 初始深度值,以藉由修正該初始深度值產生一修正深度圖 〇 2 .如申請專利範圍第1項所述三維影像的深度修正系統,其 Q 中該差異性偵測單元對每一該些像素計算出一差異值,代 表其與鄰近該些像素的差異性。 3 .如申請專利範圍第2項所述三維影像的深度修正系統,其 中該可信圖包含一可信區域和一不可信區域,其中該可信 區域包含該差異值小於一預設門檻值的該些像素,且該不 可信區域該差異值大於該預設門檻值的該些像素。 4 .如申請專利範圍第3項所述三維影像的深度修正系統,其 中該内插單元係根據該些像素之間的空間與亮度近似程度 來對該不可信區域中的該些像素之該初始深度值進行内插 100120089 表單編號A0101 第11頁/共17頁 1002033987-0 201250628 運算。 5 .如申請專利範圍第3項所述三維影像的深度修正系統,其 中該可信圖記錄了每一該些像素的一可信值,該差異性偵 測單元設定於該可信區域的該些像素之該可信值為1,並 設定於該不可信區域的該些像素之該可信值為〇。 6 .如申請專利範圍第1項所述三維影像的深度修正系統,其 中上述之深度產生器根據一左影像以產生一左深度圖,且 根據一右影像以產生一右深度圖,其中該影像為該左影像 或δ玄右影像。 7 .如申凊專利範圍第1項所述三維影像的深度修正系統,更 包含一深度影像成像(DIBR)單元,其接收該修正深度 圖,藉以產生至少一修正左影像及至少一修正右影像。 8 .如申請專利範圍第1項所述三維影像的深度修正系統,其 中該差異性偵測單元係使用左到右確認 (Left-Right-checking)或右到左確認 (Right-Left-checking)技術來债測差異性。 9 .如申請專利範圍第1項所述三維影像的深度修正系統,其 中該内插單元包含一三向濾波器(Tri lateral Filter :) ο 10 . —種三維(3D)影像對的深度修正方法,包含: 接收相關於該二維影像對之其中一影像的至少一初 始深度圖,其中該景>像具有複數個像素,該初始深度圖記 錄了每一該些像素的一初始深度值; 偵測該影像中該些像素之間的差異性,以估測一可 信圖(reliable map);及 根據該可信圖以及近似的該些像素來内插該初始深度值, 100120089 表單編號A0101 第12頁/共π頁 Μ 1ΑΛ* 201250628 11 . 12 . Ο 13 . G14 . 15 . 100120089 以藉由修正該初始深度值產生一修正深度圖。 如申凊專利範圍第1 〇項所述三維影像的深度修正方法,其 中於估測該可信圖之步驟中包含: 對每一該些像素計算出一差異值,代表每一該些像素與鄰 近該些像素的差異性;及 根據所偵測到的差異性估測該可信圖。 如申睛專利範圍第11項所述三維影像的深度修正方法,其 中該可彳§圖記錄了每一該些像素的一可信值,於估測該可 信圖之步驟中更包含: 提供一預設門檻值; 設定該差異值小於該預設門檻值的該些像素之該可信值為 1 ;及 設定該差異值大於該預設門檻值的該些像素.之該可信值為 0 ° 如申請專利範圍第12項所述三維影像的深度修正方法,其 中於内插該初始深度值之步驟中包含: 根據該些像素之間的空間與亮度近似程度來運算該初始深 度值。 如申請專利範圍第10項所述三維影像的深度修正方法,係 使用左到右雄認(Left-Right-checking)或右到左確認 (Right-Left-checking)技術來偵測差異性。 如申請專利範圍第10項所述三維影像的深度修正方法,係 使用一三向渡波器(Trilateral Filter)來内插該初始 深度值。 如申請專利範圍第10項所述三維影像的深度修正方法,其 中上述之至少一初始深度圖包含對應於一左影像之一左深 表單編號A0101 第13頁/共17頁 1002033987-0 16 . 201250628 度圖,及對應於一右影像之一右深度圖,其中該影像為該 左影像或該右影像。 17 . 如申請專利範圍第1〇項所述三維影像的深度修正方法,更 包含: 接收該修正深度圖,藉以產生至少一修正左影像及至少一 修正右影像。 100120089 表單編號A0101 第14頁/共17頁 1002033987-0201250628 VII. Patent application scope: 1. A depth correction system for a three-dimensional (3D) image pair, comprising: a depth generator for generating at least one initiality related to one of the three-dimensional image pairs based on a stereo pairing technique; a depth map, wherein the image has a plurality of pixels, the initial depth map records an initial depth value of each of the pixels; and a depth modifier includes: a difference detection unit, based on the two-way mask detection technology (two-directional occlusion detection Ο techn i que) to measure the difference between the pixels in the image, and estimate a reliable map based on the detected difference; and an interpolation Unit, interpolating the initial depth value according to the trusted map and the approximate pixels to generate a corrected depth map 修正2 by correcting the initial depth value. The depth of the 3D image as described in claim 1 The correction system, in which the difference detection unit calculates a difference value for each of the pixels, representing the difference from the adjacent pixels. 3. The depth correction system for a three-dimensional image according to claim 2, wherein the trusted map comprises a trusted area and an untrusted area, wherein the trusted area comprises the difference value being less than a preset threshold The pixels, and the untrusted area has the difference value being greater than the pixels of the preset threshold value. 4. The depth correction system for a three-dimensional image according to claim 3, wherein the interpolation unit is based on a spatial approximation degree of the pixels to the initial of the pixels in the untrusted region. Interpolation of depth values 100120089 Form number A0101 Page 11 of 17 1002033987-0 201250628 Operation. 5. The depth correction system of the 3D image according to claim 3, wherein the trusted map records a trusted value of each of the pixels, and the difference detecting unit is configured in the trusted area. The trusted value of the pixels is 1, and the trusted value of the pixels set in the untrusted area is 〇. 6. The depth correction system for a three-dimensional image according to claim 1, wherein the depth generator generates a left depth map according to a left image, and generates a right depth map according to a right image, wherein the image For the left image or the δ right image. 7. The depth correction system for a three-dimensional image according to claim 1, further comprising a depth image imaging (DIBR) unit that receives the corrected depth map to generate at least one corrected left image and at least one corrected right image . 8. The depth correction system for a three-dimensional image according to claim 1, wherein the difference detection unit uses Left-Right-checking or Right-Left-checking. Technology to measure differences. 9. The depth correction system for a three-dimensional image according to claim 1, wherein the interpolation unit comprises a three-way filter (Tri lateral Filter:) ο 10 - a depth correction method for a three-dimensional (3D) image pair The method includes: receiving at least one initial depth map related to one of the two-dimensional image pairs, wherein the scene has a plurality of pixels, and the initial depth map records an initial depth value of each of the pixels; Detecting a difference between the pixels in the image to estimate a reliable map; and interpolating the initial depth value according to the trusted map and the approximate pixels, 100120089 Form No. A0101 Page 12 of π page Μ 1ΑΛ* 201250628 11 . 12 . Ο 13 . G14 . 15 . 100120089 To generate a corrected depth map by correcting the initial depth value. The method for estimating a depth of a three-dimensional image according to the first aspect of the invention, wherein the step of estimating the trusted image comprises: calculating a difference value for each of the pixels, representing each of the pixels and The difference in proximity to the pixels; and estimating the trusted map based on the detected differences. The depth correction method for a three-dimensional image according to claim 11, wherein the map records a trusted value of each of the pixels, and the step of estimating the trusted map further includes: providing a preset threshold value; setting the trusted value of the pixels whose difference value is less than the preset threshold value to 1; and setting the pixel value of the pixel whose difference value is greater than the preset threshold value. 0° The method for depth correction of a three-dimensional image according to claim 12, wherein the step of interpolating the initial depth value comprises: calculating the initial depth value according to a spatial approximation degree between the pixels. The depth correction method for the three-dimensional image described in claim 10 of the patent application uses Left-Right-checking or Right-Left-checking techniques to detect the difference. The depth correction method of the three-dimensional image described in claim 10 of the patent application uses a Trilateral Filter to interpolate the initial depth value. The depth correction method for a three-dimensional image according to claim 10, wherein the at least one initial depth map includes one of the left images, a left depth form number A0101, a page 13 / a total of 17 pages 1002033987-0 16 . 201250628 a degree map, and a right depth map corresponding to one of the right images, wherein the image is the left image or the right image. 17. The depth correction method of the 3D image according to the first aspect of the patent application, further comprising: receiving the corrected depth map, thereby generating at least one corrected left image and at least one corrected right image. 100120089 Form No. A0101 Page 14 of 17 1002033987-0
TW100120089A 2011-06-09 2011-06-09 System and method of revising depth of a 3d image pair TWI514325B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW100120089A TWI514325B (en) 2011-06-09 2011-06-09 System and method of revising depth of a 3d image pair

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW100120089A TWI514325B (en) 2011-06-09 2011-06-09 System and method of revising depth of a 3d image pair

Publications (2)

Publication Number Publication Date
TW201250628A true TW201250628A (en) 2012-12-16
TWI514325B TWI514325B (en) 2015-12-21

Family

ID=48139320

Family Applications (1)

Application Number Title Priority Date Filing Date
TW100120089A TWI514325B (en) 2011-06-09 2011-06-09 System and method of revising depth of a 3d image pair

Country Status (1)

Country Link
TW (1) TWI514325B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI503618B (en) * 2012-12-27 2015-10-11 Ind Tech Res Inst Device for acquiring depth image, calibrating method and measuring method therefore

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201033936A (en) * 2009-03-11 2010-09-16 Univ Nat Cheng Kung Method of synthesizing stereoscopic video

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI503618B (en) * 2012-12-27 2015-10-11 Ind Tech Res Inst Device for acquiring depth image, calibrating method and measuring method therefore
US9319660B2 (en) 2012-12-27 2016-04-19 Industrial Technology Research Institute Device for acquiring depth image, calibrating method and measuring method therefor

Also Published As

Publication number Publication date
TWI514325B (en) 2015-12-21

Similar Documents

Publication Publication Date Title
US8629901B2 (en) System and method of revising depth of a 3D image pair
JP6016061B2 (en) Image generation apparatus, image display apparatus, image generation method, and image generation program
US8854425B2 (en) Method and apparatus for depth-related information propagation
JP5287702B2 (en) Image processing apparatus and method, and program
TWI496452B (en) Stereoscopic image system, stereoscopic image generating method, stereoscopic image adjusting apparatus and method thereof
JP5879713B2 (en) Image processing apparatus, image processing method, and program
Cheng et al. Spatio-temporally consistent novel view synthesis algorithm from video-plus-depth sequences for autostereoscopic displays
JP5755571B2 (en) Virtual viewpoint image generation device, virtual viewpoint image generation method, control program, recording medium, and stereoscopic display device
US20110080463A1 (en) Image processing apparatus, method, and recording medium
Reel et al. Joint texture-depth pixel inpainting of disocclusion holes in virtual view synthesis
TW201301202A (en) Image processing method and image processing apparatus thereof
KR101918030B1 (en) Method and apparatus for rendering hybrid multi-view
JP2015012429A (en) Image processing apparatus, image processing method, and image processing program
JP5692051B2 (en) Depth estimation data generation apparatus, generation method and generation program, and pseudo stereoscopic image generation apparatus, generation method and generation program
KR20140001358A (en) Method and apparatus of processing image based on occlusion area filtering
Liu et al. Hole-filling based on disparity map and inpainting for depth-image-based rendering
TWI478100B (en) Method of image depth estimation and apparatus thereof
TW201250628A (en) System and method of revising depth of a 3D image pair
TW201230770A (en) Apparatus and method for stereoscopic effect adjustment on video display
KR101329069B1 (en) Depth estimation data generating device, computer readable recording medium having depth estimation data generating program recorded thereon, and pseudo-stereo image display device
TWI479455B (en) Method for generating all-in-focus image
JP5871113B2 (en) Stereo image generation apparatus, stereo image generation method, and stereo image generation program
JP5459231B2 (en) Pseudo stereoscopic image generation apparatus, pseudo stereoscopic image generation program, and pseudo stereoscopic image display apparatus
Yang et al. Depth image-based rendering with edge-oriented hole filling for multiview synthesis
Jovanov et al. Depth video enhancement for 3D displays