TW201017092A - Three-dimensional model reconstruction method and system thereof - Google Patents

Three-dimensional model reconstruction method and system thereof Download PDF

Info

Publication number
TW201017092A
TW201017092A TW97140331A TW97140331A TW201017092A TW 201017092 A TW201017092 A TW 201017092A TW 97140331 A TW97140331 A TW 97140331A TW 97140331 A TW97140331 A TW 97140331A TW 201017092 A TW201017092 A TW 201017092A
Authority
TW
Taiwan
Prior art keywords
optical pattern
dimensional model
optical
resolution
color
Prior art date
Application number
TW97140331A
Other languages
Chinese (zh)
Other versions
TWI388797B (en
Inventor
jia-yan Chen
Ji-Fa Chen
yu-yao Zhang
Original Assignee
Univ Ishou
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Univ Ishou filed Critical Univ Ishou
Priority to TW97140331A priority Critical patent/TWI388797B/en
Publication of TW201017092A publication Critical patent/TW201017092A/en
Application granted granted Critical
Publication of TWI388797B publication Critical patent/TWI388797B/en

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

A three-dimensional model reconstruction method includes the following steps: (a) extracting a profile information of a DUT (device under test); (b) based on the profile information, generating an optical pattern of a desired projection; (c) carrying out the projection according to the optical pattern; (d) extracting an optical imaging map corresponding to the optical pattern; (e) according to the optical imaging map and using a triangulation method, finding the surface information of the DUT; (f) based on the surface information of the DUT, determining whether there is a need to adjust the resolution of the optical pattern; and (g) if it is necessary to adjust the resolution, adjusting the optical pattern resolution of the desired projection, and returning to the Step (c).

Description

201017092 九、發明說明: 【發明所屬之技術領域】 本發明是有關於一種三維模型重建技術,特別是指一 種依據目標區域調整光源色彩與解析度之三維模型重建方 法及其系統。 【先前技術】 近年來’科技的進步使得光電檢測領域更加重要;在 以機器視覺方式自動獲取三維資訊的研究中,結構光學法 是較為準確的一種三維資訊擷取方式。 結構光學法通常是利用不同形式的遮罩放置在光源前 面,而將不同的光學樣型(light pattern )投射至物體的表 面,目前已有許多相關專利:如美國專利公告號 US5,675,407中所揭露,係使用菱鏡分光,然後利用不同折 射的波長光譜而產生彩色光投射物體,但其投射的範圍及 色彩固定;又,如美國專利公告號US6 853 458中所揭露, 係固定彩色條紋光,增加投射結構光學樣型的顏色,配合 物體或投射器的移動來得到物體三維資訊,但其可能因為 物體表面顏色而產生辨識編碼上的誤差。 上述使用結構光學法來重建三維模型的習知技術,已 具有相當的精確度;但其對於具有多重顏色的物體,仍有 辨識不清的疑慮,而且,無法針對物體的細部調整適當的 解析度,造成部份重建資料的不完整;另一方面,習知技 術需要擷取多組影像以獲得足夠的表面資訊,其處理之資 料量較為龐大,對於即時(real time)的應用而言,處理速 5 201017092 ,有必要尋求一解決之道,以改善上述 度略嫌緩慢;因此 缺點。 【發明内容】 因此,本發明之目的,即在提供一種三維模型重建方 法。 於是,本發明三維模型重建方法是包含下列步驟⑷ 擷取-待測物之-輪廊資訊;(b)根據該輪靡資訊產生欲 投射之-光學樣型;(e)根據該光學樣型進行投射;⑷操 取對應該光學樣型之-光學樣型成像圖;(e)根據該光學樣 型成像圖並利用一三角量測法,纟出一待測物表面資訊組 ;⑴根據該待測物表面資訊組,判斷是否需調整該光學 樣型之-解析度;a (g)若需調整該解析度,則調整欲投 射之光學樣型的解析度,然後,回到該步驟(c)。 本發明之另一目的,即在提供一種三維模型重建系統 於疋,本發明二維模型重建系統是包含一光學樣型產 生單7C、一光投射單元、一表面資訊計算單元、一解析度 決定單兀,以及一模型重建單元。該光學樣型產生單元用 以根據一待測物之一輪廓資訊,產生欲投射之一光學樣型 。該光投射單元用以根據該光學樣型進行投射。該表面資 訊計算單元用以根據對應該光學樣型之一光學樣型成像圖 ,並利用一二角量測法,求出一待測物表面資訊組,以及 用以根據該待測物表面資訊組,判斷是否需調整該光學樣 型之一解析度。該解析度決定單元用以調整該光學樣型產 201017092 單元所產生之光學樣型的解析度。該模型重建單元用以 根據該待測物表面資訊組進行三維模型重建。 【實施方式】 有關本發明之前述及其他技術内容、特點與功效,在 以下配合參考圖式之一個較佳實施例的詳細說明中,將可 清楚的呈現。 參閱圖1、圖2與圖3,本發明三維模型重建系統之較 佳實施例包含一光感測單元丨、一光投射單元2、一旋轉角 度控制單元31、—影像暫存單元32、-顏色決定單元33、 -光學樣型產生單丨34、—表面資訊計算單元35、一解析 度決定單兀36,以及一模型重建單元37;該模型重建單元 37包括一模型重建資料庫371。該三維模型重建系統係用 於對放置於-旋轉單元4之一轉盤41上的一待測物5進行 光電檢測,並重建出該待測物5之三維模型。 該旋轉角度控制單元31係根據一預設角度產生一馬達 控制彳s號給該旋轉單元4之一步進馬達(圖未示),以使其 帶動該轉盤41依該預設角度進行旋轉;其中,該預設角度 可依該三維模型重建系統之所需解析度,配合該步進馬達 加以變化,例如,20度/步、30度/步等,以可以整除36〇 度為原則。 對於該轉盤41在某一旋轉角度的情況下,該影像暫存 單元32暫存由該光感測單元丨擷取到的該待測物5之—色 彩資訊及一輪廓資訊;該光學樣型產生單元34根據該輪廓 資訊,產生欲投射之一光學樣型;該顏色決定單元33根據 201017092 該色彩資訊決定該光學樣型的一顏色;該光投射單元2先 根據該光學樣型進行投射;該光感測單元1接著擷取對應 該光學樣型之一光學樣型成像圖;該影像暫存單元32更暫 存由該光感測單元1擷取到的該光學樣型成像圖;該表面 資訊計算單元35根據該光學樣型成像圖,並利用三角量測 法,求出一待測物表面資訊組,且儲存於該模型重建資料 庫371,該表面資訊計算早元35更根據該待測物表面資訊 組判斷是否需調整該光學樣型之一解析度;若需調整解析 度,則該解析度決定單元30調整該光學樣型的解析度;該 模型重建單元37根據儲存於該模型重建資料庫371的待測 物表面資訊組進行三維模型重建。 在本較佳實施例中,該光感測單元1為一數位像機; 該光投射單元2為一彩色單搶投影機;該轉角度控制單元 31、影像暫存單元32、顏色決定單元33、光學樣型產生單 元34、表面資訊計算單元35、解析度決定單元%,以及模 型重建單元37係整合成可於電腦執行的軟體。 對應上述三維模型重建系統,本發明三維模型重建方 法之較佳實施例包含下列步驟。值得一提的是,進行本發 明二維模型重建方法之前,必須先進行該數位像機(即, 該光感測單元1)之鏡頭焦距(以y表示)校正,並進行該 彩色單搶投影機(即,該光投射單元2)之投射角度(以^ 表示)校正,以減低量測誤差。在本較佳實施例中,該待 測物5、該光感測單元i與該光投射單元2間的夾角為9〇。 ,而且,下列步驟是該轉盤41在某一旋轉角度的情況下所 201017092 進行之處理。 在步驟61中,該光感測單元1裸取該待測物5之色彩 資訊及輪廓資訊’且該影像暫存單元32暫存該色彩資訊及 輪廓資訊。 在步驟62中,該光學樣型產生,單元34根據該待測物5 之輪廓資訊產生欲投射之光學樣型;在本較佳實施例中, 該光學樣型產生單元34係根據輪廓資訊對應出該待測物5 上至下及左至右的範圍’並初步決定該解析度,然後依此 產生欲投射於該待測物5之光學樣型。 在步驟63中,該顏色決定單元33根據該待測物5之 色彩資訊’決定該光學樣型產生單元34所產生之該光學樣 型的顏色,在本較佳實施例中,該顏色為該色彩資訊的互 補色,但實際上,該顏色之選擇係以不同於該待測物5之 色彩資訊為主,避免投射相同或過於相近的顏色而造成不 易辨識的問題,並不限於本較佳實施例所揭露。 在步驟64中,該光投射單元2根據該光學樣型進行投 射;在本較佳實施例中,所使用的光學樣型為點狀圖案光 學樣型,即,如圖3所示,該光投射單元2係將該光學樣 型依其解析度及顏色,以複數掃描投影點71的方式投射至 該待測物5之表面。 在步驟65〜66中,該光感測單元1擷取對應該光學樣 型之光學樣型成像圖8,然後,該表面資訊計算單元35根 據該光學樣型成像圖8,並洲三角量測法,求出該待測物 表面資訊組。 k 9 201017092 參閱圖1、圖3與圖4,假設該光學樣型成像圖8之成 像晝面中心為三維座標系統原點(即,圖4中的χ γ ζ 軸的交會點0)’該光投射單元2落在X軸上,該光感測單 元1之鏡頭焦距/落在ζ轴上,該光學樣型成像圖8之每— 成像點之座標資訊以心,力表示,該光投射單元2於該待 測物5之掃描投影點71的座標資訊以p(n,w表示, 該待測物表面資訊組包括該等掃描投影點71的座標資訊, 每一座標資訊具有-深度座標(即,Zg);其中掃描投影 點71、該光投射單元2與該光感測單元i之夾角為α,該 光投射單兀2與該光感測單元丨之距離為&。其中,成像點 之座標資訊外”),與掃描投影點71的座標資訊哪,匕 ’ Ζ0)間的關係如式子⑴所示。利用正切函數經過式子(° 2)〜(4)的運算’最後可得χ〇,同理可得h與&因此 ,該待測物5之掃描投影點71的座標資訊(χΰ,,a)整理 為式子(5) ~ (7)。201017092 IX. Description of the Invention: [Technical Field] The present invention relates to a three-dimensional model reconstruction technique, and more particularly to a three-dimensional model reconstruction method and system for adjusting the color and resolution of a light source according to a target region. [Prior Art] In recent years, the advancement of technology has made the field of photoelectric detection more important. In the research of automatically acquiring three-dimensional information by machine vision, the structural optical method is a more accurate three-dimensional information acquisition method. Structural optics typically utilize different forms of masks placed in front of the light source to project different optical patterns onto the surface of the object. There are a number of related patents, such as in U.S. Patent No. 5,675,407. It is disclosed that the prisms are split and then the wavelength spectrum of the different refractions is used to produce a colored light projecting object, but the range and color of the projection are fixed; and, as disclosed in US Pat. No. 6,853,458, the color stripe light is fixed. The color of the optical structure of the projection structure is increased, and the three-dimensional information of the object is obtained by the movement of the object or the projector, but the error in the identification code may be generated due to the color of the surface of the object. The above-mentioned conventional technique of reconstructing a three-dimensional model using the structural optical method has considerable accuracy; however, it still has unclear doubts about objects having multiple colors, and it is impossible to adjust an appropriate resolution for the details of the object. It causes incomplete reconstruction of some of the reconstructed data; on the other hand, the prior art requires multiple sets of images to obtain sufficient surface information, and the amount of data processed is relatively large, for real time applications, processing Speed 5 201017092, it is necessary to seek a solution to improve the above degree slightly slower; therefore shortcomings. SUMMARY OF THE INVENTION Accordingly, it is an object of the present invention to provide a three-dimensional model reconstruction method. Therefore, the three-dimensional model reconstruction method of the present invention comprises the following steps (4) extracting - the information of the object to be tested - (b) generating an optical sample to be projected based on the information of the rim; (e) according to the optical pattern Projecting; (4) taking an optical pattern imaging image corresponding to the optical pattern; (e) drawing a surface information group of the object to be tested according to the optical pattern imaging map and using a triangular measurement method; (1) according to the The surface information group of the object to be tested determines whether it is necessary to adjust the resolution of the optical sample; a (g) if the resolution needs to be adjusted, adjust the resolution of the optical sample to be projected, and then return to the step ( c). Another object of the present invention is to provide a three-dimensional model reconstruction system. The two-dimensional model reconstruction system of the present invention comprises an optical sample generation unit 7C, a light projection unit, a surface information calculation unit, and a resolution determination. Single 兀, and a model reconstruction unit. The optical pattern generating unit is configured to generate an optical pattern to be projected based on the contour information of one of the objects to be tested. The light projection unit is configured to project according to the optical pattern. The surface information calculation unit is configured to obtain an image information group of the object to be tested according to an optical pattern imaging image corresponding to one of the optical samples, and use the surface information group of the object to be tested according to the surface information of the object to be tested. Group, determine whether it is necessary to adjust the resolution of one of the optical samples. The resolution determining unit is configured to adjust the resolution of the optical pattern generated by the optical prototype 201017092 unit. The model reconstruction unit is configured to perform three-dimensional model reconstruction according to the surface information group of the object to be tested. The above and other technical contents, features, and advantages of the present invention will be apparent from the following detailed description of the preferred embodiments. Referring to FIG. 1 , FIG. 2 and FIG. 3 , a preferred embodiment of the three-dimensional model reconstruction system of the present invention comprises a light sensing unit, a light projection unit 2, a rotation angle control unit 31, and an image temporary storage unit 32, The color decision unit 33, the optical pattern generation unit 34, the surface information calculation unit 35, a resolution determination unit 36, and a model reconstruction unit 37; the model reconstruction unit 37 includes a model reconstruction database 371. The three-dimensional model reconstruction system is used for photodetection of a test object 5 placed on a turntable 41 of the -rotation unit 4, and reconstructing a three-dimensional model of the test object 5. The rotation angle control unit 31 generates a motor control 彳s number to a stepping motor (not shown) of the rotating unit 4 according to a preset angle, so as to drive the turntable 41 to rotate according to the preset angle; The preset angle can be changed according to the required resolution of the three-dimensional model reconstruction system, and is matched with the stepping motor, for example, 20 degrees/step, 30 degrees/step, etc., so as to divide 36 degrees. In the case that the turntable 41 is at a certain rotation angle, the image temporary storage unit 32 temporarily stores the color information and a contour information of the object to be tested 5 captured by the light sensing unit; The generating unit 34 generates an optical pattern to be projected according to the contour information; the color determining unit 33 determines a color of the optical pattern according to the color information of 201017092; the light projecting unit 2 first projects according to the optical pattern; The optical sensing unit 1 then captures an optical pattern image corresponding to one of the optical patterns; the image temporary storage unit 32 temporarily stores the optical pattern image captured by the light sensing unit 1; The surface information calculation unit 35 obtains a surface information group of the object to be tested according to the optical pattern imaging map and uses a triangulation method, and stores it in the model reconstruction database 371, and the surface information calculation is based on the early element 35. The surface information group of the object to be tested determines whether it is necessary to adjust the resolution of the optical pattern; if the resolution is to be adjusted, the resolution determining unit 30 adjusts the resolution of the optical pattern; the model reconstruction unit 37 is configured according to Reconstruction model stored in the database 371 of the test surface model reconstruction of three-dimensional information set. In the preferred embodiment, the light sensing unit 1 is a digital camera; the light projection unit 2 is a color single-shot projector; the rotation angle control unit 31, the image temporary storage unit 32, and the color determining unit 33. The optical pattern generating unit 34, the surface information calculating unit 35, the resolution determining unit %, and the model reconstructing unit 37 are integrated into a software executable by a computer. Corresponding to the above three-dimensional model reconstruction system, the preferred embodiment of the three-dimensional model reconstruction method of the present invention comprises the following steps. It is worth mentioning that before performing the two-dimensional model reconstruction method of the present invention, the lens focal length (indicated by y) of the digital camera (ie, the light sensing unit 1) must be corrected, and the color single shot projection is performed. The projection angle (indicated by ^) of the machine (i.e., the light projection unit 2) is corrected to reduce the measurement error. In the preferred embodiment, the object 5, the light sensing unit i and the light projecting unit 2 have an angle of 9 〇. Moreover, the following steps are performed by the turntable 41 at a certain rotation angle of 201017092. In step 61, the light sensing unit 1 takes the color information and the contour information of the object to be tested 5 and the image temporary storage unit 32 temporarily stores the color information and the contour information. In step 62, the optical pattern is generated, and the unit 34 generates an optical pattern to be projected according to the contour information of the object to be tested 5; in the preferred embodiment, the optical pattern generating unit 34 corresponds to the contour information. The range of the object to be tested 5 from top to bottom and left to right is determined and the resolution is initially determined, and then an optical pattern to be projected on the object to be tested 5 is generated. In step 63, the color determining unit 33 determines the color of the optical pattern generated by the optical pattern generating unit 34 according to the color information ' of the object 5 to be tested. In the preferred embodiment, the color is The complementary color of the color information, but in fact, the selection of the color is mainly based on the color information different from the object to be tested 5, and avoids the problem of not being able to recognize the same or too close color, which is not limited to the present. The embodiment discloses. In step 64, the light projecting unit 2 projects according to the optical pattern; in the preferred embodiment, the optical pattern used is a dot pattern optical pattern, that is, as shown in FIG. The projection unit 2 projects the optical pattern onto the surface of the object to be tested 5 in a plurality of scanning projection points 71 according to its resolution and color. In steps 65-66, the light sensing unit 1 captures an optical pattern imaging image corresponding to the optical pattern. Then, the surface information calculating unit 35 images the image according to the optical pattern, and measures the triangle. Method, the surface information group of the object to be tested is obtained. k 9 201017092 Referring to FIG. 1, FIG. 3 and FIG. 4, it is assumed that the imaging pupil center of FIG. 8 is the origin of the three-dimensional coordinate system (ie, the intersection point of the χ γ ζ axis in FIG. 4). The light projection unit 2 falls on the X-axis, and the focal length of the lens of the light sensing unit 1 falls on the x-axis. Each of the optical patterns is imaged. The coordinate information of the imaging point is expressed by the heart and the force. The coordinate information of the scanning projection point 71 of the unit 2 at the object to be tested 5 is represented by p(n, w, the surface information group of the object to be tested includes the coordinate information of the scanning projection points 71, and each of the label information has a depth coordinate (ie, Zg); wherein the scanning projection point 71, the angle between the light projection unit 2 and the light sensing unit i is α, and the distance between the light projection unit 2 and the light sensing unit 为 is & The relationship between the coordinate information of the imaging point ") and the coordinate information of the scanning projection point 71, 匕' Ζ 0) is as shown in the equation (1). The operation of the equation (° 2) to (4) is performed by using the tangent function. Finally, you can get χ〇, the same can get h and & therefore, the coordinate information of the scanned projection point 71 of the object to be tested 5 (χΰ,, a) Organize as equations (5) ~ (7).

Hi X y f ..............*........ ( η tan⑻ Z〇 k-Xr (2) Z〇Hi X y f ..............*........ ( η tan(8) Z〇 k-Xr (2) Z〇

Xn x f = tan(_a) · (k - X ). •^〇[— + tan(or)] = tan(or)»k ·· (3)(4) tan⑻· · jc f + x· tan(or) (6)201017092 γ _ tan(a) ·众· y f + x*tan(a) (7) ton(〇>fc· f f + x*tan(oc) 從式子(5)〜(7、 )可知,欲計算該待測物5之掃描投 影點71的座標資訊(Hz0),需……"等參 數其中/為該光感測單元】之鏡頭焦距(即相機焦距 )該光感測單70 1出廠時皆有固定的焦距範圍,其隨著放 大倍率而改變’灸疋已知的該光投射單元2與該光感測單元 1之距離’ X、3;為成像點相對於該光學樣型成像圖8之畫面 中心的寬度與高度;❹丨為掃描投影·點71、該光投射單元 2與β亥光感測單元丨之夾角,每個投影點所相對之角度“不 太一樣,並非從同一投射樣型之投射點角度α都相同,必 須預先根據電腦設計樣型來決定各個投影點的角度α。 ❹ 在步驟67中,該表面資訊計算單元35根據該待測物 表面資訊組判斷是否需調整該光學樣型之解析度;若是, 則繼續進行步驟68 ;否則’繼績進行步驟69。在本較佳實 施例中’若該表面資訊計算單元35由該等深度座標得知該 待測物5之掃描投影點71中有深度不同者(如圖3中,有 三個深度相異的掃描投影點711 ),則判斷為需調整該光學 樣型之解析度。 參閱圖1、圖3與圖5,在步驟68中,該解析度決定 單元36進一步增加該光學樣型的解析度,並回到該步驟64 11 201017092 在步驟69中,該表面資訊計算單元35將該待測物表 面資料組儲存於該模型重建資料庫371,以待該模型重建單 元37進行後續之三維模型重建。 值得一提得是,本發明的重點係在依該待測物5之色 彩資訊及待測物表面資料組,調整該光投射單元2投射之 光學樣型的顏色及解析度;而該模型重建單元37所執行的 三維模型重建之詳細實作方式係為熟習此項技術者所熟知 ,故不在此贅述。 歸納上述,本發明以回授的方式,可調適地調整欲投 射之該光學樣型的顏色及解析度,大幅的增加對於該待測 物5之表面辨識的完整性與強健性,並使獲得的待測物表 面資料組更為精確,無須像習知技術需要擷取多組影像方 能獲得足夠的表面資訊,有利於即時的應用,的確可以達 成本發明之目的。 惟以上所述者,僅為本發明之較佳實施例而已,當不 能以此限定本發明實施之範圍,即大凡依本發明申請專利 範圍及發明說明内容所作之簡單的等效變化與修飾,皆仍 屬本發明專利涵蓋之範圍内 【圖式簡單說明】 圖1是一架構圖,說明本發明三維模型重建系統之較 佳實施例及其架設; 圖2是一流程圖,說明本發明三維模型重建方法之較 佳實施例; 201017092 圖3是一示意圖,說明本發明中所使用的掃描投影點 及光學樣型成像圖; 圖4是一示意圖,說明本發明中所使用的三角量測法 ;及 圖5是一示意圖,說明依本發明三維模型重建方法調 整解析度後的掃描投影點及光學樣型成像圖。Xn xf = tan(_a) · (k - X ). ^^〇[— + tan(or)] = tan(or)»k ·· (3)(4) tan(8)· · jc f + x· tan( Or) (6)201017092 γ _ tan(a) · public · yf + x*tan(a) (7) ton(〇>fc· ff + x*tan(oc) From equation (5)~(7 It can be seen that, in order to calculate the coordinate information (Hz0) of the scanning projection point 71 of the object to be tested 5, the lens focal length (ie, the camera focal length) of the parameters such as "the light sensing unit" is required. The test sheet 70 1 has a fixed focal length range when it leaves the factory, which changes with the magnification, 'the distance between the light projection unit 2 and the light sensing unit 1 known as 'the moxibustion' X; 3; The optical pattern images the width and height of the center of the screen of FIG. 8; ❹丨 is the scanning projection point 71, the angle between the light projection unit 2 and the β-light sensing unit ,, and the angle of each projection point is “not Too similar, not the projection angle α of the same projection pattern is the same, the angle α of each projection point must be determined in advance according to the computer design. ❹ In step 67, the surface information calculation unit 35 is based on the object to be tested. Surface information group Whether the resolution of the optical pattern needs to be adjusted; if yes, proceed to step 68; otherwise, 'the succession proceeds to step 69. In the preferred embodiment, 'if the surface information calculation unit 35 is known by the depth coordinates If there is a difference in depth between the scanning projection points 71 of the object 5 (as shown in FIG. 3, there are three scanning projection points 711 having different depths), it is determined that the resolution of the optical pattern needs to be adjusted. 3 and FIG. 5, in step 68, the resolution determining unit 36 further increases the resolution of the optical pattern, and returns to the step 64 11 201017092. In step 69, the surface information calculating unit 35 compares the to-be-tested The object surface data set is stored in the model reconstruction database 371, so that the model reconstruction unit 37 performs subsequent three-dimensional model reconstruction. It is worth mentioning that the focus of the present invention is on the color information of the object to be tested 5 and Measuring the surface data set, adjusting the color and resolution of the optical pattern projected by the light projecting unit 2; and the detailed implementation of the 3D model reconstruction performed by the model reconstruction unit 37 is well known to those skilled in the art. Therefore, it is not described here. In summary, the present invention adjusts the color and resolution of the optical pattern to be projected in a feedback manner, and greatly increases the integrity and robustness of the surface identification of the object to be tested 5. Sexuality, and the obtained surface data set of the object to be tested is more precise, and it is not necessary to acquire a plurality of sets of images as in the prior art to obtain sufficient surface information, which is advantageous for immediate application, and can indeed achieve the object of the present invention. The above is only the preferred embodiment of the present invention, and the scope of the invention is not limited thereto, that is, the simple equivalent changes and modifications made by the scope of the invention and the description of the invention are all 1 is a structural diagram illustrating a preferred embodiment of the three-dimensional model reconstruction system of the present invention and its erection; FIG. 2 is a flow chart illustrating the three-dimensional model of the present invention. Preferred Embodiment of Reconstruction Method; 201017092 FIG. 3 is a schematic view showing a scanning projection point and an optical pattern imaging image used in the present invention; A schematic diagram illustrating triangulation measurement method used in the present invention; and FIG. 5 is a schematic, three-dimensional model described under this invention, the projection reconstruction method of adjusting the scanning point after the optical resolution and image type view of the entire sample.

13 201017092 【主要元件符號說明】 1 ««««»* …光感測單元 **»»»»»» •模型重建單元 2 ”…· …光投射單元 3 7 1 + 模型重建資料庫 1«*«^» …旋轉角度控制單 *旋轉單元 元 •轉盤 3 2 ‘ * * μ •…影像暫存單元 •待測物 3 3* •…顏色決定單元 61〜6 9…* •步驟 3 4- …光學樣型產生單 η γ •掃描投影點 元 »»»**♦. •深度相異的掃描 3 5 * …一 …表面資訊計算單 投影點 元 8 “"…… 光學樣型成像圖 36**"** •…解析度決定單元 1413 201017092 [Description of main component symbols] 1 ««««»* ...light sensing unit **»»»»»» • Model reconstruction unit 2 ”...·...light projection unit 3 7 1 + model reconstruction database 1« *«^» ...Rotation angle control single *Rotating unit element • Turntable 3 2 ' * * μ •...Image temporary storage unit • DUT 3 3* •...Color decision unit 61~6 9...* • Step 3 4- ...the optical model produces a single η γ •scan projection point element»»»**♦. • Depth-differentiated scan 3 5 * ... a... surface information calculation single-projection point element 8 ""...... optical sample imaging 36**"** •...resolution determination unit 14

Claims (1)

201017092 十、申請專利範圍: 1. 一種三維模型重建方法,包含下列步驟: (a) 擷取一待測物之一輪廓資訊; (b) 根據該輪廓資訊產生欲投射之—光學樣型; (c )根據該光學樣型進行投射; (d)擷取對應該光學樣型之一光學樣型成像圖; (e )根據該光學樣型成像圖並利用—三角量測法, 求出一待測物表面資訊組; (f )根據該待測物表面資訊組,判斷是否需調整該 光學樣型之一解析度;及 (g)若需調整該解析度,則調整欲投射之光學樣型 的解析度’然後’回到該步驟(c )。 2. 依據申請專利範圍第1項所述之三維模型重建方法,該 步驟(c)之前更包含下列步驟: (h )操取該待測物之一色彩資訊;及 (Ο根據該色彩資訊決定欲投射之光學樣型的一顏 • 色。 3. 依據申請專利範圍第2項所述之三維模型重建方法,其 中’該顏色為該色彩資訊之一互補色。 4·依據申凊專利範圍第1項所述之三維模型重建方法,其 中,該待測物表面資訊組包括該待測物之表面的複數座 標資訊’每一座標資訊包括一深度座標。 5·依據申請專利範圍第4項所述之三維模型重建方法,其 中,在該步驟(f)中,係根據該等深度座標,判斷是否 15 201017092 需調整該光學樣型之解析度。 6. 依據申請專利範圍第1項所述之三維模型重建方法,更 包含一步驟U),若不需調整該解析度,則儲存該待測 物表面資訊組,以供進行後續之三維模型重建。 7. —種三維模型重建系統,包含: 一光學樣型產生單元,用以根據一待測物之—輪廓 資訊,產生欲投射之一光學樣型; 一光投射單元,用以根據該光學樣型進行投射; 一表面資訊計算單元,用以根據對應該光學樣型之 鲁 光學樣型成像圖’並利用一三角量測法,求出一待測 物表面資訊組,以及用以根據該待測物表面資訊組判 斷是否需調整該光學樣型之一解析度; 一解析度決定單元’用以調整該光學樣型產生單元 所產生之光學樣型的解析度;以及 一模型重建單元’用以根據該待測物表面資訊組進 行三維模型重建。 8. 依據申請專利範圍第7項所述之三維模型重建系統,更 鬱 包含一光感測單元,用以擷取該待測物之該輪廊資訊、 一色彩資訊,及該光學樣型成像圖。 9·依據申請專利範圍第8項所述之三維模型重建系統,更 包含一顏色決定單元’用以根據該色彩資訊決定該光學 樣型產生單元所產生之光學樣型的一顏色。 10·依據申請專利範圍第9項所述之三維模型重建系統,其 中’該顏色為該色彩資訊之'一互補色。 16 201017092 11 ·依據申請專利範 由,姑α、 第靖所述之三維模型重建系統,其 诚次 負^道包括該待測物之表面的複數座 標貝訊’每—座標資訊包括-深度座標。 依據申β月專利範圍第n項所述之三維模型重建系統,其 中’該解析度決定單元係根據該等深度座標,判斷是否 需調整該光學樣型之解析度。201017092 X. Patent application scope: 1. A three-dimensional model reconstruction method, comprising the following steps: (a) extracting a contour information of a test object; (b) generating an optical sample to be projected according to the contour information; c) projecting according to the optical pattern; (d) drawing an optical pattern image corresponding to one of the optical samples; (e) obtaining an image based on the optical pattern imaging image and using the triangular measurement method (f) determining whether to adjust the resolution of the optical sample according to the surface information group of the object to be tested; and (g) adjusting the optical pattern to be projected if the resolution is to be adjusted The resolution 'then' then returns to step (c). 2. According to the three-dimensional model reconstruction method described in claim 1, the step (c) further comprises the following steps: (h) acquiring color information of the object to be tested; and (determining according to the color information) 3. A color and color of the optical sample to be projected. 3. According to the three-dimensional model reconstruction method described in claim 2, wherein 'the color is one of the complementary colors of the color information. 4. According to the scope of the patent application The method for reconstructing a three-dimensional model according to the above, wherein the surface information group of the object to be tested includes a plurality of coordinate information of the surface of the object to be tested, and each coordinate information includes a depth coordinate. 5. According to the fourth item of the patent application scope The method for reconstructing a three-dimensional model, wherein, in the step (f), determining whether the resolution of the optical sample needs to be adjusted according to the depth coordinates is as follows: 6. According to the first aspect of the patent application scope The three-dimensional model reconstruction method further includes a step U). If the resolution is not needed, the surface information group of the object to be tested is stored for subsequent 3D model reconstruction. 7. A three-dimensional model reconstruction system, comprising: an optical pattern generating unit for generating an optical pattern to be projected according to an outline information of a test object; and a light projecting unit for determining an optical sample according to the optical sample a type of projection unit; a surface information calculation unit for determining a surface information group of the object to be tested according to a Lu optical pattern image corresponding to the optical pattern and using a triangulation method; The surface information group of the object determines whether it is necessary to adjust the resolution of the optical sample; a resolution determining unit is configured to adjust the resolution of the optical pattern generated by the optical pattern generating unit; and a model reconstruction unit The three-dimensional model reconstruction is performed according to the surface information group of the object to be tested. 8. The three-dimensional model reconstruction system according to claim 7 of the patent application scope, further comprising a light sensing unit for capturing the corridor information, a color information, and the optical image of the object to be tested Figure. 9. The three-dimensional model reconstruction system according to claim 8 of the patent application, further comprising a color determining unit </ RTI> for determining a color of the optical pattern generated by the optical pattern generating unit based on the color information. 10. The three-dimensional model reconstruction system according to claim 9, wherein the color is a complementary color of the color information. 16 201017092 11 · According to the patent application scope, the three-dimensional model reconstruction system described by Gu and Dijing includes the plural coordinates of the surface of the object to be tested, and each coordinate information includes - depth coordinates . According to the three-dimensional model reconstruction system described in item n of the patent application scope, the resolution determining unit determines whether the resolution of the optical pattern needs to be adjusted according to the depth coordinates. 1717
TW97140331A 2008-10-21 2008-10-21 Three - dimensional model reconstruction method and its system TWI388797B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW97140331A TWI388797B (en) 2008-10-21 2008-10-21 Three - dimensional model reconstruction method and its system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW97140331A TWI388797B (en) 2008-10-21 2008-10-21 Three - dimensional model reconstruction method and its system

Publications (2)

Publication Number Publication Date
TW201017092A true TW201017092A (en) 2010-05-01
TWI388797B TWI388797B (en) 2013-03-11

Family

ID=44830676

Family Applications (1)

Application Number Title Priority Date Filing Date
TW97140331A TWI388797B (en) 2008-10-21 2008-10-21 Three - dimensional model reconstruction method and its system

Country Status (1)

Country Link
TW (1) TWI388797B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI453387B (en) * 2011-12-28 2014-09-21 Univ Nat Kaohsiung Applied Sci Inspection device and method for spring-back performance of plastic hose
US9939803B2 (en) 2011-11-18 2018-04-10 Nike, Inc. Automated manufacturing of shoe parts
US10194716B2 (en) 2011-11-18 2019-02-05 Nike, Inc. Automated identification and assembly of shoe parts
US10393512B2 (en) 2011-11-18 2019-08-27 Nike, Inc. Automated 3-D modeling of shoe parts
US10552551B2 (en) 2011-11-18 2020-02-04 Nike, Inc. Generation of tool paths for shore assembly
US11317681B2 (en) 2011-11-18 2022-05-03 Nike, Inc. Automated identification of shoe parts

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI806294B (en) * 2021-12-17 2023-06-21 財團法人工業技術研究院 3d measuring equipment and 3d measuring method

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10671048B2 (en) 2011-11-18 2020-06-02 Nike, Inc. Automated manufacturing of shoe parts
US11317681B2 (en) 2011-11-18 2022-05-03 Nike, Inc. Automated identification of shoe parts
US10194716B2 (en) 2011-11-18 2019-02-05 Nike, Inc. Automated identification and assembly of shoe parts
US10393512B2 (en) 2011-11-18 2019-08-27 Nike, Inc. Automated 3-D modeling of shoe parts
US10552551B2 (en) 2011-11-18 2020-02-04 Nike, Inc. Generation of tool paths for shore assembly
US10667581B2 (en) 2011-11-18 2020-06-02 Nike, Inc. Automated identification and assembly of shoe parts
US9939803B2 (en) 2011-11-18 2018-04-10 Nike, Inc. Automated manufacturing of shoe parts
US11266207B2 (en) 2011-11-18 2022-03-08 Nike, Inc. Automated identification and assembly of shoe parts
US11879719B2 (en) 2011-11-18 2024-01-23 Nike, Inc. Automated 3-D modeling of shoe parts
US11341291B2 (en) 2011-11-18 2022-05-24 Nike, Inc. Generation of tool paths for shoe assembly
US11346654B2 (en) 2011-11-18 2022-05-31 Nike, Inc. Automated 3-D modeling of shoe parts
US11422526B2 (en) 2011-11-18 2022-08-23 Nike, Inc. Automated manufacturing of shoe parts
US11641911B2 (en) 2011-11-18 2023-05-09 Nike, Inc. Automated identification and assembly of shoe parts
US11763045B2 (en) 2011-11-18 2023-09-19 Nike, Inc. Generation of tool paths for shoe assembly
TWI453387B (en) * 2011-12-28 2014-09-21 Univ Nat Kaohsiung Applied Sci Inspection device and method for spring-back performance of plastic hose

Also Published As

Publication number Publication date
TWI388797B (en) 2013-03-11

Similar Documents

Publication Publication Date Title
EP3907702B1 (en) Three-dimensional sensor system and three-dimensional data acquisition method
US10064553B2 (en) Detection of a movable object when 3D scanning a rigid object
TWI291013B (en) Digital-structured micro-optic three-dimensional confocal surface profile measuring system and technique
US9672630B2 (en) Contour line measurement apparatus and robot system
JP3624353B2 (en) Three-dimensional shape measuring method and apparatus
CN104132613B (en) Noncontact optical volume measurement method for complex-surface and irregular objects
US6549288B1 (en) Structured-light, triangulation-based three-dimensional digitizer
CN103649674B (en) Measuring equipment and messaging device
TWI498580B (en) Length measuring method and length measuring apparatus
TW201017092A (en) Three-dimensional model reconstruction method and system thereof
US20150178908A1 (en) Method for capturing the three-dimensional surface geometry of objects
CN109377551B (en) Three-dimensional face reconstruction method and device and storage medium thereof
WO2012020696A1 (en) Device for processing point group position data, system for processing point group position data, method for processing point group position data and program for processing point group position data
CN105358092A (en) Video-based auto-capture for dental surface imaging apparatus
JP6580761B1 (en) Depth acquisition apparatus and method using polarization stereo camera
JP2009192332A (en) Three-dimensional processor and method for controlling display of three-dimensional data in the three-dimensional processor
JP4419570B2 (en) 3D image photographing apparatus and method
JP7353757B2 (en) Methods for measuring artifacts
JP2007093412A (en) Three-dimensional shape measuring device
JP2007508557A (en) Device for scanning three-dimensional objects
JP2014153149A (en) Shape measurement device, structure manufacturing system, shape measurement method and program
CN109521022A (en) Touch screen defect detecting device based on the confocal camera of line
JP2007322259A (en) Edge detecting method, apparatus and program
JP4077755B2 (en) POSITION DETECTION METHOD, DEVICE THEREOF, PROGRAM THEREOF, AND CALIBRATION INFORMATION GENERATION METHOD
CN209624417U (en) Touch screen defect detecting device based on the confocal camera of line

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees