TWI692968B - 3d model establishing device and calibration method applying to the same - Google Patents

3d model establishing device and calibration method applying to the same Download PDF

Info

Publication number
TWI692968B
TWI692968B TW107141879A TW107141879A TWI692968B TW I692968 B TWI692968 B TW I692968B TW 107141879 A TW107141879 A TW 107141879A TW 107141879 A TW107141879 A TW 107141879A TW I692968 B TWI692968 B TW I692968B
Authority
TW
Taiwan
Prior art keywords
processing unit
unit
rotation
modeling device
camera
Prior art date
Application number
TW107141879A
Other languages
Chinese (zh)
Other versions
TW201946450A (en
Inventor
蕭淳澤
王銓祺
陳加珍
Original Assignee
財團法人工業技術研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 財團法人工業技術研究院 filed Critical 財團法人工業技術研究院
Priority to CN201811532976.8A priority Critical patent/CN110415329B/en
Priority to US16/394,650 priority patent/US10891805B2/en
Publication of TW201946450A publication Critical patent/TW201946450A/en
Application granted granted Critical
Publication of TWI692968B publication Critical patent/TWI692968B/en

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A 3D model constructing device, includes a camera and a wearable display. The camera is configured to obtain a plurality of first frames, a second frame and a depth information. The wearable display is coupled to the camera, and includes a display unit, a processing unit, a storage unit and a projection unit. The storage unit is configured to store a first module and a second module. When the first module is performed by the processing unit, cause the processing unit to calculate a first pose of the wearable display. When the second module is performed by the processing unit, cause the processing unit to calculate a 3D model according to the first frames, the depth information, the first pose and a plurality of calibration parameters, and to update the 3D model according to the second frame. The projection unit is configured to project the 3D model and the second frame onto the display unit according to the first pose for being displayed with a real image on the display unit.

Description

三維建模裝置及應用於其之校準方法 Three-dimensional modeling device and calibration method applied thereto

本發明是有關於一種三維建模裝置及方法 The invention relates to a three-dimensional modeling device and method

三維場景建模是電腦視覺領域裡一個熱門的研究題目,一般是利用多張圖幀和深度資訊來建構出一個完整的三維場景模型。三維場景模型的優劣取決於能否估算出圖幀正確的位姿。然而,事實上位姿卻容易因為誤差累積或是特徵點不明顯而產生位姿漂移。現有技術中大多利用優化演算法來縮小這些誤差,但是成效有限。 Three-dimensional scene modeling is a popular research topic in the field of computer vision. Generally, multiple frames and depth information are used to construct a complete three-dimensional scene model. The pros and cons of the 3D scene model depends on whether the correct pose of the frame can be estimated. However, in fact, pose poses are prone to pose drift due to error accumulation or insignificant feature points. Most of the existing technologies use optimization algorithms to reduce these errors, but the results are limited.

本發明的目的是提供一種三維建模裝置及應用於其之校準方法,能夠建立環境場景的三維模型。 The object of the present invention is to provide a three-dimensional modeling device and a calibration method applied thereto, which can establish a three-dimensional model of an environmental scene.

本發明的一方面揭露一種三維建模裝置,包括一攝影機及一穿戴式顯示器。攝影機用以取得複數個第一圖幀、一第二圖幀及一深度資訊。穿戴式顯示器耦接至攝影機。穿戴式顯示器包括一顯示單元、一處理單元、一儲存單元及一投影單元。儲存單元耦接至處理單元,且用以儲存一第一模組及一第二模組。當第一模組由處理單元執行時,致使處理單元計算穿戴式顯示器的一第一位姿。當第二模組 由處理單元執行時,致使處理單元依據該些第一圖幀、深度資訊、第一位姿及複數個校準參數計算一三維模型,並依據第二圖幀更新三維模型。投影單元耦接至處理單元,且用以將三維模型及第二圖幀依據第一位姿投影至顯示單元,以與一實際影像共同顯示於顯示單元。 One aspect of the present invention discloses a three-dimensional modeling device, including a camera and a wearable display. The camera is used to obtain a plurality of first image frames, a second image frame and a depth information. The wearable display is coupled to the camera. The wearable display includes a display unit, a processing unit, a storage unit, and a projection unit. The storage unit is coupled to the processing unit and is used to store a first module and a second module. When the first module is executed by the processing unit, the processing unit is caused to calculate a first pose of the wearable display. When the second module When executed by the processing unit, the processing unit is caused to calculate a three-dimensional model based on the first image frames, depth information, first pose and a plurality of calibration parameters, and update the three-dimensional model based on the second image frame. The projection unit is coupled to the processing unit, and is used to project the three-dimensional model and the second image frame to the display unit according to the first posture to display together with an actual image on the display unit.

本發明的另一方面揭露一種應用於三維建模裝置的校準方法,包括:令三維建模裝置沿複數個方向直線移動,並分別取得三維建模裝置的一穿戴式顯示器對應於各方向的複數個第一移動速度向量、三維建模裝置的一攝影機對應於各方向的複數個第二移動速度向量;依據複數個第一移動速度向量及第二移動速度向量計算一旋轉參數;任意移動三維建模裝置,並記錄複數筆旋轉量資訊,其中複數個旋轉量資訊分別記錄每一次三維建模裝置的角速度大於一第一閥值時穿戴式顯示器於該時間點的旋轉量、穿戴式顯示器於該時間點相對於前一個時間點的位移量、攝影機於該時間點相對於前一個時間點的位移量,並依據該時間點的旋轉量計算該時間點的旋轉軸單位向量;以及依據複數個旋轉量資訊及旋轉軸單位向量計算位移參數。 Another aspect of the present invention discloses a calibration method applied to a three-dimensional modeling device, including: making the three-dimensional modeling device move linearly in a plurality of directions, and respectively obtaining a plurality of wearable displays of the three-dimensional modeling device corresponding to a plurality of directions A first moving speed vector and a camera of the three-dimensional modeling device correspond to a plurality of second moving speed vectors in various directions; a rotation parameter is calculated according to the plural first moving speed vectors and the second moving speed vectors; Model device, and record a plurality of pieces of rotation amount information, wherein the plurality of pieces of rotation amount information respectively record the rotation amount of the wearable display at the time point when the angular velocity of the 3D modeling device is greater than a first threshold, The displacement amount of the time point relative to the previous time point, the displacement amount of the camera relative to the previous time point at that time point, and the rotation axis unit vector at that time point is calculated according to the rotation amount of the time point; and based on the plurality of rotations The displacement information is calculated based on the quantity information and the unit vector of the rotation axis.

為了對本發明之上述及其他方面有更佳的瞭解,下文特舉實施例,並配合所附圖式詳細說明如下: In order to have a better understanding of the above and other aspects of the present invention, the following examples are specifically described in conjunction with the accompanying drawings as follows:

10:三維建模裝置 10: 3D modeling device

102:攝影機 102: Camera

104:穿戴式顯示器 104: Wearable display

1041:處理單元 1041: Processing unit

1043:儲存單元 1043: Storage unit

1045:投影單元 1045: Projection unit

1047:輸入單元 1047: Input unit

1049:顯示單元 1049: display unit

M1:第一模組 M1: the first module

M2:第二模組 M2: second module

S301~S307:步驟 S301~S307: Steps

第1圖繪示依據本發明一實施例的三維建模裝置的方塊圖。 FIG. 1 is a block diagram of a three-dimensional modeling device according to an embodiment of the invention.

第2圖繪示依據本發明一實施例的三維建模裝置的示意圖。 FIG. 2 is a schematic diagram of a three-dimensional modeling device according to an embodiment of the invention.

第3圖繪示依據本發明一實施例的校準參數產生方法的流程圖。 FIG. 3 is a flowchart of a calibration parameter generation method according to an embodiment of the invention.

請參照第1圖,第1圖繪示依據本發明一實施例的三維建模裝置的方塊圖。請同時參照第2圖,第2圖繪示的是依據本發明一實施例的三維建模裝置的示意圖。三維建模裝置10包括一攝影機102以及一穿戴式顯示器104。三維建模裝置10是用以建立環境場景的三維模型。在一實施例中,三維建模裝置10是一擴增實境(Augmented Reality,AR)裝置。在其他實施例中,三維建模裝置10可為一虛擬實境VR(Virtual Reality,VR)裝置、一替代實境(Substitutional Reality)裝置或一混合實境(Mixed Reality,MR)裝置。 Please refer to FIG. 1, which illustrates a block diagram of a three-dimensional modeling device according to an embodiment of the invention. Please also refer to FIG. 2, which is a schematic diagram of a three-dimensional modeling device according to an embodiment of the invention. The three-dimensional modeling device 10 includes a camera 102 and a wearable display 104. The three-dimensional modeling device 10 is used to create a three-dimensional model of an environmental scene. In one embodiment, the three-dimensional modeling device 10 is an Augmented Reality (AR) device. In other embodiments, the three-dimensional modeling device 10 may be a virtual reality VR (Virtual Reality, VR) device, a substitutional reality (Substitutional Reality) device, or a mixed reality (MR) device.

攝影機102可為一RGB-D攝影機(RGB-D camera)。攝影機102除了能夠捕捉環境的色彩資訊,還能夠藉由自身具備的深度感測器捕捉環境的深度資訊。攝影機102是用以取得複數個第一圖幀、一第二圖幀及一深度資訊,其中第二圖幀是當前取得的最新的圖幀,而第一圖幀則是在第二圖幀之前取得的多個圖幀。如第2圖所示,攝影機102是可拆卸地或固定地設置在穿戴式顯示器104上。 The camera 102 may be an RGB-D camera (RGB-D camera). In addition to capturing the color information of the environment, the camera 102 can also capture the depth information of the environment with its own depth sensor. The camera 102 is used to obtain a plurality of first picture frames, a second picture frame and a depth information, wherein the second picture frame is the latest picture frame currently obtained, and the first picture frame is before the second picture frame Get multiple picture frames. As shown in FIG. 2, the camera 102 is detachably or fixedly installed on the wearable display 104.

穿戴式顯示器104耦接至攝影機102,而使得兩者之間可進行訊號及資料的傳輸。穿戴式顯示器104包括一顯示單元1049、一處理單元1041、一儲存單元1043、一投影單元1045以及一輸入單元1047。顯示單元1049的配置如第2圖所示,顯示單元1049可為一頭戴式顯示螢幕(Head Mount Display,HMD)。處理單元1041可包括一或 多個專用積體電路晶片或通用處理器,用以執行三維建模的運算。儲存單元1043可為非揮發性記憶體,例如NAND快閃記憶體,用以儲存應用程式及資料。儲存單元1043包括一第一模組M1及一第二模組M2。第一模組M1與第二模組M2為計算機可讀程式,包括多個計算機可讀指令。當第一模組M1由處理單元1041執行時,會使處理單元1041計算穿戴式顯示器104的一第一位姿(pose)。處理單元1041可根據穿戴式顯示器104所具備的一或多個感測器(未繪示)來計算第一位姿,計算的方式可採用本領域中常用的手段,故於此不加贅述。當第二模組M2由處理單元1041執行時,會使處理單元1041根據複數個第一圖幀、深度資訊、第一位姿及複數個校準參數計算一三維模型,並依據第二圖幀更新三維模型,其中校準參數的細節會在下文進行說明。簡單來說,第一模組M1是用來計算穿戴式顯示器104當前的位姿,第二模組M2是用來建立環境場景的三維模型。 The wearable display 104 is coupled to the camera 102 so that signals and data can be transmitted between the two. The wearable display 104 includes a display unit 1049, a processing unit 1041, a storage unit 1043, a projection unit 1045, and an input unit 1047. The configuration of the display unit 1049 is shown in FIG. 2. The display unit 1049 may be a head-mounted display (HMD). The processing unit 1041 may include one or Multiple dedicated integrated circuit chips or general-purpose processors are used to perform 3D modeling operations. The storage unit 1043 may be a non-volatile memory, such as a NAND flash memory, for storing applications and data. The storage unit 1043 includes a first module M1 and a second module M2. The first module M1 and the second module M2 are computer readable programs and include a plurality of computer readable instructions. When the first module M1 is executed by the processing unit 1041, it causes the processing unit 1041 to calculate a first pose of the wearable display 104. The processing unit 1041 may calculate the first posture according to one or more sensors (not shown) included in the wearable display 104, and the calculation method may use a method commonly used in the art, so details are not described here. When the second module M2 is executed by the processing unit 1041, it will cause the processing unit 1041 to calculate a three-dimensional model based on the plurality of first frames, depth information, first pose and the plurality of calibration parameters, and update according to the second frame The details of the calibration parameters of the three-dimensional model will be described below. In short, the first module M1 is used to calculate the current pose of the wearable display 104, and the second module M2 is used to create a three-dimensional model of the environment scene.

投影單元1045耦接至處理單元1041。投影單元1045是用以將處理單元1041建立好的三維模型及第二圖幀投影至顯示單元1049。於混合實境的應用中,使用者可通過顯示單元1049看到一實際影像,即實際的環境場景。被投影至顯示單元1049上的三維模型及第二圖幀可與實際影像共同呈現於顯示單元1049上。 The projection unit 1045 is coupled to the processing unit 1041. The projection unit 1045 is used to project the three-dimensional model and the second image frame created by the processing unit 1041 to the display unit 1049. In the application of mixed reality, the user can see an actual image through the display unit 1049, that is, the actual environment scene. The three-dimensional model and the second frame projected onto the display unit 1049 can be presented on the display unit 1049 together with the actual image.

由於攝影機102與穿戴式顯示器104有各自的座標系統,因此,穿戴式顯示器104在使用來自攝影機102的第一圖幀、第二圖幀與深度資訊時須先進行座標系統的轉換(或稱校 準),才不會令三維模型與實際場景有大幅的偏差。前文所述的校準參數即是用以進行座標系統轉換的依據。 Since the camera 102 and the wearable display 104 have their own coordinate systems, the wearable display 104 must first convert the coordinate system (or calibration) when using the first frame, second frame, and depth information from the camera 102 Standard) so that the 3D model does not deviate significantly from the actual scene. The calibration parameters mentioned above are the basis for the coordinate system conversion.

輸入單元1047耦接至處理單元1041。輸入單元1047可為一輸入控制器,可以與顯示單元1049連結,用以接收來自使用者的輸入指令,並將輸入指令傳送至處理單元1041。處理單元1041可回應於使用者的輸入指令對第二圖幀及/或三維模型進行調整。關於此部分的細節將於下文進行說明。 The input unit 1047 is coupled to the processing unit 1041. The input unit 1047 may be an input controller, which may be connected to the display unit 1049 to receive input commands from the user and transmit the input commands to the processing unit 1041. The processing unit 1041 can adjust the second image frame and/or the three-dimensional model in response to the user's input command. Details about this part will be explained below.

請參照第3圖,第3圖繪示的是依據本發明一實施例的校準參數的產生方法的流程圖。校準參數包括一旋轉參數R及一位移參數T,其中R及T可為矩陣的形式。 Please refer to FIG. 3, which is a flowchart of a method for generating a calibration parameter according to an embodiment of the present invention. The calibration parameters include a rotation parameter R and a displacement parameter T, where R and T may be in the form of a matrix.

步驟S301,令三維建模裝置10沿複數個方向直線移動,並分別取得穿戴式顯示器104對應於各個方向的複數個第一移動速度向量、攝影機102對應於各個方向的複數個第二移動速度向量。舉例來說,可準備一三軸平台(X軸、Y軸及Z軸,三軸相互垂直),將建模裝置10固定於三軸平台的一承載單元上。首先,操作承載單元以令建模裝置10沿X軸方向移動,以取得穿戴式顯示器104對應於X軸方向的第一移動速度向量v 1及攝影機102對應於X軸方向的第二移動速度向量c 1,得到v 1=Rc 1,其中R為旋轉參數。接著,操作承載單元以令建模裝置10沿Y軸方向移動,以取得穿戴式顯示器104對應於Y軸方向的第一移動速度向量v 2及攝影機102對應於X軸方向的第二移動速度向量c 2,得到v 2=Rc 2。再接著,操作承載單元以令建模裝置10沿Z軸方向 移動,以取得穿戴式顯示器104對應於Z軸方向的第一移動速度向量v 3及攝影機102對應於X軸方向的第二移動速度向量c 3,得到v 3=Rc 3In step S301, the three-dimensional modeling device 10 is linearly moved in a plurality of directions, and respectively obtains a plurality of first movement speed vectors corresponding to each direction of the wearable display 104, and a plurality of second movement speed vectors corresponding to each direction of the camera 102 . For example, a three-axis platform (X-axis, Y-axis, and Z-axis, the three axes are perpendicular to each other) may be prepared, and the modeling device 10 is fixed on a carrying unit of the three-axis platform. First, operate the carrying unit to move the modeling device 10 in the X-axis direction to obtain the first movement speed vector v 1 of the wearable display 104 corresponding to the X-axis direction and the second movement speed vector of the camera 102 corresponding to the X-axis direction c 1 , get v 1 = Rc 1 , where R is the rotation parameter. Next, the carrying unit is operated to move the modeling device 10 in the Y-axis direction to obtain the first movement speed vector v 2 of the wearable display 104 corresponding to the Y-axis direction and the second movement speed vector of the camera 102 corresponding to the X-axis direction c 2 , get v 2 = Rc 2 . Then, the carrying unit is operated to move the modeling device 10 along the Z-axis direction to obtain the first moving speed vector v 3 corresponding to the Z-axis direction of the wearable display 104 and the second moving speed corresponding to the X-axis direction of the camera 102 Vector c 3 gives v 3 = Rc 3 .

在獲取第一移動速度向量時可藉由穿戴式顯示器104所具備的感測器來量測。另一方面,在獲取第二移動速度向量時可藉由攝影機102所具備的感測器來量測。取得的第一移動速度向量及第二移動速度向量可提供給穿戴式顯示器104的處理單元1041或一外部計算機(未繪示)進行後續的計算。所謂的「外部計算機」是指不包括在三維建模裝置10內,而是從外部耦接至三維建模裝置10的計算機,例如背包電腦。 When acquiring the first moving speed vector, it can be measured by a sensor provided in the wearable display 104. On the other hand, when acquiring the second moving speed vector, it can be measured by a sensor provided in the camera 102. The obtained first moving speed vector and second moving speed vector may be provided to the processing unit 1041 of the wearable display 104 or an external computer (not shown) for subsequent calculation. The so-called "external computer" refers to a computer that is not included in the three-dimensional modeling device 10 but is externally coupled to the three-dimensional modeling device 10, such as a backpack computer.

步驟S303,依據第一移動速度向量及第二移動速度向量計算旋轉參數。以上述例子來說,R=[v 1 v 2 v 3][c 1 c 2 c 3]-1Step S303: Calculate the rotation parameter according to the first moving speed vector and the second moving speed vector. In the above example, R=[ v 1 v 2 v 3 ][ c 1 c 2 c 3 ] -1 .

值得一提的是,移動的方向不限於三個,而可以擴展至N個(N>3)以提高精準度。在移動的方向為N個的情況下,所取得的第一移動速度向量與第二移動速度向量之間的關係可被整理成:

Figure 107141879-A0305-02-0008-2
It is worth mentioning that the direction of movement is not limited to three, but can be expanded to N (N>3) to improve accuracy. In the case of N moving directions, the relationship between the obtained first moving speed vector and the second moving speed vector can be organized as:
Figure 107141879-A0305-02-0008-2

藉由最小平方法(least square)可解出上述方程式,得到R如下:

Figure 107141879-A0305-02-0009-3
By least square method (least square) can solve the above equation, get R as follows:
Figure 107141879-A0305-02-0009-3

完成旋轉參數R的計算後,接著計算位移參數T。 After the calculation of the rotation parameter R is completed, the displacement parameter T is then calculated.

步驟S305,任意移動三維建模裝置10,並記錄複數筆旋轉量資訊,其中複數筆旋轉量資訊係分別記錄每一次角速度大於一第一閥值時穿戴式顯示器104於該時間點t的旋轉量R t 、穿戴式顯示器104於該時間點t相對於前一個時間點的位移量T vt ,攝影機102於該時間點t相對於前一個時間點的位移量T ct ,並依據該時間點t的旋轉量R t 計算該時間點t的旋轉軸單位向量ω t Step S305: Move the three-dimensional modeling device 10 arbitrarily and record the information of the rotation amount of the plural pens, wherein the information of the rotation amount of the plural pens respectively record the rotation amount of the wearable display 104 at the time point t when the angular velocity is greater than a first threshold R t , the displacement T vt of the wearable display 104 relative to the previous time at this time t, the displacement T ct of the camera 102 relative to the previous time at this time t, and according to the time t The rotation amount R t calculates the rotation axis unit vector ω t at the time point t .

步驟S307,依據旋轉量資訊及旋轉軸單位向量計算位移參數T。舉例來說,假設收集到的旋轉量資訊有n筆,其中n為大於或等於2的整數。當n為奇數時,旋轉量資訊、旋轉軸單位向量及位移參數之間的關係可由下式表示:

Figure 107141879-A0305-02-0009-4
In step S307, the displacement parameter T is calculated according to the rotation amount information and the rotation axis unit vector. For example, suppose there are n pieces of rotation information collected, where n is an integer greater than or equal to 2. When n is an odd number, the relationship between the rotation amount information, the rotation axis unit vector and the displacement parameter can be expressed by the following formula:
Figure 107141879-A0305-02-0009-4

當n為偶數時,旋轉量資訊、旋轉軸單位向量及位移參數之間的關係可由下式表示:

Figure 107141879-A0305-02-0009-5
When n is an even number, the relationship between the rotation amount information, the rotation axis unit vector and the displacement parameter can be expressed by the following formula:
Figure 107141879-A0305-02-0009-5

其中I為單位矩陣,△T t =T ct -T vt [ω t] X 為一自定義的運算子,定義如下:

Figure 107141879-A0305-02-0010-8
Where I is the identity matrix, △T t = T ct - T vt , [ ω t ] X is a custom operator, defined as follows:
Figure 107141879-A0305-02-0010-8

其中ω xt為單位旋轉軸向量ω t在X軸的分量,ω yt為單位旋轉軸向量ω t在Y軸的分量,ω zt為單位旋轉軸向量ω t在Z軸的分量。 Where ω x t is the component of the unit rotation axis vector ω t on the X axis, ω y t is the component of the unit rotation axis vector ω t on the Y axis, and ω z t is the unit rotation axis vector ω t on the Z axis Weight.

需要注意的是,此處的旋轉量R t與步驟S303算出的旋轉參數R是不同的。 It should be noted that the rotation amount R t here is different from the rotation parameter R calculated in step S303.

藉由Psudo Inverse解上述關係式,即可得到位移參數T。 By Psudo Inverse solving the above relationship, the displacement parameter T can be obtained.

在一實施例中,處理單元1041可藉由管理一第一緩衝器及一第二緩衝器,例如先進先出(first in first out,FIFO)緩衝器,來即時更新旋轉參數R及位移參數T。例如,第一緩衝器用以儲存計算旋轉參數R的相關資訊,第二緩衝器用以儲存用以計算位移參數T的相關資訊。當第一緩衝器滿時(例如已累積n筆相關資訊),處理單元1041即依據第一緩衝器內儲存的資訊計算旋轉參數R,計算的方式可採用前文所述的方式。每當有一筆新的資訊被儲存至第一緩衝器,最舊的一筆資訊就會被丟棄,處理單元1041就可再計算一次旋轉參數R,以進行旋轉參數R的更新。相似地,位移參數T也可以類似的方式即時更新。 In one embodiment, the processing unit 1041 can update the rotation parameter R and the displacement parameter T in real time by managing a first buffer and a second buffer, such as a first in first out (FIFO) buffer . For example, the first buffer is used to store related information for calculating the rotation parameter R, and the second buffer is used for storing related information for calculating the displacement parameter T. When the first buffer is full (for example, n pieces of related information have been accumulated), the processing unit 1041 calculates the rotation parameter R according to the information stored in the first buffer. The calculation method may be the method described above. Each time a new piece of information is stored in the first buffer, the oldest piece of information is discarded, and the processing unit 1041 can calculate the rotation parameter R again to update the rotation parameter R. Similarly, the displacement parameter T can be updated in a similar manner in real time.

在計算出校準參數後,可將之記載於第二模組M2中,而當第二模組M2由處理單元1041執行時,處理單元1041便可依據這些校準參數將攝影機102提供的第一圖幀、第二圖幀及深度資訊以穿戴 式顯示器104的座標系統表示,接著再根據第一位姿及上述轉換後的第一圖幀、第二圖幀及深度資訊建立/更新三維模型。 After calculating the calibration parameters, they can be recorded in the second module M2. When the second module M2 is executed by the processing unit 1041, the processing unit 1041 can provide the first image provided by the camera 102 according to these calibration parameters. Frames, second picture frames and depth information to wear The coordinate system of the display 104 indicates that the three-dimensional model is created/updated based on the first pose and the converted first image frame, second image frame, and depth information.

在一實施例中,使用者藉由比較投影於顯示單元1049上的第二圖幀與實際影像以確定投影的第二圖幀是否與實際影像(或三維模型)相符。若使用者認為第二圖幀與實際影像有所偏差,使用者可通過輸入單元1047發出一圖幀調整指令。圖幀調整指令可包括沿X軸、Y軸及Z軸方向的移動及/或以X軸、Y軸及Z軸為中心的旋轉。在第二模組M2由處理單元1041執行的情況下,處理單元1041在接收到圖幀調整指令後會判斷可能是校準參數需要調整,於是處理單元1041會依據圖幀調整指令產生複數個第一修正參數。接著,處理單元1041可根據第一修正參數更新校準參數。 In one embodiment, the user determines whether the projected second frame matches the actual image (or the three-dimensional model) by comparing the second frame projected on the display unit 1049 with the actual image. If the user believes that the second image frame deviates from the actual image, the user can issue a frame adjustment instruction through the input unit 1047. The frame adjustment instruction may include movement in the X-axis, Y-axis, and Z-axis directions and/or rotation centered on the X-axis, Y-axis, and Z-axis. In the case where the second module M2 is executed by the processing unit 1041, after receiving the frame adjustment instruction, the processing unit 1041 determines that the calibration parameters may need to be adjusted, so the processing unit 1041 generates a plurality of first according to the frame adjustment instruction Correct the parameters. Next, the processing unit 1041 may update the calibration parameters according to the first correction parameters.

在一實施例中,使用者藉由比較投影於顯示單元1049上的三維模型與實際影像可以得知處理單元1041所建立的三維模型是否與實際影像相符。若使用者認為三維模型與實際影像有所偏差或認為三維模型的位置需要調整,使用者可通過輸入單元1047發出一模型調整指令。模型調整指令可包括沿X軸、Y軸及Z軸方向的移動及/或以X軸、Y軸及Z軸為中心的旋轉。在第二模組M2由處理單元1041執行的情況下,處理單元1041在接收到模型調整指令後會判斷可能是第一位姿需要調整,於是處理單元1041會依據模型調整指令產生複數個第二修正參數。當第一模 組M1由處理單元1041執行時,處理單元1041會依據第二修正參數更新第一位姿。 In an embodiment, the user can know whether the three-dimensional model created by the processing unit 1041 matches the actual image by comparing the three-dimensional model projected on the display unit 1049 with the actual image. If the user thinks that the three-dimensional model deviates from the actual image or thinks that the position of the three-dimensional model needs to be adjusted, the user can issue a model adjustment command through the input unit 1047. The model adjustment instruction may include movement in the X-axis, Y-axis, and Z-axis directions and/or rotation about the X-axis, Y-axis, and Z-axis. In the case where the second module M2 is executed by the processing unit 1041, after receiving the model adjustment instruction, the processing unit 1041 determines that the first posture needs to be adjusted, so the processing unit 1041 generates a plurality of second according to the model adjustment instruction Correct the parameters. When the first mode When the group M1 is executed by the processing unit 1041, the processing unit 1041 updates the first pose according to the second correction parameter.

在一實施例中,第一修正參數及第二修正參數可分別依據圖幀調整指令及模型調整指令所調整的位移量及/或旋轉量產生。 In an embodiment, the first correction parameter and the second correction parameter may be generated according to the displacement and/or rotation adjusted by the frame adjustment instruction and the model adjustment instruction, respectively.

在一實施例中,當第二模組M2由處理單元1041執行時,會致使處理單元1041依據第二圖幀及三維模型計算一相似度,依據相似度產生一調整提示,並令投影單元1045將調整提示投影至顯示單元1049。例如,處理單元1041會先計算一重疊區域A=Overlap(F,pf,M,pm),可採用如voxelhulls方法找出重疊區域。接著,計算重疊區域對應點的差值平方和平均值作為相似度,其中包含RGB資訊圖及深度資訊圖兩部分,並分別乘上一個正實數權值(ω r d )結合成最終的相似度S如下:

Figure 107141879-A0305-02-0012-9
In one embodiment, when the second module M2 is executed by the processing unit 1041, it causes the processing unit 1041 to calculate a similarity based on the second image frame and the three-dimensional model, generate an adjustment prompt based on the similarity, and cause the projection unit 1045 The adjustment prompt is projected to the display unit 1049. For example, the processing unit 1041 will first calculate an overlap area A=Overlap(F,pf,M,pm), and the voxelhulls method may be used to find the overlap area. Then, calculate the difference square and the average value of the corresponding points in the overlapping area as the similarity, which includes the two parts of the RGB information map and the depth information map, and multiply each by a positive real weight ( ω r , ω d ) to form the final The similarity S is as follows:
Figure 107141879-A0305-02-0012-9

其中△rgb為RGB資訊圖中的RGB值差異量,△depth為深度資訊圖中深度的差異量,N為重疊區域的總點數。 Where △ rgb is the amount of difference in RGB values in the RGB information map, △ depth is the amount of difference in depth in the depth information map, and N is the total number of points in the overlapping area.

在另一實施例中,三維建模裝置為一虛擬實境裝置。在這個實施例中,三維建模裝置更包括一第一攝影機,用以拍攝環境場景並顯示於顯示單元上。處理單元可令投影單元將建好的三維模型及第二圖幀投影於顯示單元上,或是藉由運算功能將三維模型及第二圖 幀顯示於顯示單元上。前文所述的校準方法、調整第二圖幀/三維模型及計算相似度等皆可應用於本實施例中。 In another embodiment, the three-dimensional modeling device is a virtual reality device. In this embodiment, the three-dimensional modeling device further includes a first camera, which is used to photograph the environmental scene and display it on the display unit. The processing unit can cause the projection unit to project the built 3D model and the second image frame on the display unit, or use the calculation function to project the 3D model and the second image The frame is displayed on the display unit. The calibration method described above, adjusting the second frame/three-dimensional model, and calculating the similarity can be applied to this embodiment.

藉由本發明提供的三維建模裝置,能夠將攝影機與穿戴式顯示器進行整合,並透過校準方法產生的校準參數,使得穿戴式顯示器根據校準參數將來自攝影機取得的圖幀與深度資訊轉換為自身所在的座標系統,並用以建立三維模型。此外,藉由將攝影機提供的當前的圖幀與實際影像比較,使用者可即時調整圖幀或三維模型的位置及/或角度,穿戴式顯示器根據使用者的調整可針對自身的位姿及/或校準參數進行修正,使得更新後的三維模型更加貼近實際的環境場景,達到圖幀和場景模型的互動式調整的效果。不僅如此,穿戴式顯示器更可以計算當前的圖幀與三維模型的相似度,並根據相似度產生調整提示,以供使用者做為調整的參考。 With the three-dimensional modeling device provided by the present invention, the camera and the wearable display can be integrated, and the calibration parameters generated by the calibration method enable the wearable display to convert the frame and depth information obtained from the camera to its own location according to the calibration parameters Coordinate system and used to build a three-dimensional model. In addition, by comparing the current frame provided by the camera with the actual image, the user can adjust the position and/or angle of the frame or three-dimensional model in real time, and the wearable display can target its own position and/or position according to the user's adjustment. Or the calibration parameters are modified to make the updated 3D model closer to the actual environment scene, and to achieve the effect of interactive adjustment of the frame and scene model. Not only that, the wearable display can also calculate the similarity between the current image frame and the three-dimensional model, and generate an adjustment prompt according to the similarity, for the user to use as a reference for adjustment.

綜上所述,雖然本發明已以實施例揭露如上,然其並非用以限定本發明。本發明所屬技術領域中具有通常知識者,在不脫離本發明之精神和範圍內,當可作各種之更動與潤飾。因此,本發明之保護範圍當視後附之申請專利範圍所界定者為準。 In summary, although the present invention has been disclosed as above with examples, it is not intended to limit the present invention. Those with ordinary knowledge in the technical field to which the present invention belongs can make various modifications and retouching without departing from the spirit and scope of the present invention. Therefore, the scope of protection of the present invention shall be deemed as defined by the scope of the attached patent application.

10:三維建模裝置 10: 3D modeling device

102:攝影機 102: Camera

104:穿戴式顯示器 104: Wearable display

1041:處理單元 1041: Processing unit

1043:儲存單元 1043: Storage unit

1045:投影單元 1045: Projection unit

1047:輸入單元 1047: Input unit

1049:顯示單元 1049: display unit

M1:第一模組 M1: the first module

M2:第二模組 M2: second module

Claims (10)

一種三維建模裝置,包括:一攝影機,用以取得複數個第一圖幀、一第二圖幀及一深度資訊;以及一穿戴式顯示器,耦接至該攝影機,該穿戴式顯示器包括:一顯示單元;一處理單元;一儲存單元,耦接至該處理單元,且用以儲存一第一模組及一第二模組,當該第一模組由該處理單元執行時,致使該處理單元計算該穿戴式顯示器的一第一位姿,當該第二模組由該處理單元執行時,致使該處理單元依據該些第一圖幀、該深度資訊、該第一位姿及複數個校準參數計算一三維模型,並依據該第二圖幀更新該三維模型;以及一投影單元,耦接至該處理單元,且用以將該三維模型及該第二圖幀依據該第一位姿投影至該顯示單元,以與一實際影像共同顯示於該顯示單元,其中該些校準參數係為進行該攝影機的一座標系統與該穿戴式顯示器的一座標系統之間轉換的依據。 A three-dimensional modeling device includes: a camera for acquiring a plurality of first image frames, a second image frame and a depth information; and a wearable display coupled to the camera, the wearable display includes: a Display unit; a processing unit; a storage unit, coupled to the processing unit, and used to store a first module and a second module, when the first module is executed by the processing unit, causing the processing The unit calculates a first pose of the wearable display. When the second module is executed by the processing unit, the processing unit is caused to be based on the first frames, the depth information, the first pose and the plurality of poses The calibration parameters calculate a three-dimensional model and update the three-dimensional model according to the second image frame; and a projection unit, coupled to the processing unit, and used to apply the three-dimensional model and the second image frame to the first pose Projected onto the display unit to be displayed on the display unit together with an actual image, wherein the calibration parameters are the basis for conversion between a camera system of the camera and a system of the wearable display. 如申請專利範圍第1項所述之三維建模裝置,其中該穿戴式顯示器更包括一輸入單元,耦接至該處理單元,且用以接收來自一使用者的一圖幀調整命令,當該第二模組由該處理單元 執行時,更致使該處理單元依據該圖幀調整命令產生複數個第一修正參數,並依據該些第一修正參數變更該些校準參數的至少其中之一。 The three-dimensional modeling device as described in item 1 of the patent application scope, wherein the wearable display further includes an input unit coupled to the processing unit and used to receive a frame adjustment command from a user when the The second module consists of the processing unit During execution, the processing unit is caused to generate a plurality of first correction parameters according to the frame adjustment command, and change at least one of the calibration parameters according to the first correction parameters. 如申請專利範圍第1項所述之三維建模裝置,更包括一輸入單元,耦接至該處理單元,且用以接收來自一使用者的一模型調整命令,當該第二模組由該處理單元執行時,更致使該處理單元依據該模型調整命令產生複數個第二修正參數,當該第一模組由該處理單元執行時,更致使該處理單元依據該些第二修正參數變更該第一位姿。 The three-dimensional modeling device as described in item 1 of the patent application scope further includes an input unit coupled to the processing unit and used to receive a model adjustment command from a user when the second module is When the processing unit is executed, the processing unit is caused to generate a plurality of second correction parameters according to the model adjustment command. When the first module is executed by the processing unit, the processing unit is caused to change the second correction parameter according to the second correction parameters. The first posture. 如申請專利範圍第1項所述之三維建模方法,其中當該第二模組由該處理單元執行時,更致使該處理單元依據該第二圖幀及該三維模型計算一相似度,並依據該相似度產生一調整提示。 The three-dimensional modeling method as described in item 1 of the patent application scope, wherein when the second module is executed by the processing unit, the processing unit is further caused to calculate a similarity based on the second frame and the three-dimensional model, and An adjustment prompt is generated based on the similarity. 如申請專利範圍第1項所述之三維建模裝置,其中該些校準參數包括一旋轉參數,該旋轉參數係由以下方式產生:令該三維建模裝置沿複數個方向直線移動,並分別取得該穿戴式顯示器對應於各該方向的複數個第一移動速度向量、該攝影機對應於各該方向的複數個第二移動速度向量;以及 依據該些第一移動速度向量及該些第二移動速度向量計算該旋轉參數。 The three-dimensional modeling device as described in item 1 of the patent application, wherein the calibration parameters include a rotation parameter, which is generated by: making the three-dimensional modeling device move linearly in a plurality of directions and obtain separately The wearable display corresponds to a plurality of first movement speed vectors in each direction, and the camera corresponds to a plurality of second movement speed vectors in each direction; and The rotation parameter is calculated according to the first moving speed vectors and the second moving speed vectors. 如申請專利範圍第1項所述之三維建模裝置,其中該些校準參數包括一位移參數,該位移參數係由以下方式產生:任意移動該三維建模裝置,並記錄複數筆旋轉量資訊,其中該些旋轉量資訊分別記錄每一次該三維建模裝置的角速度大於一第一閥值時該穿戴式顯示器於該時間點的旋轉量、穿戴式顯示器於該時間點相對於前一個時間點的位移量、該攝影機於該時間點相對於前一個時間點的位移量,並依據該時間點的旋轉量計算該時間點的旋轉軸單位向量;以及依據該些旋轉量資訊及該些旋轉軸單位向量計算該位移參數。 The three-dimensional modeling device as described in item 1 of the patent application, wherein the calibration parameters include a displacement parameter, which is generated by: moving the three-dimensional modeling device arbitrarily and recording a plurality of pieces of rotation information, The rotation amount information records the rotation amount of the wearable display at the time point each time the angular velocity of the three-dimensional modeling device is greater than a first threshold, and the wearable display relative to the previous time point at the time point The displacement, the displacement of the camera relative to the previous time at the time, and the rotation axis unit vector at the time is calculated according to the rotation at the time; and according to the rotation information and the rotation axis units The vector calculates the displacement parameter. 一種應用於三維建模裝置的校準方法,包括:令該三維建模裝置沿複數個方向直線移動,並分別取得該三維建模裝置的一穿戴式顯示器對應於各該方向的複數個第一移動速度向量、該三維建模裝置的一攝影機對應於各該方向的複數個第二移動速度向量;依據該些第一移動速度向量及該些第二移動速度向量計算一旋轉參數;任意移動該三維建模裝置,並記錄複數筆旋轉量資訊,其中該些旋轉量資訊分別記錄每一次該三維建模裝置的角速度大於一 第一閥值時該穿戴式顯示器於該時間點的旋轉量、該穿戴式顯示器於該時間點相對於前一個時間點的位移量、該攝影機於該時間點相對於前一個時間點的位移量,並依據該時間點的旋轉量計算該時間點的旋轉軸單位向量;以及依據該些旋轉量資訊及該些旋轉軸單位向量計算一位移參數。 A calibration method applied to a three-dimensional modeling device includes: linearly moving the three-dimensional modeling device along a plurality of directions, and obtaining a plurality of first movements corresponding to each direction of a wearable display of the three-dimensional modeling device respectively A speed vector, a camera of the three-dimensional modeling device corresponding to a plurality of second moving speed vectors in each direction; calculating a rotation parameter based on the first moving speed vectors and the second moving speed vectors; arbitrarily moving the three-dimensional Modeling device, and record a plurality of pieces of rotation information, wherein the rotation information records the angular velocity of the three-dimensional modeling device each time is greater than one The amount of rotation of the wearable display at the time point at the first threshold, the amount of displacement of the wearable display at the time point relative to the previous time point, and the amount of displacement of the camera at the time point relative to the previous time point And calculate a unit of rotation axis unit vector at the time point according to the rotation amount at the time point; and calculate a displacement parameter according to the rotation amount information and the rotation axis unit vectors. 如申請專利範圍第7項所述之校準方法,其中該穿戴式顯示器的一處理單元係依據該旋轉參數、該位移參數、該攝影機提供的複數個第一圖幀、一深度資訊、一第一位姿及複數個校準參數計算一三維模型,並依該攝影機提供的該第二圖幀更新該三維模型。 The calibration method as described in item 7 of the patent application scope, wherein a processing unit of the wearable display is based on the rotation parameter, the displacement parameter, a plurality of first image frames provided by the camera, a depth information, a first The pose and the plurality of calibration parameters calculate a three-dimensional model, and update the three-dimensional model according to the second frame provided by the camera. 如申請專利範圍第8項所述之校準方法,其中該處理單元依據一使用者的一圖幀調整命令產生複數個第一修正參數,並依據該些第一修正參數變更該旋轉參數及該位移參數的至少其中之一。 The calibration method as described in item 8 of the patent application, wherein the processing unit generates a plurality of first correction parameters according to a user's frame adjustment command, and changes the rotation parameter and the displacement according to the first correction parameters At least one of the parameters. 如申請專利範圍第8項所述之校準方法,其中該處理單元依據來自一使用者的一模型調整產生複數個第二修正參數,該處理單元依據該些第二修正參數變更該第一位姿。 The calibration method as described in item 8 of the patent application scope, wherein the processing unit generates a plurality of second correction parameters according to a model adjustment from a user, and the processing unit changes the first pose according to the second correction parameters .
TW107141879A 2018-04-26 2018-11-23 3d model establishing device and calibration method applying to the same TWI692968B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811532976.8A CN110415329B (en) 2018-04-26 2018-12-14 Three-dimensional modeling device and calibration method applied to same
US16/394,650 US10891805B2 (en) 2018-04-26 2019-04-25 3D model establishing device and calibration method applying to the same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862662781P 2018-04-26 2018-04-26
US62/662,781 2018-04-26

Publications (2)

Publication Number Publication Date
TW201946450A TW201946450A (en) 2019-12-01
TWI692968B true TWI692968B (en) 2020-05-01

Family

ID=69582839

Family Applications (1)

Application Number Title Priority Date Filing Date
TW107141879A TWI692968B (en) 2018-04-26 2018-11-23 3d model establishing device and calibration method applying to the same

Country Status (1)

Country Link
TW (1) TWI692968B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1497504A (en) * 2002-09-30 2004-05-19 佳能株式会社 Video image combining equipment equipment and video image combining method
CN103988226A (en) * 2011-08-31 2014-08-13 Metaio有限公司 Method for estimating camera motion and for determining three-dimensional model of real environment
CN105320271A (en) * 2014-07-10 2016-02-10 精工爱普生株式会社 HMD calibration with direct geometric modeling
CN105404392A (en) * 2015-11-03 2016-03-16 北京英梅吉科技有限公司 Monocular camera based virtual wearing method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1497504A (en) * 2002-09-30 2004-05-19 佳能株式会社 Video image combining equipment equipment and video image combining method
CN103988226A (en) * 2011-08-31 2014-08-13 Metaio有限公司 Method for estimating camera motion and for determining three-dimensional model of real environment
CN105320271A (en) * 2014-07-10 2016-02-10 精工爱普生株式会社 HMD calibration with direct geometric modeling
CN105404392A (en) * 2015-11-03 2016-03-16 北京英梅吉科技有限公司 Monocular camera based virtual wearing method and system

Also Published As

Publication number Publication date
TW201946450A (en) 2019-12-01

Similar Documents

Publication Publication Date Title
US11704833B2 (en) Monocular vision tracking method, apparatus and non-transitory computer-readable storage medium
CN110880189B (en) Combined calibration method and combined calibration device thereof and electronic equipment
JP7046189B2 (en) How to calibrate an augmented reality device
US7663649B2 (en) Information processing device and method for aiding control operations relating to controlling the position and orientation of a virtual object and method for changing the positioning and orientation of a virtual object
WO2018076154A1 (en) Spatial positioning calibration of fisheye camera-based panoramic video generating method
JP7326911B2 (en) Control system and control method
US9785249B1 (en) Systems and methods for tracking motion and gesture of heads and eyes
JP6198230B2 (en) Head posture tracking using depth camera
JP3486613B2 (en) Image processing apparatus and method, program, and storage medium
WO2021043213A1 (en) Calibration method, device, aerial photography device, and storage medium
JP2022501684A (en) Shooting-based 3D modeling systems and methods, automated 3D modeling equipment and methods
WO2019104571A1 (en) Image processing method and device
CN111161336B (en) Three-dimensional reconstruction method, three-dimensional reconstruction apparatus, and computer-readable storage medium
JP2015090298A (en) Information processing apparatus, and information processing method
JP2003344018A (en) Unit and method for image processing as well as program and storage medium
US20200294269A1 (en) Calibrating cameras and computing point projections using non-central camera model involving axial viewpoint shift
WO2022000713A1 (en) Augmented reality self-positioning method based on aviation assembly
Fang et al. Self-supervised camera self-calibration from video
JP2018173882A (en) Information processing device, method, and program
KR20130130283A (en) System for generating a frontal-view image for augmented reality based on the gyroscope of smart phone and method therefor
WO2020019175A1 (en) Image processing method and apparatus, and photographing device and unmanned aerial vehicle
CN110415329B (en) Three-dimensional modeling device and calibration method applied to same
TWI726536B (en) Image capturing method and image capturing apparatus
TWI692968B (en) 3d model establishing device and calibration method applying to the same
US11941851B2 (en) Systems and methods for calibrating imaging and spatial orientation sensors