TWI692968B - 3d model establishing device and calibration method applying to the same - Google Patents
3d model establishing device and calibration method applying to the same Download PDFInfo
- Publication number
- TWI692968B TWI692968B TW107141879A TW107141879A TWI692968B TW I692968 B TWI692968 B TW I692968B TW 107141879 A TW107141879 A TW 107141879A TW 107141879 A TW107141879 A TW 107141879A TW I692968 B TWI692968 B TW I692968B
- Authority
- TW
- Taiwan
- Prior art keywords
- processing unit
- unit
- rotation
- modeling device
- camera
- Prior art date
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
本發明是有關於一種三維建模裝置及方法 The invention relates to a three-dimensional modeling device and method
三維場景建模是電腦視覺領域裡一個熱門的研究題目,一般是利用多張圖幀和深度資訊來建構出一個完整的三維場景模型。三維場景模型的優劣取決於能否估算出圖幀正確的位姿。然而,事實上位姿卻容易因為誤差累積或是特徵點不明顯而產生位姿漂移。現有技術中大多利用優化演算法來縮小這些誤差,但是成效有限。 Three-dimensional scene modeling is a popular research topic in the field of computer vision. Generally, multiple frames and depth information are used to construct a complete three-dimensional scene model. The pros and cons of the 3D scene model depends on whether the correct pose of the frame can be estimated. However, in fact, pose poses are prone to pose drift due to error accumulation or insignificant feature points. Most of the existing technologies use optimization algorithms to reduce these errors, but the results are limited.
本發明的目的是提供一種三維建模裝置及應用於其之校準方法,能夠建立環境場景的三維模型。 The object of the present invention is to provide a three-dimensional modeling device and a calibration method applied thereto, which can establish a three-dimensional model of an environmental scene.
本發明的一方面揭露一種三維建模裝置,包括一攝影機及一穿戴式顯示器。攝影機用以取得複數個第一圖幀、一第二圖幀及一深度資訊。穿戴式顯示器耦接至攝影機。穿戴式顯示器包括一顯示單元、一處理單元、一儲存單元及一投影單元。儲存單元耦接至處理單元,且用以儲存一第一模組及一第二模組。當第一模組由處理單元執行時,致使處理單元計算穿戴式顯示器的一第一位姿。當第二模組 由處理單元執行時,致使處理單元依據該些第一圖幀、深度資訊、第一位姿及複數個校準參數計算一三維模型,並依據第二圖幀更新三維模型。投影單元耦接至處理單元,且用以將三維模型及第二圖幀依據第一位姿投影至顯示單元,以與一實際影像共同顯示於顯示單元。 One aspect of the present invention discloses a three-dimensional modeling device, including a camera and a wearable display. The camera is used to obtain a plurality of first image frames, a second image frame and a depth information. The wearable display is coupled to the camera. The wearable display includes a display unit, a processing unit, a storage unit, and a projection unit. The storage unit is coupled to the processing unit and is used to store a first module and a second module. When the first module is executed by the processing unit, the processing unit is caused to calculate a first pose of the wearable display. When the second module When executed by the processing unit, the processing unit is caused to calculate a three-dimensional model based on the first image frames, depth information, first pose and a plurality of calibration parameters, and update the three-dimensional model based on the second image frame. The projection unit is coupled to the processing unit, and is used to project the three-dimensional model and the second image frame to the display unit according to the first posture to display together with an actual image on the display unit.
本發明的另一方面揭露一種應用於三維建模裝置的校準方法,包括:令三維建模裝置沿複數個方向直線移動,並分別取得三維建模裝置的一穿戴式顯示器對應於各方向的複數個第一移動速度向量、三維建模裝置的一攝影機對應於各方向的複數個第二移動速度向量;依據複數個第一移動速度向量及第二移動速度向量計算一旋轉參數;任意移動三維建模裝置,並記錄複數筆旋轉量資訊,其中複數個旋轉量資訊分別記錄每一次三維建模裝置的角速度大於一第一閥值時穿戴式顯示器於該時間點的旋轉量、穿戴式顯示器於該時間點相對於前一個時間點的位移量、攝影機於該時間點相對於前一個時間點的位移量,並依據該時間點的旋轉量計算該時間點的旋轉軸單位向量;以及依據複數個旋轉量資訊及旋轉軸單位向量計算位移參數。 Another aspect of the present invention discloses a calibration method applied to a three-dimensional modeling device, including: making the three-dimensional modeling device move linearly in a plurality of directions, and respectively obtaining a plurality of wearable displays of the three-dimensional modeling device corresponding to a plurality of directions A first moving speed vector and a camera of the three-dimensional modeling device correspond to a plurality of second moving speed vectors in various directions; a rotation parameter is calculated according to the plural first moving speed vectors and the second moving speed vectors; Model device, and record a plurality of pieces of rotation amount information, wherein the plurality of pieces of rotation amount information respectively record the rotation amount of the wearable display at the time point when the angular velocity of the 3D modeling device is greater than a first threshold, The displacement amount of the time point relative to the previous time point, the displacement amount of the camera relative to the previous time point at that time point, and the rotation axis unit vector at that time point is calculated according to the rotation amount of the time point; and based on the plurality of rotations The displacement information is calculated based on the quantity information and the unit vector of the rotation axis.
為了對本發明之上述及其他方面有更佳的瞭解,下文特舉實施例,並配合所附圖式詳細說明如下: In order to have a better understanding of the above and other aspects of the present invention, the following examples are specifically described in conjunction with the accompanying drawings as follows:
10:三維建模裝置 10: 3D modeling device
102:攝影機 102: Camera
104:穿戴式顯示器 104: Wearable display
1041:處理單元 1041: Processing unit
1043:儲存單元 1043: Storage unit
1045:投影單元 1045: Projection unit
1047:輸入單元 1047: Input unit
1049:顯示單元 1049: display unit
M1:第一模組 M1: the first module
M2:第二模組 M2: second module
S301~S307:步驟 S301~S307: Steps
第1圖繪示依據本發明一實施例的三維建模裝置的方塊圖。 FIG. 1 is a block diagram of a three-dimensional modeling device according to an embodiment of the invention.
第2圖繪示依據本發明一實施例的三維建模裝置的示意圖。 FIG. 2 is a schematic diagram of a three-dimensional modeling device according to an embodiment of the invention.
第3圖繪示依據本發明一實施例的校準參數產生方法的流程圖。 FIG. 3 is a flowchart of a calibration parameter generation method according to an embodiment of the invention.
請參照第1圖,第1圖繪示依據本發明一實施例的三維建模裝置的方塊圖。請同時參照第2圖,第2圖繪示的是依據本發明一實施例的三維建模裝置的示意圖。三維建模裝置10包括一攝影機102以及一穿戴式顯示器104。三維建模裝置10是用以建立環境場景的三維模型。在一實施例中,三維建模裝置10是一擴增實境(Augmented Reality,AR)裝置。在其他實施例中,三維建模裝置10可為一虛擬實境VR(Virtual Reality,VR)裝置、一替代實境(Substitutional Reality)裝置或一混合實境(Mixed Reality,MR)裝置。
Please refer to FIG. 1, which illustrates a block diagram of a three-dimensional modeling device according to an embodiment of the invention. Please also refer to FIG. 2, which is a schematic diagram of a three-dimensional modeling device according to an embodiment of the invention. The three-
攝影機102可為一RGB-D攝影機(RGB-D camera)。攝影機102除了能夠捕捉環境的色彩資訊,還能夠藉由自身具備的深度感測器捕捉環境的深度資訊。攝影機102是用以取得複數個第一圖幀、一第二圖幀及一深度資訊,其中第二圖幀是當前取得的最新的圖幀,而第一圖幀則是在第二圖幀之前取得的多個圖幀。如第2圖所示,攝影機102是可拆卸地或固定地設置在穿戴式顯示器104上。
The
穿戴式顯示器104耦接至攝影機102,而使得兩者之間可進行訊號及資料的傳輸。穿戴式顯示器104包括一顯示單元1049、一處理單元1041、一儲存單元1043、一投影單元1045以及一輸入單元1047。顯示單元1049的配置如第2圖所示,顯示單元1049可為一頭戴式顯示螢幕(Head Mount Display,HMD)。處理單元1041可包括一或
多個專用積體電路晶片或通用處理器,用以執行三維建模的運算。儲存單元1043可為非揮發性記憶體,例如NAND快閃記憶體,用以儲存應用程式及資料。儲存單元1043包括一第一模組M1及一第二模組M2。第一模組M1與第二模組M2為計算機可讀程式,包括多個計算機可讀指令。當第一模組M1由處理單元1041執行時,會使處理單元1041計算穿戴式顯示器104的一第一位姿(pose)。處理單元1041可根據穿戴式顯示器104所具備的一或多個感測器(未繪示)來計算第一位姿,計算的方式可採用本領域中常用的手段,故於此不加贅述。當第二模組M2由處理單元1041執行時,會使處理單元1041根據複數個第一圖幀、深度資訊、第一位姿及複數個校準參數計算一三維模型,並依據第二圖幀更新三維模型,其中校準參數的細節會在下文進行說明。簡單來說,第一模組M1是用來計算穿戴式顯示器104當前的位姿,第二模組M2是用來建立環境場景的三維模型。
The
投影單元1045耦接至處理單元1041。投影單元1045是用以將處理單元1041建立好的三維模型及第二圖幀投影至顯示單元1049。於混合實境的應用中,使用者可通過顯示單元1049看到一實際影像,即實際的環境場景。被投影至顯示單元1049上的三維模型及第二圖幀可與實際影像共同呈現於顯示單元1049上。
The
由於攝影機102與穿戴式顯示器104有各自的座標系統,因此,穿戴式顯示器104在使用來自攝影機102的第一圖幀、第二圖幀與深度資訊時須先進行座標系統的轉換(或稱校
準),才不會令三維模型與實際場景有大幅的偏差。前文所述的校準參數即是用以進行座標系統轉換的依據。
Since the
輸入單元1047耦接至處理單元1041。輸入單元1047可為一輸入控制器,可以與顯示單元1049連結,用以接收來自使用者的輸入指令,並將輸入指令傳送至處理單元1041。處理單元1041可回應於使用者的輸入指令對第二圖幀及/或三維模型進行調整。關於此部分的細節將於下文進行說明。
The
請參照第3圖,第3圖繪示的是依據本發明一實施例的校準參數的產生方法的流程圖。校準參數包括一旋轉參數R及一位移參數T,其中R及T可為矩陣的形式。 Please refer to FIG. 3, which is a flowchart of a method for generating a calibration parameter according to an embodiment of the present invention. The calibration parameters include a rotation parameter R and a displacement parameter T, where R and T may be in the form of a matrix.
步驟S301,令三維建模裝置10沿複數個方向直線移動,並分別取得穿戴式顯示器104對應於各個方向的複數個第一移動速度向量、攝影機102對應於各個方向的複數個第二移動速度向量。舉例來說,可準備一三軸平台(X軸、Y軸及Z軸,三軸相互垂直),將建模裝置10固定於三軸平台的一承載單元上。首先,操作承載單元以令建模裝置10沿X軸方向移動,以取得穿戴式顯示器104對應於X軸方向的第一移動速度向量v 1及攝影機102對應於X軸方向的第二移動速度向量c 1,得到v 1=Rc 1,其中R為旋轉參數。接著,操作承載單元以令建模裝置10沿Y軸方向移動,以取得穿戴式顯示器104對應於Y軸方向的第一移動速度向量v 2及攝影機102對應於X軸方向的第二移動速度向量c 2,得到v 2=Rc 2。再接著,操作承載單元以令建模裝置10沿Z軸方向
移動,以取得穿戴式顯示器104對應於Z軸方向的第一移動速度向量v 3及攝影機102對應於X軸方向的第二移動速度向量c 3,得到v 3=Rc 3。
In step S301, the three-
在獲取第一移動速度向量時可藉由穿戴式顯示器104所具備的感測器來量測。另一方面,在獲取第二移動速度向量時可藉由攝影機102所具備的感測器來量測。取得的第一移動速度向量及第二移動速度向量可提供給穿戴式顯示器104的處理單元1041或一外部計算機(未繪示)進行後續的計算。所謂的「外部計算機」是指不包括在三維建模裝置10內,而是從外部耦接至三維建模裝置10的計算機,例如背包電腦。
When acquiring the first moving speed vector, it can be measured by a sensor provided in the
步驟S303,依據第一移動速度向量及第二移動速度向量計算旋轉參數。以上述例子來說,R=[v 1 v 2 v 3][c 1 c 2 c 3]-1。 Step S303: Calculate the rotation parameter according to the first moving speed vector and the second moving speed vector. In the above example, R=[ v 1 v 2 v 3 ][ c 1 c 2 c 3 ] -1 .
值得一提的是,移動的方向不限於三個,而可以擴展至N個(N>3)以提高精準度。在移動的方向為N個的情況下,所取得的第一移動速度向量與第二移動速度向量之間的關係可被整理成:
藉由最小平方法(least square)可解出上述方程式,得到R如下:
完成旋轉參數R的計算後,接著計算位移參數T。 After the calculation of the rotation parameter R is completed, the displacement parameter T is then calculated.
步驟S305,任意移動三維建模裝置10,並記錄複數筆旋轉量資訊,其中複數筆旋轉量資訊係分別記錄每一次角速度大於一第一閥值時穿戴式顯示器104於該時間點t的旋轉量R t 、穿戴式顯示器104於該時間點t相對於前一個時間點的位移量T vt ,攝影機102於該時間點t相對於前一個時間點的位移量T ct ,並依據該時間點t的旋轉量R t 計算該時間點t的旋轉軸單位向量ω t 。
Step S305: Move the three-
步驟S307,依據旋轉量資訊及旋轉軸單位向量計算位移參數T。舉例來說,假設收集到的旋轉量資訊有n筆,其中n為大於或等於2的整數。當n為奇數時,旋轉量資訊、旋轉軸單位向量及位移參數之間的關係可由下式表示:
當n為偶數時,旋轉量資訊、旋轉軸單位向量及位移參數之間的關係可由下式表示:
其中I為單位矩陣,△T t =T ct -T vt ,[ω t] X 為一自定義的運算子,定義如下:
其中ω xt為單位旋轉軸向量ω t在X軸的分量,ω yt為單位旋轉軸向量ω t在Y軸的分量,ω zt為單位旋轉軸向量ω t在Z軸的分量。 Where ω x t is the component of the unit rotation axis vector ω t on the X axis, ω y t is the component of the unit rotation axis vector ω t on the Y axis, and ω z t is the unit rotation axis vector ω t on the Z axis Weight.
需要注意的是,此處的旋轉量R t與步驟S303算出的旋轉參數R是不同的。 It should be noted that the rotation amount R t here is different from the rotation parameter R calculated in step S303.
藉由Psudo Inverse解上述關係式,即可得到位移參數T。 By Psudo Inverse solving the above relationship, the displacement parameter T can be obtained.
在一實施例中,處理單元1041可藉由管理一第一緩衝器及一第二緩衝器,例如先進先出(first in first out,FIFO)緩衝器,來即時更新旋轉參數R及位移參數T。例如,第一緩衝器用以儲存計算旋轉參數R的相關資訊,第二緩衝器用以儲存用以計算位移參數T的相關資訊。當第一緩衝器滿時(例如已累積n筆相關資訊),處理單元1041即依據第一緩衝器內儲存的資訊計算旋轉參數R,計算的方式可採用前文所述的方式。每當有一筆新的資訊被儲存至第一緩衝器,最舊的一筆資訊就會被丟棄,處理單元1041就可再計算一次旋轉參數R,以進行旋轉參數R的更新。相似地,位移參數T也可以類似的方式即時更新。
In one embodiment, the
在計算出校準參數後,可將之記載於第二模組M2中,而當第二模組M2由處理單元1041執行時,處理單元1041便可依據這些校準參數將攝影機102提供的第一圖幀、第二圖幀及深度資訊以穿戴
式顯示器104的座標系統表示,接著再根據第一位姿及上述轉換後的第一圖幀、第二圖幀及深度資訊建立/更新三維模型。
After calculating the calibration parameters, they can be recorded in the second module M2. When the second module M2 is executed by the
在一實施例中,使用者藉由比較投影於顯示單元1049上的第二圖幀與實際影像以確定投影的第二圖幀是否與實際影像(或三維模型)相符。若使用者認為第二圖幀與實際影像有所偏差,使用者可通過輸入單元1047發出一圖幀調整指令。圖幀調整指令可包括沿X軸、Y軸及Z軸方向的移動及/或以X軸、Y軸及Z軸為中心的旋轉。在第二模組M2由處理單元1041執行的情況下,處理單元1041在接收到圖幀調整指令後會判斷可能是校準參數需要調整,於是處理單元1041會依據圖幀調整指令產生複數個第一修正參數。接著,處理單元1041可根據第一修正參數更新校準參數。
In one embodiment, the user determines whether the projected second frame matches the actual image (or the three-dimensional model) by comparing the second frame projected on the
在一實施例中,使用者藉由比較投影於顯示單元1049上的三維模型與實際影像可以得知處理單元1041所建立的三維模型是否與實際影像相符。若使用者認為三維模型與實際影像有所偏差或認為三維模型的位置需要調整,使用者可通過輸入單元1047發出一模型調整指令。模型調整指令可包括沿X軸、Y軸及Z軸方向的移動及/或以X軸、Y軸及Z軸為中心的旋轉。在第二模組M2由處理單元1041執行的情況下,處理單元1041在接收到模型調整指令後會判斷可能是第一位姿需要調整,於是處理單元1041會依據模型調整指令產生複數個第二修正參數。當第一模
組M1由處理單元1041執行時,處理單元1041會依據第二修正參數更新第一位姿。
In an embodiment, the user can know whether the three-dimensional model created by the
在一實施例中,第一修正參數及第二修正參數可分別依據圖幀調整指令及模型調整指令所調整的位移量及/或旋轉量產生。 In an embodiment, the first correction parameter and the second correction parameter may be generated according to the displacement and/or rotation adjusted by the frame adjustment instruction and the model adjustment instruction, respectively.
在一實施例中,當第二模組M2由處理單元1041執行時,會致使處理單元1041依據第二圖幀及三維模型計算一相似度,依據相似度產生一調整提示,並令投影單元1045將調整提示投影至顯示單元1049。例如,處理單元1041會先計算一重疊區域A=Overlap(F,pf,M,pm),可採用如voxelhulls方法找出重疊區域。接著,計算重疊區域對應點的差值平方和平均值作為相似度,其中包含RGB資訊圖及深度資訊圖兩部分,並分別乘上一個正實數權值(ω r ,ω d )結合成最終的相似度S如下:
其中△rgb為RGB資訊圖中的RGB值差異量,△depth為深度資訊圖中深度的差異量,N為重疊區域的總點數。 Where △ rgb is the amount of difference in RGB values in the RGB information map, △ depth is the amount of difference in depth in the depth information map, and N is the total number of points in the overlapping area.
在另一實施例中,三維建模裝置為一虛擬實境裝置。在這個實施例中,三維建模裝置更包括一第一攝影機,用以拍攝環境場景並顯示於顯示單元上。處理單元可令投影單元將建好的三維模型及第二圖幀投影於顯示單元上,或是藉由運算功能將三維模型及第二圖 幀顯示於顯示單元上。前文所述的校準方法、調整第二圖幀/三維模型及計算相似度等皆可應用於本實施例中。 In another embodiment, the three-dimensional modeling device is a virtual reality device. In this embodiment, the three-dimensional modeling device further includes a first camera, which is used to photograph the environmental scene and display it on the display unit. The processing unit can cause the projection unit to project the built 3D model and the second image frame on the display unit, or use the calculation function to project the 3D model and the second image The frame is displayed on the display unit. The calibration method described above, adjusting the second frame/three-dimensional model, and calculating the similarity can be applied to this embodiment.
藉由本發明提供的三維建模裝置,能夠將攝影機與穿戴式顯示器進行整合,並透過校準方法產生的校準參數,使得穿戴式顯示器根據校準參數將來自攝影機取得的圖幀與深度資訊轉換為自身所在的座標系統,並用以建立三維模型。此外,藉由將攝影機提供的當前的圖幀與實際影像比較,使用者可即時調整圖幀或三維模型的位置及/或角度,穿戴式顯示器根據使用者的調整可針對自身的位姿及/或校準參數進行修正,使得更新後的三維模型更加貼近實際的環境場景,達到圖幀和場景模型的互動式調整的效果。不僅如此,穿戴式顯示器更可以計算當前的圖幀與三維模型的相似度,並根據相似度產生調整提示,以供使用者做為調整的參考。 With the three-dimensional modeling device provided by the present invention, the camera and the wearable display can be integrated, and the calibration parameters generated by the calibration method enable the wearable display to convert the frame and depth information obtained from the camera to its own location according to the calibration parameters Coordinate system and used to build a three-dimensional model. In addition, by comparing the current frame provided by the camera with the actual image, the user can adjust the position and/or angle of the frame or three-dimensional model in real time, and the wearable display can target its own position and/or position according to the user's adjustment. Or the calibration parameters are modified to make the updated 3D model closer to the actual environment scene, and to achieve the effect of interactive adjustment of the frame and scene model. Not only that, the wearable display can also calculate the similarity between the current image frame and the three-dimensional model, and generate an adjustment prompt according to the similarity, for the user to use as a reference for adjustment.
綜上所述,雖然本發明已以實施例揭露如上,然其並非用以限定本發明。本發明所屬技術領域中具有通常知識者,在不脫離本發明之精神和範圍內,當可作各種之更動與潤飾。因此,本發明之保護範圍當視後附之申請專利範圍所界定者為準。 In summary, although the present invention has been disclosed as above with examples, it is not intended to limit the present invention. Those with ordinary knowledge in the technical field to which the present invention belongs can make various modifications and retouching without departing from the spirit and scope of the present invention. Therefore, the scope of protection of the present invention shall be deemed as defined by the scope of the attached patent application.
10:三維建模裝置 10: 3D modeling device
102:攝影機 102: Camera
104:穿戴式顯示器 104: Wearable display
1041:處理單元 1041: Processing unit
1043:儲存單元 1043: Storage unit
1045:投影單元 1045: Projection unit
1047:輸入單元 1047: Input unit
1049:顯示單元 1049: display unit
M1:第一模組 M1: the first module
M2:第二模組 M2: second module
Claims (10)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811532976.8A CN110415329B (en) | 2018-04-26 | 2018-12-14 | Three-dimensional modeling device and calibration method applied to same |
US16/394,650 US10891805B2 (en) | 2018-04-26 | 2019-04-25 | 3D model establishing device and calibration method applying to the same |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862662781P | 2018-04-26 | 2018-04-26 | |
US62/662,781 | 2018-04-26 |
Publications (2)
Publication Number | Publication Date |
---|---|
TW201946450A TW201946450A (en) | 2019-12-01 |
TWI692968B true TWI692968B (en) | 2020-05-01 |
Family
ID=69582839
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW107141879A TWI692968B (en) | 2018-04-26 | 2018-11-23 | 3d model establishing device and calibration method applying to the same |
Country Status (1)
Country | Link |
---|---|
TW (1) | TWI692968B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1497504A (en) * | 2002-09-30 | 2004-05-19 | 佳能株式会社 | Video image combining equipment equipment and video image combining method |
CN103988226A (en) * | 2011-08-31 | 2014-08-13 | Metaio有限公司 | Method for estimating camera motion and for determining three-dimensional model of real environment |
CN105320271A (en) * | 2014-07-10 | 2016-02-10 | 精工爱普生株式会社 | HMD calibration with direct geometric modeling |
CN105404392A (en) * | 2015-11-03 | 2016-03-16 | 北京英梅吉科技有限公司 | Monocular camera based virtual wearing method and system |
-
2018
- 2018-11-23 TW TW107141879A patent/TWI692968B/en active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1497504A (en) * | 2002-09-30 | 2004-05-19 | 佳能株式会社 | Video image combining equipment equipment and video image combining method |
CN103988226A (en) * | 2011-08-31 | 2014-08-13 | Metaio有限公司 | Method for estimating camera motion and for determining three-dimensional model of real environment |
CN105320271A (en) * | 2014-07-10 | 2016-02-10 | 精工爱普生株式会社 | HMD calibration with direct geometric modeling |
CN105404392A (en) * | 2015-11-03 | 2016-03-16 | 北京英梅吉科技有限公司 | Monocular camera based virtual wearing method and system |
Also Published As
Publication number | Publication date |
---|---|
TW201946450A (en) | 2019-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11704833B2 (en) | Monocular vision tracking method, apparatus and non-transitory computer-readable storage medium | |
CN110880189B (en) | Combined calibration method and combined calibration device thereof and electronic equipment | |
JP7046189B2 (en) | How to calibrate an augmented reality device | |
US7663649B2 (en) | Information processing device and method for aiding control operations relating to controlling the position and orientation of a virtual object and method for changing the positioning and orientation of a virtual object | |
WO2018076154A1 (en) | Spatial positioning calibration of fisheye camera-based panoramic video generating method | |
JP7326911B2 (en) | Control system and control method | |
US9785249B1 (en) | Systems and methods for tracking motion and gesture of heads and eyes | |
JP6198230B2 (en) | Head posture tracking using depth camera | |
JP3486613B2 (en) | Image processing apparatus and method, program, and storage medium | |
WO2021043213A1 (en) | Calibration method, device, aerial photography device, and storage medium | |
JP2022501684A (en) | Shooting-based 3D modeling systems and methods, automated 3D modeling equipment and methods | |
WO2019104571A1 (en) | Image processing method and device | |
CN111161336B (en) | Three-dimensional reconstruction method, three-dimensional reconstruction apparatus, and computer-readable storage medium | |
JP2015090298A (en) | Information processing apparatus, and information processing method | |
JP2003344018A (en) | Unit and method for image processing as well as program and storage medium | |
US20200294269A1 (en) | Calibrating cameras and computing point projections using non-central camera model involving axial viewpoint shift | |
WO2022000713A1 (en) | Augmented reality self-positioning method based on aviation assembly | |
Fang et al. | Self-supervised camera self-calibration from video | |
JP2018173882A (en) | Information processing device, method, and program | |
KR20130130283A (en) | System for generating a frontal-view image for augmented reality based on the gyroscope of smart phone and method therefor | |
WO2020019175A1 (en) | Image processing method and apparatus, and photographing device and unmanned aerial vehicle | |
CN110415329B (en) | Three-dimensional modeling device and calibration method applied to same | |
TWI726536B (en) | Image capturing method and image capturing apparatus | |
TWI692968B (en) | 3d model establishing device and calibration method applying to the same | |
US11941851B2 (en) | Systems and methods for calibrating imaging and spatial orientation sensors |