TWI278772B - Augmented reality system and method with mobile and interactive function for multiple users - Google Patents

Augmented reality system and method with mobile and interactive function for multiple users Download PDF

Info

Publication number
TWI278772B
TWI278772B TW94105365A TW94105365A TWI278772B TW I278772 B TWI278772 B TW I278772B TW 94105365 A TW94105365 A TW 94105365A TW 94105365 A TW94105365 A TW 94105365A TW I278772 B TWI278772 B TW I278772B
Authority
TW
Taiwan
Prior art keywords
computer
augmented reality
user
virtual image
viewing point
Prior art date
Application number
TW94105365A
Other languages
Chinese (zh)
Other versions
TW200630865A (en
Inventor
Kuen-Meau Chen
Lin-Lin Chen
Ming-Jen Wang
Whey-Fone Tsai
Ching-Yu Yang
Original Assignee
Nat Applied Res Lab Nat Ce
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nat Applied Res Lab Nat Ce filed Critical Nat Applied Res Lab Nat Ce
Priority to TW94105365A priority Critical patent/TWI278772B/en
Publication of TW200630865A publication Critical patent/TW200630865A/en
Application granted granted Critical
Publication of TWI278772B publication Critical patent/TWI278772B/en

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

An augmented reality system with mobile and interactive function for multiple users includes two major portions: a computer system with handling augmented reality, and multiple user systems for each user. The computer system with handling augmented reality has a very powerful function of calculating imagine digital data and transferring imagine digital data to a three-dimensional virtual imagine for each user system. The user system mainly includes a head-mounted display (HMD), a microphone and a PDA. A user can see the virtual imagine from the HMD and use the microphone or the PDA for communication with other users.

Description

1278772 玖、發明說明: ,【發明所屬之技術領域】 本發明係關於一種可供多人移動式互動使用之擴增實 境系統,譬如應用在車子的虛擬原型的評估與討論。 '【先前技術】 擴增實境是一種結合現有外觀環境影像及電腦虛擬影 _ 像的一種新的虛擬實境技術。 擴增實境工具(Augmented Reality Kit, ARtoolKit) 可以提供使用者最自然的瀏覽方式,虛擬模型會隨著使用 者眼睛觀看的方向而移動或轉動,比一般須以滑鼠或鍵盤 進行瀏覽較為直覺,但是擴增實境工具需求大量的運算能 力,一般的行動運算裝置(mobile computing devices)如個人 數位助理(PDA)無法負荷這樣的運算,而能提供這樣運 算能力的電腦又因為過大而喪失了移動性(mobility)。 除了移動性之外,如何亦可達成異地的討論,甚至使 # 用者可以即時觀看到其他使用者的角度等功能是需要的, 因為這些功能可使多個使用者能夠進行互動及有效率的討 論。 【發明内容】 本發明之主要目的係在提供多個使用者可以在擴增實 境下進行互動式的討論,譬如應用於車子的虛擬原型的評 估,多個使用者可以進行即時、異地的討論,甚至使用者 1278772 <以即時觀看到其他使用者的角度,使得多個使用者能夠 進行互動及有效率的討論。這些討論亦可搭配個人數位助 理(PDA),能夠輸入註記並供日後使用。 為達成上述之目的’本發明可供多人移動式互動使用 之擴增實境系統包括兩大部分,第一大部分是可處理擴增 實境之電腦系統,另一大部分是每一個使用者擁有的使用 奢系統。 可處理擴增實境之電腦系統主要是具有強大之電腦影1278772 发明, invention description: [Technical field to which the invention belongs] The present invention relates to an augmented reality system for multi-person mobile interactive use, such as evaluation and discussion of a virtual prototype applied to a vehicle. '[Prior Art] Augmented reality is a new virtual reality technology that combines existing appearance environment images with computer virtual shadow images. The Augmented Reality Kit (ARtoolKit) can provide users with the most natural way to browse. The virtual model can move or rotate with the user's eyes, which is more intuitive than browsing with a mouse or keyboard. However, augmented reality tools require a large amount of computing power. Generally, mobile computing devices such as personal digital assistants (PDAs) cannot load such calculations, and computers that can provide such computing power are lost because they are too large. Mobility. In addition to mobility, how can you reach an off-site discussion, even so that users can instantly see other users' perspectives, because these features enable multiple users to interact and be efficient. discuss. SUMMARY OF THE INVENTION The main object of the present invention is to provide interactive discussions among a plurality of users in an augmented reality, such as an evaluation of a virtual prototype applied to a car, where multiple users can conduct an instant, off-site discussion. Even the user 1278772 < allows instant viewing of other users, enabling multiple users to interact and discuss effectively. These discussions can also be combined with a personal digital assistant (PDA) that can be entered for future use. In order to achieve the above objectives, the augmented reality system for multi-person mobile interactive use of the present invention includes two parts, the first part being a computer system capable of processing augmented reality, and the other part being used for each The owner has a luxury system. A computer system that can handle augmented reality mainly has a powerful computer image

像運算功能,可將電腦繪圖資料轉換成一立體之虛擬影 像,再送至使用者系統。 使相有乐既包栝頭戴式顯示器,顯示器 麥克風,以及手持式電腦。使用者可透 = 看立而=用麥克風或手持式電腦可讀 看點之位置。當使用者想觀看到其二 需取得其他使用者的觀看點位置即理ς = 電腦系統運算出其他使用者所看貫境之 [實施方式】 以下請一併參考圖丨關於本Like the computing function, the computer graphics data can be converted into a three-dimensional virtual image and sent to the user system. The camera is equipped with a head-mounted display, a display microphone, and a handheld computer. Users can see through = and use a microphone or handheld computer to read the location of the point. When the user wants to see the second position, it is necessary to obtain the position of the other user's viewing point. The computer system calculates the situation that other users see. [Implementation] Please refer to the figure below for reference.

應用於設計車子外觀為 列。 &之糸、、、充采構圖,並J 本务明可供多人移自式互 括兩大部分,第_大邱八3 用之擴增實境系統10 ί 2〇,另一大部分是使大用^ ,,為方便說明以用 =明可供多人丨 為例。 便用者80a,80b同時使; 6 1278772 在本實施例中可處理擴增實境之電腦系統2 0包括一第 一擴增貫境之電腦子系統20a以及一第二擴增實境之電腦 子系統20b,其中各子系統2〇a,2〇b電性連接在一起。各子 系統20a,20b基本上採用一個電腦,請一併參考圖3,而各 子系統20a,20b包括有擴增實境系統應用程式21,而在本 發明中,擴增實境系統應用程式21主要包括電腦影像運算 私式碼22,資料傳輸程式碼23,觀看點位置分析程式碼 24以及立體之電腦繪圖資料25。立體之電腦繪圖資料25 在本實施例中以車子外觀設計之繪圖資料為例,至於:其它 # 的程式碼則後述。 使用者系統50包括給各使用者8〇a,8〇b所使用的使用 者系統50a,50b。使用者系統5〇a包括頭戴式顯示器3〇a(通 常包括喇队)’设於頭戴式顯示器3〇a之攝影機31a及麥克 風32a ’以及手持式電腦4〇a。同樣的使用者系統5〇b也包 括頭戴式顯示态30b,攝影機31b,麥克風32b,以及手持 式電腦40b。 本發明實施例是希望各使用者80a,8〇b頭戴著頭戴式 顯示器30a,30b,而在使用者80a,80b移動時,因為所佔的 馨位置以及使用者8〇a,80b頭部之角度改變時,虛擬影像6〇 即會顯示如真實的觀看樣子,因此本發明實施例包括有一 參考物件70,使用者8〇a,80b可繞著參考物件7〇週邊移 動,而虛擬影像60就如同顯示在參考物件7〇的位置,如 圖7之示意圖。 圖1之貝加例係在實質同一個區域的使用環境。而圖2 係在兩個不同地方之使用環境,兩個子系統2Qa,2〇b透過 網際網路90 (若距離較近可使用内部網路連線)電性連接 在一起,由於使用者80a,8〇b在不同地方使用,因此各有 1278772 兩個位置參考物件70a,70b,而虛擬影像60a,60b亦各自 呈現於參考物件70a,70b上、 · 以下請一併參考圖4暸解虛擬影像產生之流輊圖,以 下以使用者80a這端為例: ' 步驟401 : 擴增實境之電腦子系統20a取得立體之電腦繪圖資料 25 ° 步驟402 : 取得位置參考物件70之影像。攝影機31a由於設於頭 戴式顯示器30a上,因此當使用者80a面對位置參考物件 70時,攝影機31a可取得位置參考物件70的影像並傳送 至子系統20a。 步驟403 : 分析位置參考物件70之影像以取得觀看點位置參數。 ⑩ 子系統20a的觀看點位置分析程式碼24分析位置參考 物件70,即可得到使用者80a觀看點的位置。參考物件70 會有一參考標示71 (如『MARKER』),技術原理是根據位 置參考物件70之影像,如分析『MARKER』之字的大小, 形狀,方向即可推算使用者80a觀看點的位置,而取得觀 看點位置參數(譬如是座標位置,或是以向量表示等等), 由於此技術是已知技術因此在此不再詳細說明。 步驟404 : 72 ^787 ^由觀看點位置參數以計算出立體之虛擬影像6〇。有 忒後,電腦影像運算程式碼22即可將該立 驟為基本立體影像之運算,因此在此不再贅述。 匕乂 &驟 405 : 傳送虛擬影像至頭戴式顯示器30/手持式電腦4〇, 使用者8〇a可以看到虛擬影像6〇。 使 請參考圖5,本發明系統於多人使用狀態之流程圖,此 重點。以下以使用者80a想傳達他的看法 =用者80b為例,當然應用同樣的方式,使用者_也 可傳達他的看法給使用者8〇a : 步驟A1 :紀錄註記。 在本發明中,使用者80a可在其手持式電腦4〇a紀 他對於虛擬影像60的想法,譬如對於車子的造型,顏色 等,也$可藉由手持式電腦40a輸入指令操作子系統2〇a,嬖 如改憂車子的顏色等等。請參考圖6,手持式電腦40嬖如· 是PDA,手持式電腦4〇的螢幕41上顯示虛擬 5 及註解視窗43。 ^ 42 步驟A2 ··傳送註記。 42^1 者^透過操控手持式電腦伽,將虛擬影像視窗 42及在解視自43透過子系統20a傳送至子系統2〇b。 9 1278772 步驟B1 :接收註記。 子系統20b接收子系統20a所傳送之虛擬影像視窗42 及註解視窗43,並將虛擬影像視窗42及註解視窗43傳給 手持式電腦40b。 步驟B2 :執行切換影像指令。 假設使用者80b想直接與使用者80a討論,此時最好 能以使用者80a所觀看到的虛擬影像60a來討論。使用者 80b可透過手持式電腦40b執行切換影像指令。 步驟B3 :傳送執行切換影像指令。 子系統20b可以傳送執行切換影像指令至子系統20a。 步驟A3 :子系統20a接收執行切換影像指令。 步驟A4 :子系統20a持續傳送第一觀看點位置參數。 亦即使用者80a所觀看之觀看點。 步驟B4 :子系統20b接收第一觀看點位置參數。 步驟B5 :計算第一使用者所觀看到之立體虛擬影像 此時子系統20b利用第一觀看點位置參數計算出虛擬 影像60a。 步驟B6:傳送虛擬影像60a至頭戴式顯示器30b/手持式電 1278772 腦 40b。 亦即使用者80b在頭戴式顯示器30b所看到的即是使 用者80a所看到。由於第一觀看點位置參數資料量很小, 所以幾乎是完全同步進行。 當然隨著使用者80a改變觀看位置,步驟A4會持續進 行,使得步驟B4〜B6持續進行。 需注意的是,使用者80a,80b亦可用語音進行溝通,尤 其是使用者80a,80b不在同一地點時(如圖2),譬如透過 麥克風32a,32b,以及頭戴式顯示器30a,30b通常内建之 ® 喇队進行即時溝通,而不一定要傳送註記,亦即步驟A1,A2, B1可以不要。當然手持式電腦40a,40b亦可有内建麥克風 及喇叭(圖未示),而不需另外在頭戴式顯示器30a,30b設 置麥克風32a,32b及制队。 另外上述步驟關於兩個子系統20a,20b或者子系統20a, 20b與手持式電腦40a,40b之間的資料傳輸都可透過資料 傳輸程式碼23完成。 I 需注意的是,上述僅為實施例,而非限制於實施例。 譬如此不脫離本發明基本架構者,皆應為本專利所主張之 權利範圍,而應以專利申請範圍為準。譬如可處理擴增實 境之電腦系統20在圖1之實施例中可以只有一台性能夠強 的電腦,而不需多個子系統,或者立體之電腦繪圖資料25 放在另一台電腦’讓兩台電腦共用。 【圖式簡單說明】 1278772 圖1係本發明之系統架構圖’顯示在實質同一個區域的使用 環境。 圖2係本發明之系統架構圖,顯示在兩個不同地方之使用環 境。 圖3係本發明系統關於軟體程式之架構圖。 圖4係本發明關於顯示虚擬影像之流程圖。 圖5係本發明系統於多人使用狀態之流程圖。 圖6係本發明系統關於手持式電腦之實施例圖。 φ 圖7係本發明系統之使用示意圖。 【元件代表符號說明】 可供多人移動式互動使用之擴增實境系統1〇 可處理擴增實境之電腦系統20 擴增實境之電腦子系統20a,20b 擴增實境系統應用程式21 電腦影像運算程式碼22 觀看點位置分析程式碼24 頭戴式顯示器30a,30b 麥克風32a,32b 螢幕41 註解視窗43 虛擬影像60,60a,60b 參考標示71 網際網路90 資料傳輸程式碼23 立體之電腦繪圖資料25 攝影機31a,31b 手持式電腦40a,40b 虛擬影像視窗42 使用者系統50, 50a,50b 位置參考物件70, 70a,70b 使用者80a,80b 12It is applied to the design of the car as a column. & amp , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , It is to make a large use of ^, for the convenience of explanation to use = Ming for many people as an example. The user 80a, 80b is simultaneously enabled; 6 1278772 In this embodiment, the computer system 20 that can process the augmented reality includes a computer subsystem 20a of the first augmented reality and a computer of the second augmented reality. Subsystem 20b, wherein each subsystem 2〇a, 2〇b is electrically connected together. Each subsystem 20a, 20b basically adopts a computer, please refer to FIG. 3 together, and each subsystem 20a, 20b includes an augmented reality system application 21, and in the present invention, the augmented reality system application 21 mainly includes a computer image computing private code 22, a data transmission code 23, a viewing point location analysis code 24, and a stereoscopic computer graphics data 25. Stereoscopic computer graphics data 25 In the present embodiment, the drawing data of the car design is taken as an example. As for the other # code, it will be described later. User system 50 includes user systems 50a, 50b for use by users 8A, 8B. The user system 5A includes a head mounted display 3A (usually including a racquet), a camera 31a and a microphone 32a' disposed on the head mounted display 3A, and a handheld computer 4A. The same user system 5〇b also includes a head mounted display state 30b, a camera 31b, a microphone 32b, and a handheld computer 40b. In the embodiment of the present invention, it is desirable that each user 80a, 8〇b wears the head mounted display 30a, 30b, and when the user 80a, 80b moves, because of the occupied position and the user 8〇a, 80b head When the angle of the part is changed, the virtual image 6 is displayed as if it were a real view. Therefore, the embodiment of the present invention includes a reference object 70, and the user 8〇a, 80b can move around the reference object 7〇, and the virtual image 60 is as shown in the position of the reference object 7〇, as shown in FIG. The Beiga example of Figure 1 is in the same environment in which the same area is used. Figure 2 is used in two different places. The two subsystems 2Qa, 2〇b are electrically connected through the Internet 90 (if the distance is close, the internal network connection can be used), because the user 80a 8〇b is used in different places, so there are 1278772 two position reference objects 70a, 70b, and virtual images 60a, 60b are also respectively presented on the reference objects 70a, 70b, · Please refer to FIG. 4 for virtual images. For the generated flow map, the user 80a is taken as an example: 'Step 401: The computer subsystem 20a of the augmented reality obtains the stereoscopic computer graphics data 25 ° Step 402: Obtain the image of the position reference object 70. Since the camera 31a is disposed on the head mounted display 30a, when the user 80a faces the position reference object 70, the camera 31a can take an image of the position reference object 70 and transmit it to the subsystem 20a. Step 403: Analyze the image of the position reference object 70 to obtain the viewing point position parameter. 10 The viewing point position analysis code 24 of the subsystem 20a analyzes the position reference object 70 to obtain the position of the user 80a viewing point. The reference object 70 will have a reference mark 71 (such as "MARKER"). The technical principle is to refer to the image of the object 70 according to the position. If the size, shape and direction of the word "MARKER" are analyzed, the position of the user 80a viewing point can be estimated. The viewing point position parameters (such as coordinate positions, or vector representations, etc.) are obtained, and since this technique is a known technique, it will not be described in detail herein. Step 404: 72 ^ 787 ^ The stereoscopic virtual image 6 计算 is calculated from the viewing point position parameter. After the computer image computing code 22 can be used as the operation of the basic stereo image, it will not be described here.匕乂 & Step 405: The virtual image is transmitted to the head mounted display 30/handheld computer 4〇, and the user 8〇a can see the virtual image 6〇. Referring to FIG. 5, a flow chart of the system of the present invention in a multi-person use state is focused. In the following, the user 80a wants to convey his opinion = the user 80b is taken as an example. Of course, in the same manner, the user_ can also convey his opinion to the user 8〇a: Step A1: Recording note. In the present invention, the user 80a can input the command operation subsystem 2 by the handheld computer 40a in his hand-held computer 4A, the idea of the virtual image 60, such as the shape, color, etc. of the car. 〇a, such as changing the color of the car and so on. Referring to FIG. 6, the handheld computer 40 is a PDA, and the virtual screen 5 and the annotation window 43 are displayed on the screen 41 of the handheld computer. ^ 42 Step A2 ·· Transfer notes. The 42^1 is transmitted to the subsystem 2〇b by manipulating the handheld computer gamma and transmitting the virtual image window 42 and the resolution from the 43 transmission subsystem 20a. 9 1278772 Step B1: Receive annotations. The subsystem 20b receives the virtual image window 42 and the annotation window 43 transmitted by the subsystem 20a, and transmits the virtual image window 42 and the annotation window 43 to the handheld computer 40b. Step B2: Execute the switching image command. Assuming that the user 80b wants to discuss directly with the user 80a, it is preferable to discuss it with the virtual image 60a viewed by the user 80a. The user 80b can perform a switching image command through the handheld computer 40b. Step B3: The transfer execution image switching instruction is executed. Subsystem 20b may transmit a switch image command to subsystem 20a. Step A3: The subsystem 20a receives the execution switching image command. Step A4: The subsystem 20a continuously transmits the first viewing point position parameter. That is, the viewing point viewed by the user 80a. Step B4: The subsystem 20b receives the first viewing point position parameter. Step B5: Calculate the stereoscopic virtual image viewed by the first user. At this time, the subsystem 20b calculates the virtual image 60a using the first viewing point position parameter. Step B6: Transfer the virtual image 60a to the head mounted display 30b/handheld power 1278772 brain 40b. That is, what the user 80b sees on the head mounted display 30b is what the user 80a sees. Since the amount of data of the first viewing point position parameter is small, it is almost completely synchronized. Of course, as the user 80a changes the viewing position, step A4 continues, so that steps B4 to B6 continue. It should be noted that the users 80a, 80b can also communicate by voice, especially when the users 80a, 80b are not in the same place (as shown in Figure 2), such as through the microphones 32a, 32b, and the head mounted displays 30a, 30b. The Jianzhi® team will communicate instantly, without having to send notes, ie steps A1, A2, B1 may not. Of course, the handheld computers 40a, 40b may also have built-in microphones and speakers (not shown), without the need to additionally set the microphones 32a, 32b and the team in the head mounted displays 30a, 30b. In addition, the above steps are related to the data transfer between the two subsystems 20a, 20b or the subsystems 20a, 20b and the handheld computers 40a, 40b via the data transfer code 23. It should be noted that the above is only an embodiment, and is not limited to the embodiment. Therefore, those who do not depart from the basic structure of the present invention should be bound by the scope of the patent claims, and the scope of the patent application shall prevail. For example, the computer system 20 that can handle the augmented reality can have only one computer with strong performance in the embodiment of FIG. 1, without multiple subsystems, or the three-dimensional computer graphics data 25 placed on another computer. Shared by two computers. BRIEF DESCRIPTION OF THE DRAWINGS 1278772 FIG. 1 is a system architecture diagram of the present invention showing the use environment in substantially the same area. Figure 2 is a system architecture diagram of the present invention showing the use environment in two different locations. Figure 3 is a block diagram of the software program of the system of the present invention. 4 is a flow chart of the present invention for displaying a virtual image. Figure 5 is a flow chart of the system of the present invention in a multi-person state. Figure 6 is a diagram of an embodiment of the system of the present invention with respect to a handheld computer. φ Figure 7 is a schematic representation of the use of the system of the present invention. [Description of Component Symbols] Augmented Reality System for Multi-Person Mobile Interaction 1 Computer System for Augmented Reality 20 Computer Subsystem 20a, 20b for Augmented Reality Augmented Reality System Application 21 Computer image computing code 22 View point position analysis code 24 Head-mounted display 30a, 30b Microphone 32a, 32b Screen 41 Note window 43 Virtual image 60, 60a, 60b Reference mark 71 Internet 90 Data transmission code 23 Stereo Computer graphics data 25 cameras 31a, 31b handheld computers 40a, 40b virtual image windows 42 user systems 50, 50a, 50b location reference objects 70, 70a, 70b users 80a, 80b 12

Claims (1)

1278772 拾、申請專利範圍 h 一種可供多人移動式互動使用之擴增實境系統,包括: 一可處理擴增實境之電腦系統,用以處理以下機制: 儲存機制,用以儲存至少一立體之電腦繪圖資料; 電腦影像運算機制,用以將該立體之電腦繪圖資 換成一立體之虛擬影像; 、 資料傳輸機制,用以接收及輸出資料; 一第一使用者系統以及一第二使用者系統,1 用ϊίί及第二使用者緖分別提供第—使用者愈第 一使用者使用,且各使用者系統包括一頭戴 二π 與可處理擴增實境m统電性連接;以及、不杰’ 一使用者頭部位置偵測裝置,用以知 之位置; 、分頌戳式顯不器 藉由上述之構造,使得·· 可處理擴增實境之電腦系統可依 頭戴式顯示器之位置#由便用者糸統之 第-立體之卢縣„衫像運算機制計算出- 可處理擴增實境之電腦純可 =顯不器; 第二立體之虛擬影ΐ稭:制計算出-器;以及 ¥使用者线之頭戴式顯示 1278772 第一使用者系統之頭戴式顯示器亦可顯示第二立體之 虛擬影像,第二使用者系統之頭戴式顯示器亦可顯示第 一立體之虛擬影像。 2. 如申請專利範圍第1項所述之可供多人移動式互動使用 之擴增實境系統,其中該使用者頭部位置偵測裝置包 括: 一位置參考物件; 兩部攝影機,各攝影機分別設於各頭戴式顯示器上並與 * 可處理擴增實境之電腦系統電性連接,各攝影機可取得 位置參考物件之影像; 其中可處理擴增實境之電腦系統更包括觀看點位置分 析機制,使得觀看點位置分析機制可利用位置參考物件 之影像以分析出各頭戴式顯示器之位置以取得第一觀 看點位置參數,以及第二觀看點位置參數。 3. 如申請專利範圍第1項所述之可供多人移動式互動使用 之擴增實境系統,其中可處理擴增實境之電腦系統包括 _ 一第一擴增實境之電腦子系統以及一第二擴增實境之· 電腦子系統,其中各擴增實境之電腦子系統電性連接於 相對之各使用者系統,且各擴增實境之電腦子系統都包 括可處理擴增實境之電腦系統之機制。 4. 如申請專利範圍第2項所述之可供多人移動式互動使用 之擴增實境系統,其中可處理擴增實境之電腦系統包括 一第一擴增實境之電腦子系統以及一第二擴增實境之 電腦子系統,其中各擴增實境之電腦子系統電性連接於1278772 Pickup, Patent Application Scope h Augmented reality system for multi-person mobile interactive use, including: A computer system that can handle augmented reality to handle the following mechanisms: Storage mechanism for storing at least one Stereoscopic computer graphics; computer image computing mechanism for replacing the stereoscopic computer graphics with a three-dimensional virtual image; data transmission mechanism for receiving and outputting data; a first user system and a second The user system, 1 is provided by the user and the second user, respectively, and the first user is used by the user, and each user system includes a pair of π and a processable augmented reality m electrical connection; And a user's head position detecting device for knowing the position; and the split-type display device is constructed by the above, so that the computer system capable of processing the augmented reality can be worn by the head Position of the display ###################################################################################### ΐ : 制 制 制 制 制 制 制 制 ; ; 使用者 使用者 使用者 使用者 使用者 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 127 The first three-dimensional virtual image can also be displayed. 2. The augmented reality system for multi-person mobile interactive use according to claim 1, wherein the user head position detecting device comprises: Position reference object; two cameras, each camera is respectively arranged on each head-mounted display and electrically connected with a computer system capable of processing augmented reality, each camera can obtain an image of a position reference object; The real computer system further includes a viewing point position analysis mechanism, so that the viewing point position analysis mechanism can utilize the image of the position reference object to analyze the position of each head mounted display to obtain the first viewing point position parameter, and the second viewing point. Positional parameters 3. Augmented reality system for multi-person mobile interactive use as described in item 1 of the patent application scope, which can handle expansion The computer system of the real world includes a computer subsystem of the first augmented reality and a computer subsystem of the second augmented reality, wherein the computer subsystems of each augmented reality are electrically connected to each other. The system, and the computer subsystems of each augmented reality include mechanisms for processing computer systems that augment the reality. 4. Augmentation of multi-person mobile interactive use as described in item 2 of the patent application. A real-world system, wherein the computer system capable of processing the augmented reality comprises a first augmented reality computer subsystem and a second augmented reality computer subsystem, wherein each augmented reality computer subsystem is powered Sexual connection 14 1278772 相對之各使用者系統,且各擴增實境之電腦子系統都包 括可處理擴增實境之電腦系統之機制。 5.如申請專利範圍第4項所述之可供多人移動式互動使用 之擴增實境系統,其中當第一使用者系統之頭戴式顯示 器欲顯示第二立體之虛擬影像時,是由第二擴增實境之 電腦子系統傳送第二觀看點位置參數至第一擴增實境 之電腦子系統,使得第一擴增實境之電腦子系統利用第 二觀看點位置參數並藉由電腦影像運算機制計算出第 二立體之虛擬影像。 • 6.如申請專利範圍第1項所述之可供多人移動式互動使用 之擴增實境系統,其中各使用者系統更包括一手持式電 腦,係與可處理擴增實境之電腦系統電性連接,其中該 手持式電腦包括一螢幕,各使用者可在其手持式電腦上 進行註記,並透過可處理擴增實境之電腦系統將其註記 即時傳送給另一手持式電腦。 7.如申請專利範圍第6項所述之可供多人移動式互動使甩< 之擴增實境系統,其中各手持式電腦可顯示第一立體之 虛擬影像以及第二立體之虛擬影像。 ® 8. —種可供多人移動式互動使用之擴增實境之方法,係讓 使用者可看見其他使用者所看見的虛擬影像,包括: 於第一使用者端之第一電腦: 步驟A1 :計算出一第一觀看點位置參數; 步驟B1 :藉由第一觀看點位置參數計算出一第一立體之虛 擬影像; 步驟C1 :將一第一立體之虛擬影像傳送到一第一頭戴式顯 示器30,使得第一使用者可看見第一立體之虛擬影像; 1278772 於第二使用者端之第二電腦: 步驟A2 :計算出一第二觀看點位置參數; 步驟B2 :藉由第二觀看點位置參數計算出一第二立體之虛 擬影像; 步驟C2 :將一第二立體之虛擬影像傳送到一第二頭戴式顯 示器30,使得第二使用者可看見第二立體之虛擬影像; 步驟D2 ··接收第一電腦所傳之第一觀看點位置參數; 步驟E2 :藉由第一觀看點位置參數計算出一第一立體之虛 擬影像;以及 步驟F2 ··將第一立體之虛擬影像傳送到第二頭戴式顯示器 30,使得第二使用者可看見第一立體之虛擬影像。 9. 如申請專利範圍第8項所述之可供多人移動式互動使用 之擴增實境之方法,其中第二電腦係接受到一切換影像 指令後才進行步驟D2。14 1278772 Relative to each user system, and the computer subsystems of each augmented reality include mechanisms for processing computer systems that augment the reality. 5. The augmented reality system for multi-person mobile interactive use as described in claim 4, wherein when the head mounted display of the first user system is to display the second stereoscopic virtual image, Transmitting the second viewing point location parameter to the computer subsystem of the first augmented reality by the computer subsystem of the second augmented reality, so that the computer subsystem of the first augmented reality utilizes the second viewing point location parameter and borrows The second stereoscopic virtual image is calculated by the computer image computing mechanism. • 6. Augmented reality system for multi-person mobile interactive use as described in item 1 of the patent application, wherein each user system further includes a handheld computer and a computer capable of processing the augmented reality. The system is electrically connected, wherein the handheld computer includes a screen, and each user can perform annotation on the handheld computer and instantly transmit the annotation to another handheld computer through a computer system capable of processing the augmented reality. 7. The augmented reality system for multi-person mobile interaction, as described in claim 6, wherein each handheld computer can display a virtual image of the first stereo and a virtual image of the second stereo . ® 8. A method of augmented reality for multiplayer interactive use, allowing users to see virtual images seen by other users, including: First computer on the first user: Steps A1: calculating a first viewing point position parameter; Step B1: calculating a first stereoscopic virtual image by using the first viewing point position parameter; Step C1: transmitting a first stereoscopic virtual image to a first head The wearable display 30 is such that the first user can see the virtual image of the first stereo; 1278772 is the second computer of the second user: Step A2: calculating a second viewing point position parameter; Step B2: The second viewing point position parameter calculates a second stereoscopic virtual image; Step C2: transmitting a second stereoscopic virtual image to a second head mounted display 30, so that the second user can see the second stereoscopic virtual image. Step D2 · receiving the first viewing point position parameter transmitted by the first computer; Step E2: calculating a first stereoscopic virtual image by using the first viewing point position parameter; and step F2 ·· The first three-dimensional virtual image is transferred to a second head-mounted display 30, such that the second user can see a virtual image of a first perspective. 9. A method of augmented reality for multiplayer interactive use as described in claim 8 wherein the second computer receives step (D2) after receiving a switch image command. 10. 如申請專利範圍第8項所述之可供多人移動式互動使用 之擴增實境之方法,其中: 第一使用者方括有一第一手持式電腦並電性連接於 第一電腦,使得第一手持式電腦可顯示第一立體之虛擬 影像,且第一使用者可輸入註記於第一手持式電腦; 第二使用括有一第二手持式電腦並電性連接於 第二電腦,使得第二手持式電腦可顯示第二立體之虛擬 影像,且第二使用者可輸入註記於第二手持式電腦; 使得該方法更包括: 於第二使用者端之第二電腦: 1278772 步驟G2 ··接收第一手持式電腦之註記,並將該註記 傳送至第二手持式電腦。10. The method of augmented reality for multi-person mobile interactive use as described in claim 8 wherein: the first user includes a first handheld computer and is electrically connected to the first computer. The first handheld computer can display the first three-dimensional virtual image, and the first user can input the annotation on the first handheld computer; the second use includes a second handheld computer and is electrically connected to the second computer. The second handheld computer can display the second stereoscopic virtual image, and the second user can input the annotation to the second handheld computer; the method further includes: the second computer at the second user end: 1278772 Step G2 · Receive a note of the first handheld computer and transfer the note to the second handheld computer.
TW94105365A 2005-02-23 2005-02-23 Augmented reality system and method with mobile and interactive function for multiple users TWI278772B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW94105365A TWI278772B (en) 2005-02-23 2005-02-23 Augmented reality system and method with mobile and interactive function for multiple users

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW94105365A TWI278772B (en) 2005-02-23 2005-02-23 Augmented reality system and method with mobile and interactive function for multiple users

Publications (2)

Publication Number Publication Date
TW200630865A TW200630865A (en) 2006-09-01
TWI278772B true TWI278772B (en) 2007-04-11

Family

ID=38645233

Family Applications (1)

Application Number Title Priority Date Filing Date
TW94105365A TWI278772B (en) 2005-02-23 2005-02-23 Augmented reality system and method with mobile and interactive function for multiple users

Country Status (1)

Country Link
TW (1) TWI278772B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102788577A (en) * 2011-05-17 2012-11-21 财团法人工业技术研究院 Positioning device and positioning method using augmented reality technology
TWI413034B (en) * 2010-07-29 2013-10-21 Univ Nat Central The System of Mixed Reality Realization and Digital Learning
CN105378801A (en) * 2013-04-12 2016-03-02 微软技术许可有限责任公司 Holographic snap grid
TWI694353B (en) * 2018-12-27 2020-05-21 仁寶電腦工業股份有限公司 Augmented reality positioning sharing system and method using same

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9250703B2 (en) 2006-03-06 2016-02-02 Sony Computer Entertainment Inc. Interface with gaze detection and voice input
US8730156B2 (en) 2010-03-05 2014-05-20 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
CN105843396B (en) * 2010-03-05 2019-01-01 索尼电脑娱乐美国公司 The method of multiple view is maintained on shared stabilization Virtual Space
EP2558176B1 (en) * 2010-04-13 2018-11-07 Sony Computer Entertainment America LLC Calibration of portable devices in a shared virtual space
US10120438B2 (en) 2011-05-25 2018-11-06 Sony Interactive Entertainment Inc. Eye gaze to alter device behavior
TWI628634B (en) * 2016-05-25 2018-07-01 國立中央大學 Interactive teaching systems and methods thereof

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI413034B (en) * 2010-07-29 2013-10-21 Univ Nat Central The System of Mixed Reality Realization and Digital Learning
CN102788577A (en) * 2011-05-17 2012-11-21 财团法人工业技术研究院 Positioning device and positioning method using augmented reality technology
CN105378801A (en) * 2013-04-12 2016-03-02 微软技术许可有限责任公司 Holographic snap grid
CN105378801B (en) * 2013-04-12 2018-09-11 微软技术许可有限责任公司 Hologram snapshot grid
TWI694353B (en) * 2018-12-27 2020-05-21 仁寶電腦工業股份有限公司 Augmented reality positioning sharing system and method using same

Also Published As

Publication number Publication date
TW200630865A (en) 2006-09-01

Similar Documents

Publication Publication Date Title
TWI278772B (en) Augmented reality system and method with mobile and interactive function for multiple users
KR102590841B1 (en) virtual object driving Method, apparatus, electronic device, and readable storage medium
US10482673B2 (en) System and method for role negotiation in multi-reality environments
KR101687017B1 (en) Hand localization system and the method using head worn RGB-D camera, user interaction system
AU2013370334B2 (en) System and method for role-switching in multi-reality environments
JP6377082B2 (en) Providing a remote immersive experience using a mirror metaphor
JP5871345B2 (en) 3D user interface device and 3D operation method
WO2010062117A2 (en) Immersive display system for interacting with three-dimensional content
WO2017010614A1 (en) System and method for acquiring partial space in augmented space
CN107491174A (en) Method, apparatus, system and electronic equipment for remote assistance
CN104508600A (en) Three-dimensional user-interface device, and three-dimensional operation method
EP3196734A1 (en) Control device, control method, and program
WO2019087564A1 (en) Information processing device, information processing method, and program
US11412050B2 (en) Artificial reality system with virtual wireless channels
JP6409861B2 (en) Information processing apparatus, information processing system, control method thereof, and program
Park et al. New design and comparative analysis of smartwatch metaphor-based hand gestures for 3D navigation in mobile virtual reality
Scheggi et al. Shape and weight rendering for haptic augmented reality
CN116848556A (en) Enhancement of three-dimensional models using multi-view refinement
Zhang et al. A hybrid 2D–3D tangible interface combining a smartphone and controller for virtual reality
JP6564484B1 (en) Same room communication system
Basu et al. Ubiquitous collaborative activity virtual environments
JP2005148844A (en) Display system
Soares et al. Collaborative hybrid virtual environment
Liarokapis et al. Design experiences of multimodal mixed reality interfaces
JP2005011275A (en) System and program for displaying stereoscopic image