TWI518634B - Augmented reality method and system - Google Patents

Augmented reality method and system Download PDF

Info

Publication number
TWI518634B
TWI518634B TW103143813A TW103143813A TWI518634B TW I518634 B TWI518634 B TW I518634B TW 103143813 A TW103143813 A TW 103143813A TW 103143813 A TW103143813 A TW 103143813A TW I518634 B TWI518634 B TW I518634B
Authority
TW
Taiwan
Prior art keywords
mobile device
augmented reality
physical
reference object
angle
Prior art date
Application number
TW103143813A
Other languages
Chinese (zh)
Other versions
TW201624424A (en
Inventor
張騰文
梁哲瑋
陳思瑋
陳毅承
Original Assignee
財團法人工業技術研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 財團法人工業技術研究院 filed Critical 財團法人工業技術研究院
Priority to TW103143813A priority Critical patent/TWI518634B/en
Priority to CN201410826020.4A priority patent/CN105786166B/en
Application granted granted Critical
Publication of TWI518634B publication Critical patent/TWI518634B/en
Publication of TW201624424A publication Critical patent/TW201624424A/en

Links

Landscapes

  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Description

擴增實境方法與系統 Augmented reality method and system

本案是有關於一種擴增實境方法與系統。 This case is about an augmented reality method and system.

透過實體宅妝,看屋者能看到屋內傢俱擺置情況,使得房屋成交率能被提高,並藉此瞭解消費者的喜好。自從擴增實境(Augment Reality,AR)被提出後,設計業者/廠商將設計風格及虛擬傢俱上傳到擴增實境開發平台,讓消費者可以在行動裝置上看到虛擬傢俱與實體房屋物件的結合。 Through the physical house makeup, the housekeeper can see the furniture placement in the house, so that the transaction rate of the house can be improved, and to understand the preferences of consumers. Since the introduction of Augment Reality (AR), designers/vendors have uploaded design styles and virtual furniture to the Augmented Reality Development Platform, allowing consumers to see virtual and physical housing objects on mobile devices. Combination of.

故而,如何能夠在不論使用者在實體空間內的任何地點,皆能追蹤到使用者位置與角度進而相應地進行適當的AR展示,是此領域的重點之一。 Therefore, how to track the user's position and angle regardless of the user's location in the physical space and accordingly perform appropriate AR display is one of the key points in this field.

本案提出一種擴增實境方法與系統,可利用多種方式來追蹤行動裝置之位置與角度,以進行AR展示。 This case proposes an augmented reality method and system, which can use a variety of methods to track the position and angle of the mobile device for AR display.

本案係有關於一種擴增實境方法與系統,可根據所偵測到的實體參考物件及/或多個環境特徵及/或該行動裝置之該角度參數與該位移參數,來預估該行動裝置在實體空間中的實體位置與實體角度,來進行擴增實境展示。 The present invention relates to an augmented reality method and system for estimating an action based on the detected physical reference object and/or a plurality of environmental features and/or the angular parameter of the mobile device and the displacement parameter. The physical location of the device in physical space is compared to the physical angle for augmented reality presentation.

根據本案之一實施例提出一種擴增實境方法,應用 於一擴增實境系統。該擴增實境方法包括:於一實體空間中,偵測一實體參考物件、多個環境特徵與一行動裝置的一角度參數與一位移參數;追蹤該實體參考物件以預估該行動裝置在該實體空間中的一實體位置與一實體角度,來進行擴增實境展示;當追蹤不到該實體參考物件時,追蹤該多個環境特徵以預估該行動裝置在該實體空間中的該實體位置與該實體角度,來進行擴增實境展示;以及當追蹤不到該實體參考物件且追蹤不到該多個環境特徵時,根據該行動裝置之該角度參數與該位移參數以預估該行動裝置在該實體空間中的該實體位置與該實體角度,來進行擴增實境展示。 According to an embodiment of the present application, an augmented reality method is proposed, which is applied Augmented the real system. The augmented reality method includes: detecting an entity reference object, a plurality of environmental features, and an angle parameter and a displacement parameter of a mobile device in a physical space; tracking the physical reference object to estimate the mobile device An entity location in the physical space is associated with a physical perspective for augmented reality presentation; when the entity reference object is not tracked, the plurality of environmental features are tracked to estimate the mobile device in the physical space An entity position is compared with the entity to perform augmented reality display; and when the entity reference object is not tracked and the plurality of environmental features are not tracked, the angle parameter and the displacement parameter are estimated according to the mobile device The physical location of the mobile device in the physical space is at an angle to the entity for augmented reality presentation.

根據本案之一實施例提出一種擴增實境系統,包括:一行動裝置,以及一擴增實境平台,耦接至該行動裝置。於一實體空間中,該行動裝置偵測一實體參考物件、多個環境特徵與該行動裝置的一角度參數與一位移參數。該行動裝置追蹤該實體參考物件以預估該行動裝置在該實體空間中的一實體位置與一實體角度,來進行擴增實境展示。當該行動裝置追蹤不到該實體參考物件時,該行動裝置追蹤該多個環境特徵以預估該行動裝置在該實體空間中的該實體位置與該實體角度,來進行擴增實境展示。當該行動裝置追蹤不到該實體參考物件且追蹤不到該多個環境特徵時,根據該行動裝置之該角度參數與該位移參數,該行動裝置預估該行動裝置在該實體空間中的該實體位置與該實體角度,來進行擴增實境展示。 An augmented reality system is provided according to an embodiment of the present invention, comprising: a mobile device, and an augmented reality platform coupled to the mobile device. In a physical space, the mobile device detects an entity reference object, a plurality of environmental features, and an angle parameter and a displacement parameter of the mobile device. The mobile device tracks the physical reference object to estimate a physical location of the mobile device in the physical space with a physical angle for augmented reality presentation. When the mobile device does not track the physical reference object, the mobile device tracks the plurality of environmental features to estimate the physical location of the mobile device in the physical space and the entity angle for augmented reality presentation. When the mobile device cannot track the physical reference object and cannot track the plurality of environmental features, the mobile device estimates the mobile device in the physical space according to the angular parameter of the mobile device and the displacement parameter The physical location is at the perspective of the entity for augmented reality presentation.

為了對本案之上述及其他方面有更佳的瞭解,下文特舉實施例,並配合所附圖式,作詳細說明如下: In order to better understand the above and other aspects of the present invention, the following specific embodiments, together with the drawings, are described in detail below:

100‧‧‧擴增實境系統 100‧‧‧Augmented Reality System

110‧‧‧行動裝置 110‧‧‧ mobile devices

150‧‧‧AR平台 150‧‧‧AR platform

170‧‧‧網路 170‧‧‧Network

111‧‧‧攝影單元 111‧‧‧Photographic unit

112‧‧‧旋轉感測器 112‧‧‧Rotary sensor

113‧‧‧加速度感測器 113‧‧‧Acceleration sensor

114‧‧‧螢幕 114‧‧‧ screen

115‧‧‧AR應用程式 115‧‧‧AR application

121‧‧‧AR追蹤模組 121‧‧‧AR Tracking Module

122‧‧‧環境特徵追蹤模組 122‧‧‧Environmental Feature Tracking Module

124‧‧‧特徵資料庫 124‧‧‧Characteristic database

123‧‧‧移動預估模組 123‧‧‧Mobile Estimation Module

152‧‧‧AR內容尺寸模組 152‧‧‧AR content size module

151‧‧‧AR編輯模組 151‧‧‧AR editing module

220A-220D‧‧‧環境特徵 220A-220D‧‧‧Environmental characteristics

210‧‧‧實體參考物件 210‧‧‧Entity reference objects

320、320’‧‧‧AR傢俱 320, 320’‧‧‧AR furniture

230‧‧‧AR傢俱 230‧‧‧AR furniture

310‧‧‧實體參考物件 310‧‧‧Entity reference object

第1圖顯示根據本案一實施例之擴增實境系統之示意圖。 Figure 1 shows a schematic diagram of an augmented reality system in accordance with an embodiment of the present invention.

第2A圖至第2E圖繪示依照本案一實施例的預估行動裝置的位置與角度,以進行AR展示之示意圖。 2A to 2E are schematic diagrams showing the position and angle of the motion estimation device for performing AR display according to an embodiment of the present invention.

第3A圖與第3B圖顯示依照本案一實施例之AR物件之編輯示意圖。 3A and 3B are diagrams showing the editing of an AR object in accordance with an embodiment of the present invention.

第4圖顯示根據本案一實施例之擴增實境方法之流程圖。 Figure 4 is a flow chart showing an augmented reality method in accordance with an embodiment of the present invention.

本說明書的技術用語係參照本技術領域之習慣用語,如本說明書對部分用語有加以說明或定義,該部分用語之解釋係以本說明書之說明或定義為準。本揭露之各個實施例分別具有一或多個技術特徵。在可能實施的前提下,本技術領域具有通常知識者可選擇性地實施任一實施例中部分或全部的技術特徵,或者選擇性地將這些實施例中部分或全部的技術特徵加以組合。 The technical terms of the present specification refer to the idioms in the technical field, and some of the terms are explained or defined in the specification, and the explanation of the terms is based on the description or definition of the specification. Various embodiments of the present disclosure each have one or more of the technical features. Those skilled in the art can selectively implement some or all of the technical features of any embodiment, or selectively combine some or all of the technical features of these embodiments, where possible.

現請參考第1圖,其顯示根據本案一實施例之擴增實境系統之示意圖。如第1圖所示,擴增實境系統100包括:行動裝置110與AR平台150。行動裝置110與AR平台150之間可透過網路170而互相溝通。網路170可以包括有線/無線網路。 Reference is now made to Fig. 1, which shows a schematic diagram of an augmented reality system in accordance with an embodiment of the present invention. As shown in FIG. 1, the augmented reality system 100 includes a mobile device 110 and an AR platform 150. The mobile device 110 and the AR platform 150 can communicate with each other through the network 170. Network 170 can include a wired/wireless network.

行動裝置110包括:攝影單元111、旋轉感測器112、加速度感測器113、螢幕114與AR應用程式115。AR應用程式115安裝於行動裝置110之記憶體(未示出)內。AR應用程式115包括:AR追蹤模組121、環境特徵追蹤模組122、移動預估模組123與特徵資料庫124。AR平台150包括:AR編輯模組151與AR內容尺寸模組152。 The mobile device 110 includes a photographing unit 111, a rotation sensor 112, an acceleration sensor 113, a screen 114, and an AR application 115. The AR application 115 is installed in a memory (not shown) of the mobile device 110. The AR application 115 includes an AR tracking module 121, an environment feature tracking module 122, a motion estimation module 123, and a feature database 124. The AR platform 150 includes an AR editing module 151 and an AR content size module 152.

攝影單元111用以拍攝實體空間內之實體物件。旋轉感測器112用以感測行動裝置110之角加速度與角動量等角度參數,以感測行動裝置110之角度。旋轉感測器112比如可以是陀螺儀(gyroscope)等。加速度感測器113用以感測行動裝置110之加速度參數/位移參數,以感測行動裝置110之位置。螢幕114用以呈現影像,比如是攝影單元111所拍攝之影像,或者是由AR應用程式115所傳來之AR展示之相關影像。 The photographing unit 111 is used to photograph a physical object in a physical space. The rotation sensor 112 is configured to sense an angular parameter such as an angular acceleration and an angular momentum of the mobile device 110 to sense the angle of the mobile device 110. The rotation sensor 112 may be, for example, a gyroscope or the like. The acceleration sensor 113 is configured to sense an acceleration parameter/displacement parameter of the mobile device 110 to sense the position of the mobile device 110. The screen 114 is used to present an image, such as an image taken by the photographing unit 111, or an image of an AR display transmitted by the AR application 115.

AR應用程式115用以即時地呈現AR展示於螢幕上114。也就是說,使用者可一面觀看螢幕114,其上可同時呈現由攝影單元111所擷取到的實體影像(比如是實體空間中之實體物件(人、實體參考物件、牆壁等))與由AR應用程式115所傳來之AR展示之相關影像。 The AR application 115 is used to present the AR presentation on the screen 114 on the fly. That is to say, the user can view the screen 114 on the same time, and the physical image captured by the photographing unit 111 (for example, a physical object (human, physical reference object, wall, etc.) in the physical space) can be simultaneously presented and The image of the AR display sent by the AR application 115.

如果行動裝置110在移動/轉動的話,則顯示在行動裝置110的螢幕114上的AR配置會隨之移動/轉動(透過AR應用程式115的控制),以讓使用者在觀看時的視覺感受能貼近目前的情況。 If the mobile device 110 is moving/rotating, the AR configuration displayed on the screen 114 of the mobile device 110 will be moved/rotated (via the control of the AR application 115) to allow the user to visually feel while viewing. Close to the current situation.

當行動裝置110的攝影單元111能拍攝到實體參考物件時,AR追蹤模組121可據此來計算出行動裝置110在實體空間中的實體位置。實體參考物件通常是擺在地面上的一張平面物件。在初始化時(比如,執行AR應用程式115以初次偵測到此實體參考物件時),透過此實體參考物件與(選擇性)其他環境特徵,AR應用程式115能夠計算出行動裝置110在此實體空間中的初始實體位置。之後,如果攝影單元111能夠一直拍攝/偵測到實體參考物件的話,AR追蹤模組121可據此來追蹤出行動裝置110在實體空間中的實體位置與實體角度。 When the photographing unit 111 of the mobile device 110 can capture the physical reference object, the AR tracking module 121 can calculate the physical position of the mobile device 110 in the physical space. A physical reference object is usually a flat object placed on the ground. Upon initialization (eg, when the AR application 115 is executed to detect this entity reference object for the first time), the AR application 115 can calculate the mobile device 110 in this entity through the entity reference object and (optional) other environmental characteristics. The initial physical location in space. Thereafter, if the photographing unit 111 can always capture/detect the physical reference object, the AR tracking module 121 can track the physical position and the physical angle of the mobile device 110 in the physical space.

當行動裝置110的攝影單元111能拍攝到實體空間中的環境特徵(比如,掛在牆壁上的圖等)時,環境特徵追蹤模組122可據此來計算/追蹤出行動裝置110在實體空間中的實體位置。詳細地說,於AR追蹤模組121計算出行動裝置110在此實體空間中的初始實體位置之後,環境特徵追蹤模組122便可以持續根據環境特徵來計算/追蹤出行動裝置110在實體空間中的實體位置。 When the photographing unit 111 of the mobile device 110 can capture environmental features in the physical space (for example, a map hanging on a wall, etc.), the environmental feature tracking module 122 can calculate/track the mobile device 110 in the physical space accordingly. The physical location in . In detail, after the AR tracking module 121 calculates the initial physical location of the mobile device 110 in the physical space, the environmental feature tracking module 122 can continuously calculate/track the mobile device 110 in the physical space according to the environmental characteristics. Physical location.

進一步說,環境特徵追蹤模組122可依據電腦視覺(computer vision,CV)來找出環境特徵。之後,環境特徵追蹤模組122可持續建立環境特徵地圖(存於特徵資料庫124之中)。如果能一直偵測到實體參考物件的話,則環境特徵追蹤模組122重覆執行上述操作。但如果無法偵測到實體參考物件的話,則環境特徵追蹤模組122可利用演算法(如隨機抽樣一致(RANdom SAmple Consensus,RANSAC)演算法)來找出位置靠近實體參考物件的該些環境特徵。之後,環境特徵追蹤模組122可利用另一演算法(如叠代最近點(Iterative Closest Point,ICP)演算法)來從所找出的位置靠近實體參考物件的該些環境特徵中去除可能是雜訊的一或多環境特徵。藉此,環境特徵追蹤模組122可根據環境特徵來預估出行動裝置110的位置與角度等。 Further, the environmental feature tracking module 122 can find environmental features according to computer vision (CV). Thereafter, the environmental feature tracking module 122 can continuously establish an environmental feature map (stored in the feature database 124). If the physical reference object can be detected all the time, the environmental feature tracking module 122 repeats the above operations. However, if the physical reference object cannot be detected, the environmental feature tracking module 122 can utilize an algorithm (eg, random sampling consistent (RANdom SAmple) Consensus, RANSAC) algorithm to find these environmental features located close to the physical reference object. Thereafter, the environment feature tracking module 122 can utilize another algorithm (such as an Iterative Closest Point (ICP) algorithm) to remove the environmental features from the found location near the physical reference object. One or more environmental characteristics of noise. Thereby, the environmental feature tracking module 122 can estimate the position, angle, and the like of the mobile device 110 according to the environmental characteristics.

於AR追蹤模組121計算出行動裝置110在此實體空間中的初始實體位置之後,移動預估模組123即可依據旋轉感測器112與加速度感測器113來持續追蹤/預估行動裝置110之位置與角度。 After the AR tracking module 121 calculates the initial physical location of the mobile device 110 in the physical space, the motion estimation module 123 can continuously track/estimate the mobile device according to the rotation sensor 112 and the acceleration sensor 113. 110 position and angle.

進一步說,移動預估模組123可依據旋轉感測器112所偵測到的行動裝置110的角速度來預估出行動裝置110的角度參數;以及根據加速度感測器113所偵測出的行動裝置110的速度,來預估出行動裝置110的位移參數與位置參數。故而,移動預估模組123能持續追蹤/預估行動裝置110之位置與角度。 Further, the motion estimation module 123 can estimate the angle parameter of the mobile device 110 according to the angular velocity of the mobile device 110 detected by the rotation sensor 112; and the motion detected by the acceleration sensor 113. The speed of the device 110 is used to estimate the displacement and position parameters of the mobile device 110. Therefore, the motion estimation module 123 can continuously track/estimate the position and angle of the mobile device 110.

以資料優先順序而言,AR追蹤模組121高於環境特徵追蹤模組122,且環境特徵追蹤模組122又高於移動預估模組123。也就是說,在本案一實施例中,如果行動裝置110能夠一直拍攝到實體參考物件的話,則優先採納由AR追蹤模組121所預估/追蹤出的行動裝置110的實體位置,因為其準確度較其他兩個模組122/123為高。一旦行動裝置110拍攝不到實體參考物件的話但能拍攝到足夠多的環境特徵的話,則退而求其次,採納 由環境特徵追蹤模組122所預估/追蹤出的行動裝置110的實體位置。如果一旦行動裝置110拍攝不到實體參考物件的話且所拍攝到環境特徵之數量不足以讓環境特徵追蹤模組122來預估/追蹤出的行動裝置110的實體位置的話,則再退而求其次,採納由移動預估模組123所預估/追蹤出的行動裝置110的實體位置。 In the data priority order, the AR tracking module 121 is higher than the environment feature tracking module 122, and the environment feature tracking module 122 is higher than the motion estimation module 123. That is to say, in an embodiment of the present invention, if the mobile device 110 can capture the physical reference object all the time, the physical position of the mobile device 110 estimated/tracked by the AR tracking module 121 is preferentially adopted because it is accurate. The degree is higher than the other two modules 122/123. Once the mobile device 110 can't capture the physical reference object, but can capture enough environmental features, then the second best, adopt The physical location of the mobile device 110 as estimated/tracked by the environmental feature tracking module 122. If the mobile device 110 does not capture the physical reference object and the number of captured environmental features is insufficient for the environmental feature tracking module 122 to estimate/track the physical location of the mobile device 110, then retreat and then retreat The physical location of the mobile device 110 estimated/tracked by the mobile estimation module 123 is adopted.

特徵資料庫124用以儲存所拍攝/偵測出的特徵點(包括實體參考物件與環境特徵)。 The feature database 124 is used to store captured/detected feature points (including physical reference objects and environmental features).

AR編輯模組151可根據行動裝置110的AR應用程式115所回傳的資料(比如,實體空間的尺寸等),來進行AR展示的編輯。比如,如果使用者覺得目前的AR展示並不符合其所需的話(比如,AR展示所對應的實體空間坪數並不符合目前使用者所在的實體空間的坪數,或者使用者想要更換AR風格),行動裝置110可以回傳給AR平台150,讓AR編輯模組151來進行AR編輯(比如,調整AR虛擬物件的擺放位置、旋轉角度、尺寸等),以達到客製化目的。 The AR editing module 151 can perform editing of the AR presentation according to the data (for example, the size of the physical space, etc.) returned by the AR application 115 of the mobile device 110. For example, if the user feels that the current AR display does not meet the requirements (for example, the physical space number corresponding to the AR display does not match the number of physical spaces of the current user's physical space, or the user wants to replace the AR. The mobile device 110 can be transmitted back to the AR platform 150, and the AR editing module 151 can perform AR editing (for example, adjusting the placement position, rotation angle, size, etc. of the AR virtual object) to achieve customization.

AR內容尺寸模組152計算出AR配置在實體空間中的相對應面積。比如,假設實體參考物件的實體面積為A,AR配置的AR面積為B,AR配置中的AR參考物件的AR面積為C,則此AR配置在實體空間中所佔的實體面積Y=A*B/C。亦即,在進行面積轉換時,將AR參考物件對應於實體參考物件,以求出AR配置經轉換後的實體佔用面積。 The AR content size module 152 calculates the corresponding area of the AR configuration in the physical space. For example, if the physical area of the entity reference object is A, the AR area of the AR configuration is B, and the AR area of the AR reference object in the AR configuration is C, the physical area occupied by the AR configuration in the physical space is Y=A*. B/C. That is, when the area conversion is performed, the AR reference object is corresponding to the physical reference object to find the converted occupied area of the AR configuration.

舉例但不受限於,以A0大小的物件當成實體參考 物件的話,則A=1189(cm)*841(cm)。實體參考物件是指放在實體空間中的參考物件,當AR應用程式115偵測到此實體參考物件時,將此實體參考物件當成參考定位點。通常而言,實體參考物件會擺放在實體空間的中央點。比如,實體空間是客廳的話,則實體參考物件會被擺放在客廳的中央點。 For example, but not limited to, an object of size A0 is used as an entity reference. For the object, A = 1189 (cm) * 841 (cm). The physical reference object refers to a reference object placed in the physical space. When the AR application 115 detects the physical reference object, the physical reference object is regarded as a reference positioning point. In general, a solid reference object is placed at the center point of the physical space. For example, if the physical space is the living room, the physical reference object will be placed at the center of the living room.

所謂的AR配置通常包括所要放置的AR傢俱之類型、大小、形狀、擺放位置等。比如,在設計適合於客廳的AR配置時,AR配置通常包括常見的AR傢俱,如茶几、電視櫃、沙發等。在設計AR配置時,也會一併設計好此AR配置所對應的AR面積。也就是說,如果有一間房屋包括一個客廳(5坪)、一個餐廳(3坪)、主臥房(5坪)與一般臥房(4坪)的話,在設計AR配置時,會分別設計/產生適合客廳/餐廳/房間的各別AR配置。 The so-called AR configuration typically includes the type, size, shape, placement, etc. of the AR furniture to be placed. For example, when designing an AR configuration suitable for a living room, the AR configuration typically includes common AR furniture such as a coffee table, a TV cabinet, a sofa, and the like. When designing the AR configuration, the AR area corresponding to the AR configuration is also designed. In other words, if a house consists of a living room (5 pings), a restaurant (3 pings), a master bedroom (5 pings) and a general bedroom (4 pings), it will be designed/generated separately when designing the AR configuration. Suitable for individual AR configurations in the living/dining/room.

在設計AR配置時,亦在AR配置中設置AR參考物件,其預設的虛擬擺放位置同樣位於AR空間的中央點,以對應於實體參考物件。 When designing the AR configuration, the AR reference object is also set in the AR configuration, and the preset virtual placement position is also located at the central point of the AR space to correspond to the physical reference object.

故而,AR內容尺寸模組152在進行面積轉換時,乃是將AR參考物件的位置對應至實體參考物件的位置,且將AR參考物件的面積依比例縮/放至相同於實體參考物件,再依此比例來縮/放該AR配置的AR面積,以得到該AR配置在實體空間中的相對應面積。 Therefore, when performing the area conversion, the AR content size module 152 corresponds the position of the AR reference object to the position of the physical reference object, and scales/places the area of the AR reference object to be the same as the physical reference object. The AR area of the AR configuration is reduced/released according to the ratio to obtain a corresponding area of the AR configuration in the physical space.

在本案一實施例中,經過AR編輯模組151來調整/編輯AR虛擬物件後,AR平台150可回傳給行動裝置110的AR 應用程式115。據此,AR應用程式115可得到調整後的AR配置,而無法再次進行3D登記(3D registration)。 In an embodiment of the present invention, after the AR virtual object is adjusted/edited by the AR editing module 151, the AR platform 150 can transmit back to the AR of the mobile device 110. Application 115. Accordingly, the AR application 115 can obtain the adjusted AR configuration, and cannot perform 3D registration again.

現將說明本案一實施例如何追蹤/預估行動裝置110之位置與角度。如第2A圖所示,使用者手持著行動裝置110,對準實體參考物件(marker)210拍攝,藉此,AR追蹤模組121可計算出行動裝置110在此實體空間中的初始實體位置。如果當行動裝置110移動至無法拍攝到實體參考物件210的位置的話,如第2B圖所示,則環境特徵追蹤模組122可偵測目前所能拍攝到的環境特徵220A-220D來追蹤/預估行動裝置110之位置與角度,以便讓AR應用程式115能據以在螢幕114上顯示目前位置與角度所對應的AR展示(如AR傢俱230),如第2C圖所示。 It will now be described how an embodiment of the present invention tracks/estimates the position and angle of the mobile device 110. As shown in FIG. 2A, the user holds the mobile device 110 and aligns with the physical reference marker 210, whereby the AR tracking module 121 can calculate the initial physical location of the mobile device 110 in the physical space. If the mobile device 110 moves to a position where the physical reference object 210 cannot be captured, as shown in FIG. 2B, the environmental feature tracking module 122 can detect the currently captured environmental features 220A-220D to track/pre-take. The position and angle of the mobile device 110 is estimated to allow the AR application 115 to display an AR display (e.g., AR furniture 230) corresponding to the current position and angle on the screen 114, as shown in FIG. 2C.

如果當行動裝置110移動至無法拍攝到實體參考物件210且也無法拍攝到環境特徵220A-220D(或者是所拍攝到環境特徵不足以讓環境特徵追蹤模組122來預估/追蹤出的行動裝置110的實體位置)的位置的話,如第2D圖所示,則移動預估模組123可根據旋轉感測器112與加速度感測器113來得知每個時間點的行動裝置110的位移量與角度,進而可得知在每個時刻下的行動裝置110的實體座標。所以,移動預估模組123也可追蹤/預估行動裝置110之位置與角度,以便讓AR應用程式115能據以在螢幕114上顯示對應於目前位置與角度的AR展示,如第2E圖所示。 If the mobile device 110 is moved to the inability to capture the physical reference object 210 and the environmental features 220A-220D are not captured (or the mobile device that is captured with insufficient environmental features to allow the environmental feature tracking module 122 to estimate/track) If the position of the physical position of 110 is as shown in FIG. 2D, the motion estimation module 123 can know the displacement amount of the mobile device 110 at each time point according to the rotation sensor 112 and the acceleration sensor 113. The angle, and thus the physical coordinates of the mobile device 110 at each moment, is known. Therefore, the motion estimation module 123 can also track/estimate the position and angle of the mobile device 110, so that the AR application 115 can display the AR display corresponding to the current position and angle on the screen 114, as shown in FIG. 2E. Shown.

此外,在本案一實施例中,AR追蹤模組121、環境 特徵追蹤模組122與移動預估模組123持續運作的,以持續預估出行動裝置110的位置與角度。 In addition, in an embodiment of the present invention, the AR tracking module 121 and the environment The feature tracking module 122 and the motion estimation module 123 continue to operate to continuously estimate the position and angle of the mobile device 110.

此外,如果在AR展示過程中,使用者覺得AR配置中的AR傢俱不滿意(比如,AR傢俱尺寸不適合),使用者可透過AR應用程式115將參數回傳給AR平台150,讓設計師透過AR平台150的AR編輯模組151及/或AR內容尺寸模組152來修改/調整AR配置中的AR傢俱,再回傳給使用者。比如,以第3A圖為例,在進行AR展示時,行動裝置110的AR應用程式115可量測此實體空間的面積,以從眾多個預先設計好的AR配置中挑選出適合的面積與所喜歡的AR風格。在此,所謂的AR風格是指,預先設計好的多個AR傢俱的組合。AR應用程式115將使用者所挑選的AR配置顯示於螢幕上,如第3A圖所示。實體參考物件310與AR傢俱320顯示於螢幕上。使用者覺得AR傢俱320的尺寸/顏色不對勁/不喜歡,則使用者可透過AR應用程式115回傳給AR平台150。AR平台150將AR傢俱320修改成AR傢俱320’後回傳給AR應用程式115,並顯示於螢幕114之上,如第3B圖所示。在第3B圖中,AR傢俱320’的顏色/尺寸經過修改。 In addition, if the user feels that the AR furniture in the AR configuration is not satisfactory during the AR display process (for example, the size of the AR furniture is not suitable), the user can pass the parameters back to the AR platform 150 through the AR application 115, allowing the designer to The AR editing module 151 and/or the AR content size module 152 of the AR platform 150 modify/adjust the AR furniture in the AR configuration and transmit it back to the user. For example, taking FIG. 3A as an example, when performing AR display, the AR application 115 of the mobile device 110 can measure the area of the physical space to select a suitable area and location from a plurality of pre-designed AR configurations. Like the AR style. Here, the so-called AR style refers to a combination of a plurality of AR furnitures that are pre-designed. The AR application 115 displays the AR configuration selected by the user on the screen as shown in FIG. 3A. The physical reference object 310 and the AR furniture 320 are displayed on the screen. The user feels that the size/color of the AR furniture 320 is not right/disliked, and the user can pass back to the AR platform 150 through the AR application 115. The AR platform 150 modifies the AR furniture 320 to the AR furniture 320' and transmits it back to the AR application 115 and displays it on the screen 114 as shown in FIG. 3B. In Figure 3B, the color/size of the AR furniture 320' has been modified.

現請參考第4圖,其顯示根據本案一實施例之擴增實境方法之流程圖。如第4圖所示,在步驟410中,於一實體空間中,偵測一實體參考物件、多個環境特徵與一行動裝置的一角度參數與一位移參數。在步驟420中,追蹤該實體參考物件以預估該行動裝置在該實體空間中的一實體位置與一實體角度,來進 行擴增實境展示。在步驟430中,當追蹤不到該實體參考物件時,追蹤該多個環境特徵以預估該行動裝置在該實體空間中的該實體位置與該實體角度,來進行擴增實境展示。在步驟440中,當追蹤不到該實體參考物件且追蹤不到該多個環境特徵時,根據該行動裝置之該角度參數與該位移參數,預估該行動裝置在該實體空間中的該實體位置與該實體角度,來進行擴增實境展示。 Reference is now made to Fig. 4, which shows a flow chart of an augmented reality method in accordance with an embodiment of the present invention. As shown in FIG. 4, in step 410, an entity reference object, a plurality of environmental features, and an angle parameter and a displacement parameter of a mobile device are detected in a physical space. In step 420, the entity reference object is tracked to estimate a physical location of the mobile device in the physical space and a physical angle. Augmented reality display. In step 430, when the entity reference object is not tracked, the plurality of environmental features are tracked to estimate the physical location of the mobile device in the physical space and the entity angle for augmented reality presentation. In step 440, when the entity reference object is not tracked and the plurality of environmental features are not tracked, the entity of the mobile device in the physical space is estimated according to the angle parameter of the mobile device and the displacement parameter. The location is at an angle to the entity for augmented reality presentation.

綜上所述,在本案實施例中,由於可利用多種方式來預估行動裝置之位置與角度,故而,即便當行動裝置移動到偵測不到實體參考物件(通常放置於房間中央處)時,仍可依環境特徵,及/或,加速度/旋轉感測器來預估行動裝置之位置與角度,進而使得AR展示不因行動裝置之移動而有所失真。 In summary, in the embodiment of the present invention, since the position and angle of the mobile device can be estimated in various ways, even when the mobile device moves to detect no physical reference object (usually placed at the center of the room) The position and angle of the mobile device can still be estimated based on environmental characteristics and/or acceleration/rotation sensors, so that the AR display is not distorted by the movement of the mobile device.

綜上所述,雖然本案已以實施例揭露如上,然其並非用以限定本案。本案所屬技術領域中具有通常知識者,在不脫離本案之精神和範圍內,當可作各種之更動與潤飾。因此,本案之保護範圍當視後附之申請專利範圍所界定者為準。 In summary, although the present invention has been disclosed above by way of example, it is not intended to limit the present invention. Those who have ordinary knowledge in the technical field of the present invention can make various changes and refinements without departing from the spirit and scope of the present case. Therefore, the scope of protection of this case is subject to the definition of the scope of the patent application attached.

410~440‧‧‧步驟 410~440‧‧‧Steps

Claims (12)

一種擴增實境方法,應用於一擴增實境系統,該擴增實境方法包括:於一實體空間中,偵測一實體參考物件、多個環境特徵與一行動裝置的一角度參數與一位移參數;追蹤該實體參考物件以預估該行動裝置在該實體空間中的一實體位置與一實體角度,來進行擴增實境展示;當追蹤不到該實體參考物件時,追蹤該多個環境特徵以預估該行動裝置在該實體空間中的該實體位置與該實體角度,來進行擴增實境展示;以及當追蹤不到該實體參考物件且追蹤不到該多個環境特徵時,根據該行動裝置之該角度參數與該位移參數,預估該行動裝置在該實體空間中的該實體位置與該實體角度,來進行擴增實境展示。 An augmented reality method is applied to an augmented reality system, the method comprising: detecting an entity reference object, a plurality of environmental features and an angle parameter of a mobile device in a physical space a displacement parameter; tracking the physical reference object to estimate a physical location of the mobile device in the physical space and a physical angle for augmented reality display; when the physical reference object is not tracked, tracking the multiple An environmental feature to estimate the physical location of the mobile device in the physical space and the physical perspective for augmented reality presentation; and when the physical reference object is not tracked and the plurality of environmental features are not tracked According to the angle parameter of the mobile device and the displacement parameter, the physical position of the mobile device in the physical space and the entity angle are estimated to perform augmented reality display. 如申請專利範圍第1項所述之擴增實境方法,更包括:利用初次偵測該實體參考物件以計算出該行動裝置在該實體空間中的一初始實體位置;以及如果一直偵測到該實體參考物件的話,據以追蹤出該行動裝置在該實體空間中的該實體位置與實體角度,來進行擴增實境展示。 The augmented reality method as described in claim 1, further comprising: utilizing the initial detection of the entity reference object to calculate an initial physical location of the mobile device in the physical space; and if The entity refers to the object, and accordingly, the augmented reality display is performed by tracking the physical position and the physical angle of the mobile device in the physical space. 如申請專利範圍第1項所述之擴增實境方法,其中,偵測該多個環境特徵的該步驟包括: 依據電腦視覺來找出該多個環境特徵;持續建立與儲存一環境特徵地圖;如果無法偵測到該實體參考物件的話,找出位置靠近實體參考物件的該多個環境特徵;從所找出的該多個環境特徵中去除可能是雜訊的一或多個環境特徵;以及根據所餘留的該多個環境特徵來預估該行動裝置的該實體位置與該實體角度,來進行擴增實境展示。 The augmented reality method of claim 1, wherein the step of detecting the plurality of environmental features comprises: Finding the plurality of environmental features according to computer vision; continuously establishing and storing an environmental feature map; if the physical reference object cannot be detected, finding the plurality of environmental features located close to the physical reference object; Removing the one or more environmental features of the plurality of environmental features that may be noise; and estimating the physical location of the mobile device from the remaining one based on the remaining plurality of environmental features to perform amplification Reality display. 如申請專利範圍第1項所述之擴增實境方法,其中,偵測該行動裝置的該角度參數與該位移參數的該步驟包括:利用一旋轉感測器來偵測該行動裝置的該角度參數;以及利用一加速度感測器來偵測該行動裝置的該位移參數。 The augmented reality method of claim 1, wherein the step of detecting the angle parameter and the displacement parameter of the mobile device comprises: detecting the mobile device by using a rotation sensor An angle parameter; and an acceleration sensor to detect the displacement parameter of the mobile device. 如申請專利範圍第1項所述之擴增實境方法,更包括:該行動裝置回傳給一擴增實境平台,以進行擴增實境展示的編輯;以及於該擴增實境平台進行編輯後,回傳給該行動裝置進行擴增實境展示。 The augmented reality method as described in claim 1, further comprising: returning the mobile device to an augmented reality platform for editing of the augmented reality display; and the augmented reality platform After editing, it is passed back to the mobile device for augmented reality display. 如申請專利範圍第1項所述之擴增實境方法,更包括:將一擴增實境配置中的一擴增實境參考物件對應於該實體參考物件;將該擴增實境參考物件的面積依一比例縮/放至尺寸相同於該實體參考物件;以及 依該比例來縮/放該擴增實境配置的一擴增實境面積,以得到該擴增實境配置在該實體空間中的相對應面積。 The augmented reality method as described in claim 1, further comprising: an augmented reality reference object in an augmented reality configuration corresponding to the entity reference object; the augmented reality reference object The area is reduced/placed to the same size as the physical reference object; According to the ratio, an augmented reality area of the augmented reality configuration is reduced/released to obtain a corresponding area of the augmented reality configuration in the physical space. 一種擴增實境系統,包括:一行動裝置,以及一擴增實境平台,耦接至該行動裝置,其中,於一實體空間中,該行動裝置偵測一實體參考物件、多個環境特徵與該行動裝置的一角度參數與一位移參數;該行動裝置追蹤該實體參考物件以預估該行動裝置在該實體空間中的一實體位置與一實體角度,來進行擴增實境展示;當該行動裝置追蹤不到該實體參考物件時,該行動裝置追蹤該多個環境特徵以預估該行動裝置在該實體空間中的該實體位置與該實體角度,來進行擴增實境展示;以及當該行動裝置追蹤不到該實體參考物件且追蹤不到該多個環境特徵時,根據該行動裝置之該角度參數與該位移參數,該行動裝置預估該行動裝置在該實體空間中的該實體位置與該實體角度,來進行擴增實境展示。 An augmented reality system includes: a mobile device, and an augmented reality platform coupled to the mobile device, wherein the mobile device detects an entity reference object and multiple environmental features in a physical space An angle parameter and a displacement parameter with the mobile device; the mobile device tracks the physical reference object to estimate a physical location of the mobile device in the physical space with a physical angle for augmented reality display; When the mobile device does not track the physical reference object, the mobile device tracks the plurality of environmental features to estimate the physical location of the mobile device in the physical space and the entity angle for augmented reality display; When the mobile device cannot track the physical reference object and cannot track the plurality of environmental features, the mobile device estimates the mobile device in the physical space according to the angular parameter of the mobile device and the displacement parameter The physical location is at the perspective of the entity for augmented reality presentation. 如申請專利範圍第7項所述之擴增實境系統,其中,該行動裝置利用初次偵測該實體參考物件以計算出該行動裝置在該實體空間中的一初始實體位置;以及如果該行動裝置一直偵測到該實體參考物件的話,該行動裝置據以追蹤出該行動裝置在該實體空間中的該實體位置與實體 角度,來進行擴增實境展示。 The augmented reality system of claim 7, wherein the mobile device first detects the physical reference object to calculate an initial physical location of the mobile device in the physical space; and if the action When the device detects the physical reference object, the mobile device tracks the physical location and entity of the mobile device in the physical space. Angle, for augmented reality display. 如申請專利範圍第7項所述之擴增實境系統,其中,該行動裝置依據電腦視覺來找出該多個環境特徵;該行動裝置持續建立與儲存一環境特徵地圖;如果該行動裝置無法偵測到該實體參考物件的話,該行動裝置找出位置靠近實體參考物件的該多個環境特徵;該行動裝置從所找出的該多個環境特徵中去除可能是雜訊的一或多個環境特徵;以及根據所餘留的該多個環境特徵,該行動裝置預估該實體位置與該實體角度,來進行擴增實境展示。 The augmented reality system of claim 7, wherein the mobile device finds the plurality of environmental features according to computer vision; the mobile device continuously establishes and stores an environmental feature map; if the mobile device cannot Detecting the physical reference object, the mobile device finds the plurality of environmental features located near the physical reference object; the mobile device removes one or more of the plurality of environmental features that may be noise An environmental feature; and based on the remaining plurality of environmental features, the mobile device estimates the physical location and the entity angle for augmented reality presentation. 如申請專利範圍第7項所述之擴增實境系統,其中,該行動裝置利用一旋轉感測器來偵測該行動裝置的該角度參數;以及該行動裝置利用一加速度感測器來偵測該行動裝置的該位移參數。 The augmented reality system of claim 7, wherein the mobile device uses a rotation sensor to detect the angle parameter of the mobile device; and the mobile device utilizes an acceleration sensor to detect The displacement parameter of the mobile device is measured. 如申請專利範圍第7項所述之擴增實境系統,其中,該行動裝置回傳給該擴增實境平台,以進行擴增實境展示的編輯;以及於該擴增實境平台進行編輯後,該擴增實境平台回傳給該行動裝置進行擴增實境展示。 The augmented reality system of claim 7, wherein the mobile device is transmitted back to the augmented reality platform for editing of the augmented reality display; and the augmented reality platform is performed. After editing, the augmented reality platform is transmitted back to the mobile device for augmented reality display. 如申請專利範圍第7項所述之擴增實境系統,其中:該行動裝置將一擴增實境配置中的一擴增實境參考物件對 應於該實體參考物件;該行動裝置將該擴增實境參考物件的面積依一比例縮/放至尺寸相同於該實體參考物件;以及依該比例來縮/放該擴增實境配置的一擴增實境面積,該行動裝置得到該擴增實境配置在該實體空間中的相對應面積。 The augmented reality system of claim 7, wherein: the mobile device is an augmented reality reference object pair in an augmented reality configuration The object should be referenced to the entity; the mobile device scales/amplifies the area of the augmented reality reference object to a size equal to the physical reference object; and shrinks/distributes the augmented reality configuration according to the ratio Upon augmenting the real area, the mobile device obtains the corresponding area of the augmented reality configuration in the physical space.
TW103143813A 2014-12-16 2014-12-16 Augmented reality method and system TWI518634B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
TW103143813A TWI518634B (en) 2014-12-16 2014-12-16 Augmented reality method and system
CN201410826020.4A CN105786166B (en) 2014-12-16 2014-12-26 Augmented reality method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW103143813A TWI518634B (en) 2014-12-16 2014-12-16 Augmented reality method and system

Publications (2)

Publication Number Publication Date
TWI518634B true TWI518634B (en) 2016-01-21
TW201624424A TW201624424A (en) 2016-07-01

Family

ID=55640453

Family Applications (1)

Application Number Title Priority Date Filing Date
TW103143813A TWI518634B (en) 2014-12-16 2014-12-16 Augmented reality method and system

Country Status (2)

Country Link
CN (1) CN105786166B (en)
TW (1) TWI518634B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI586936B (en) * 2016-05-20 2017-06-11 國立交通大學 A transform method between a physical image and a virtual image and a system thereof
TWI603227B (en) * 2016-12-23 2017-10-21 李雨暹 Method and system for remote management of virtual message for a moving object
CN106843493B (en) * 2017-02-10 2019-11-12 成都弥知科技有限公司 A kind of picture charge pattern method and the augmented reality implementation method using this method
CN114935994A (en) * 2022-05-10 2022-08-23 阿里巴巴(中国)有限公司 Article data processing method, device and storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8688443B2 (en) * 2009-12-23 2014-04-01 At&T Intellectual Property I, L.P. Multimodal augmented reality for location mobile information service
KR100989663B1 (en) * 2010-01-29 2010-10-26 (주)올라웍스 Method, terminal device and computer-readable recording medium for providing information on an object not included in visual field of the terminal device
US8933986B2 (en) * 2010-05-28 2015-01-13 Qualcomm Incorporated North centered orientation tracking in uninformed environments
KR101303948B1 (en) * 2010-08-13 2013-09-05 주식회사 팬택 Apparatus and Method for Providing Augmented Reality Information of invisible Reality Object
KR101330805B1 (en) * 2010-08-18 2013-11-18 주식회사 팬택 Apparatus and Method for Providing Augmented Reality
TWI522966B (en) * 2010-12-15 2016-02-21 Augmented reality system without object image
US9214137B2 (en) * 2012-06-18 2015-12-15 Xerox Corporation Methods and systems for realistic rendering of digital objects in augmented reality
US20140192164A1 (en) * 2013-01-07 2014-07-10 Industrial Technology Research Institute System and method for determining depth information in augmented reality scene

Also Published As

Publication number Publication date
CN105786166A (en) 2016-07-20
CN105786166B (en) 2019-01-29
TW201624424A (en) 2016-07-01

Similar Documents

Publication Publication Date Title
US11495001B2 (en) Chroma key content management systems and methods
US10339714B2 (en) Markerless image analysis for augmented reality
KR101636027B1 (en) Methods and systems for capturing and moving 3d models and true-scale metadata of real world objects
CN104781849B (en) Monocular vision positions the fast initialization with building figure (SLAM) simultaneously
US20120095589A1 (en) System and method for 3d shape measurements and for virtual fitting room internet service
TW202117502A (en) An augmented reality data presentation method, device and storage medium
TW202004670A (en) Self-supervised training of a depth estimation system
JP2020535536A5 (en)
TWI628613B (en) Augmented reality method and system
JP6456347B2 (en) INSITU generation of plane-specific feature targets
WO2017020766A1 (en) Scenario extraction method, object locating method and system therefor
TW201724031A (en) Augmented reality method, system and computer-readable non-transitory storage medium
TWI518634B (en) Augmented reality method and system
US10339700B2 (en) Manipulating virtual objects on hinged multi-screen device
JP7316282B2 (en) Systems and methods for augmented reality
CN104360729A (en) Multi-interactive method and device based on Kinect and Unity 3D
WO2019088273A1 (en) Image processing device, image processing method and image processing program
WO2017041740A1 (en) Methods and systems for light field augmented reality/virtual reality on mobile devices
JP6621565B2 (en) Display control apparatus, display control method, and program
TW202026861A (en) Authoring device, authoring method, and authoring program
JP7445348B1 (en) Information processing device, method, program, and system
Madhkour et al. KOSEI: a kinect observation system based on kinect and projector calibration