TWI590189B - Augmented reality method, system and computer-readable non-transitory storage medium - Google Patents

Augmented reality method, system and computer-readable non-transitory storage medium Download PDF

Info

Publication number
TWI590189B
TWI590189B TW104143405A TW104143405A TWI590189B TW I590189 B TWI590189 B TW I590189B TW 104143405 A TW104143405 A TW 104143405A TW 104143405 A TW104143405 A TW 104143405A TW I590189 B TWI590189 B TW I590189B
Authority
TW
Taiwan
Prior art keywords
augmented reality
virtual
reference line
real
virtual object
Prior art date
Application number
TW104143405A
Other languages
Chinese (zh)
Other versions
TW201724031A (en
Inventor
廖歆蘭
曹修銓
陳泰安
梁哲瑋
Original Assignee
財團法人工業技術研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 財團法人工業技術研究院 filed Critical 財團法人工業技術研究院
Priority to TW104143405A priority Critical patent/TWI590189B/en
Priority to CN201511010572.9A priority patent/CN106910249A/en
Application granted granted Critical
Publication of TW201724031A publication Critical patent/TW201724031A/en
Publication of TWI590189B publication Critical patent/TWI590189B/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Description

擴增實境方法、系統及電腦可讀取非暫態儲存媒介 Augmented reality methods, systems and computer readable non-transitory storage media

本案有關於一種擴增實境方法、系統及電腦可讀取非暫態儲存媒介。 This case is about an augmented reality method, system and computer readable non-transitory storage medium.

透過實體宅妝,看屋者能看到屋內傢俱擺置情況,使得房屋成交率能被提高,並藉此瞭解消費者的喜好。自從擴增實境(Augment Reality,AR)被提出後,設計業者/廠商將設計風格及虛擬傢俱即時上傳到擴增實境平台,讓消費者可以在行動智慧裝置上看到虛擬傢俱與實體房屋物件的結合。 Through the physical house makeup, the housekeeper can see the furniture placement in the house, so that the transaction rate of the house can be improved, and to understand the preferences of consumers. Since the introduction of Augment Reality (AR), designers/vendors have instantly uploaded design styles and virtual furniture to the augmented reality platform, allowing consumers to see virtual and physical homes on mobile smart devices. The combination of objects.

目前,以擴增實境的應用而言,組合式傢俱(ready-to-assemble furniture)市場持續增長。如可提高互動式擺設友善性和即時預視效果的話,可以加速成交。 At present, the market for ready-to-assemble furniture continues to grow in terms of augmented reality applications. If you can improve the interactive and friendly look-and-see effect, you can speed up the transaction.

故而,如何讓使用者可以編輯及快速預覽擴增實境整體搭配效果,以進行適當的擴增實境展示,是此領域的重點之一。 Therefore, how to enable users to edit and quickly preview the overall effect of augmented reality for appropriate augmented reality display is one of the focuses in this field.

本案係有關於一種擴增實境方法、系統及電腦可讀 取非暫態儲存媒介,基於深度資訊來校正擴增實境的初始座標系統。 This case is about an augmented reality method, system and computer readable A non-transitory storage medium is taken to correct the initial coordinate system of the augmented reality based on the depth information.

根據本案一實施例,提出一種擴增實境方法。選擇相關於一真實環境的一虛擬格局。配置至少一虛擬物件在該虛擬格局中的一虛擬擺放位置。進行一座標系統之初始化與校正。於該校正座標系統中,顯示擴增實境。 According to an embodiment of the present invention, an augmented reality method is proposed. Choose a virtual landscape that is related to a real environment. Configuring at least one virtual object in a virtual placement position in the virtual landscape. Initialize and correct a standard system. In the calibration coordinate system, the augmented reality is displayed.

根據本案另一實施例,提出一種擴增實境系統,包括:一三維感測單元;一影像擷取單元;以及一擴增實境應用模組,該三維感測單元的一感測結果與該影像擷取單元所擷取的一影像傳送給該擴增實境應用模組。回應於一使用者選擇,該擴增實境應用模組選擇相關於一真實環境的一虛擬格局。該擴增實境應用模組配置至少一虛擬物件在該虛擬格局中的一虛擬擺放位置。該擴增實境應用模組進行一座標系統之初始化與校正。於該校正座標系統中,該擴增實境應用模組顯示擴增實境。 According to another embodiment of the present invention, an augmented reality system is provided, including: a three-dimensional sensing unit; an image capturing unit; and an augmented reality application module, and a sensing result of the three-dimensional sensing unit is An image captured by the image capturing unit is transmitted to the augmented reality application module. In response to a user selection, the augmented reality application module selects a virtual landscape associated with a real environment. The augmented reality application module configures at least one virtual object in a virtual placement position in the virtual landscape. The augmented reality application module performs initialization and correction of a standard system. In the calibration coordinate system, the augmented reality application module displays the augmented reality.

根據本案又一實施例提出一種電腦可讀取非暫態儲存媒介,當被一電腦讀取時,該電腦執行如上之擴增實境方法。 According to still another embodiment of the present invention, a computer readable non-transitory storage medium is provided. When read by a computer, the computer performs the augmented reality method as described above.

為了對本案之上述及其他方面有更佳的瞭解,下文特舉實施例,並配合所附圖式,作詳細說明如下: In order to better understand the above and other aspects of the present invention, the following specific embodiments, together with the drawings, are described in detail below:

110-140‧‧‧步驟 110-140‧‧‧Steps

210-240‧‧‧步驟 210-240‧‧‧Steps

310-330、310’-330’‧‧‧虛擬物件 310-330, 310’-330’‧‧‧ virtual objects

M‧‧‧虛擬擴增實境標記 M‧‧‧Virtual Augmentation Reality Mark

L‧‧‧虛擬參考線 L‧‧‧Virtual Reference Line

M’‧‧‧真實擴增實境標記 M’‧‧‧Real Augmented Reality Mark

L’‧‧‧真實參考線 L’‧‧· true reference line

410-455‧‧‧步驟 410-455‧‧‧Steps

510-550‧‧‧步驟 510-550‧‧‧Steps

1000‧‧‧擴增實境系統 1000‧‧‧Augmented Reality System

1010‧‧‧三維感測單元 1010‧‧‧3D sensing unit

1015‧‧‧重力感測單元 1015‧‧‧Gravity sensing unit

1020‧‧‧影像擷取單元 1020‧‧‧Image capture unit

1030‧‧‧擴增實境應用模組 1030‧‧‧Augmented Reality Application Module

1040‧‧‧顯示單元 1040‧‧‧Display unit

第1圖顯示根據本案一實施例之擴增實境(AR)方法之流程圖。 Figure 1 shows a flow chart of an augmented reality (AR) method in accordance with an embodiment of the present invention.

第2圖顯示根據本案另一實施例之擴增實境(AR)方法之流程圖。 Figure 2 shows a flow chart of an augmented reality (AR) method in accordance with another embodiment of the present invention.

第3A圖顯示根據本案一實施例之虛擬格局擺設示意圖。 FIG. 3A is a schematic diagram showing the virtual layout according to an embodiment of the present invention.

第3B圖顯示根據本案一實施例之虛擬參考線L之示意圖。 Figure 3B shows a schematic diagram of a virtual reference line L in accordance with an embodiment of the present invention.

第3C圖顯示根據本案一實施例之真實參考線L’之示意圖。 Figure 3C shows a schematic diagram of a true reference line L' in accordance with an embodiment of the present invention.

第3D圖顯示根據本案一實施例之擴增實境系統所顯示的擴增實境。 Figure 3D shows the augmented reality displayed by the augmented reality system in accordance with an embodiment of the present invention.

第4圖顯示根據本案一實施例之座標系統初始化與校正架構之流程圖。 Figure 4 is a flow chart showing the coordinate system initialization and correction architecture in accordance with an embodiment of the present invention.

第5圖顯示根據本案一實施例之Z軸參考點距離估測的流程。 Figure 5 shows the flow of the Z-axis reference point distance estimation according to an embodiment of the present invention.

第6圖為快速行進法影像修復技術的原理圖。 Figure 6 is a schematic diagram of the fast-moving image restoration technique.

第7圖顯示根據本案一實施例之視角校正之示意圖。 Figure 7 is a diagram showing the viewing angle correction according to an embodiment of the present invention.

第8圖顯示根據本案一實施例真實參考線長度估算之示意圖。 Figure 8 is a diagram showing the actual reference line length estimation according to an embodiment of the present invention.

第9圖顯示根據本案一實施例所建立的座標系統的一例。 Figure 9 shows an example of a coordinate system established in accordance with an embodiment of the present invention.

第10圖顯示根據本案實施例之擴增實境系統之功能方塊圖。 Figure 10 shows a functional block diagram of an augmented reality system in accordance with an embodiment of the present invention.

本說明書的技術用語係參照本技術領域之習慣用語,如本說明書對部分用語有加以說明或定義,該部分用語之解釋係以本說明書之說明或定義為準。本揭露之各個實施例分別具有一或多個技術特徵。在可能實施的前提下,本技術領域具有通常知識者可選擇性地實施任一實施例中部分或全部的技術特 徵,或者選擇性地將這些實施例中部分或全部的技術特徵加以組合。 The technical terms of the present specification refer to the idioms in the technical field, and some of the terms are explained or defined in the specification, and the explanation of the terms is based on the description or definition of the specification. Various embodiments of the present disclosure each have one or more of the technical features. Subject to the possible implementation, those skilled in the art can selectively implement some or all of the technical features of any embodiment. The features, or some or all of the technical features of these embodiments, are selectively combined.

第1圖顯示根據本案一實施例之擴增實境(AR)方法之流程圖。第1圖之擴增實境(AR)方法可應用於擴增實境系統中。於步驟110中,使用者選擇相關於所處真實環境的虛擬格局。亦即,回應於一使用者選擇,擴增實境系統選擇相關於一真實環境的虛擬格局。舉例來說,假設使用者想要在房間內操作擴增實境系統來預覽傢俱的擺放配置的話,則使用者可以在擴增實境系統上操作,來選擇適合目前所處房間的一虛擬格局(例如房間形狀、房間功能、設計風格等)。在本案一實施例中,擴增實境系統與方法可由行動智慧裝置(例如但不受限於,智慧型手機、平板電腦等)或頭戴式裝置(包括顯示器)來完成。 Figure 1 shows a flow chart of an augmented reality (AR) method in accordance with an embodiment of the present invention. The Augmented Reality (AR) method of Figure 1 can be applied to augmented reality systems. In step 110, the user selects a virtual landscape that is related to the real environment in which they are located. That is, in response to a user selection, the augmented reality system selects a virtual landscape that is related to a real environment. For example, if the user wants to operate the augmented reality system in the room to preview the placement of the furniture, the user can operate on the augmented reality system to select a virtual one suitable for the current room. Pattern (such as room shape, room function, design style, etc.). In an embodiment of the present invention, the augmented reality system and method may be performed by a mobile smart device (such as, but not limited to, a smart phone, a tablet, etc.) or a head mounted device (including a display).

於步驟120中,由擴增實境系統來配置至少一虛擬物件在該虛擬格局中的虛擬擺放位置,其中可以由使用者操作來輸入或由其他的輸入模式來輸入。在此,虛擬物件例如但不受限於,傢俱、家飾等。詳細地說,於步驟120中,根據使用者選擇之虛擬格局(此虛擬格局相關於所處真實環境),擴增實境系統從擴增實境系統內的資料庫中選擇適合的至少一虛擬物件,來將所選的至少一虛擬物件擺放於使用者所選虛擬格局中。 In step 120, a virtual placement position of at least one virtual object in the virtual landscape is configured by the augmented reality system, which may be input by a user operation or input by other input modes. Here, the virtual object is, for example but not limited to, furniture, furniture, and the like. In detail, in step 120, according to the virtual pattern selected by the user (this virtual pattern is related to the real environment), the augmented reality system selects at least one virtual one from the database in the augmented reality system. An object to place the selected at least one virtual object in a virtual pattern selected by the user.

於步驟130中,進行座標系統初始化與校正。步驟130的細節將於底下說明之。 In step 130, the coordinate system initialization and correction is performed. The details of step 130 will be explained below.

於步驟140中,於校正後的座標系統上,顯示擴增 實境於裝置的顯示單元上。 In step 140, on the corrected coordinate system, display amplification The reality is on the display unit of the device.

第2圖顯示根據本案另一實施例之擴增實境(AR)方法之流程圖。第2圖之步驟210、220、230與240相同或相似於第1圖之步驟110、120、130與140。 Figure 2 shows a flow chart of an augmented reality (AR) method in accordance with another embodiment of the present invention. Steps 210, 220, 230 and 240 of Figure 2 are identical or similar to steps 110, 120, 130 and 140 of Figure 1.

在步驟210中,使用者選擇相關於所處真實環境的虛擬格局。 In step 210, the user selects a virtual landscape that is relevant to the real environment in which they are located.

於步驟215中,擴增實境系統顯示使用者所選的虛擬格局於互動式介面上。 In step 215, the augmented reality system displays the virtual pattern selected by the user on the interactive interface.

於步驟220中,使用者操作擴增實境系統的互動式介面(亦即,回應於互動式介面之操作),以配置在真實環境中的虛擬物件的虛擬擺放位置。在此,以平面圖來顯示虛擬物件的虛擬擺放位置。在本案一實施例中,平面圖由所選擇的虛擬格局和虛擬物件構成。平面圖可為透視圖法所構成之佈局,包括但不限於下列之一或其任意組合:鳥瞰圖、平行透視圖、二點透視圖、斜角投影圖等。虛擬格局可以幾何圖形來表現之。虛擬物件可由擴增實境系統內的一虛擬物件資料庫中選擇,或由使用者自訂。使用者可在互動式介面上,以拖曳或點選等操作方式來擺設虛擬物件的虛擬擺放位置。第3A圖顯示根據本案一實施例之虛擬格局擺設示意圖。在第3A圖中,物件310-330代表虛擬物件。使用者可在互動式介面上配置及/或改變虛擬物件310-330的虛擬擺放位置。 In step 220, the user operates an interactive interface of the augmented reality system (ie, in response to the operation of the interactive interface) to configure the virtual placement of the virtual object in the real environment. Here, the virtual placement position of the virtual object is displayed in a plan view. In an embodiment of the present invention, the plan view is composed of the selected virtual pattern and the virtual object. The plan view may be a layout constructed by a perspective method including, but not limited to, one or any combination of the following: a bird's eye view, a parallel perspective view, a two-point perspective view, a beveled projection view, and the like. The virtual landscape can be represented by geometry. The virtual object may be selected from a virtual object database within the augmented reality system or customized by the user. The user can display the virtual placement position of the virtual object by dragging or clicking on the interactive interface. FIG. 3A is a schematic diagram showing the virtual layout according to an embodiment of the present invention. In Figure 3A, objects 310-330 represent virtual objects. The user can configure and/or change the virtual placement of the virtual objects 310-330 on the interactive interface.

在本案實施例中,互動式介面之功能更包括參考值 設定。參考值設定係指平面圖參考物與真實環境之間的對應關係的設定。互動式介面可自動指定平面圖中之任一或一個以上的虛擬物件為參考物。前述對應關係為所指定之平面圖參考物與真實環境資訊之間的比對、校準、與設定;其中真實環境資訊由一個或多個感測器取得,真實環境資訊包括但不限於影像訊號與深度資訊等。 In the embodiment of the present case, the function of the interactive interface further includes a reference value. set up. The reference value setting refers to the setting of the correspondence between the floor plan reference object and the real environment. The interactive interface automatically assigns any one or more virtual objects in the floor plan as a reference. The foregoing correspondence relationship is the comparison, calibration, and setting between the specified floor plan reference object and the real environment information; wherein the real environment information is obtained by one or more sensors, and the real environment information includes but is not limited to the image signal and depth. Information, etc.

於步驟222中,擴增實境系統提示使用者於所選虛擬格局中選擇座標初始化與校正所用之虛擬參考線。擴增實境系統產生提示信號給使用者,回應於使用者選擇,擴增實境系統在所選虛擬格局中選擇座標初始化與校正所用之虛擬參考線。在使用者選擇好虛擬參考線後,擴增實境系統開啟行動智慧裝置之三維感測單元和其他相關感測器(例如但不受限於,相機、重力感測單元(g-sensor)、陀螺儀等)來偵測相對應之真實參考線(步驟224),在本案一實施例中,三維感測單元例如為三維深度感測單元。 In step 222, the augmented reality system prompts the user to select a virtual reference line for coordinate initialization and correction in the selected virtual landscape. The augmented reality system generates a cue signal to the user, in response to the user selection, the augmented reality system selects the virtual reference line used for coordinate initialization and correction in the selected virtual landscape. After the user selects the virtual reference line, the augmented reality system turns on the three-dimensional sensing unit of the mobile smart device and other related sensors (such as, but not limited to, a camera, a gravity sensing unit (g-sensor), The gyroscope or the like is configured to detect the corresponding real reference line (step 224). In an embodiment of the present disclosure, the three-dimensional sensing unit is, for example, a three-dimensional depth sensing unit.

在本案實施例中,虛擬參考線定義為,在平面圖中,從虛擬擴增實境標記(marker)M延伸至使用者所選虛擬格局的邊界(通常為牆面)之線段。第3B圖顯示根據本案一實施例之虛擬參考線L之示意圖,此虛擬參考線L由虛擬擴增實境標記M延伸至虛擬格局邊界。虛擬參考線L之長度乃為已知。亦即,由於已知使用者所選虛擬格局的尺寸且虛擬擴增實境標記M位在此虛擬格局的中央或其他位置,故而,可以得知虛擬參考線L之長度。 在本案一實施例中,可以由互動式介面來指定虛擬擴增實境標記M的位置。 In the present embodiment, the virtual reference line is defined as a line segment extending from a virtual augmented reality marker M to a boundary (usually a wall) of the virtual pattern selected by the user in a plan view. Figure 3B shows a schematic diagram of a virtual reference line L extending from the virtual augmented reality mark M to the virtual pattern boundary, in accordance with an embodiment of the present invention. The length of the virtual reference line L is known. That is, since the size of the virtual pattern selected by the user is known and the virtual augmented reality mark M is located at the center or other position of the virtual pattern, the length of the virtual reference line L can be known. In an embodiment of the present invention, the location of the virtual augmented reality marker M can be specified by an interactive interface.

至於相對應於此虛擬參考線的真實參考線則是,放置在真實環境中的真實擴增實境標記M’延伸至此真實環境的邊界(通常為牆面)之線段。第3C圖顯示根據本案一實施例之真實參考線L’之示意圖,此真實參考線L’由真實擴增實境標記M’延伸至真實環境的邊界。 As for the true reference line corresponding to this virtual reference line, the true augmented reality mark M' placed in the real environment extends to the line segment of the boundary (usually the wall) of the real environment. Figure 3C shows a schematic diagram of a true reference line L' according to an embodiment of the present invention, which is extended from the true augmented reality mark M' to the boundary of the real environment.

舉例來說,真實擴增實境標記M’可以是一已知尺寸之真實物件,例如一張A4大小的白紙,放置在使用者所處真實環境的中央位置(舉例,但不受限於中央位置)。因此擴增實境系統須成功地偵測到真實擴增實境標記M’和真實環境的邊界。 For example, the real augmented reality mark M' may be a real object of a known size, such as an A4 size white paper placed in a central location of the real environment in which the user is located (for example, but not limited to the central position). Therefore, the augmented reality system must successfully detect the boundary of the real augmented reality marker M' and the real environment.

然而,在實務上,被持在使用者手上的行動智慧裝置難免會有視角偏差或位移。故而,在本案一實施例中,如果行動智慧裝置有視角偏差或位移,則根據深度資訊進行座標系統初始化與校正(步驟230)以估測真實參考線L’之長度,進而推算出平面圖裡的虛擬物件在真實環境中的虛擬擺放位置,以供使用者體驗擴增實境內容。 However, in practice, the mobile intelligence device held by the user is inevitably subject to viewing angle deviation or displacement. Therefore, in an embodiment of the present invention, if the mobile intelligence device has a viewing angle deviation or displacement, the coordinate system initialization and correction is performed according to the depth information (step 230) to estimate the length of the real reference line L', thereby deriving the plan view. The virtual object is placed in a virtual environment in a real environment for the user to experience augmented reality content.

於步驟235中,虛擬物件的縮放比例可藉由三維感測單元和其他相關感測器(例如但不受限於,相機、重力感測單元、陀螺儀等)所偵測之真實環境資訊推算。而虛擬物件在真實環境中的相對位置,可根據虛擬參考線L之長度與真實參考線L’之長度之間的比值推算。從上述說明可以得知,本案實施例已知 虛擬參考線L之長度,也已可偵測出真實參考線L’之長度,可根據虛擬參考線L之長度與真實參考線L’之長度之間的比值來得到縮放比例,此縮放比例可套用至所有虛擬物件在真實環境中的虛擬擺放位置。 In step 235, the scaling of the virtual object can be estimated by the real environment information detected by the three-dimensional sensing unit and other related sensors (such as, but not limited to, a camera, a gravity sensing unit, a gyroscope, etc.). . The relative position of the virtual object in the real environment can be estimated from the ratio between the length of the virtual reference line L and the length of the real reference line L'. It can be known from the above description that the embodiment of the present invention is known The length of the virtual reference line L can also detect the length of the real reference line L', and can be scaled according to the ratio between the length of the virtual reference line L and the length of the real reference line L', and the scaling ratio can be Apply to the virtual placement of all virtual objects in the real world.

於步驟240中,顯示擴增實境於擴增實境系統上。第3D圖顯示顯示根據本案一實施例之擴增實境系統所顯示的擴增實境。舉例來說,為方便解釋,假設第3A圖的虛擬物件310與虛擬格局邊界之距離為A。經過縮放比例調整後,第3D圖所顯示出的虛擬物件310’與真實環境邊界的距離為A*(L’/L)。 In step 240, the augmented reality is displayed on the augmented reality system. Figure 3D shows an augmented reality displayed by an augmented reality system in accordance with an embodiment of the present invention. For example, for convenience of explanation, it is assumed that the distance between the virtual object 310 of FIG. 3A and the boundary of the virtual pattern is A. After the scaling adjustment, the distance between the virtual object 310' displayed in the 3D map and the boundary of the real environment is A*(L'/L).

如第3D圖所示,使用者在行動智慧裝置的螢幕上可以看到虛擬物件310’、320’與330’在真實環境中的擺放。 As shown in Fig. 3D, the user can see the placement of the virtual objects 310', 320' and 330' in the real environment on the screen of the mobile smart device.

在顯示擴增實境時,可持續追蹤使用者的行動智慧裝置的位置及相機角度,以持續校正所顯示的擴增實境。 When the augmented reality is displayed, the position of the user's mobile intelligence device and the camera angle are continuously tracked to continuously correct the displayed augmented reality.

現將說明本案實施例如何進行座標系統初始化與校正。第4圖顯示根據本案一實施例之座標系統初始化與校正架構之流程圖。 It will now be explained how the embodiment of the present invention performs coordinate system initialization and correction. Figure 4 is a flow chart showing the coordinate system initialization and correction architecture in accordance with an embodiment of the present invention.

於步驟410中,利用三維感測單元取得深度資訊。例如,利用三維深度感測技術來取得深度資訊,三維深度感測技術舉例包含但不限於超音波感測、時差測距(ToF,Time-of-Flight)、結構光源(structured lighting)、和立體視覺法(stereo vision)等。本案實施例以時差測距技術取得深度資訊為例做說明,藉由對深度資訊進行前處理,以估測Z-軸參考點距離並 利於座標系統初始化與校正。 In step 410, the depth information is obtained by using the three-dimensional sensing unit. For example, three-dimensional depth sensing technology is used to obtain depth information. Examples of three-dimensional depth sensing technologies include, but are not limited to, ultrasonic sensing, time-of-flight (ToF), structured lighting, and stereoscopic Vision (stereo vision), etc. The embodiment of the present invention uses the time difference ranging technology to obtain depth information as an example to illustrate, by preprocessing the depth information to estimate the Z-axis reference point distance and Contribute to the coordinate system initialization and correction.

於步驟420中,擷取環境影像。步驟420例如但不受限於是,由行動智慧裝置的相機所執行。於步驟425中,偵測位移資訊。步驟425例如但不受限於是,由行動智慧裝置的重力感測單元所執行,來偵測此行動智慧裝置的位移資訊。 In step 420, an environmental image is captured. Step 420 is, for example but not limited to, performed by a camera of the mobile smart device. In step 425, the displacement information is detected. Step 425 is, for example but not limited to, being performed by a gravity sensing unit of the mobile smart device to detect displacement information of the mobile smart device.

於步驟430中,進行Z值前處理。步驟430的Z值前處理可根據步驟410所取得的深度資訊、步驟420所擷取的環境影像與步驟425所取得的位移資訊,其細節將於底下說明之。 In step 430, a Z value pre-processing is performed. The Z value pre-processing of step 430 may be based on the depth information obtained in step 410, the environmental image captured in step 420, and the displacement information obtained in step 425, the details of which will be described below.

於步驟435中,進行參考點前處理。步驟435的參考點前處理可根據步驟410所取得的深度資訊、步驟420所擷取的環境影像與步驟425所取得的位移資訊,其細節將於底下說明之。 In step 435, reference point pre-processing is performed. The reference point pre-processing of step 435 may be based on the depth information obtained in step 410, the environmental image captured in step 420, and the displacement information obtained in step 425, the details of which will be described below.

於步驟440中,根據步驟430的Z值前處理結果與步驟435的參考點前處理結果來估測距離。所估測的距離例如是真實參考線的長度等。步驟440的細節將於底下說明之。 In step 440, the distance is estimated based on the Z value pre-processing result of step 430 and the reference point pre-processing result of step 435. The estimated distance is, for example, the length of the real reference line and the like. The details of step 440 will be explained below.

於步驟445,建立一座標系統。於步驟450中,擺放虛擬物件於所建立的座標系統中。 At step 445, a standard system is established. In step 450, the virtual object is placed in the established coordinate system.

於步驟455中,進行座標系統初始化與校正。 In step 455, the coordinate system initialization and correction is performed.

於進行座標系統初始化時,可導入距離回推與縮放比例換算,並求得座標系統轉換矩陣。 When the coordinate system is initialized, the distance pushback and scaling conversion can be imported, and the coordinate system conversion matrix can be obtained.

在進行距離回推時,可利用三維感測單元所感測到的深度資訊以及相機所擷取的影像訊號估測真實參考線的長度 以輔助定位,或以藍芽訊號、無線射頻、可見光通訊訊號等輔助定位。距離回推包含但不限於,雜訊處理、參考點定位、距離回推矩陣運算、及容錯控制等。 When the distance is pushed back, the depth information sensed by the three-dimensional sensing unit and the image signal captured by the camera can be used to estimate the length of the real reference line. Auxiliary positioning, or assisted positioning by Bluetooth signal, radio frequency, visible light communication signal, etc. Distance pushback includes, but is not limited to, noise processing, reference point positioning, distance pushback matrix operations, and fault tolerant control.

縮放比例換算基於已知值與比例係數,將已知座標系統中的物件大小及物件間相對關係依照比例係數來縮放至所對應之大小與相對關係。 The scaling is based on the known value and the scale factor, and the object size and the relative relationship between the objects in the known coordinate system are scaled according to the proportional coefficient to the corresponding size and relative relationship.

座標系統轉換矩陣包括但不限於繪圖座標、虛擬三維模型座標、相機座標、三維感測單元座標、及世界座標等該些座標系統之間的轉換。 Coordinate system transformation matrices include, but are not limited to, mapping coordinates, virtual 3D model coordinates, camera coordinates, 3D sensing unit coordinates, and world coordinates such as conversion between coordinate systems.

於進行座標系統校正時,可利用相機、重力感測單元、陀螺儀、與三維感測單元等所取得的資訊。另外,進行座標系統校正所用的虛擬三維模型包括但不限於虛擬物件的擺設座標和三維模型渲染效果設定。 For coordinate system calibration, information obtained by cameras, gravity sensing units, gyroscopes, and three-dimensional sensing units can be utilized. In addition, the virtual three-dimensional model used for coordinate system correction includes, but is not limited to, the placement coordinates of the virtual object and the three-dimensional model rendering effect setting.

於座標系統校正時,可整合該些感測器所偵測/擷取到的資訊,相互比對資訊的一致性,以推算誤差值,並利用誤差值推算結果來修改原輸出結果。 When the coordinate system is calibrated, the information detected/captured by the sensors can be integrated, and the consistency of the information can be compared with each other to estimate the error value, and the error value is used to estimate the original output result.

第5圖顯示根據本案一實施例之Z軸參考點距離估測的流程。在本案實施例中,為了使用圖像修復技術來修補由時差測距方法所感測到之深度資訊,將步驟410所取得的深度資訊轉化為二維灰階圖像,以視覺化呈現當前場景之遠近(距離)關係。分別以三維感測單元之最遠與最近感測距離極限為基準,將所測得的距離轉換成浮點數(位於0.0~1.0區間),再映射至介於 0-255之間的灰階值(步驟520)。其中,對於感測距離不在可信範圍(由估測程序設定)內的深度資訊將被裁減為待修復區域(步驟510),待修復區域包含被裁減的深度資訊以及感測器因物件邊緣或材質問題而造成的回傳值(深度資訊)缺失等情形。 Figure 5 shows the flow of the Z-axis reference point distance estimation according to an embodiment of the present invention. In the embodiment of the present invention, in order to use the image repair technology to repair the depth information sensed by the time difference ranging method, the depth information obtained in step 410 is converted into a two-dimensional grayscale image to visually present the current scene. Distance (distance) relationship. The measured distance is converted into a floating point number (in the range of 0.0 to 1.0) based on the farthest and closest sensing distance limit of the three-dimensional sensing unit, and then mapped to A grayscale value between 0-255 (step 520). Wherein, the depth information within the trusted range (set by the estimation program) for the sensing distance will be reduced to the area to be repaired (step 510), the area to be repaired contains the depth information to be corrected and the sensor is due to the edge of the object or The return value (depth information) caused by the material problem is missing.

三維與二維資訊映射(步驟520),在一實施例中但不以此為限,將三維感測單元所偵測之深度資訊與最遠感測距離極限相除,轉換成浮點數,再以此轉換結果乘上255並下取整數,映設為灰階值。假設所測得之深度資訊為d,最遠感測距離極限為dMAX,則其所映射的灰階值為。以此式對偵測取得之所有深度資訊進行運算,可得二維資訊映設結果。 3D and 2D information mapping (step 520), in an embodiment, but not limited thereto, dividing the depth information detected by the three-dimensional sensing unit by the farth sensing distance limit, and converting into a floating point number, Then multiply the result by 255 and take the integer, and set it to the grayscale value. Assuming that the measured depth information is d and the farthest sensing distance limit is d MAX , the mapped grayscale value is . In this way, all the depth information obtained by the detection is calculated, and the result of the two-dimensional information mapping can be obtained.

二維影像修復(步驟530)可採用快速行進法(Fast Marching Method)的影像修復技術,第6圖為快速行進法影像修復技術的原理圖。將二維影像分為像素區域ε與待修復區域Ω,假設待修復區域Ω之輪廓為δΩ,修補時由外至內逐圈將像素輪廓δΩ修補完成。 The two-dimensional image restoration (step 530) can adopt the image restoration technology of the Fast Marching Method, and the sixth figure is the schematic diagram of the image technique of the fast marching method. The two-dimensional image is divided into a pixel area ε and a region to be repaired Ω, assuming that the contour of the region to be repaired Ω is δΩ, and the pixel contour δΩ is repaired from the outside to the inside by patching.

對於每個δΩ中的待修復像素點p,其像素值由公式(1)決定: For each pixel p to be repaired in δΩ, the pixel value is determined by equation (1):

其中B i (p) ε為以p為中心在給定半徑內的像素區域點群,I(.)為輸出像素值,▽I(.)為梯度值函數。w(p,q)為Bi(P)中像素點q對待修復像素點p之權重,w(p,q)由公式(2)中的三個 因子決定:w(p,q)=dir(p,q).dst(p,q).lev(p,q) (2) Where B i ( p ) ε is a pixel region point group centered on p within a given radius, I(.) is the output pixel value, and ▽I(.) is the gradient value function. w(p,q) is the weight of pixel point q in B i (P) to be repaired pixel point p, w(p,q) is determined by three factors in formula (2): w(p,q)=dir (p, q). Dst(p,q). Lev(p,q) (2)

方向因子dir(p,q)代表愈靠近待修復像素點p切線方向之法向量權重愈大,方向因子dir(p,q)如公式(3): The direction factor dir(p,q) represents the larger the normal vector weight of the tangential direction of the pixel p to be repaired, and the direction factor dir(p,q) is as shown in formula (3):

幾何距離因子dst(p,q)代表愈靠近待修復像素點p的權重愈大,幾何距離因子dst(p,q)如公式(4): The geometric distance factor dst(p,q) represents the greater the weight closer to the pixel p to be repaired, and the geometric distance factor dst(p,q) is as in equation (4):

水平集距離因子lev(p,q)則代表愈靠近輪廓線δΩ的權重愈大,水平集距離因子lev(p,q)如公式(5)所示: The horizontal set distance factor lev(p,q) represents the greater the weight closer to the contour line δΩ, and the horizontal set distance factor lev(p,q) is as shown in equation (5):

其中d0為幾何距離參數,T代表像素點到δΩ的距離,T0為水平集參數。使用影像修復技術可降低步驟510中被裁減的深度資訊中的像素值缺失的情況。 Where d 0 is the geometric distance parameter, T represents the distance from the pixel point to δ Ω, and T 0 is the level set parameter. The use of image restoration techniques can reduce the absence of pixel values in the reduced depth information in step 510.

視角校正(步驟540)如第7圖所示。第7圖的左圖示意常見的視角偏差,其中M’代表為已知大小和形狀之真實擴增實境標記(例如是A4的圖像),r’表示偵測到的Z-軸參考點距離(即牆面,用以對應虛擬格局邊界)。當使用者站在偏離真實擴增實境標記M’的位置(y軸旋轉)並以自然手持習慣將設備前傾(x軸旋轉)感測時,所偵測到的r’長於理想Z軸參考點距離r(r代表使用者與理想Z軸參考點之間的距離),這將影響座標系統建立(步驟445) 與物件擺放(步驟450)的精準度。為減少距離估測誤差(由視角偏差所造成)影響真實參考線L,的長度的推算,利用投影轉換矩陣(projective transformation matrix)R將前述影像修復結果圖轉正,如第7圖的右圖所示。 The angle of view correction (step 540) is as shown in FIG. The left image of Figure 7 illustrates common viewing angle deviations, where M' represents a true augmented reality marker of known size and shape (for example, an image of A4), and r' represents the detected Z-axis reference. Point distance (that is, the wall surface, which corresponds to the virtual pattern boundary). When the user stands at a position that deviates from the true augmented reality mark M' (y-axis rotation) and senses the device forward (x-axis rotation) in a natural hand-held habit, the detected r' is longer than the ideal Z-axis. Reference point distance r (r represents the distance between the user and the ideal Z-axis reference point), which will affect the coordinate system establishment (step 445) The accuracy with the placement of objects (step 450). In order to reduce the estimation of the length of the real reference line L caused by the distance estimation error (caused by the angle of view deviation), the image restoration result map is rotated by the projection transformation matrix R, as shown in the right diagram of FIG. Show.

藉由擴增實境標記識別技術來偵測真實擴增實境標記M’的邊緣與頂點,並以所擷取影像的形變結果(如第7圖左之真實擴增實境標記,其例如是A4圖像的形變結果)和理想形狀(如第7圖右之真實擴增實境標記M’理想形狀,例如等腰梯形)反推出投影轉換矩陣R,如式(6)所示: Detecting the edge and apex of the real augmented reality marker M' by augmented reality marker recognition technology, and using the deformation result of the captured image (such as the real augmented reality marker on the left of Fig. 7, for example It is the deformation result of the A4 image) and the ideal shape (such as the true augmented reality mark M' ideal shape on the right in Fig. 7, for example, the isosceles trapezoid) reversely projects the projection transformation matrix R, as shown in equation (6):

為旋轉矩陣(rotation matrix),定義圖形的旋轉與縮放程度。 For the rotation matrix, define the degree of rotation and scaling of the graph.

為轉譯向量(translation vector),定義各點的位移程度。 For the translation vector, define the degree of displacement of each point.

[c 31 c 32]為投影向量(projection vector),定義透視失真程度。 [ c 31 c 32 ] is a projection vector that defines the degree of perspective distortion.

透過投影轉換矩陣R將影像修復結果上的各點轉正,即可模擬使用者站在真實擴增實境標記的中點時的視角。此時,偏離情況已被修正至與真實擴增實境標記中點對應的Z-軸參考點,其距離為r”(如第7圖的右圖所示)。接著利用重力感測單 元或陀螺儀等所感測到的角位移資訊計算來行動智慧裝置的前傾角度θ,以r”cos θ為理想Z-軸參考點距離(步驟550)。 By rotating the points on the image restoration result through the projection transformation matrix R, the angle of view of the user standing at the midpoint of the real augmented reality marker can be simulated. At this point, the deviation has been corrected to the Z-axis reference point corresponding to the midpoint of the true augmented reality marker, the distance being r" (as shown in the right image of Figure 7). Then using the gravity sensing unit or The angular displacement information sensed by the gyroscope or the like is calculated as the forward tilt angle θ of the mobile smart device, and r"cos θ is the ideal Z-axis reference point distance (step 550).

至於距離估測單元440估算真實參考線L’之長度如第8圖所示。於第8圖中,參數X表示真實參考線L’之長度(即真實擴增實境標記M’至牆面之距離)。由第8圖可知x=r-dr,其中,行動智慧裝置(由使用者所持)至真實擴增實境標記M’之水平距離dr可表示為。由於前述程序已偵測到真實擴增實境標記M’的位置,因此可取該位置之深度值做為行動智慧裝置(由使用者所持)至真實擴增實境標記M’之距離dm。而行動智慧裝置至牆角的距離s則需先透過邊緣偵測演算法定位出交界處,再取其深度值做為距離s。 As for the distance estimation unit 440, the length of the real reference line L' is estimated as shown in FIG. In Fig. 8, the parameter X represents the length of the real reference line L' (i.e., the distance from the true augmented reality mark M' to the wall). It can be seen from Fig. 8 that x=rd r , wherein the horizontal distance d r of the mobile smart device (held by the user) to the real augmented reality mark M′ can be expressed as . Since the program has been detected real augmented reality marker M 'position, therefore it is preferable that a depth value as the position of the mobile device intelligence (held by the user) to the true augmented reality marker M' distance d m. The distance s from the mobile smart device to the corner of the wall needs to first locate the junction through the edge detection algorithm, and then take the depth value as the distance s.

整體而言,在本案實施例中,z值前處理(亦即,深度資訊前處理)和參考點前處理可以相輔相成推算出真實參考線L’之長度。步驟430的z值前處理(深度資訊前處理)包含深度資訊之裁減(步驟510)、映射(步驟520)、修復(步驟530)、視角校正運算(步驟540)與得到z軸參考點距離(步驟550)等。步驟435的參考點前處理包含感測器訊號處理(如影像視角校正運算)、擴增實境標記偵測、邊緣偵測和角位移資訊輸出等。當然,真實參考線L’之長度推算的方法、程序、資訊和訊號來源不以此實施例所揭露內容為限。 In general, in the present embodiment, the z-value pre-processing (i.e., depth information pre-processing) and the reference point pre-processing can complement each other to derive the length of the real reference line L'. The z-value pre-processing (depth information pre-processing) of step 430 includes a reduction of depth information (step 510), mapping (step 520), repair (step 530), perspective correction operation (step 540), and obtaining a z-axis reference point distance ( Step 550) and the like. The reference point pre-processing of step 435 includes sensor signal processing (such as image angle correction operation), augmented reality mark detection, edge detection, and angular displacement information output. Of course, the method, program, information, and signal source for calculating the length of the real reference line L' are not limited to the contents disclosed in this embodiment.

於取得真實參考線L’之長度(步驟440)後,依照縮放比例,從使用者所選虛擬格局來推算出使用者所處真實環境的 大略尺寸,據以建立座標系統(步驟445)。 After obtaining the length of the real reference line L' (step 440), according to the scaling ratio, the virtual environment selected by the user is used to calculate the real environment of the user. The size is roughly sized to establish a coordinate system (step 445).

第9圖顯示根據本案一實施例所建立的座標系統的一例。如第9圖所示,長30英呎、寬15英呎、高10英呎之立方體空間模型(舉例而言,當知本案並不受限於此)是依照所偵測出的真實參考線的長度和所選虛擬格局經比例推算而得,而此比例亦可用以換算出平面圖中的虛擬物件(如第3A圖之虛擬物件310-330)與座標系統原點的相對位置(即虛擬物件質心的座標),以完成座標系統初始化與校正。在建立出座標系統後,即可決定虛擬物件的擺放位置(如第9圖所示)。 Figure 9 shows an example of a coordinate system established in accordance with an embodiment of the present invention. As shown in Figure 9, a cubic space model of 30 inches long, 15 inches wide, and 10 inches high (for example, when the case is not limited to this) is based on the detected true reference line. The length and the selected virtual pattern are derived by proportionally, and the ratio can also be used to convert the relative position of the virtual object in the plan view (such as the virtual object 310-330 of FIG. 3A) to the origin of the coordinate system (ie, the virtual object). The coordinates of the centroid) to complete the coordinate system initialization and correction. Once the coordinate system is established, the placement of the virtual object can be determined (as shown in Figure 9).

本案另一實施例揭露擴增實境系統,可執行於電子裝置上。電子裝置例如但不受限於行動智慧裝置與頭戴式裝置。第10圖顯示根據本案實施例之擴增實境系統之功能方塊圖。擴增實境系統1000至少包括:三維感測單元1010、重力感測單元1015、影像擷取單元1020、擴增實境應用模組1030與顯示單元1040。三維感測單元1010與重力感測單元1015的感測結果與影像擷取單元1020所擷取的影像可傳送給擴增實境應用模組1030。重力感測單元1015用以感測位移資訊,也可稱為位移感測單元。當擴增實境應用模組1030被執行時,該電子裝置可執行上述的擴增實境方法,並將擴增實境顯示於顯示單元1040上。擴增實境應用模組1030可為硬體電路或軟體來實施。擴增實境應用模組1030可由一積體電路或一處理單元來實施。後續擴增實境內容之呈現則可仰賴即時定位技術找出使用者在真實環境 中的座標與偵測出相機視角,以正確貼合虛擬物件至指定位置中、實現環景擴增實境的體驗感。 Another embodiment of the present disclosure discloses an augmented reality system executable on an electronic device. Electronic devices such as, but not limited to, mobile smart devices and head mounted devices. Figure 10 shows a functional block diagram of an augmented reality system in accordance with an embodiment of the present invention. The augmented reality system 1000 includes at least a three-dimensional sensing unit 1010, a gravity sensing unit 1015, an image capturing unit 1020, an augmented reality application module 1030, and a display unit 1040. The sensing result of the three-dimensional sensing unit 1010 and the gravity sensing unit 1015 and the image captured by the image capturing unit 1020 can be transmitted to the augmented reality application module 1030. The gravity sensing unit 1015 is configured to sense displacement information, and may also be referred to as a displacement sensing unit. When the augmented reality application module 1030 is executed, the electronic device may perform the augmented reality method described above and display the augmented reality on the display unit 1040. The augmented reality application module 1030 can be implemented as a hardware circuit or software. The augmented reality application module 1030 can be implemented by an integrated circuit or a processing unit. The presentation of subsequent augmented reality content can rely on instant location technology to find out the user in the real environment. The coordinates in the camera and the camera angle are detected to correctly fit the virtual object to the specified position to achieve a sense of experience in the augmented reality.

本案另一實施例揭露電腦可讀取非暫態儲存媒介,其內儲存有上述的擴增實境方法。 Another embodiment of the present disclosure discloses a computer readable non-transitory storage medium in which the above-described augmented reality method is stored.

本案實施例基於深度資訊來校正擴增實境的初始座標系統,以期解決,當將平面圖轉換至虛擬三維模型時,如何找出環景擴增實境內容之比例縮放和如何找出座標系統轉換矩陣的問題。由上述實施例內容可知,本案實施例利用三維感測單元搭配其他技術,可簡易地設定虛擬參考線及偵測真實參考線之長度。更輔以互動式介面,使使用者可編輯與預覽擴增實境體驗內容。本案實施例可讓使用者立即體驗整體配置效果,增加使用者的體驗。 The embodiment of the present invention corrects the initial coordinate system of the augmented reality based on the depth information, in order to solve how to find the scaling of the augmented reality content and how to find the coordinate system conversion when converting the plan to the virtual three-dimensional model. The problem with the matrix. It can be seen from the above embodiments that the embodiment of the present invention can easily set the virtual reference line and detect the length of the real reference line by using the three-dimensional sensing unit in combination with other technologies. It is complemented by an interactive interface that allows users to edit and preview augmented reality experiences. The embodiment of the present invention allows the user to immediately experience the overall configuration effect and increase the user experience.

本案實施例利用三維感測單元取得深度資訊,並將深度資訊結合於真實擴增實境標記,以找出比例轉換關係。如此一來,使用者可以擴增實境預覽其在互動式介面上的整體擺設設計,增加使用者經驗。 The embodiment of the present invention utilizes a three-dimensional sensing unit to obtain depth information, and combines the depth information with the real augmented reality mark to find a proportional conversion relationship. In this way, the user can augment the reality preview of its overall design on the interactive interface, increasing user experience.

綜上所述,雖然本案已以實施例揭露如上,然其並非用以限定本案。本案所屬技術領域中具有通常知識者,在不脫離本案之精神和範圍內,當可作各種之更動與潤飾。因此,本案之保護範圍當視後附之申請專利範圍所界定者為準。 In summary, although the present invention has been disclosed above by way of example, it is not intended to limit the present invention. Those who have ordinary knowledge in the technical field of the present invention can make various changes and refinements without departing from the spirit and scope of the present case. Therefore, the scope of protection of this case is subject to the definition of the scope of the patent application attached.

210-240‧‧‧步驟 210-240‧‧‧Steps

Claims (17)

一種擴增實境方法,包括:選擇相關於一真實環境的一虛擬格局;配置至少一虛擬物件在該虛擬格局中的一虛擬擺放位置;進行一座標系統之初始化與校正;以及於該校正座標系統中,顯示擴增實境,其中,進行該座標系統之初始化與校正之該步驟包括:取得一深度資訊;擷取一環境影像;取得一位移資訊;根據該深度資訊、該環境影像與該位移資訊,進行Z值前處理與參考點前處理;根據一Z值前處理結果與一參考點前處理結果,估測該真實環境的一真實參考線的一長度;建立一座標系統;以及擺放該至少一虛擬物件於所建立的該座標系統中。 An augmented reality method, comprising: selecting a virtual pattern related to a real environment; configuring a virtual placement position of at least one virtual object in the virtual pattern; performing initialization and correction of a standard system; and performing the correction In the coordinate system, the augmented reality is displayed, wherein the step of initializing and correcting the coordinate system comprises: obtaining a depth information; capturing an environmental image; obtaining a displacement information; and according to the depth information, the environmental image and The displacement information is subjected to Z value pre-processing and reference point pre-processing; estimating a length of a real reference line of the real environment according to a Z-value pre-processing result and a reference point pre-processing result; establishing a bidding system; The at least one virtual object is placed in the established coordinate system. 如申請專利範圍第1項所述之擴增實境方法,更包括:顯示所選的該虛擬格局於一互動式介面上;以及回應於該互動式介面之一操作,配置在該真實環境中的該至少一虛擬物件的該虛擬擺放位置,其中,以一平面圖來顯示該至少一虛擬物件的該虛擬擺放位置。 The augmented reality method of claim 1, further comprising: displaying the selected virtual pattern on an interactive interface; and configuring the real environment in response to one of the interactive interfaces The virtual placement position of the at least one virtual object, wherein the virtual placement position of the at least one virtual object is displayed in a plan view. 如申請專利範圍第2項所述之擴增實境方法,其中,該平面圖包括該虛擬格局和該至少一虛擬物件;該平面圖由透視圖法所構成,包括下列之一或其任意組合:一鳥瞰圖、一平行透視圖、一二點透視圖與一斜角投影圖;該虛擬格局以一幾何圖形來表現;以及該至少一虛擬物件由該互動式介面從一虛擬物件資料庫中選擇,或者由一使用者自訂。 The augmented reality method of claim 2, wherein the plan view comprises the virtual pattern and the at least one virtual object; the plan view is formed by a perspective method, including one or any combination of the following: a bird's-eye view, a parallel perspective view, a two-point perspective view, and an oblique projection view; the virtual pattern is represented by a geometric figure; and the at least one virtual object is selected from a virtual object database by the interactive interface. Or be customized by a user. 如申請專利範圍第3項所述之擴增實境方法,更包括:提示以選擇一虛擬參考線,以及於選擇該虛擬參考線後,開啟至少一三維感測單元以偵測相對應之該真實參考線;其中,該虛擬參考線從一虛擬擴增實境標記延伸至該虛擬格局的一邊界,該虛擬參考線之一長度為已知;以及該真實參考線從放置在該真實環境中的一真實擴增實境標記延伸至該真實環境的一邊界。 The augmented reality method of claim 3, further comprising: prompting to select a virtual reference line, and after selecting the virtual reference line, turning on at least one three-dimensional sensing unit to detect the corresponding a virtual reference line extending from a virtual augmented reality marker to a boundary of the virtual pattern, a length of the virtual reference line being known; and the real reference line being placed in the real environment A true augmented reality marker extends to a boundary of the real environment. 如申請專利範圍第4項所述之擴增實境方法,於偵測該真實參考線後,進行該座標系統之初始化;其中,如果存在一視角偏差或一位移,則校正該座標系統,以估測該真實參考線之該長度,進而推算出該平面圖的該至少一虛擬物件在該真實環境中的該虛擬擺放位置。 For example, in the augmented reality method described in claim 4, after the real reference line is detected, initialization of the coordinate system is performed; wherein if there is a viewing angle deviation or a displacement, the coordinate system is corrected to Estimating the length of the real reference line, and inferring the virtual placement position of the at least one virtual object of the plan view in the real environment. 如申請專利範圍第5項所述之擴增實境方法,更包括:於進行該座標系統之初始化與校正後,推算該至少一虛擬物件的一縮放比例,並轉換該至少一虛擬物件在該真實環境中的一相對位置,其中,根據該虛擬參考線之該長度與該真實參考線之該長度之間的一比值來得到該相對位置。 The augmented reality method of claim 5, further comprising: after performing initialization and correction of the coordinate system, estimating a scaling ratio of the at least one virtual object, and converting the at least one virtual object in the a relative position in the real environment, wherein the relative position is obtained based on a ratio between the length of the virtual reference line and the length of the real reference line. 如申請專利範圍第1項所述之擴增實境方法,其中,估測該真實參考線的該長度之該步驟包括:裁減該深度資訊;對三維資訊與二維資訊進行映射;修復二維影像;校正視角;以及取得Z軸參考點距離,其中,於取得該真實參考線之該長度後,依照該縮放比例,從所選的該虛擬格局來推算出該真實環境的尺寸,據以建立該座標系統。 The augmented reality method of claim 1, wherein the step of estimating the length of the real reference line comprises: clipping the depth information; mapping the three-dimensional information with the two-dimensional information; repairing the two-dimensional Obtaining a viewing angle; and obtaining a Z-axis reference point distance, wherein after obtaining the length of the real reference line, the size of the real environment is derived from the selected virtual pattern according to the scaling ratio, thereby establishing The coordinate system. 如申請專利範圍第1項所述之擴增實境方法,更包括:在顯示擴增實境時,持續追蹤一位置及一相機角度,以持續校正所顯示的擴增實境。 The augmented reality method of claim 1, further comprising: continuously tracking a position and a camera angle when displaying the augmented reality to continuously correct the displayed augmented reality. 一種擴增實境系統,包括:一三維感測單元;一影像擷取單元; 一位移感測單元;以及一擴增實境應用模組,該三維感測單元的一感測結果、該影像擷取單元所擷取的一影像與該位移感測單元的一位移資訊傳送給該擴增實境應用模組,該擴增實境應用模組執行:回應於一使用者選擇,選擇相關於一真實環境的一虛擬格局;配置至少一虛擬物件在該虛擬格局中的一虛擬擺放位置;進行一座標系統之初始化與校正;以及於該校正座標系統中,顯示擴增實境,其中,在進行該座標系統之初始化與校正時,該擴增實境應用模組:取得一深度資訊;擷取一環境影像;取得一位移資訊;根據該深度資訊、該環境影像與該位移資訊,進行Z值前處理與參考點前處理;根據一Z值前處理結果與一參考點前處理結果,估測該真實環境的一真實參考線的一長度;建立一座標系統;以及擺放該至少一虛擬物件於所建立的該座標系統中。 An augmented reality system comprising: a three-dimensional sensing unit; an image capturing unit; a displacement sensing unit; and an augmented reality application module, a sensing result of the three-dimensional sensing unit, an image captured by the image capturing unit, and a displacement information of the displacement sensing unit are transmitted to The augmented reality application module, the augmented reality application module executes: in response to a user selection, selecting a virtual pattern related to a real environment; configuring at least one virtual object in the virtual pattern to be a virtual Positioning; performing initialization and correction of a standard system; and displaying the augmented reality in the calibration coordinate system, wherein the augmented reality application module is obtained during initialization and correction of the coordinate system: a depth information; capturing an environmental image; obtaining a displacement information; performing Z value pre-processing and reference point pre-processing according to the depth information, the environment image and the displacement information; and processing a result according to a Z value and a reference point Pre-processing results, estimating a length of a real reference line of the real environment; establishing a landmark system; and placing the at least one virtual object in the established coordinate system. 如申請專利範圍第9項所述之擴增實境系統,其中,該 擴增實境應用模組:將所選的該虛擬格局顯示於一互動式介面上;以及回應於該互動式介面之一操作,配置在該真實環境中的該至少一虛擬物件的該虛擬擺放位置,其中,以一平面圖來顯示該至少一虛擬物件的該虛擬擺放位置。 An augmented reality system as described in claim 9 of the patent application, wherein Augmented reality application module: displaying the selected virtual landscape on an interactive interface; and in response to one of the interactive interfaces, configuring the virtual pendulum of the at least one virtual object in the real environment a placement position, wherein the virtual placement position of the at least one virtual object is displayed in a plan view. 如申請專利範圍第10項所述之擴增實境系統,其中,該平面圖包括該虛擬格局和該至少一虛擬物件;該平面圖由透視圖法所構成,包括下列之一或其任意組合:一鳥瞰圖、一平行透視圖、一二點透視圖與一斜角投影圖;該虛擬格局以一幾何圖形來表現;以及該至少一虛擬物件由該互動式介面從一虛擬物件資料庫中選擇,或者由一使用者自訂。 The augmented reality system of claim 10, wherein the plan view comprises the virtual pattern and the at least one virtual object; the plan view is formed by a perspective method, including one or any combination of the following: a bird's-eye view, a parallel perspective view, a two-point perspective view, and an oblique projection view; the virtual pattern is represented by a geometric figure; and the at least one virtual object is selected from a virtual object database by the interactive interface. Or be customized by a user. 如申請專利範圍第11項所述之擴增實境系統,其中,該擴增實境應用模組:產生一提示信號;回應於一使用者選擇,選擇一虛擬參考線,以及於選擇該虛擬參考線後,開啟該三維感測單元以偵測相對應之該真實參考線;其中,該虛擬參考線從一虛擬擴增實境標記延伸至該虛擬格局的一邊界,該虛擬參考線之一長度為已知;以及 該真實參考線從放置在該真實環境中的一真實擴增實境標記延伸至該真實環境的一邊界。 The augmented reality system of claim 11, wherein the augmented reality application module: generates a prompt signal; selects a virtual reference line in response to a user selection, and selects the virtual After the reference line, the three-dimensional sensing unit is turned on to detect the corresponding real reference line; wherein the virtual reference line extends from a virtual augmented reality mark to a boundary of the virtual pattern, and one of the virtual reference lines The length is known; The real reference line extends from a true augmented reality marker placed in the real environment to a boundary of the real environment. 如申請專利範圍第12項所述之擴增實境系統,其中,如果存在一視角偏差或一位移的話,該擴增實境應用模組進行該座標系統之初始化與校正,以估測該真實參考線之該長度,進而推算出該平面圖的該至少一虛擬物件在該真實環境中的該虛擬擺放位置。 The augmented reality system of claim 12, wherein if there is a viewing angle deviation or a displacement, the augmented reality application module performs initialization and correction of the coordinate system to estimate the reality. The length of the reference line further infers the virtual placement position of the at least one virtual object of the plan view in the real environment. 如申請專利範圍第13項所述之擴增實境系統,其中,該擴增實境應用模組:於進行該座標系統之初始化與校正後,推算該至少一虛擬物件的一縮放比例,並轉換該至少一虛擬物件在該真實環境中的一相對位置,其中,根據該虛擬參考線之該長度與該真實參考線之該長度之間的一比值來得到該相對位置。 The augmented reality system of claim 13, wherein the augmented reality application module: after performing initialization and correction of the coordinate system, estimating a scaling ratio of the at least one virtual object, and Converting a relative position of the at least one virtual object in the real environment, wherein the relative position is obtained according to a ratio between the length of the virtual reference line and the length of the real reference line. 如申請專利範圍第9項所述之擴增實境系統,其中,在估測該真實參考線的該長度時,該擴增實境應用模組:裁減該深度資訊;對三維資訊與二維資訊進行映射;修復二維影像;校正視角;以及取得Z軸參考點距離,其中,於取得該真實參考線之該長度後,依照該縮放比例, 從所選的該虛擬格局來推算出該真實環境的尺寸,據以建立該座標系統。 The augmented reality system of claim 9, wherein, in estimating the length of the real reference line, the augmented reality application module: reducing the depth information; for three-dimensional information and two-dimensional information Mapping the information; repairing the two-dimensional image; correcting the viewing angle; and obtaining the Z-axis reference point distance, wherein after obtaining the length of the real reference line, according to the scaling ratio, The size of the real environment is derived from the selected virtual pattern, and the coordinate system is established accordingly. 如申請專利範圍第9項所述之擴增實境系統,其中,該擴增實境應用模組:在顯示擴增實境時,持續追蹤一位置及一相機角度,以持續校正所顯示的擴增實境;該擴增實境應用模組由一積體電路或一處理器所實施。 The augmented reality system of claim 9, wherein the augmented reality application module continuously tracks a position and a camera angle when displaying the augmented reality to continuously correct the displayed Augmented reality; the augmented reality application module is implemented by an integrated circuit or a processor. 一種電腦可讀取非暫態儲存媒介,當被一電腦讀取時,該電腦執行如申請專利範圍第1項所述之擴增實境方法。 A computer readable non-transitory storage medium that, when read by a computer, performs the augmented reality method as described in claim 1 of the patent application.
TW104143405A 2015-12-23 2015-12-23 Augmented reality method, system and computer-readable non-transitory storage medium TWI590189B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
TW104143405A TWI590189B (en) 2015-12-23 2015-12-23 Augmented reality method, system and computer-readable non-transitory storage medium
CN201511010572.9A CN106910249A (en) 2015-12-23 2015-12-29 Augmented reality method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW104143405A TWI590189B (en) 2015-12-23 2015-12-23 Augmented reality method, system and computer-readable non-transitory storage medium

Publications (2)

Publication Number Publication Date
TW201724031A TW201724031A (en) 2017-07-01
TWI590189B true TWI590189B (en) 2017-07-01

Family

ID=59206090

Family Applications (1)

Application Number Title Priority Date Filing Date
TW104143405A TWI590189B (en) 2015-12-23 2015-12-23 Augmented reality method, system and computer-readable non-transitory storage medium

Country Status (2)

Country Link
CN (1) CN106910249A (en)
TW (1) TWI590189B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI712004B (en) * 2018-07-27 2020-12-01 開曼群島商創新先進技術有限公司 Coordinate system calibration method and device of augmented reality equipment
US11527048B2 (en) 2020-06-24 2022-12-13 Optoma Corporation Method for simulating setting of projector by augmented reality and terminal device therefor

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107742232A (en) * 2017-08-21 2018-02-27 珠海格力电器股份有限公司 A kind of selection method of electrical equipment, device and terminal
TWI679555B (en) * 2017-10-12 2019-12-11 華碩電腦股份有限公司 Augmented reality system and method for providing augmented reality
CN109685907A (en) * 2017-10-18 2019-04-26 深圳市掌网科技股份有限公司 Image combination method and system based on augmented reality
CN108955723B (en) * 2017-11-08 2022-06-10 北京市燃气集团有限责任公司 Method for calibrating augmented reality municipal pipe network
CN108154558B (en) * 2017-11-21 2021-10-15 中电海康集团有限公司 Augmented reality method, device and system
CN108109208B (en) * 2017-12-01 2022-02-08 同济大学 Augmented reality method for offshore wind farm
CN108510596A (en) * 2018-03-30 2018-09-07 北京华麒通信科技股份有限公司 A kind of Design of Indoor Signal Distributed System method, apparatus, medium and equipment
CN110769245A (en) * 2018-07-27 2020-02-07 华为技术有限公司 Calibration method and related equipment
CN110825279A (en) * 2018-08-09 2020-02-21 北京微播视界科技有限公司 Method, apparatus and computer readable storage medium for inter-plane seamless handover
CN110827411B (en) * 2018-08-09 2023-07-18 北京微播视界科技有限公司 Method, device, equipment and storage medium for displaying augmented reality model of self-adaptive environment
CN109561278B (en) * 2018-09-21 2020-12-29 中建科技有限公司深圳分公司 Augmented reality display system and method
TWI811292B (en) * 2019-01-25 2023-08-11 信義房屋股份有限公司 Interactive photography device and method
CN109801341B (en) * 2019-01-30 2020-11-03 北京经纬恒润科技有限公司 Calibration target position calibration method and device
TWI700671B (en) 2019-03-06 2020-08-01 廣達電腦股份有限公司 Electronic device and method for adjusting size of three-dimensional object in augmented reality
CN110264818B (en) * 2019-06-18 2021-08-24 国家电网有限公司 Unit water inlet valve disassembly and assembly training method based on augmented reality
TWI731430B (en) 2019-10-04 2021-06-21 財團法人工業技術研究院 Information display method and information display system
CN114359524B (en) * 2022-01-07 2024-03-01 合肥工业大学 Intelligent furniture experience official system based on inversion augmented reality

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5423406B2 (en) * 2010-01-08 2014-02-19 ソニー株式会社 Information processing apparatus, information processing system, and information processing method
JP5966510B2 (en) * 2012-03-29 2016-08-10 ソニー株式会社 Information processing system
EP2879022A4 (en) * 2012-07-27 2016-03-23 Nec Solution Innovators Ltd Three-dimensional user-interface device, and three-dimensional operation method
US20140132595A1 (en) * 2012-11-14 2014-05-15 Microsoft Corporation In-scene real-time design of living spaces
CN203950352U (en) * 2014-02-19 2014-11-19 中国园林博物馆北京筹备办公室 Landscape architecture interactive system based on augmented reality
CN104899920B (en) * 2015-05-25 2019-03-08 联想(北京)有限公司 Image processing method, image processing apparatus and electronic equipment
TWM514072U (en) * 2015-10-12 2015-12-11 Hsiao-Chen Lee Three-dimensional virtual reality interactive house browsing system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI712004B (en) * 2018-07-27 2020-12-01 開曼群島商創新先進技術有限公司 Coordinate system calibration method and device of augmented reality equipment
US11527048B2 (en) 2020-06-24 2022-12-13 Optoma Corporation Method for simulating setting of projector by augmented reality and terminal device therefor

Also Published As

Publication number Publication date
TW201724031A (en) 2017-07-01
CN106910249A (en) 2017-06-30

Similar Documents

Publication Publication Date Title
TWI590189B (en) Augmented reality method, system and computer-readable non-transitory storage medium
US11805861B2 (en) Foot measuring and sizing application
US10102639B2 (en) Building a three-dimensional composite scene
CN103718213B (en) Automatic scene is calibrated
US9519968B2 (en) Calibrating visual sensors using homography operators
US10420397B2 (en) Foot measuring and sizing application
WO2018019272A1 (en) Method and apparatus for realizing augmented reality on the basis of plane detection
US9727776B2 (en) Object orientation estimation
WO2022019975A1 (en) Systems and methods for reducing a search area for identifying correspondences between images
CN110599432B (en) Image processing system and image processing method
EP3189493B1 (en) Depth map based perspective correction in digital photos
TWI567473B (en) Projection alignment
Wan et al. A study in 3D-reconstruction using kinect sensor
TWI691932B (en) Image processing system and image processing method
JP6579659B2 (en) Light source estimation apparatus and program
US10339702B2 (en) Method for improving occluded edge quality in augmented reality based on depth camera
US11551368B2 (en) Electronic devices, methods, and computer program products for controlling 3D modeling operations based on pose metrics
EP4186029A1 (en) Systems and methods for continuous image alignment of separate cameras
EP4186028A1 (en) Systems and methods for updating continuous image alignment of separate cameras
JP2016024728A (en) Information processing device, method for controlling information processing device and program