TWI379217B - Input device of space design - Google Patents

Input device of space design Download PDF

Info

Publication number
TWI379217B
TWI379217B TW97137757A TW97137757A TWI379217B TW I379217 B TWI379217 B TW I379217B TW 97137757 A TW97137757 A TW 97137757A TW 97137757 A TW97137757 A TW 97137757A TW I379217 B TWI379217 B TW I379217B
Authority
TW
Taiwan
Prior art keywords
light
rti
drawing system
unit
input device
Prior art date
Application number
TW97137757A
Other languages
Chinese (zh)
Other versions
TW201015385A (en
Inventor
Chih Chieh Huang
Shen Guan Shih
Original Assignee
Univ Nat Taiwan Science Tech
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Univ Nat Taiwan Science Tech filed Critical Univ Nat Taiwan Science Tech
Priority to TW97137757A priority Critical patent/TWI379217B/en
Publication of TW201015385A publication Critical patent/TW201015385A/en
Application granted granted Critical
Publication of TWI379217B publication Critical patent/TWI379217B/en

Links

Landscapes

  • Position Input By Displaying (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Description

1379217 0970054TW 28520twf.doc/n 九、發明說明: 【發明所屬之技術領域】 ▲本發明是關於-種輪入裝置,且特別是關於一種空間 没計之輸入裝置。 【先前技術】 隨著多媒體技術的發展,電腦已能精確地呈現現實世 界中絢麗多彩的三維圖像。使用者透過滑鼠、鍵盤、手寫 • 繪圖板等輸入裝置來操作電腦輔助設計軟體,例如: 祀-Max、Sketch-up以及AutoCAD,便能進行三維圖像之 繪製與編輯。 然而,在空間設計上,無論是透過電腦輔助設計軟體 或者人工繪圖,設計者僅能以縮尺方式來表達及呈現設計 圖。以縮尺表示之設計圖不易讓一般未受過建築設計訓練 的大眾所理解,往往造成設計者與業者之間溝通上的不 良。而且,設計者所呈現的設計圖也容易與業者所思考之 叹叶出現落差。因此’如何能有效地掌握空間尺度,來進 行空間設計是相當重要的。 【發明内容】 本發明提供一種空間設計之輸入裝置,其能讓設計者 有效地掌握空間尺度的設計狀況,繪製物件於三維圖像之 中。 本發明提出一種空間設計之輸入裝置,其適於繪圖系 統。輸入裝置包括光線發射模組以及攝影模組。光線發射 模組包括第一收發單元以及光源。第一收發單元用以使光 5 1379217 0970054TW 28520twf.doc/n 線發射模組與緣圖系統進行通訊。光源福接第—收發單 元。當光線發射模組接收到第一輸入指令時,光源便^射 光線至實境。攝影模組包括第二收發單元、影像掏取單元 以及光線追蹤單元。第二收發單元用以使攝影模組與繪圖 糸統進仃通訊。影像擷取單元耦接第一收發單元。 取單元拍攝實境並輸出背景影像至繪圖系統,使繪圖系統 產^對應背景影像之三維影像。光線追縱單元輕接第^收 發早兀:光線追蹤單元債測光線發射至實境的移動路徑, 生f一控制信號至繪圖系統’使繪圖系統據以繪i物 件於三維圖像之中。 衣切 到第置广實施例中當光線發射模組接收 』弟一輸入扣令枯,繪圖系統便依據第一 維圖像之巾職物狀表面。 ’在二 土述之輪人裝置,在—實施例中更包括測距單元,i ,接弟-收發單元。#光線發賴組接㈣第 : 、’測距單元依據光線發射至實境之距離變化,I = 2=號至_纽,使繪圖线在三_像中延展ς 單-上ΐ之輪^裝置,在一實施例中更包括轴向轉值偵測 70,八耦接第一收發單元。當光線發射模纟且 '/、 ^指令時,軸向轉值偵測單元依據光線發射至 角度變化,產生第三控制信號至緣圖系統,‘二 據以旋轉該物件。 、·曰圖系、·充 上述之輸入裝置,在-實施例中更包括轴向轉值侦測 6 1379217 0970054TW 28520twf.doc/n 單兀,其耦接第一收發單元。當光線發射模組接收到第三 輸^指令時,依據光線發射至實境之移動角度變化,產生 第三控制信號至繪圖系統,使繪圖系統據以旋轉物件。 本發明之輸入裝置採用攝影模組拍攝實境,使繪圖系 統產生對應實境之三維圖像,並且透過偵測光線於實境中 的移動路徑,繪製物件於三維圖像之中。藉此,使用者能 掌握空間尺度,於實境甲進行空間設計。 為讓本發明之上述和其他目的、特徵和優點能更明顯 易懂,下文特舉本發明之較佳實施例,並配合所附圖式, 作詳細說明如下。 【實施方式】 圖1繪示為本發明之一實施例之輸入裝置。請參照圖 1,輸入裝置100包括光線發射模組110以及攝影模組 120。光線發射模組Π0包括收發單元m、光源112以及 定位單元U3。在光線發射模組110中,收發單元111建 立光線收發模組110與繪圖系統13〇之間的通訊。當光線 發射模組接收到輸入指令CMD1時,光源112便發射光線 至一貫境140,其中光源U2可以是點光源、線光源或者 面光源,在此以點光源為例說明。定位單元113耦接收發 單7L 111,其定位光線發射模組11〇在空間中的座標並 且將此座標資訊傳送至緣圖系統130。 攝影模組120包括收發單元121、影像擷取單元122、 光線追蹤單元123以及定位單元124。在攝影模組12〇中, 收發單το 121建立攝影模組12〇與繪圖系統13〇之間的通 7 1379217 0970054TW 28520twf.doc/n 訊。影像棘單元122_收發單元⑵,翻攝實境14〇, 影像至繪圖系統130,使繪圖系統130產生 .Μ月厅、衫像之二維影像133。光線追蹤單元123 發單元121 ’其偵測光線發射至實境14㈣移動路炉⑷, ί 控制信號咖1至緣圖系統13(),使_ 繪製物件132於三維圖像133中。 在此,光線追蹤單元123可以透過影像處理方式,來1379217 0970054TW 28520twf.doc/n IX. Description of the invention: [Technical field to which the invention pertains] ▲ The present invention relates to a wheeled device, and more particularly to an input device that does not count. [Prior Art] With the development of multimedia technology, computers have been able to accurately represent the colorful three-dimensional images in the real world. Users can access computer-aided design software such as 祀-Max, Sketch-up, and AutoCAD through mouse, keyboard, handwriting, tablet, and other input devices, such as 祀-Max, Sketch-up, and AutoCAD. However, in space design, whether through computer-aided design software or manual drawing, the designer can only express and present the design in scale. Design drawings in scales are not easily understood by the general public who have not received architectural design training, and often cause poor communication between designers and operators. Moreover, the design drawings presented by the designers are also easy to fall with the sighs that the industry thinks. Therefore, how to effectively grasp the spatial scale to carry out space design is very important. SUMMARY OF THE INVENTION The present invention provides a spatial design input device that allows a designer to effectively grasp the design of a spatial scale and draw objects into a three-dimensional image. The present invention provides a spatially designed input device that is suitable for a drawing system. The input device includes a light emitting module and a photography module. The light emitting module includes a first transceiver unit and a light source. The first transceiver unit is configured to communicate the light 5 1379217 0970054TW 28520twf.doc/n line transmitting module with the edge map system. The light source is connected to the first transceiver unit. When the light emitting module receives the first input command, the light source emits light to the real world. The photography module includes a second transceiver unit, an image capture unit, and a ray tracing unit. The second transceiver unit is configured to enable the camera module to communicate with the graphics system. The image capturing unit is coupled to the first transceiver unit. The unit captures the real world and outputs the background image to the drawing system, so that the drawing system produces a three-dimensional image corresponding to the background image. The ray tracing unit is lightly connected to the first acknowledgment: the ray tracing unit measures the movement path of the ray to the real world, and generates a control signal to the drawing system to cause the drawing system to draw the i object into the three-dimensional image. In the first embodiment, when the light-emitting module receives the input, the drawing system is based on the surface of the towel in the first dimension image. In the second embodiment of the wheel device, in the embodiment, the distance measuring unit, i, the brother-transceiver unit is further included. #光发赖接接(四)第:, 'The distance measuring unit changes according to the distance from the light emission to the real world, I = 2= to _ New, so that the drawing line extends in the three _ image ς single-upper wheel ^ In one embodiment, the device further includes an axial value detection 70 coupled to the first transceiver unit. When the light is emitted and the '/, ^ command is commanded, the axial rotation detecting unit generates a third control signal to the edge map system according to the light emission to the angle change, and the data is rotated to rotate the object. The input device is further included in the embodiment, and further includes an axial value detection 6 1379217 0970054TW 28520 twf.doc/n unit, which is coupled to the first transceiver unit. When the light emitting module receives the third input command, the third control signal is generated to the drawing system according to the change of the moving angle of the light emission to the real world, so that the drawing system rotates the object. The input device of the present invention uses the photographic module to capture the real world, so that the drawing system generates a three-dimensional image corresponding to the real environment, and draws the object into the three-dimensional image by detecting the moving path of the light in the real world. In this way, the user can grasp the spatial scale and carry out the space design in the real world. The above and other objects, features, and advantages of the present invention will become more apparent from the <RTIgt; Embodiments FIG. 1 illustrates an input device according to an embodiment of the present invention. Referring to FIG. 1, the input device 100 includes a light emitting module 110 and a photographing module 120. The light emitting module Π0 includes a transceiver unit m, a light source 112, and a positioning unit U3. In the light emitting module 110, the transceiver unit 111 establishes communication between the light transceiving module 110 and the drawing system 13A. When the light emitting module receives the input command CMD1, the light source 112 emits light to the boundary 140. The light source U2 may be a point light source, a line light source or a surface light source, and the point light source is taken as an example. The positioning unit 113 is coupled to receive the bill 7L 111, which locates the coordinates of the light emitting module 11 in the space and transmits the coordinate information to the edge map system 130. The photography module 120 includes a transceiver unit 121, an image capturing unit 122, a ray tracing unit 123, and a positioning unit 124. In the photographic module 12, the transceiver το 121 establishes a communication between the photographic module 12 〇 and the drawing system 13 7 7 1379217 0970054 TW 28520 twf.doc/n. The image ratchet unit 122_transceiver unit (2), flipping the real image 14〇, and the image to the drawing system 130, causes the drawing system 130 to generate a two-dimensional image 133 of the moon hall and the shirt image. The ray tracing unit 123 sends a unit 121' to detect the light emission to the real 14 (four) moving road furnace (4), ί control signal 1 to the edge map system 13 (), so that the object 132 is drawn in the three-dimensional image 133. Here, the ray tracing unit 123 can transmit image processing methods.

偵測光線發射至實境14〇的移動路徑⑷。舉例來說,光 線追蹤單元123透過影像分·景影像巾的移動物件便 可債測出·發射至實境14()的光點於背景影像中的位 置丄並從而得知光線發射至實境140的移動路徑〗41。定 位單7L 124耦接收發單元121 ’其定位攝影模組丨2〇在空 間中的座標,並且將鋪資訊傳送讀圖㈣n卜在本 實施例=,透過定位單元113及124之運作,可定位光線 各射至貝i兄140的光點座標,以提高偵測移動路徑mi的 準確性。Detects the movement path of the light to the real 14 (4). For example, the ray tracing unit 123 can detect the position of the spot of the real image 14() in the background image through the moving object of the image sub-view image towel, and thereby know that the light is emitted to the real world. The movement path of 140 is 41. The positioning unit 7L 124 is coupled to the receiving unit 121', which positions the coordinates of the camera module 丨2〇 in the space, and transmits the information to the reading map (4). In this embodiment, the positioning unit 113 and 124 can be positioned. The light is directed to the spot coordinates of the Bayi brother 140 to improve the accuracy of detecting the moving path mi.

從上述可以得知,當光源發射模組11〇接收到輸入指 令CMD1時,便發射光線至實境14〇。透過攝影模組12〇 偵測光線發射至實境140的移動路徑,並據以產生控制信 號CON1 ’則繪圖系統13〇便可繪製物件於三維圖像之中, 例如·線段,或者多個線段所組成之多邊形。另外,當光 源發射模組11 〇更接收到輸入指令CMD2時,繪圖系統13〇 便依據光線發射至實境14〇的移動路徑(即受控於控制信 號CON1),在三維圖像之中延展物件之表面,例如:將線 1379217 0970054TW 28520twf.doc/n 段延展成平面或者改變平面尺寸大小。 本實施例之光源發射模組no更包括測距單元114以 及軸向轉值偵測單元115。測距單元114 … m。。當光線發射模組110接收到輪入指令CMD“早= 距=114依據光線發射至實境⑽的距離變化產 制信號CON2至緣圖系.统13〇,使緣圖系統13〇在三維^ 像中m職物件絲立體,:將平面延展成立體。It can be known from the above that when the light source transmitting module 11 receives the input command CMD1, it emits light to the real world 14〇. The camera module 12 detects the moving path of the light emission to the real environment 140, and accordingly generates the control signal CON1', so that the drawing system 13 can draw the object into the three-dimensional image, for example, a line segment, or a plurality of line segments. The polygons that are formed. In addition, when the light source transmitting module 11 receives the input command CMD2, the drawing system 13 is extended in the three-dimensional image according to the moving path of the light to the real 14〇 (ie, controlled by the control signal CON1). The surface of the object, for example, extending the line 1379217 0970054TW 28520twf.doc/n into a plane or changing the size of the plane. The light source transmitting module no of the embodiment further includes a distance measuring unit 114 and an axial rotation detecting unit 115. Ranging unit 114 ... m. . When the light emitting module 110 receives the rounding command CMD "early = distance = 114 according to the distance of the light emitted to the real world (10), the production signal CON2 is changed to the edge map system, so that the edge map system 13 is in the three-dimensional ^ Like the medium-sized object, the wire is three-dimensional: the plane is extended to form a body.

轴向轉值侧單元115 _收發單元⑴。當 射模組m接收到輸入指令CMD4時,轴向轉值债測單^ 115依據光線發射至實境140的移動角度變化,產生 信號CON3至緣圖系、統13〇,使綠圖系統13〇在三^ 133中旋轉物件。而當光線發射模組ιω接收到輸入指令 CMD5時’奢圖系統13〇便依據光線發射至實境14〇的 動路徑(即受控於控制信號c〇N1)以及光線發射至實境⑽ 的移動角度變化(即受控於控制信號c〇N3),在三維圖像 133中移動物件的位置。另外,當光線發射模組ιι〇接收 到輸入指令CMD3時,軸向轉值偵測單元115依據光線發 射至實境140的移動角度變化,產生控制信號c〇N3至^ 圖系統130,使繪圖系統130在三維圖像133中將物件 展成立體。 由於光源發射模組110可以透過收發單元lu與繪圖 系統130進行通訊,因此繪圖系統13〇可以得知光源發射 模組110所接收之輸入指令,並且依據其所接收到的控制 信號’改變物件的狀態。 1379217 0970054TW 28520twf.doc/n 圖2A〜圖2G分別為本發明之一實施例採用輸入裝置 繪製物件於三維圖像的示意圖。請參照圖2A,光線發射模 組no外觀呈現例如為槍的造型或者筆,其於接收到輸入 指令CMD1時,發射光線(例如··雷射光)至實境14〇。使 用者瞄準欲繪製之起點位置按下搶的板手(即產生輸入指 令CMD1)以發射雷射光,並且移動雷射光投射至實境 的光點位置L(即光線追蹤單元123產生控制信號coni), 使繪圖系統130繪製一線段132a於三維圖像133中。 凊參照圖2B ’使用者瞄準欲繪製之起點位置按下槍的 板手(即產生輸入指令CMD1)以發射雷射光,並且移動雷 射光投射至實境140的光點位置L(即光線追縱單元123產 生控制t號CON1) ’使繪圖系統130續·製一矩形132b於 三維圖像133中。 ' 5月參照圖2C,使用者猫準已繪製的線段按下搶的板手 (即產生輸入指令CMD1)以發射光線,並且點選繪製表面 之按紐(即產生輸入指令CMD2)。透過移動雷射光投射至 實境140的光點位置L(即光線追蹤單元123產生控制信號 CON1) ’使繪圖系統13〇在三維圖像133中將線段延展成 平面132c。 請參照圖2D,使用者瞄準已繪製的平面按下搶的板手 (即產生輸入指令CMD1)以發射光線,並且點選綠製立體 之按钮(即產生輸入指令CMD3)。透過改變雷射光投射至 實境140的距離D(即測距單元114產生控制信號c0N2), 使繪圖系統130在三維圖像133中將平面延展成立體 1379217 0970054TW 28520tvvf.d〇c/n 132d。 請參照圖2E’使用者瞄準已繪製的平面按下搶的板手 (即產生輸入指令CMD1)以發射光線,並且點選繪製立體 之按鈕(即產生輸入指令CMD3)。透過改變雷射光投射至 實境140的移動角度A(即軸向轉值偵測單元丨15產生控制 信號CON3),使繪圖系統13〇在三維圖像133中將平面延 展成立體132e。 請參照圖2F,使用者瞄準已繪製的立體按下搶的板手 (即產生輸入指令CMD1)以發射光線’並且點選旋轉物件 之按鈕(即產生輸入指令CMD4)。透過改變雷射光投射至 實境14〇的移動角度A(即軸向轉值偵測單元115產生控制 #號CON3),使繪圖系統13〇在三維圖像133中旋轉物件 132f。 請參照圖2G,使用者瞄準已繪製的立體按下搶的板手 (即產生輸入指令CMD1)以發射雷射光,並且點選移動物 件=按鈕(即產生輸入指令CMD5)。透過移動雷射光投射 至貫境140的光點位置l(即光線追磔單元123產生控制信 娩CON1),以及改變雷射光投射至實境14〇的移動角度 A(即軸向轉值偵測單元U5產生控制信號c〇N3) ,使繪圖 系統130在三維圖像丨33中移動物件132g。 依據上述實施例圖2A〜圖2G之敘述,輸入裝置1〇〇 控制繪圖系統130繪製線段、多邊形、平面以及立體等多 種物件於三維圖像133之中。另外,輸入裝置1〇〇更可控 制繪圖系統133移動物件及旋轉物件。 1379217The axial value side unit 115_ transceiver unit (1). When the injection module m receives the input command CMD4, the axial value of the debt measurement unit 115 changes according to the movement angle of the light emission to the real environment 140, and generates a signal CON3 to the edge map system, and the system 13旋转 Rotate the object in three ^ 133. When the light emitting module ιω receives the input command CMD5, the extravagant system 13 is based on the moving path of the light to the real 14〇 (ie, controlled by the control signal c〇N1) and the light is transmitted to the real world (10). The movement angle is changed (i.e., controlled by the control signal c 〇 N3), and the position of the object is moved in the three-dimensional image 133. In addition, when the light emitting module ιι receives the input command CMD3, the axial rotation detecting unit 115 generates a control signal c〇N3 to the mapping system 130 according to the change of the moving angle of the light emission to the real environment 140, so that the drawing is performed. System 130 presents the object in a three-dimensional image 133. Since the light source transmitting module 110 can communicate with the drawing system 130 through the transceiver unit lu, the drawing system 13 can know the input command received by the light source transmitting module 110, and change the object according to the received control signal. status. 1379217 0970054TW 28520twf.doc/n FIG. 2A to FIG. 2G are respectively schematic diagrams showing an object in a three-dimensional image using an input device according to an embodiment of the present invention. Referring to Fig. 2A, the appearance of the light emitting module no is, for example, a shape of a gun or a pen that emits light (e.g., laser light) to the real world 14 when the input command CMD1 is received. The user aims at the starting position to be drawn, presses the grabbing wrench (ie, generates an input command CMD1) to emit the laser light, and moves the laser light to the spot position L of the real world (ie, the ray tracing unit 123 generates the control signal coni) The drawing system 130 is caused to draw a line segment 132a into the three-dimensional image 133. Referring to FIG. 2B, the user presses the wrench of the gun at the starting position to be drawn (ie, generates an input command CMD1) to emit the laser light, and moves the laser light to the spot position L of the real environment 140 (ie, the light is traced). Unit 123 generates control t number CON1) ' to cause drawing system 130 to continue to create a rectangle 132b in three-dimensional image 133. Referring to Fig. 2C in May, the line drawn by the user's cat is pressed by the grabber (i.e., the input command CMD1 is generated) to emit light, and the button for drawing the surface is clicked (i.e., the input command CMD2 is generated). The projection system 13 延 causes the drawing system 13 to extend the line segment into the plane 132c in the three-dimensional image 133 by moving the laser light to the spot position L of the real environment 140 (i.e., the ray tracing unit 123 generates the control signal CON1). Referring to Figure 2D, the user is aiming at the drawn plane to press the grabbing hand (i.e., generating an input command CMD1) to emit light, and clicking the green stereo button (i.e., generating an input command CMD3). By varying the distance D at which the laser light is projected onto the real world 140 (i.e., the ranging unit 114 produces the control signal c0N2), the mapping system 130 extends the plane in the three-dimensional image 133 to form a body 1379217 0970054TW 28520tvvf.d〇c/n 132d. Referring to Fig. 2E', the user presses the grabbed player to aim at the drawn plane (i.e., generates an input command CMD1) to emit light, and clicks to draw a stereo button (i.e., generates an input command CMD3). By changing the movement angle A of the projection of the laser light to the real environment 140 (i.e., the axial rotation detecting unit 丨 15 generates the control signal CON3), the drawing system 13 延 extends the plane into the body 132e in the three-dimensional image 133. Referring to Figure 2F, the user is aiming at the drawn stereo presser (i.e., generating an input command CMD1) to emit light&apos; and clicking the button that rotates the object (i.e., generates an input command CMD4). By changing the movement angle A of the projection of the laser light to the reality 14 (i.e., the axial rotation detecting unit 115 generates the control ##CON3), the drawing system 13 rotates the object 132f in the three-dimensional image 133. Referring to Figure 2G, the user is aiming at the drawn stereo presser (i.e., generating an input command CMD1) to emit the laser light, and clicking the move object = button (i.e., generating an input command CMD5). By moving the laser light to the spot position l of the boundary 140 (ie, the ray tracing unit 123 generates the control delivery CON1), and changing the movement angle A of the laser light projected to the reality 14 ( (ie, the axial rotation detection Unit U5 generates control signal c〇N3) to cause drawing system 130 to move object 132g in three-dimensional image 丨33. According to the above-described embodiment, as shown in Figs. 2A to 2G, the input device 1 〇〇 controls the drawing system 130 to draw a plurality of objects such as a line segment, a polygon, a plane, and a stereo in the three-dimensional image 133. In addition, the input device 1 can further control the drawing system 133 to move objects and rotate objects. 1379217

0970054TW 28520twf.doc/n _值得一提的是,輸入裝置刚更可包括一顯*器(未纷 不於圖1),此顯示器可應用擴增實境(augmentedrea 技術,將二維圖像133中的物件132與實境14〇結合, 使用者能於實境140中看見所繪製之物件,然並不褐限於 此。 ' 念综上所述,上述實施例之輸入裝置採用攝影模組拍攝 只境’使纟會®系統能產生對應實境之三維圖像,並且透過 φ 彳貞^光線發射至實境的移動路徑,控制繪圖系統繪製物件 於三維圖像之中。另外,綠圖系統也可依據光線發射至實 境的心動角度變化或者光線發射至實境的距離變化,改變 物件的狀態,例如:延展物件之絲、延展平面成為立體、 ^夕動物件或者旋轉物件等。此外,更可配合擴增實境的技 將所繪製之物件與實境結合。藉此,使用者能有效地 莩握空間尺度,於實境中進行空間設計。 —雖然本發明已以較佳實施例揭露如上,然其並非用以 限定本發明,任何所屬技術領域中具有通常知識者,在不 脫離本發明之精神和範圍内’當可作些許之更動與潤飾, 因此本發明之保護範圍當視後附之申請專利範圍 為準。 【圖式簡單說明】 圖1繪示為本發明之一實施例之輸入裝置。 圖2A〜圖2G分別為本發明之一實施例採用輸入裝置 綠製物件於三維圖像的示意圖。 【主要元件符號說明】 12 1379217 0970054TW 28520twf.doc/n 100 :輸入裝置 110 :光線發射模組 111、121 :收發單元 112 :光源 113 :定位單元 114 :測距單元 115 :軸向轉值偵測單元 120 :攝影模組 122 :影像擷取單元 123 :光線追蹤單元 124 :定位單元 130 :繪圖系統 131 :三維圖像 132、ma~132g :物件 140 :實境 141 :移動路徑 CMD1 :輸入指令 CON1 :控制信號 130970054TW 28520twf.doc/n _ It is worth mentioning that the input device can only include a display device (not shown in Figure 1). This display can be applied to augmentedreal technology (augmentedrea technology, 2D image 133) The object 132 is combined with the real thing 14〇, and the user can see the drawn object in the real environment 140, but the color is not limited to this. 'In summary, the input device of the above embodiment is photographed by the camera module. The Vision® system can generate a three-dimensional image corresponding to the real world, and control the drawing system to draw objects into the three-dimensional image through the moving path of the φ 彳贞^ ray to the real world. In addition, the green image system It is also possible to change the state of the object according to the change of the heart angle of the light emitted to the real world or the distance of the light emitted to the real thing, for example, the filament of the extended object, the extended plane becomes a three-dimensional, an animal piece or a rotating object, etc. The object can be combined with the real world by the technique of augmented reality, whereby the user can effectively grasp the spatial scale and carry out the space design in the real world. - Although the invention has been implemented better The disclosure of the present invention is not intended to limit the present invention, and any one of ordinary skill in the art can make some modifications and refinements without departing from the spirit and scope of the present invention. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a diagram showing an input device according to an embodiment of the present invention. FIGS. 2A to 2G are respectively an embodiment of the present invention using an input device for a green object. Schematic diagram of the three-dimensional image. [Main component symbol description] 12 1379217 0970054TW 28520twf.doc/n 100: Input device 110: Light emitting module 111, 121: Transceiver unit 112: Light source 113: Positioning unit 114: Ranging unit 115 : Axis rotation detecting unit 120: Photographic module 122: Image capturing unit 123: Ray tracing unit 124: Positioning unit 130: Drawing system 131: Three-dimensional image 132, ma~132g: Object 140: Reality 141: Moving path CMD1: input command CON1: control signal 13

Claims (1)

137令217137 order 217 十、申請專利範圍: 1. 一種空間設計之輸入裝置,適於一繪圖系統,包括· 一光線發射模組,包括 一第一收發單元,用以使該光線發射模組與該繪 圖系統進行通訊;以及 a 一光源’耦接該第一收發單元,當該光線發射模 組接收到一第一輸入指令時,發射一光線至一實境,其中 • 該第一輸入指令係用來確定是否使該光源發射該光線至該 實境;以及 一攝影模組,包括: 一第二收發單元,用以使該攝影模組與該繪圖系 統進行通訊; 一影像擷取單元,耦接該第二收發單元,用以拍 攝該實境,並輸出一背景影像至該繪圖系統,使該繪圖系 統產生對應該背景影像之一三維影像;以及 光線追縱單元,;1¾接第一該收發單元,用以偵 鲁 測該光線發射至該實境的移動路徑,並產生一第一控制信 號至該繪圖系統,使該繪圖系統據以繪製一物件於該三維 圖像之中。 2. 如申請專利範圍第1項所述之輸入裝置,其中該光 線發射模組更包括: 一第一定位單元,耦接該第一收發單元,用以定位該 光線發射模經於該空間中之一第一座標,並將該第一座標 資訊傳送至該繪圖系統。 101-4-12 影模專鄉K ^ 1項所狀輸人裝置,其中該攝 攝影;之,該第二收發單元,用以定位該 傳送至輯_統。座標,並將該弟二座標資訊X. Patent application scope: 1. A space design input device, suitable for a drawing system, comprising: a light emitting module, comprising a first transceiver unit for communicating the light emitting module with the drawing system And a light source coupled to the first transceiver unit, when the light emitting module receives a first input command, transmitting a light to a real environment, wherein: the first input command is used to determine whether to make The light source emits the light to the real world; and a camera module, comprising: a second transceiver unit for communicating the camera module with the drawing system; an image capturing unit coupled to the second transceiver a unit for capturing the real world and outputting a background image to the drawing system, so that the drawing system generates a three-dimensional image corresponding to the background image; and a light tracking unit; and the first transceiver unit is configured to be used for Detecting the movement path of the light to the real world, and generating a first control signal to the drawing system, so that the drawing system draws an object in the three-dimensional Among the images. 2. The input device of claim 1, wherein the light emitting module further comprises: a first positioning unit coupled to the first transceiver unit for positioning the light emitting mode in the space One of the first coordinates and transmits the first coordinate information to the drawing system. 101-4-12 The model input unit K ^ 1 is the input device, wherein the camera; the second transceiver unit is used to locate the transmission to the system. Coordinates, and the two coordinates of the brother 光線利範圍第1項所述之輸人裝置’其中當該 據該到一第二輸入指令時,該繪圖系統便依 面,复中二儿於該二維圖像之中延展該物件之表 該:二二1輪人指令係用來確定是否使該繪圖系統在 亥一維圖像之中延展該物件之表面。 線發範圍第1項所述之輸入裝置’其中該光 -接你舰f?,輪該第—收發單元,當該光線發射模 距離變指令時’依據該光線發射至該實境之 4弟二控制信號至該繪圖系統,使該繪圖The input device described in item 1 of the light range is in which the drawing system is in accordance with the second input command, and the second child extends the table of the object in the two-dimensional image. The two-two-round human command is used to determine whether to extend the surface of the object in the one-dimensional image. The input device described in item 1 of the line transmission range, wherein the light is connected to your ship f?, the first transmission unit, when the light emission mode is changed from the command, the light is transmitted to the 4th brother of the reality according to the light. Two control signals to the drawing system to make the drawing i、’’以於該二_像中延展該物件為立體,其中該第三 =入指令係用來叙是否使該繪圖系統在該三維圖像中延 展該物件為立體。 始政6.如申請專利範圍第1項所述之輸人裝置,其中該光 線發射模組更包括: 轴内轉值彳貞測單元,麵接該第一收發單元,當該光 組接㈣—第四輸人指令時,依據該光線發射至 =貝境之移動角度變化’產生—第三控制信號至該繪圖系 、·先,使該㈣系統據以旋轉該物件,其中該第四輸入指令 15 101-4,)2 係用來確定是否使該繪圖系統在該三維圖像中旋轉,夺 7.如申請專利範圍第6項所述之輸入裝置,其中卷 光線發射模組接收到一第五輸入指令時,該繪圖系均碡 該第-控制信號及該第三控制信號,在該三維_像中依據 該物件的位置,其中該第五輸入指令係用來確定是移動 繪圖系統在該三維圖像中移動該物件。 使該 • 8.如申請專利範圍第1項所述之輸入裝置,其巾括、μ 線發射模組更包括: 、光 一軸向轉值偵測單元,耦接該第一收發單元,當該光 線,射模組接收到一第三輸入指令時,依據該光線發射至 該實境之移動角度變化,產生一第三控制信號至該緣圖系 統’使§緣圖系統據以延展該物件為立體,其中該第三輸 入指令係用來確定是否使該繪圖系統在該三維圖像中延展 該物件為立體。 9. 如申請專利範圍第丨項所述之輸入裝置,其中該光 源為點光源、線光源或者面光源三者其中之一或其組合。 10. 如_請專利範圍第丨項所述之輸入裝置,更包括: 一顯示器,顯示該物件與該實境結合之晝面。 16i, </ RTI> </ RTI> </ RTI> </ RTI> </ RTI> </ RTI> </ RTI> </ RTI> </ RTI> </ RTI> </ RTI> </ RTI> </ RTI> </ RTI> </ RTI> </ RTI> </ RTI> The invention relates to the input device of claim 1, wherein the light emitting module further comprises: an in-axis rotation value detecting unit, which is connected to the first transceiver unit, and when the light group is connected (4) - when the fourth input command is generated, according to the change of the movement angle of the light emission to the 贝 产生, the third control signal is sent to the drawing system, and the (four) system is rotated to rotate the object, wherein the fourth input The instruction 15 101-4,) 2 is used to determine whether to cause the drawing system to rotate in the three-dimensional image. The input device according to claim 6, wherein the volume light emitting module receives one When the fifth input command is executed, the drawing system is configured to select the first control signal and the third control signal according to the position of the object in the three-dimensional image, wherein the fifth input command is used to determine that the moving drawing system is The object is moved in the three-dimensional image. 8. The input device as described in claim 1, wherein the towel and the μ line transmitting module further comprise: an optical-axis rotation detecting unit coupled to the first transceiver unit, when When the light module receives a third input command, according to the change of the moving angle of the light emitted to the real world, a third control signal is generated to the edge map system to enable the § edge map system to extend the object Stereoscopic, wherein the third input command is used to determine whether to cause the drawing system to extend the object into a stereo in the three-dimensional image. 9. The input device of claim 2, wherein the light source is one of a point source, a line source or a surface source or a combination thereof. 10. The input device as described in the 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 16
TW97137757A 2008-10-01 2008-10-01 Input device of space design TWI379217B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW97137757A TWI379217B (en) 2008-10-01 2008-10-01 Input device of space design

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW97137757A TWI379217B (en) 2008-10-01 2008-10-01 Input device of space design

Publications (2)

Publication Number Publication Date
TW201015385A TW201015385A (en) 2010-04-16
TWI379217B true TWI379217B (en) 2012-12-11

Family

ID=44830012

Family Applications (1)

Application Number Title Priority Date Filing Date
TW97137757A TWI379217B (en) 2008-10-01 2008-10-01 Input device of space design

Country Status (1)

Country Link
TW (1) TWI379217B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI413034B (en) * 2010-07-29 2013-10-21 Univ Nat Central The System of Mixed Reality Realization and Digital Learning

Also Published As

Publication number Publication date
TW201015385A (en) 2010-04-16

Similar Documents

Publication Publication Date Title
Molyneaux et al. Interactive environment-aware handheld projectors for pervasive computing spaces
TWI454968B (en) Three-dimensional interactive device and operation method thereof
Wacker et al. Physical guides: An analysis of 3d sketching performance on physical objects in augmented reality
US10088971B2 (en) Natural user interface camera calibration
JP6539816B2 (en) Multi-modal gesture based interactive system and method using one single sensing system
US20160026361A1 (en) Three-Dimensional Tracking of a User Control Device in a Volume Using Position Sensing
KR102011163B1 (en) Optical tablet stylus and indoor navigation system
US8665213B2 (en) Spatial, multi-modal control device for use with spatial operating system
US20150138086A1 (en) Calibrating control device for use with spatial operating system
Bellarbi et al. Hand gesture interaction using color-based method for tabletop interfaces
TW201508561A (en) Speckle sensing for motion tracking
JP2014517361A (en) Camera-type multi-touch interaction device, system and method
CN105190703A (en) Using photometric stereo for 3D environment modeling
JP2013069273A (en) Motion detection method of input body and input device using the same
CN107240148A (en) Transparent substance three-dimensional surface rebuilding method and device based on background stration technique
CN104516532A (en) Method and apparatus for determining the pose of a light source using an optical sensing array
TW200817974A (en) Space positioning and directing input system and processing method therefor
US10698496B2 (en) System and method for tracking a human hand in an augmented reality environment
TWI379217B (en) Input device of space design
US10229313B1 (en) System and method for identifying and tracking a human hand in an interactive space based on approximated center-lines of digits
US11747910B2 (en) Method and apparatus for object tracking using objects with tracking marks
CN105446550B (en) Input unit and its localization method, electronic equipment and input system
US20120050160A1 (en) Method and apparatus for measuring of a three-dimensional position of mouse pen
Prima et al. A Pointing Device for 3D Interactive Spherical Displays
KR20120114767A (en) Game display system throwing objects and a method thereof

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees