TW201015385A - Input device of space design - Google Patents

Input device of space design Download PDF

Info

Publication number
TW201015385A
TW201015385A TW97137757A TW97137757A TW201015385A TW 201015385 A TW201015385 A TW 201015385A TW 97137757 A TW97137757 A TW 97137757A TW 97137757 A TW97137757 A TW 97137757A TW 201015385 A TW201015385 A TW 201015385A
Authority
TW
Taiwan
Prior art keywords
light
unit
input device
transceiver unit
drawing system
Prior art date
Application number
TW97137757A
Other languages
Chinese (zh)
Other versions
TWI379217B (en
Inventor
Chih-Chieh Huang
Shen-Guan Shih
Original Assignee
Univ Nat Taiwan Science Tech
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Univ Nat Taiwan Science Tech filed Critical Univ Nat Taiwan Science Tech
Priority to TW97137757A priority Critical patent/TWI379217B/en
Publication of TW201015385A publication Critical patent/TW201015385A/en
Application granted granted Critical
Publication of TWI379217B publication Critical patent/TWI379217B/en

Links

Abstract

An input device of space design adapted to a graphic system is disclosed. The input device includes a light emitting module and a photography module, wherein the light emitting module includes a first transceiver unit and a light source, and the photography module includes a second transceiver unit, an image retrieving unit and a light tracing unit. The first and second transceiver units respectively communicate the light emitting module and the photography module with the graphic system. When the light emitting module receives a first input command, the light source emits a light to a reality environment. The image retrieving unit films the reality environment and outputs a background image to the graphic system so that the graphic system can display a three-dimensional (3D) graphic corresponding to the background image. The light tracing unit traces the moving path of the emitted light and generates a first control signal to the graphic system so that the graphic system draws an object in the 3D graphic.

Description

201015385 w 28520twf.doc/n 九、發明說明: - 【發明所屬之技術領域】 本發明是關於一種輸入裝置,且特別是關於一種空間 ' 設計之輸入裝置。 【先前技術】 隨著多媒體技術的發展,電腦已能精確地呈現現實世 界中絢麗多彩的三維圖像。使用者透過滑鼠、鍵盤、手寫 繪圖板等輸入裝置來操作電腦輔助設計軟體,例如: 馨 3D-Max、Sketch-up以及AutoCAD,便能進行三維圖像之 繪製與編輯。 然而,在空間設計上,無論是透過電腦輔助設計軟雜 或者人工繪圖,設計者僅能以縮尺方式來表達及呈現設計 圖。以縮尺表示之設計圖不易讓一般未受過建築設計訓練 的大眾所理解,往往造成設計者與業者之間溝通上的不 良。而且,設計者所呈現的設計圖也容易與業者所思考之 設計出現落差。因此,如何能有效地掌握空間尺度,來進 φ 行空間設計是相當重要的。 【發明内容】 本發明提供一種空間設計之輸入裝置,其能讓設計者 有效地掌握空間尺度的設計狀況,繪製物件於三維圖像之 中。 本發明提出一種空間設計之輸入裝置,其適於繚圖系 統。輸入裝置包括光線發射模組以及攝影模組。光線發射 模組包括第一收發單元以及光源。第一收發單元用以使光 5 w 28520twf.doc/n 201015385 線發射模組與繪圖系統進行通 元。當光線發射模組接收到第一輸入接^ 一收發單 第二收發單元、影像摘取單元 及先線伽早兀。第二收發單元肖以 系統進行通訊。影像齡單元_第—收 镜老厅'衫像之二維影像。光線追縱單元輕接 發單元;_光線追蹤單元偵啦線發射至實境的移動路^ 並產生第一控制信號至纟會圖夺% &amp;終胃彡 件於三_像之中。μ使、、θ_統據崎製物 _上=之輸人裝置,在&quot;實施例中當光線發射模組接收 ·]第一輸入指令時,繪圖系統便依據第一控制信號, 維圖像之中延展物件之表面。 σ ~二 上述之輸入裝置,在一實施例中更包括測距單元盆 輕接第-收發單元。當紐發射模組接收到第三輸入指^ 時,測距單元依據光線發射至實境之距離變化產生第一 控制信號至繪圖系統,使繪圖系統在三維圖像中延 為立體。 抑一上述之輸入裝置,在一實施例中更包括軸向轉值偵測 單元,其耦接第一收發單元。當光線發射模組接收到第四 輸入指令時,軸向轉值偵測單元依據光線發射至實境之移 動角度變化,產生第三控制信號至繪圖系統,使修 據以旋轉該物件。 g 上述之輸入裝置,在一實施例中更包括軸向轉值偵測 6 28520twf.doc/n 201015385 單兀,其耦接第一收發單元。當光線發射模組接收到第三 . 輸^指令時,依據光線發射至實境之移動角度變化,產生 . 第三控制信號至繪圖系統,使繪圖系統據以旋轉物件。 本發明之輸入裝置採用攝影模組拍攝實境,使繪圖系 統產生對應實境之三維圖像,並且透過偵測光線於實境'中 =移動路徑,繪製物件於三維圖像之中。藉此,使用者能 掌握空間尺度,於實境中進行空間設計。 參 ⑽本發明之上述和其他目的、特徵和伽能更明顯 易懂,下文特舉本發明之較佳實施例,並配合所附圖式, 作詳細說明如下。 【實施方式】 圖1繪示為本發明之一實施例之輸入裝置。請參照圖 1輸入裝置100包括光線發射模组11〇以及攝影模組 =0° α光線發射模組no包括收發單元11卜光源112以及 定位單元113。在光線發射模組11〇中,收發單元ln建 立光線收發模組110與_系統13〇之間的通訊。當光線 ❹發射模組接收到輸入指令CMD1時,光源112便發射光線 至:實境140 ’其中光源112可以是點光源、線光源或者 f源’在此以點光源為例說明。定位單元113減收發 元111,其疋位光線發射模組11〇在空間中的座標,並 且將此座標資訊傳送至繪圖系統13()。 攝影模、组120包括收發單元12卜影像操取單元122、 .蹤單7° 123以及定位單元124。在攝影模組120中, 毛早70 121建立攝影模組120與繪圖系統13〇之間的通 7 28520twf.doc/n 201015385 w 亂。影像擷取單TO 122輕接收發單元⑵,其拍攝實境刚, 並且輸出背景影像至繪圖系統13〇,使繪圖系統13〇產生 對,背景影像之二維影像133。光線追蹤單元123轉接收 發單元121,其偵測光線發射至實境14〇的移動路徑141, 並且產生控制信號CON1至_系統m,使軸系統 緣製物件132於三維圖像133中。 在此201015385 w 28520twf.doc/n IX. Description of the Invention: - Technical Field of the Invention The present invention relates to an input device, and more particularly to a space 'design input device. [Prior Art] With the development of multimedia technology, computers have been able to accurately represent the colorful three-dimensional images in the real world. Users can manipulate computer-aided design software through input devices such as mouse, keyboard, and handwriting tablet, such as: 3D-Max, Sketch-up, and AutoCAD to create and edit 3D images. However, in space design, whether through computer-aided design or manual drawing, the designer can only express and present the design in scale. Design drawings in scales are not easily understood by the general public who have not received architectural design training, and often cause poor communication between designers and operators. Moreover, the design drawings presented by the designers are also prone to the design gaps that the industry is thinking about. Therefore, how to effectively grasp the spatial scale to design the φ line space is very important. SUMMARY OF THE INVENTION The present invention provides a spatial design input device that allows a designer to effectively grasp the design of a spatial scale and draw objects into a three-dimensional image. The present invention provides a spatially designed input device that is suitable for a mapping system. The input device includes a light emitting module and a photography module. The light emitting module includes a first transceiver unit and a light source. The first transceiver unit is configured to pass the light 5 w 28520 twf.doc/n 201015385 line transmitting module to the drawing system. When the light emitting module receives the first input, the second transceiver unit, the image capturing unit, and the first line. The second transceiver unit communicates with the system. The image age unit _ the first two-dimensional image of the old-fashioned shirt. The light tracking unit is lightly connected to the unit; the ray tracing unit detects that the line is transmitted to the real moving path ^ and generates a first control signal to the & 图 夺 % & 终 终 终 终 终 终 终 终 终 终 终 终 终 终 终 终 终 终 终 终使, , θ_, according to the input device of the 崎__==, in the embodiment, when the light emitting module receives the first input command, the drawing system according to the first control signal, the drawing Like the surface of an extended object. σ~2 The above input device, in one embodiment, further comprises a distance measuring unit basin lightly connected to the first transceiver unit. When the neon transmitting module receives the third input finger ^, the ranging unit generates a first control signal to the drawing system according to the change of the distance from the light emission to the real environment, so that the drawing system is extended into a three-dimensional image. In one embodiment, the input device further includes an axial value detecting unit coupled to the first transceiver unit. When the light emitting module receives the fourth input command, the axial rotation detecting unit generates a third control signal to the drawing system according to the change of the moving angle of the light emission to the real environment, so as to modify the object to rotate the object. The above input device further includes an axial value detection 6 28520 twf.doc/n 201015385 unit, which is coupled to the first transceiver unit. When the light emitting module receives the third command, the third control signal is sent to the drawing system according to the change of the moving angle of the light emitted to the real world, so that the drawing system rotates the object. The input device of the present invention uses a photographic module to capture the real world, so that the drawing system generates a three-dimensional image corresponding to the real environment, and draws the object into the three-dimensional image by detecting the light in the real world. In this way, the user can grasp the spatial scale and carry out the space design in the real world. The above and other objects, features and genits of the present invention will become more apparent from the <RTIgt; Embodiments FIG. 1 illustrates an input device according to an embodiment of the present invention. Referring to FIG. 1 , the input device 100 includes a light emitting module 11 〇 and a photographic module =0° α light emitting module no includes a transmitting and receiving unit 11 and a positioning unit 113 . In the light emitting module 11A, the transceiver unit ln establishes communication between the light transceiver module 110 and the system 13A. When the light ❹ transmitting module receives the input command CMD1, the light source 112 emits light to: the real environment 140' where the light source 112 can be a point light source, a line light source or an f source. Here, the point light source is taken as an example. The positioning unit 113 decrements the transceiver 111, which clamps the coordinates of the light emitting module 11 in the space, and transmits the coordinate information to the drawing system 13(). The photography module and group 120 includes a transceiver unit 12, an image manipulation unit 122, a tracking unit 7° 123, and a positioning unit 124. In the photographic module 120, Mao Zao 70 121 establishes a communication between the photographic module 120 and the drawing system 13 〇 7 28520 twf.doc/n 201015385. The image capture unit TO 122 light receiving unit (2), which captures the real scene and outputs the background image to the drawing system 13〇, so that the drawing system 13 〇 generates a two-dimensional image 133 of the background image. The ray tracing unit 123 rotates the receiving unit 121, which detects the movement of the light to the moving path 141 of the real world 14 and generates the control signals CON1 to _system m to cause the axis system to edge the object 132 into the three-dimensional image 133. here

兀*球疋观早凡123可以透過影像處理方式,來 偵測光線發射至實境14〇的移動路徑141。舉例 線追蹤單元123透過影像分㈣景影像巾的移動物件,便 :偵測出光祕射至實境⑽的総时絲像中的位 置,並從而得知光線發射至實境14〇的移動路徑⑷。 2元m麵接收發單元121,其定位攝影模组12〇在空 =中的座標,並絲絲資崎送至_系統⑽ 實施例中,透過定位單元113及124之運作, 實境⑽的光點座標,以提高侧移動路徑⑷的 令:^=^^0接收_ 線發射至實境⑽的移鱗徑,雜料生控物 =〇Ν1 ’騰圖系統13G便可㈣物件於 1 ^線段’或者多個線段所組成之多邊形。另圖:,之二兀* 疋 早 早 早 123 123 can use the image processing method to detect the light 141 that is transmitted to the real world. The example line tracking unit 123 transmits the moving object of the image (4) scene image towel through the image, and detects the position of the light secret image to the real time (10), and thereby knows the moving path of the light to the real world 14〇. (4). The 2-way m-plane receiving unit 121 is positioned to align the coordinates of the camera module 12 in the air=, and the wire is sent to the system_10. In the embodiment, the operation of the positioning units 113 and 124 is performed, and the real (10) The coordinates of the light spot to increase the side movement path (4): ^=^^0 Receive _ line emission to the real (10) scale diameter, the biomass control object = 〇Ν 1 'Tengtu system 13G can (4) object in 1 ^Line segment' or a polygon composed of multiple line segments. Another picture: two

便入指令—時,系統I M〇m)’在三糊像之中延展物件之表 201015385 v/ 28520twf.doc/n 段延展成平面或者改變平面尺寸大小。 本實鞑例之光源發射模組110更包括測距單元114以 . 及軸向轉值制單元。測距單元114她收發單元 Π1。當光線發射模組110接收到輸入指令CMD3時,测 距單元114依據光線發射至實境14G的距離變化,_ 制k號CON2至緣圖系統130,使緣圖系統13〇在三維^ 像中133延展物件成為立體’例如:將平面延展成立體。 ㈣擁侧單元115 _收發單元lu。當 射模組110接收到輸入指令CMD4時,轴向轉值侦測單元 115依據光線發射至實境140的移動角度變化產生押制 信號CON3至繪圖系、统130,使緣圖系統13〇在三維^像 133中靛轉物件。而當光線發射模組11〇 ⑽出時,繪圖系統13〇便依據光線發射至實 動路控(即受控於控制信號CON1)以及光線發射至實境_ 的移動角度變化(即受控於控制信號c⑽),在三維圖像 Π3中移動物件的位置。另外,當光線發射模組11〇接收 ® f輸入指令c_時,軸向轉值侧單元U5依據光線發 射至實境140的移動角度變化,產生控制信號CON3至繪 圖系、’先130使緣圖系統13〇在三維圖像I%中將物件延 展成立體。 由於光源發射模組11〇可以透過收發單元lu與繪圖 系統130進行通訊’因此繪圖系統130可以得知光^射 ’ f、、·8· 11G所接收之輸人指令’並且依據其所接收到的控制 信號,改變物件的狀態。 9 201015385^ 28520twf.doc/n 圖2A〜圖2G分別為本發明之一實施例採用輸入裝置 繪製物件於三維圖像的示意圖。請參照圖2A,光線發射模 • 組110外觀呈現例如為搶的造型或者筆,其於接收到輸入 才曰令CMD1時,發射光線(例如:雷射光)至實境14〇。使 用者瞄準欲繪製之起點位置按下搶的板手(即產生輸入指 令CMD1)以發射雷射光,並且移動雷射光投射至實境 的光點位置L(即光線追蹤單元123產生控制信號c〇Nl), 使繪圖系統13〇繪製一線段132a於三維圖像133中。 請參照圖2B,使用者瞄準欲繪製之起點位置按下搶的 板手(即產生輸入指令CMD1)以發射雷射光,並且移動雷 射光投射至實境140的光點位置L(即光線追蹤單元123產 生控制信號CON1) ’使繪圖系統13〇繪製一矩形132b於 三維圖像133中。 ' 請參照圖2C’使用者瞄準已繪製的線段按下槍的板手 (即產生輸入指令CMD1)以發射光線,並且點選繪製表面 之按紐(即產生輸入指令CMD2)。透過移動雷射光投射至 〇 實境140的光點位置L(即光線追蹤單元123產生控制信號 CON1) ’使繪圖系統130在三維圖像133中將線段延展成 平面132c。 請參照圖2D,使用者瞄準已緣製的平面按下槍的板手 (即產生輸入指令CMD1)以發射光線,並且點選繪製立體 之按钮(即產生輪入指令CMD3)。透過改變雷射光投射至 . 實境14〇的距離D(即測距單元114產生控制信號CON2), 使緣圖系統130在三維圖像133中將平面延展成立體 201015385 28520twf.doc/n 132d。 • 請參照圖2E,使用者瞄準已繪製的平面按下搶的板手 (即產生輸入指令CMD1)以發射光線’並且點選纟會製立體 之按鈕(即產生輸入指令CMD3)。透過改變雷射光投射至 實境140的移動角度A(即轴向轉值偵測單元115產生控制 信號CON3),使繪圖系統130在三維圖像133中將平 展成立體132e。 請參照圖2F,使用者瞄準已繪製的立體按下搶的板手 # (即產生輸入指令CMD1)以發射光線,並且點選旋轉物件 之按紐(即產生輸入指令CMD4)·。透過改變雷射光投射至 實境140的移動角度A(即轴向轉值偵測單元115產生控制 信號CON3),使繪圖系統130在三維圖像133中旋轉物件 132f。 請參照圖2G,使用者瞄準已繪製的立體按下搶的板手 (即產生輸入指令CMD1)以發射雷射光,並且點選移動物 件之按鈕(即產生輸入指令CMD5)。透過移動雷射光投射 參 至實境140的光點位置L(即光線追蹤單元123產生控制信 號CON1),以及改變雷射光投射至實境14〇的移動角度 A(即軸向轉值偵測單元115產生控制信號c〇N3),使繪圖 系統130在三維圖像133中移動物件13¾。 依據上述實施例圖2A〜圖2G之敘述,輸入裝置1〇〇 控制繪圖系統130繪製線段、多邊形、平面以及立體等多 種物件於三維圖像133之中。另外,輸入裝置1〇〇更可控 制繪圖系統133移動物件及旋轉物件。 28520twf.doc/n 201015385 w 值得一提的是,輸入裝置100更可包括一顯示器(未綠 示於圖1),此顯示器可應用擴增實境(aUgmentedreality)^ 技術’將三維圖像133中的物件132與實境HO結合,讓 使用者能於實境140中看見所繪製之物件,然並不侷限於 此。 综上所述,上述實施例之輸入裝置採用攝影模組拍攝 實境,使繪圖系統能產生對應實境之三維圖像,並且透過 偵測光線發射至實境的移動路徑,控制繪圖系統繪製物件 於三維圖像之中。另外,繪圖系統也可依據光線發射至實 境的移動肢變化或者光線發射至實_輯變化改變 物件的狀態’例如··延展物件之表面、延展平面成為立體、 移動物件或者鄕物件等。此外,更可配合擴增實境的技 =將所繪製之物件與實境結合。藉此,使用者能有效地 旱握空間尺度,於實境中進行空間設計。 o 雖然本發明已以較佳實施_露如上,然其並非用以 =本發明’任何所屬技術領域中具有通常知識者,在不 ® 明之精神和範_,#可作些許之更動與满飾, =本發明之保護範圍當視後附之中請專利範圍所界定者 為進。 【圖式簡單說明】 圖1緣示為本發明之一實施例之輸入裝置。 给圖2G分別為本發明之—實施例採用輸入裝置 、&quot;製物件於三維圖像的示意圖。 【主要元件符號說明】 12 201015385…一、 100 :輸入裝置 - 110:光線發射模組 111、121 :收發單元 ' 112 :光源 113 :定位單元 114 :測距單元 115 :轴向轉值偵測單元 120 :攝影模組 ❹ 122:影像擷取單元 123 :光線追蹤單元 124 :定位單元 130 :繪圖系統 131 :三維圖像 132、132a〜132g :物件 140 :實境 141 :移動路徑 φ CMD1:輸入指令 CON1 :控制信號 13When the command is entered, the system I M〇m)' extends the table of objects in the three-paste image. The 201015385 v/ 28520twf.doc/n segment extends into a plane or changes the size of the plane. The light source emitting module 110 of the embodiment further includes a distance measuring unit 114 and an axial value converting unit. The ranging unit 114 transmits and receives the unit Π1. When the light emitting module 110 receives the input command CMD3, the ranging unit 114 changes the distance of the light emitted to the real environment 14G, and the k-number CON2 to the edge map system 130 causes the edge image system 13 to be in the three-dimensional image. 133 extended objects become stereoscopic 'for example: extending the plane into a body. (4) The side unit 115 _ transceiver unit lu. When the injection module 110 receives the input command CMD4, the axial rotation detecting unit 115 generates the pinned signal CON3 to the drawing system 130 according to the change of the moving angle of the light emission to the real environment 140, so that the edge image system 13 is in the Three-dimensional ^ image 133 in the object. When the light emitting module 11 (10) is out, the drawing system 13 is controlled according to the movement angle of the light emission to the real motion control (ie, controlled by the control signal CON1) and the light emission to the reality _ (ie, controlled by The control signal c(10)) moves the position of the object in the three-dimensional image Π3. In addition, when the light emitting module 11 receives the f input command c_, the axial value side unit U5 changes according to the moving angle of the light emission to the real environment 140, and generates the control signal CON3 to the drawing system, 'first 130 edge The graph system 13〇 extends the object into a volume in the three-dimensional image I%. Since the light source transmitting module 11 can communicate with the drawing system 130 through the transceiver unit lu, the drawing system 130 can know the input command received by the optical device 'f, ··················· The control signal changes the state of the object. 9 201015385^ 28520twf.doc/n FIGS. 2A to 2G are respectively schematic diagrams showing an object in a three-dimensional image using an input device according to an embodiment of the present invention. Referring to Fig. 2A, the light emitting mode group 110 exhibits, for example, a sculpt shape or a pen that emits light (e.g., laser light) to a virtual reality 14 when an input is received to cause CMD1. The user aims at the starting position to be drawn, presses the grabbing wrench (ie, generates an input command CMD1) to emit the laser light, and moves the laser light to the spot position L of the real world (ie, the ray tracing unit 123 generates a control signal c〇). Nl), causing the drawing system 13 to draw a line segment 132a in the three-dimensional image 133. Referring to FIG. 2B, the user aims at the starting position to be drawn and presses the grabbing hand (ie, generates an input command CMD1) to emit the laser light, and moves the laser light to the spot position L of the real environment 140 (ie, the ray tracing unit). 123 generates a control signal CON1) 'to cause the drawing system 13 to draw a rectangle 132b into the three-dimensional image 133. 'Please refer to Fig. 2C' to aim the drawn line segment by the user pressing the gun's wrench (ie, generate the input command CMD1) to emit light, and click the button to draw the surface (ie, generate the input command CMD2). The projection system 130 extends the line segment into the plane 132c in the three-dimensional image 133 by moving the laser light to the spot position L of the 140 境 140 (i.e., the ray tracing unit 123 generates the control signal CON1). Referring to Figure 2D, the user aims to press the gun's wrench (i.e., generates an input command CMD1) to emit light, and clicks to draw a stereo button (i.e., generates a wheeling command CMD3). By changing the distance D of the projected laser light to the real 14 ( (i.e., the ranging unit 114 generates the control signal CON2), the edge map system 130 extends the plane in the three-dimensional image 133 to form a body 201015385 28520 twf.doc/n 132d. • Referring to Figure 2E, the user is aiming at the drawn plane to press the grabber (i.e., generate an input command CMD1) to emit light&apos; and click on the button that will make the stereo (i.e., generate an input command CMD3). By changing the movement angle A of the projection of the laser light to the real environment 140 (i.e., the axial rotation detecting unit 115 generates the control signal CON3), the drawing system 130 will be flattened into the body 132e in the three-dimensional image 133. Referring to FIG. 2F, the user aims at the drawn stereo-pressed grabber # (ie, generates an input command CMD1) to emit light, and clicks the button of the rotating object (ie, generates an input command CMD4). The mapping system 130 rotates the object 132f in the three-dimensional image 133 by changing the movement angle A of the projection of the laser light to the real environment 140 (i.e., the axial rotation detecting unit 115 generates the control signal CON3). Referring to Figure 2G, the user is aiming at the drawn stereo presser (i.e., generating an input command CMD1) to emit the laser light, and clicking the button of the moving object (i.e., generating an input command CMD5). Projecting the spot position L of the real-world 140 by moving the laser light (ie, the ray tracing unit 123 generates the control signal CON1), and changing the moving angle A of the laser light projected to the real world 14 (ie, the axial rotation detecting unit) 115 generates a control signal c〇N3) that causes the mapping system 130 to move the object 133⁄4 in the three-dimensional image 133. According to the above-described embodiment, as shown in Figs. 2A to 2G, the input device 1 〇〇 controls the drawing system 130 to draw a plurality of objects such as a line segment, a polygon, a plane, and a stereo in the three-dimensional image 133. In addition, the input device 1 can further control the drawing system 133 to move objects and rotate objects. 28520twf.doc/n 201015385 w It is worth mentioning that the input device 100 can further include a display (not shown in Figure 1), which can apply augmented reality (aUgmented reality) ^ technology to the three-dimensional image 133 The object 132 is combined with the real HO to allow the user to see the drawn object in the real environment 140, but is not limited thereto. In summary, the input device of the above embodiment uses the photographic module to capture the real environment, so that the drawing system can generate a three-dimensional image corresponding to the real environment, and control the drawing system to draw the object by detecting the moving path of the light to the real world. In the three-dimensional image. In addition, the drawing system can also change the state of the object according to the change of the moving limb that the light is emitted to the environment or the light emission to the real-time change. For example, the surface of the object is extended, the extended plane becomes a solid, a moving object or a workpiece. In addition, it can be combined with the technique of augmented reality = combining the drawn objects with the real world. In this way, the user can effectively grasp the spatial scale and carry out the space design in the real world. o Although the present invention has been described as a preferred embodiment, it is not intended to be used in the art of the present invention. Anyone who has a general knowledge in the technical field of the present invention may make some changes and ornaments. = The scope of protection of the present invention is defined by the scope of the patent. BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 shows an input device according to an embodiment of the present invention. FIG. 2G is a schematic diagram of an embodiment of the present invention using an input device and an article in a three-dimensional image. [Description of main component symbols] 12 201015385... I, 100: Input device - 110: Light emitting module 111, 121: Transceiver unit '112: Light source 113: Positioning unit 114: Ranging unit 115: Axial value detecting unit 120: photography module ❹ 122: image capturing unit 123: ray tracing unit 124: positioning unit 130: drawing system 131: three-dimensional image 132, 132a to 132g: object 140: real 141: moving path φ CMD1: input command CON1 : control signal 13

Claims (1)

201015385 28520twf.doc/n 十、申請專利範圍: 1. 一種空間設計之輸入裝置,適於一繪圖系統,包括: 一光線發射模組,包括 一第一收發單元,用以使該光線發射模組與該繪 圖系統進行通訊;以及 一光源,耦接該第一收發單元,當該光線發射模 組接收到一第一輸入指令時,發射一光線至一實境;以及 一攝影模組,包括:201015385 28520twf.doc/n X. Application Patent Range: 1. A space design input device suitable for a drawing system, comprising: a light emitting module, comprising a first transceiver unit for enabling the light emitting module Communicating with the drawing system; and a light source coupled to the first transceiver unit, when the light emitting module receives a first input command, transmitting a light to a real environment; and a camera module comprising: 一第二收發單元,用以使該攝影模組與該繪圖系 統進行通訊; 一影像擷取單元,耦接該第二收發單元,用以拍 攝該實境,並輸出一背景影像至該繪圖系統,使該繪圖系 統產生對應該背景影像之一三維影像;以及 一光線追蹤單元,耦接第一該收發單元,用以偵 :則該,線發射至該實境的移動路徑,並產生—第一控制信a second transceiver unit for communicating with the drawing system; an image capturing unit coupled to the second transceiver unit for capturing the real world and outputting a background image to the drawing system And causing the drawing system to generate a three-dimensional image corresponding to the background image; and a ray tracing unit coupled to the first transceiver unit for detecting: the line is transmitted to the real-world moving path, and generating Control letter 號至該繪_統’使崎㈣統據讀製—物件於該三 圖像之中。 2.如申請專利範圍第1項所述之輸入裝置,,其中該 光線發射模組更包括: 光绩恭位早7,祕該第—收發單元,用以定位該 該空間中之一第一座標,並將該第-座標 貝訊傳达至該繪圖系統。 影模3组in專利軸1項所述之輸人裝置,其中該攝 28520twf. doc/n 201015385 第二定位單元,耦接該第二收發單元,用以定位該 攝影模組於該空間中之一第二座標,並將該第二座 傳送至該繪圖系統。 ' ° 一 4.如申請專利範圍第1項所述之輸入裝置,其中當詨 光線發射模__—第二輸人指令時,鱗^統^ 據該第一控制信號,於該三維圖像之中延展該物件之表面。 5·如申請專利範圍第1項所述之輸入裝置,其中該来 線發射模組更包括: 測距單元,耗接該第一收發單元,當該光線發射模 組接收到-第二輸人指令時,依據該光線發射至該實境之 6·如申請專利範圍第1 化產生—第二控制信號至該緣圖系統,使該緣圖 系統據以於該三維圖像中延展該物件為立體。 線發射模組更包括: 項所述之輸入裝置,其中該光 一軸向轉值偵測單元 線發射模組接收到一第四 © 該實境之移動角度變化, 7·如申請專利範圍第6項 光線發射模組接收到—第五給 ’使該繪_、崎以_該物件。 7.如Φ諸直·ίΐ丨愁:_The number to the painting _ ’ 使 崎 ( (4) is based on reading - objects in the three images. 2. The input device of claim 1, wherein the light emitting module further comprises: a light performance, a seventh, a secret transceiver unit for positioning the first one of the spaces. Coordinates and communicate the first coordinate to the drawing system. The second type of positioning unit is coupled to the second transceiver unit for positioning the camera module in the space. The second positioning unit is coupled to the second transceiver unit. a second coordinate and the second seat is transmitted to the drawing system. The input device of claim 1, wherein when the ray emission mode __-the second input command, the scale control system according to the first control signal, the three-dimensional image Extend the surface of the object. 5. The input device of claim 1, wherein the incoming line transmitting module further comprises: a ranging unit that consumes the first transceiver unit, and when the light emitting module receives the second input When the command is issued, according to the light emitted to the real world, the second control signal is generated to the edge map system, so that the edge map system extends the object according to the three-dimensional image. stereoscopic. The line transmitting module further includes: the input device described in the item, wherein the light-axial value detecting unit line transmitting module receives a fourth © the moving angle change of the real world, 7 · as claimed in the patent scope 6 The item ray emission module receives the fifth to 'make the picture _, aki _ the object. 7. 如Φ直·ίΐ丨愁:_ π,耦接該第一收發單元,當該光 四輸入指令時,依據該光線發射至 ’產生一第三控制信號至該繪圖系 弟6項所述之輸入裝置,其中當該 第五給入指令時,該繪圖系統依據 二控制信號,在該三維圖像中移動 1項所述之輸入裝置,其中該光 8.如申請專利範圍第i 線發射模組更包括: 15 2010153 85— 一軸向轉值偵測單元,耦接該第一收發單元,當該光 - 線發射模組接收到—第三輸入指令時,依據該光線發射至 該實境之移動角度變化,產生一第三控制信號至該繪圖系 統,使該繪圖系統據以延展該物件為立體。 9.如申請專利範圍第1項所述之輸入裝置,其中該光 源為點光源、線光源或者面光源三者其中之一或其組合。 ίο.如申請專利範圍第1項所述之輸入裝置,更包括: 一顯示器’顯示該物件與該實境結合之畫面。 ❹ 16π, coupled to the first transceiver unit, when the light is input by the light, according to the light emission to generate a third control signal to the input device of the drawing system, wherein the fifth input When instructed, the drawing system moves one of the input devices in the three-dimensional image according to two control signals, wherein the light 8. The patented range i-th emission module further includes: 15 2010153 85 - one axis The first value transceiver unit is coupled to the first transceiver unit, and when the light line transmitting module receives the third input command, generating a third control according to the change of the moving angle of the light emitted to the real world. Signaling to the drawing system causes the drawing system to extend the object into a volume. 9. The input device of claim 1, wherein the light source is one of a point source, a line source or a surface source or a combination thereof. Ίο. The input device of claim 1, further comprising: a display ‘displaying a picture of the object combined with the real world. ❹ 16
TW97137757A 2008-10-01 2008-10-01 Input device of space design TWI379217B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW97137757A TWI379217B (en) 2008-10-01 2008-10-01 Input device of space design

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW97137757A TWI379217B (en) 2008-10-01 2008-10-01 Input device of space design

Publications (2)

Publication Number Publication Date
TW201015385A true TW201015385A (en) 2010-04-16
TWI379217B TWI379217B (en) 2012-12-11

Family

ID=44830012

Family Applications (1)

Application Number Title Priority Date Filing Date
TW97137757A TWI379217B (en) 2008-10-01 2008-10-01 Input device of space design

Country Status (1)

Country Link
TW (1) TWI379217B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI413034B (en) * 2010-07-29 2013-10-21 Univ Nat Central The System of Mixed Reality Realization and Digital Learning

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI413034B (en) * 2010-07-29 2013-10-21 Univ Nat Central The System of Mixed Reality Realization and Digital Learning

Also Published As

Publication number Publication date
TWI379217B (en) 2012-12-11

Similar Documents

Publication Publication Date Title
Molyneaux et al. Interactive environment-aware handheld projectors for pervasive computing spaces
US20210011556A1 (en) Virtual user interface using a peripheral device in artificial reality environments
TWI454968B (en) Three-dimensional interactive device and operation method thereof
Kato et al. Marker tracking and hmd calibration for a video-based augmented reality conferencing system
US9142062B2 (en) Selective hand occlusion over virtual projections onto physical surfaces using skeletal tracking
JP2019149202A5 (en)
JP2014533347A (en) How to extend the range of laser depth map
JP2013521544A (en) Augmented reality pointing device
JP2010217719A (en) Wearable display device, and control method and program therefor
CN101866243A (en) Three-dimensional space touch control operation method and hand gestures thereof
US20180113596A1 (en) Interface for positioning an object in three-dimensional graphical space
WO2018176773A1 (en) Interactive system for three-dimensional space and operation method therefor
TW201439813A (en) Display device, system and method for controlling the display device
Lee et al. Tunnelslice: Freehand subspace acquisition using an egocentric tunnel for wearable augmented reality
TW201015385A (en) Input device of space design
Tait et al. A projected augmented reality system for remote collaboration
Adeen et al. RemoAct: portable projected interface with hand gesture interaction
Song et al. A crowdsensing-based real-time system for finger interactions in intelligent transport system
VanWaardhuizen et al. Table top augmented reality system for conceptual design and prototyping
TWI253005B (en) 3D index device
Reddy et al. IIMR: A Framework for Intangible Mid-Air Interactions in a Mixed Reality Environment
Yuan Visual tracking for seamless 3d interactions in augmented reality
Ap Cenydd et al. Using a kinect interface to develop an interactive 3d tabletop display
TW202325013A (en) Method for performing interactive operation upon a stereoscopic image and system for displaying stereoscopic image
Yoshida et al. Twinkle: interface for using handheld projectors to interact with physical surfaces

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees