TWM409872U - Care interacting instrument - Google Patents

Care interacting instrument Download PDF

Info

Publication number
TWM409872U
TWM409872U TW100203504U TW100203504U TWM409872U TW M409872 U TWM409872 U TW M409872U TW 100203504 U TW100203504 U TW 100203504U TW 100203504 U TW100203504 U TW 100203504U TW M409872 U TWM409872 U TW M409872U
Authority
TW
Taiwan
Prior art keywords
unit
display
response
motion
track
Prior art date
Application number
TW100203504U
Other languages
Chinese (zh)
Inventor
Jin-Lun Lai
Hai-Zhou Tian
Original Assignee
Laio Li Shi
Jin-Lun Lai
Lai Jin Ding
Hai-Zhou Tian
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Laio Li Shi, Jin-Lun Lai, Lai Jin Ding, Hai-Zhou Tian filed Critical Laio Li Shi
Priority to TW100203504U priority Critical patent/TWM409872U/en
Publication of TWM409872U publication Critical patent/TWM409872U/en

Links

Landscapes

  • Processing Or Creating Images (AREA)

Description

M409872 五、新型說明: 【新型所屬之技術領域】 本創作係關於一種娛樂設備 一種互動式裝置。 【先前技術】 坊些球峨場館,如棒球 高爾夫M409872 V. New description: [New technical field] This creation is about an entertainment device. [Prior Art] Some ball games venues, such as baseball golf

^二用以提供愛好球類運動的人士練習及大展身手的機 白⑯的場館通常需要廣大的場地,以容納多數 白❹者及前球類被擊打後在空中飛越所需要的空間。 =,因場館建置成本高昂,而用於供消費者練習的球旦 易於消耗損壞,也需要許多成本加以 2 、、 成本轉嫁到消費者身上時,將去I ,故在上述 運動、練習打擊的活動所費不等運動場館 者欲進行打擊練習時,還必須;來說,當消費^Two people who are interested in ball games and practice their skills. The venues of the White 16 usually require a large number of venues to accommodate the space required for most whites and former balls to fly in the air after being hit. =, because the venue construction cost is high, and the ball for the consumer to practice is easy to consume and damage, it also needs a lot of cost to 2, when the cost is passed on to the consumer, it will go to I, so in the above exercise, practice blow The cost of the event is not limited to the sports venues who want to carry out the practice.

尤其係關於娛樂設備的 Ϊ在室外練打則還需要視天候狀況:決===活 【新型内容】 本創作係涉及一種互勤々壯 裝置所輸出的虛擬物件的供使用者與顯示 等互動遊戲。 進仃互動,以進行球類運動 根據本創作的—種實行方 供使用者利用顯示裝置進行打 實施例中,包括影像擷取模組 景’像擷取模組用於操取回 案所提供的互動式裝置用以 擊練習等互動活動,在一個 、處理模組及%圖單元。 應物件之複數影像,以辨識 3/28 。玄回應物件之第二運動轨跡。處理 · π、軌跡辨識單元、交合 eD 匕括.執跡產生單 跡產生單元隸接顯示裝^用執跡計算單元。軌 轨跡;轨跡辨識單元_别,7F物件的第一運動 !之影像,以辨識回應物件之第二運:執:以接:回應物 早元耦接執跡產生單元 動執跡,父會點計算 件依照第-運動執跡運動及’用以計算顯示物 動而交會時之交會座標;軌“ J根據弟二運動執跡運 單元及領示货番m 计·^早兀則耦接交會點計算 及交會座祥,%管==第一運動執跡、第二運動執跡 執跡。緣圖單:Γ與回應物件交會後之反應運動 像,以及二制^-χ物一運動軌跡績製顯示物件的立體影 件對顯供的互動式裝置可供使用者以任-回應物 互動式裝ί將出:!顯示物件立體影像進行互動, 出 =物件應有的反應結果,並將其具像化(visuanzati〇n u建到遊戲的效果。 【實施方式】 明如作衫齡卿提供絲絲置的辦,用以說 利用虛擬的顯不物件供使用者以回應物件揮擊,並 交會^分析的技術判斷回應物件與所述的虛擬顯示物件 在實髀否,再模擬出相對應的反應結果。藉此以供使用者 _運動運動場地以外,亦可進行如棒球、網球或羽球等球 、的打擊練習,錢行其它具_似概念功能的遊戲 4/28 M409872 f控制介面,以讓使用者可以用 現的内容直接溝通互動,下達命令。 顯 互動本創作實施例提供7了一種互動式裝置,所述 的互動及空間限制’並可提供更彈性有效 球練習、:、白的技術手I又’以供遊戲愛好者更便利地進行擊 〔互動式裝置之實施例〕 干音圖# H1所示關於本創作的互動式裝置的使用環境 裝置〗。使用者5可立於顯示…前方== =件2,如棒球棒或網球拍等器材,以f試揮 ,如棒球或網球影像中的顯示物件4 明。—,、勿""。本貫施例以棒球棒及棒球進行說 於制i動=置3包括有處理模組(树示於圖丨),用以 =用:=:出連續的立體影像,以呈現出一 朝使用者5㈣向勒的效果。料午4 上的影像揭取裝置,可連續掏取 二 的多個影像,以計算回應物件2的移動 者5揮擊的力道。藉此,互動式向和使用 模組30戶斤擷取到的影像來/ ^ P可接收影像掏取 根據處理模組所控制輸出的“ 跡,並 算出來的回應物件2的運動軌跡,:==及計 回應物件2在三維空間中交會的狀況。H員不物件4與 口此’使用者5雖係對虛擬的影像揮擊,但仍可透過 互動式裝置3的運算和顯示裝置i呈現的晝面,獲得如同 打擊實體顯示物件的效果。 所述的,示裝置1係可為採用如雙凸透鏡(lenticular ^或視差屏P早(paral丨ax barrier)技術的裸視型立體顯示螢 以使使用者5獲得聽度(即距誠)的立體影像。 了頁示裝呈1亦可為一般的顯示螢幕,此時使用者5須配戴 適虽的輔助裝置觀看顯示裝置丨,如配戴紅藍濾光眼鏡或偏 光眼鏡,轉得具深度(即麟感)的立體影像。互動式 裝置3 U於顯不裝置丨附近或整合银人於顯示裝置1 當中。 m …、不了本創作實施例所提供的一種互動式裝置的 $塊圖。本實施_互動式裝置3包括影㈣取模組%、 又正板、!"32、處理模组34、位置辨識單元35、緣圖單元 號傳輪單元38。校正麵32、處賴組34及位置 “皁兀35分別減於影像擷組% =正㈣及處理模組34。訊號傳輸單 模::包二二=;號到回應物件2。影像_ 如攝影機或相機),二Si二—個以上影像擷取裝置( 故〜像砧取极組州可同時以不同葙 娜回應物件2被揮擊而移動的多個影像。 门視角 互動式裝置3可在二種;^崎健式下 校正模式及互_式,町分別舉實_說明。包括 [校正模式j 請繼續參_2,校正模組3 :=標計算單元322及座標對應單 式裝置3在校正模柄,可·校正顯32 _使用田^ 6/28 M409872 用的回應料2’n贼財岐錄 和初始化,配合不同使用以 l及不_應物件2而能準確計算 動作 顯示物件是否與回應物件2交會。*’、、 斤如的 校正模組32的物件座標輸出單元 轉接於顯示單W,並根據互動式裝置3内二 =記錄的資料,提供顯示物件的物件座標 2Especially for the entertainment equipment, the outdoor training needs to be based on the weather conditions: decision ===live [new content] This creation involves a virtual object output from the interactive device for the user to interact with the display. game. Incorporating and interacting with each other for ball games, according to the present invention, the user can use the display device to perform the example, including the image capturing module view image capturing module for the operation of the case. The interactive device is used to perform interactive activities such as exercises, in one, processing module and % map unit. Multiple images of the object should be identified to identify 3/28. Xuan responds to the second motion trajectory of the object. Processing · π, track recognition unit, intersection eD . 执. The trace generation single trace generation unit is attached to the display assembly and use the execution calculation unit. Track trajectory; trajectory identification unit _ other, the first motion of the 7F object! Image to identify the second object of the response object: :: 接: response object early coupling coupled with the execution unit, the father The point calculation piece is in accordance with the movement of the first movement and the coordinates of the intersection when it is used to calculate the movement of the object; the track "J is based on the second division of the movement and the display unit and the display of the goods. At the meeting point calculation and rendezvous, the % tube == the first movement obstruction, the second movement obstruction. The edge map: the reaction movement image after the rendezvous and the response object, and the second system ^-χ物一The trajectory record shows that the three-dimensional shadow pair of the object can be used by the user to interact with the responsiveness object: the display object stereoscopic image interacts, and the object should have the reaction result. And it is imaged (visuanzati〇nu built into the effect of the game. [Embodiment] Mingru as the shirt provides the silk to set up, to use the virtual display object for the user to respond to the object swing And the technical judgment response object and the virtual display In the actual implementation, the corresponding reaction results are simulated, so that the user can exercise the ball, such as baseball, tennis or badminton, in addition to the sports field, and the other functions are similar. Game 4/28 M409872 f control interface, so that users can directly communicate and use the current content to communicate and issue commands. The interactive embodiment provides an interactive device, the interaction and space limitation. Provide more flexible and effective ball exercises,: white, technical hand I and 'for the game enthusiasts to more easily hit [interactive device embodiment] dry sound map # H1 shows the use of interactive devices Environment device〗. User 5 can stand on the display... Front == = Piece 2, such as a baseball bat or tennis racket, etc., try to f, such as the display object in the baseball or tennis image. 4,,, and ; ". This example of the use of baseball bats and baseball to say that the system i = set 3 includes a processing module (tree shown in Figure ,), used =: =: continuous three-dimensional image to render Out of the user 5 (four) to the effect of Le The image uncovering device on the noon 4 can continuously capture two images to calculate the force of the swinging of the responder 2's mover 5. thereby interactively using and using the module 30 The image can be received / ^ P can receive the image according to the output of the control module controlled by the "track, and calculate the motion track of the response object 2, :== and the condition of the response object 2 in the three-dimensional space. H The user does not object 4 and the user 5 is a virtual image, but can still obtain the effect of displaying the object as a physical object through the operation of the interactive device 3 and the face of the display device i. The display device 1 can be a stereoscopic stereoscopic display such as a lenticular lens or a parallax barrier P-block technology to enable the user 5 to obtain a stereoscopic image of the listening (ie, the distance). . The page display 1 can also be a general display screen. At this time, the user 5 must wear an auxiliary device to view the display device, such as wearing red and blue filter glasses or polarized glasses, and the depth is turned (ie, Stereoscopic image of Lin Sen). The interactive device 3 U is located near the display device or integrated with the silver person in the display device 1. m ... can't be a $block diagram of an interactive device provided by the present creative embodiment. The present embodiment _ interactive device 3 includes a shadow (4) module %, a positive plate, a "32, a processing module 34, a position recognition unit 35, and a rim unit number transmission unit 38. Correction surface 32, reliance group 34 and position "saponin 35 are reduced from image 撷 group % = positive (four) and processing module 34. Signal transmission single mode:: package 22 =; number to response object 2. Image _ Camera or camera), two Si-two or more image capture devices (so ~ like an anvil-collecting group can simultaneously move multiple images in response to the object 2 being swiped. The door angle interactive device 3 can In the two kinds; ^ Sakijian type correction mode and mutual _ type, the town respectively _ description. Including [correction mode j, please continue to participate in the calibration module 3: = standard calculation unit 322 and coordinates corresponding to the single device 3 in the correction of the mold handle, can be corrected 32 _ use the field ^ 6/28 M409872 response material 2'n thief financial record and initialization, with different use of l and not _ should be object 2 can accurately calculate the action display Whether the object meets with the response object 2. *', the object coordinate output unit of the calibration module 32 is transferred to the display unit W, and according to the information recorded in the interactive device 3, the object coordinate of the object is displayed. 2

圖早疋36根據物件座標_顯示物件的立體影像,=、·曰 輸出。所述的物件座標係為顯示物件相S 頌不裝置1之顯示平面的三維座標。 对於 。月 > 閱圖2及圖3 ’圖3為顯示物件之物件座標立 圖。顯示裝置1所輸出的平面影像传 不、不思Figure 疋36 shows the stereo image of the object according to the object coordinate _, =, · 曰 output. The object coordinate is a three-dimensional coordinate showing the object phase S 颂 not the display plane of the device 1. For . Month > Read Figure 2 and Figure 3 'Figure 3 is a diagram showing the object coordinate of the object. The planar image output by the display device 1 is not transmitted or thought.

冢知顯不在顯不平面10 C 上,平面影像的每一像素皆具有二維座標,―妒 P —平面1〇的左上角開始為兩軸原點,橫心 =軸、縱向為外軸。立體影像則更包括深度所構成的 *為了壬現出立體影像的效果,顯示裝置工係輸出至 兩個包含同-顯示物件之不同㈣的平面影像,利= ,件4在不同視㈣產生的像差(di啊ity)造成深度^ 本例中的zD軸大小即為顯示裝置!所輪出的顯示物件的 差值。因此’物件座標輸出單元32〇係根據記憶單 記錄的資料,輪出-組物件座標以控制繪圖單元3 6緣 示物件4的立體影像。 、’肩 -舉例來說,所述顯示物件4為球體,物件座標輪 tl 320輸出物件座標(XDl,yD丨,Zd〇到繪圖單元%,緣圖單 36根據物件座標的座標值產生二張平面影|,分別為第〜 影像及第二影像,XD1及加分別為第—影像中,球體在顯 7/28 M409872 示平面1G的二維座標值’ ZD1則為球體在第—影像及第二 影像間的像差’並侧正、負值分別表示視焦平面的前、 後’因此第二影像的球體在顯示平面1G的二維座標值則可 為(xD1+zD1,yD1)。*第-影像及第二影像所組成的立體影像 即會在顯示裝置1上顯示出—立體的球體畫面,呈現出球 體懸浮在顯斜面料部的視覺效果,而賴浮的立體球 體的座標即為(xD],yD丨,zDl) 〇 條照圖2 ’回應座標偵測單元322則減於影像 組30 ’用以接收多個攝影機或相機從不同視角所擷 ^到的回應物件2的影像,並根據回應物件2在不同影像 中的像差,偵測出回應物件2的回應 對於,像操取模组30所定義之梅取平面的三;;^為相 °月參閱圖2及圖4,圖4為回應物件的回應座桿之示音 象擷取模組30係以多個攝影機 : 取回應物件2的影像而獲得回應物件=視= ,的影像擷取模組3〇包括兩個設置於同 -水平高度的攝影機或相機,並以該垂直平面/Π门 元攝 平面。鳴標偵測^ 應物件2的平面位置#脉_兩㈣像中的回 二維座標為基準)。^例如以兩個影像的其中—個影像的 物们在兩個影像之f/回應座標制單元322根據回應 擷取平面300的深戶f Zh〗’獲知回應物件2相對於 )’藉此而獲得回應與攝影機或相機的鏡頭之距離 為了易於辨識回應物二f的回應座標(如為咖)。 "牛2的回應座標,可預先在回應 8/28 部位標示一接觸點22(接觸點22例如為標示 (圖^件2,的特定顏色或花樣的點或形狀)或一接觸線 式、a7F)以供回應座標彳貞測單元322依據接觸點22 在Γ象中的位置而計算出代表整個回應物件2的 明.。。為便於繪示’以下實施例僅以接觸點22為例說 庫座^ Μ 324即用以將物件座標(Hz⑴)與回 相互對應,以便得知以顯示平面1〇為基 ==與以擷取平面3。〇為基準的回應座標之間在 1間上的座標對應關係。 使用者可在校正模式下,根據目視顯示裝置丨所顯示 f示物件在三度空間中的位置,將回應物件2擺置到碰 觸顯示物件的位置,以供影像擷取模組3G擷取回應物件2 L像麥閱圖2及圖5 ’圖5為顯示物件與回應物件交合 的示意圖。以棒球為例,當顯示裝置丨所顯示出來的顯二 物件4 (球體)與回應物件2 (球棒)接觸時,物件座標 〇Dl,yDl,ZD丨)與回應座標(Xh丨,yH丨,2出)實際上係對應到三度空 間中的同—點。因此,在物件座標與回應座標皆為線^ 係的前提下,㈣鋪的座標系統及回應麵的座標 可假設有以下的對應關係f: xH=c 1 xD+c2y d+c3zd+c4 yH=c5xD+c6yD+c7zD+c8 ZH=C9XD+Ci〇yD+C1 ,ZD+C12 由於有到e12等12個變數待解以獲得兩個座標系統 的線性對應函式,物件座標輸出單元32〇可在物件座標 (Xm,yDi,zD丨)附近再依序輸出其他三次不同的物件座標 9/28 (,,yD2,zD2)到(xD4,yD4,zD4) ’使顯示穿、置i顯示對應不同物 ^標的顯示物件4的立體影像。接著,使用者分別依昭 =像顯示的位置’將喊物件2的接_ 22碰觸到顯示物 二4的立體影像,以便回應座標債測單元322根據 叫組30娜到㈣像,亦分別計算轉他三組物件座桿 、ΧΗ4,Υη4,Ζη4)。藉此,座標對庳輩开1 , t 應的座標,獲得12組線性聯:方程式,二M,相對 外万私式,而分別計算出變數 】到C】2的數值’建立起球體4與回應物件2在三度 ==對,系f。上述所例示的座標對應關係雖假設為線 係,但本技_域純t知識者自可制其他更 的計算模型,運算出更多的變數而得到回應物件2之 回應座標與顯示物件4的物件座標之間的座標對應關係。 =蒼照圖6Α及圖6Β,此二圖式分別顯示使用者定位 位置時,喊物件與顯示物件對應位置的示意圖。 站立在顯示裝置丨前不職置時皆可㈣像差觀察 的應物件與顯示物件的交會點在三度空間中間 ' 、s /、使用者所觀看的角度與距離而有別。因此, :使是由物件座標輸出單元320讀取並輸出同一個物件座 ΓΓΓ,ζ〇) ’當使用者站在不同位置時’使用者目測到回 J件2與顯示物件4之接觸點22交會的位置可能具有差 異0 、 —+,來說,先參閱圖6Α,當使用者面對顯示裝置】並 疋位在㈣點L,(未繪於圖6Α)而觀察到顯示物件4後, 可將回應物件2移職獅點22接觸雜件*之立 像的位置’以供回應座標計算單元322算出回應物件2 ^ 10/28 二〜座I不(XHL1’)HU,ZHL1) ( _應圖6A的接觸點22)。再參照 :6Β ’虽使用者移動到量測點L】後方的量測點 未Ϊ於圖⑻’顯示物件4的物件座標並未改變,使用者 ^感覺旦定位在量測點L2時,顯示物件4與自身的距離 田、疋位在1測點時顯示物件4與自身的距離—樣。但使 者移動回應物件2使接觸點22’接觸到顯示物件4時的回 應座標(XHL2,yHL2,ZHL2)則比對應於量測黑占&白勺回應座標 (XHLl,yHLl,ZHLl)更遠離顯示裝置工。 換。之使用者位於量測點及量測點^時,就使用 者目視的角度所觀看到的顯示物件4固定顯示在同一個位 置(例如使用者定位在量測點L】所觀看到的顯示物件4距 離使用者半公尺遠,定位在量測點^所觀看到的顯示物件 仍」距離使用者半公尺遠),然*,使用者以回應物件2 分別在量測點L4L2接觸到相同顯示物件4的回應座標則 2差,(如上述的(XML — 】)及、孤Μ—))。此時 右未能根據使用者所在的量測點不同而改變回應座標與物 件座標的對應關係,則可能發生使用者認為已碰觸到顯示 物件、但互動式裝置3判斷為未交會的錯誤結果。因此可 知,在兩個不同量測點所產生的回應物件2與顯示物件4 的座標對應關係應該相對應調整。特別說明的是,上 不雖係以前、後不同位置的量測點L及L為例說明,但並 不以此例為限,當量測點左、右不 關係不同的情況。 料有座標對應 為了讓使用者在顯示裳置I前的任一個 物件4與回應物件2之間的座標 左,不 ,在校正模灯更可由雜―斷 11/28 M409872 量測點(如^及L2)時的位置,並由位置辨識單元3s接 收影像擷取模組30擷取的影像來判斷使用者在每一量測點 的位置的三維座標,經由物件座標輪出單元32◦在每個量 測點分別依照相同的多組物件座標顯示顯示物件4數次( 如上述例示的四次),經對應接觸後,由座標對應單元=24 接收使用者所在的量測點的三維座標及計算出相關變數( 如上述的W c12),如此可分別獲得在不同量測點時的座 標轉換關係f(L)。換言之,取得四組物件座標(Xdi y⑴z ) 到(xD4,yD4,zD4)與相對應的回應座標(XHui,m二 (XHLl4,yHLM,ZHLM) ’而由座標對應單元324進行運管後,可 獲得使用者定位在量測點L】時,回應物件2與顯示物件 的座標對應關係f(L1)。同理,取得相同的四組物件座標 (XDl,yDl,Zm)到(XD4,yD4,ZD4)與相對應的回應 ^ /3L >|7p; 'HL2 l,yHL2〗’Zhl21)到(Xj*iL24’yHL24’ZHL24)而進行運算後,則可爽 得使用者定位在量測點L2時,回應物件2與顯示物件4 ς 座標對應關係f(L2),依此類推。而所述用以在量測點L〗進 行比對的四組物件座標與用在量測點L2進行比對的四組物 件座標相同,藉此可分別擷取及記錄使用者在不同位置時 :士觀看顯示在同—物件座標的顯示物件4,並進而根據目須: 、·"果以回應物件2捿觸顯示物件4的不同回應座標。用以 判斷使用者定位位置的 點可為使用者 幹上-特定位置。 或軀 為便於理解,本實施例中亦假設對應不同位置之量 中座標對應關係之㈣線性關係’但本技術領域 :八通吊知識者自亦可 用其他更複雜 , 京的次异方式,以獲得不同量測點(即使用者的 12/28 M409872冢 显 不在 不在 显 显 10 10 10 10 10 , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , The stereoscopic image is further composed of depth*. In order to display the effect of the stereoscopic image, the display device system outputs two planar images containing different (4) of the same-display object, and the component 4 is generated in different views (4). Aberration (di ity) causes depth ^ The zD axis size in this example is the display device! The difference between the displayed objects that are rotated. Therefore, the object coordinate output unit 32 rotates the group object coordinates according to the data recorded by the memory sheet to control the stereoscopic image of the object 4 by the drawing unit 36. 'Shoulder-- for example, the display object 4 is a sphere, and the object coordinate wheel tl 320 outputs the object coordinates (XD1, yD丨, Zd〇 to the drawing unit%, and the edge map 36 generates two sheets according to the coordinate value of the object coordinates. Plane shadow|, respectively, the first image and the second image, XD1 and plus are respectively in the first image, the sphere is in the 7/28 M409872, and the 2D coordinate value of the plane 1G is 'ZD1' is the sphere in the first image and The aberrations between the second images are positive and negative, respectively, indicating the front and back of the focal plane. Therefore, the two-dimensional coordinate value of the sphere of the second image on the display plane 1G may be (xD1+zD1, yD1). * The stereo image composed of the first image and the second image will display a three-dimensional spherical image on the display device 1, showing the visual effect that the sphere is suspended in the oblique fabric portion, and the coordinate of the three-dimensional sphere of the Laifu is For (xD], yD丨, zDl), the response frame detection unit 322 is reduced from the image group 30' to receive images of the response object 2 from a plurality of cameras or cameras from different perspectives. And detecting the back based on the aberration of the response object 2 in different images. The response of the object 2 is, for example, the three planes defined by the operation module 30; ^ is the phase of the month, see FIG. 2 and FIG. 4, and the figure 4 is the response image of the response seat of the response object. The module 30 is a plurality of cameras: the image capturing module 3 获得 is obtained by responding to the image of the object 2, and the image capturing module 3 includes two cameras or cameras disposed at the same level, and the vertical plane is used. / Π 元 摄 。 。 。 。 。 。 。 。 。 。 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣 鸣. For example, the image of one of the two images is obtained from the f/response coordinate unit 322 of the two images according to the deep user f Zh 'recognizing the response object 2 relative to the response capture plane 300'. Get a response to the distance from the camera or camera lens in order to easily identify the response coordinates of the responder (eg, for coffee). "Niu 2's response coordinates, a contact point 22 may be indicated in advance in response to the 8/28 portion (the contact point 22 is, for example, a mark (a specific color or a pattern of points or shapes of the figure 2) or a contact line, a7F) is used to calculate the representation of the entire response object 2 based on the position of the contact point 22 in the artifact according to the position of the contact point. . For convenience of illustration, the following embodiment only uses the contact point 22 as an example to say that the library holder 324 is used to correspond to the object coordinates (Hz(1)) and the back, so as to know that the display plane 1〇 is based on the==== Take plane 3.座 is the coordinate correspondence between the two coordinates of the response coordinates. In the calibration mode, the user can display the response object 2 to the position of touching the display object according to the position of the visual display device f displayed in the three-dimensional space for the image capturing module 3G to capture. The response object 2 L is like Fig. 2 and Fig. 5 'Fig. 5 is a schematic diagram showing the intersection of the object and the response object. Taking baseball as an example, when the display object 4 (sphere) displayed by the display device is in contact with the response object 2 (ball), the object coordinates 〇Dl, yDl, ZD丨) and the response coordinates (Xh丨, yH丨) , 2 out) actually corresponds to the same point in the three-dimensional space. Therefore, under the premise that both the object coordinates and the response coordinates are line systems, the coordinate system of the (4) shop and the coordinates of the response surface can be assumed to have the following correspondence f: xH=c 1 xD+c2y d+c3zd+c4 yH= c5xD+c6yD+c7zD+c8 ZH=C9XD+Ci〇yD+C1 ,ZD+C12 Since there are 12 variables to be solved by e12 to obtain the linear correspondence function of two coordinate systems, the object coordinate output unit 32〇 The other three different object coordinates 9/28 (,, yD2, zD2) are sequentially outputted near the object coordinates (Xm, yDi, zD丨) to (xD4, yD4, zD4) 'to make the display wear, and the i display corresponds to different objects. The target display shows a stereoscopic image of the object 4. Then, the user respectively touches the stereo image of the display object 2 with the position _ 22 of the screaming object 2 in response to the position of the display, so as to respond to the coordinate debt measuring unit 322 according to the group 30 to the (four) image, respectively. Calculate the three sets of object seatposts, ΧΗ4, Υη4, Ζη4). In this way, the coordinates of the coordinates of the 1st and the t-throws are obtained, and 12 sets of linear joints are obtained: equation, two M, relative to the outer private, and the variables are calculated to the value of C] 2 to establish the sphere 4 and The response object 2 is in the third degree == pair, which is f. Although the coordinate correspondence exemplified above is assumed to be a line system, the knowledge of the domain _ domain can be made by other more computational models, and more variables are calculated to obtain the response coordinates of the response object 2 and the display object 4. The coordinate correspondence between the coordinates of the object. = 苍照图6Α and Fig. 6Β, these two figures respectively show the corresponding position of the shouting object and the displayed object when the user positions the position. When standing in front of the display device, it can be used. (4) The intersection of the object and the display object in the aberration observation is in the middle of the three-dimensional space ', s /, and the angle and distance viewed by the user are different. Therefore, the object is read by the object coordinate output unit 320 and outputs the same object seat, ζ〇) 'when the user stands at different positions', the user visually detects the contact point of the J piece 2 with the display object 4. The position of the intersection may have a difference of 0, -+. For example, referring to Figure 6Α, when the user faces the display device and is clamped at the (four) point L, (not shown in Figure 6Α) and the object 4 is displayed, The response object 2 can be moved to the position of the lion point 22 to contact the image of the miscellaneous item* for the response coordinate calculation unit 322 to calculate the response object 2 ^ 10/28 2 to the seat I not (XHL1') HU, ZHL1) ( _ should Contact point 22) of Figure 6A. Referring again: 6Β 'Although the user moves to the measuring point L】the measuring point is not in the figure (8)', the object coordinate of the object 4 is not changed, and the user feels that when it is positioned at the measuring point L2, the display is displayed. The distance between the object 4 and itself is the distance between the object 4 and the self. However, when the messenger moves the response object 2 to make the contact point 22' contact the display object 4, the response coordinates (XHL2, yHL2, ZHL2) are farther away than the response coordinates (XHL1, yHL1, ZHL1) corresponding to the measurement black and & Display device. change. When the user is located at the measuring point and the measuring point ^, the displayed object 4 viewed from the user's visual angle is fixedly displayed in the same position (for example, the user is positioned at the measuring point L) to view the displayed object. 4After half a meter away from the user, the displayed object positioned at the measuring point ^ is still "a half meter away from the user", then the user responds to the object 2 at the measuring point L4L2 respectively. The response coordinates of the display object 4 are 2 (as in the above (XML - 】) and, Μ -)). At this time, if the right does not change the correspondence between the response coordinates and the object coordinates according to the measurement point of the user, the user may think that the display object has been touched, but the interactive device 3 determines that the error is not met. . Therefore, it can be known that the coordinate correspondence between the response object 2 and the display object 4 generated at two different measurement points should be adjusted correspondingly. In particular, although the measurement points L and L at different positions before and after are not described as examples, they are not limited to this example, and the difference between the left and right points of the equivalent measurement points is different. The coordinate has a coordinate corresponding to the coordinate left between the object 4 and the response object 2 before the user displays the skirt I. No, the calibration die can be measured by the miscellaneous-off 11/28 M409872 (such as ^ And the position at the time of L2), and the position recognition unit 3s receives the image captured by the image capturing module 30 to determine the three-dimensional coordinates of the position of the user at each measurement point, and passes through the object coordinate wheel-out unit 32. Each measuring point displays the object 4 several times according to the same plurality of object coordinates (such as four times as exemplified above), and after corresponding contact, the coordinate corresponding unit=24 receives the three-dimensional coordinates of the measuring point of the user and The correlation variable (such as W c12 above) is calculated, so that the coordinate transformation relationship f(L) at different measurement points can be obtained separately. In other words, after obtaining the four sets of object coordinates (Xdi y(1)z ) to (xD4, yD4, zD4) and the corresponding response coordinates (XHui, m 2 (XHLl4, yHLM, ZHLM)', and being operated by the coordinate corresponding unit 324, When the user is positioned at the measuring point L], the response object 2 corresponds to the coordinate of the displayed object f(L1). Similarly, the same four sets of object coordinates (XDl, yDl, Zm) are obtained to (XD4, yD4, ZD4) and the corresponding response ^ /3L >|7p; 'HL2 l, yHL2〗 'Zhl21) to (Xj*iL24'yHL24'ZHL24) after the operation, the user can be positioned at the measurement point At L2, the response object 2 corresponds to the display object 4 ς coordinate f(L2), and so on. And the four sets of object coordinates used for comparison at the measuring point L are the same as the four sets of object coordinates used for comparison by the measuring point L2, thereby respectively capturing and recording the user in different positions. : The watcher displays the display object 4 displayed in the same object coordinate, and then according to the eyepiece: , · " fruit in response to the object 2 touches the different response coordinates of the display object 4. The point used to determine the location of the user's location can be the user-specific location. Or the body is easy to understand, in this embodiment, it is also assumed that the (four) linear relationship corresponding to the coordinates of the coordinates in different positions is used. However, the technical field: the knowledge of the eight-way crane can also use other more complicated, different ways of Beijing, Obtain different measuring points (ie user's 12/28 M409872

不同定位點)時的回應物件2與顯示物件4的座標對應關 係f(L)。在本實施例中,當計算出至少兩組不同位置下=二 標對應關係f(L)後,使用者之後即可在顯示裝置丨前的任一 位置對顯示物件4揮擊,座標對應單元324可根據影像擷 取單元30擷取到的使用者定位位置,以及經過上述運算= 獲得的各個座標對應關係之間的線性關係,估算出使用者 所站位置下,顯示物件4與回應物件2間的另一座標對應 關係。例如,當使用者站立位置介於Li與乙2正中間7假ς 為L3)時,將可利用線性内插法則預估方法,藉由已知的 ,標對應關係f(L〇及f(L2)推估出其中間值而獲^使用^在 量測點L3時,回應物件2與顯示物件4的座標對應關係f(]^) ,,,示合而S ’根據私對應單元324所按數個對岸的回 應座標與物件座標間的線性對應關係,可用以判斷在同一 量測點時’回應物件2接觸㈣物件4的對躲式(㈣㈣ 。而進-步按不同量難間的對應_所計算The response object 2 at the time of the different positioning points corresponds to the coordinate of the display object 4 (f). In this embodiment, after calculating at least two sets of different positions = two standard correspondence f(L), the user can then swipe the display object 4 at any position before the display device, the coordinate corresponding unit 324, according to the user positioning position captured by the image capturing unit 30, and the linear relationship between the coordinates of each coordinate obtained by the above operation==, the object 4 and the response object 2 are estimated under the position of the user. Another coordinate correspondence between the two. For example, when the user's standing position is between Li and B2, 7 false ς is L3), the linear interpolation rule estimation method can be used, by the known corresponding correspondence f (L〇 and f( L2) Estimating the intermediate value and obtaining ^ when using the measurement point L3, the coordinate correspondence between the object 2 and the display object 4 is f(]^), and the indication is S' according to the private correspondence unit 324. According to the linear correspondence between several counter-response coordinates and object coordinates, it can be used to judge the response of object 2 to contact (4) object 4 in the same measurement point ((4) (4). Corresponding to _ calculated

對應關係,則可動X躺同—制者在^同定位位置時, 回應物件2與齡物件4彳_的模式。#互動式f置 互動模式下運㈣,若回應物件2欲接聰㈣置 不出來的顯純件切置3即可計翻—時間點的 物件座標與回應座標是否與座標對應單元324對 座標對應_減,而_回應物件2的接觸點2°i 觸到所述顯示物件4。 π [互動模式] 請繼續,關2,處理模組Μ包括執跡計算單元· 軌跡產生単7G 342、交會點計算單元344、通知單元祕 】3/28 ,以及轨跡辨識單元348。 請同時參照圖2與圖7,圖7為回應物件與瞥 跡之4圖。當互動式裝置3在互動模式下運作時,2 ,組34的_產生單元342可根據記憶單元(未給於二 ::攄7广錄的資料或隨機地產生一第一運動軌二^, 亚根據弟一運動執跡控制繪圖單 二立體影像並輸出到顯示裝置卜使^示 ==著第一運動軌跡運動的顯示物件= :速度―件4在第—運動軌跡“= ,使用示裝置1輸出具有立體感的顯示物件4時 使用者5不論從顯示裝置i 了 的影像都會相對於使用者5維持二=看:頁:物件4 面對顯示裝置i站立於^ 问一位置。例如使用者5 的顯示物件4後,使用^ 5二看到顯示在使用者5右方 動,所觀看到_干物^即使面對顯示裝置1朝右方移 顯示物件二仍然會與移動前相同, 示物件4顯示在顯示裝相同的距離。故每一次顯 使顯示物件4以第—運^’若轨跡產生單幻42欲 5的移練態,依日需要參照使用者 像_取模組30可即护枸讯爻.,,、員不物件4的物件座標。影 軍元35根據影像辨的影像,並由位置辨識 目艮晴位置)。 有5的疋位位置(如使用者5的 軌跡產生單元 位位置以及第—運動轨跡識單元35所辨識出的定 02即時運算顯示物件4相對應 14/28 M409872 於使用者之定純置的物件座標。藉此 物件4觀看起來’會隨著’二 近使用者D。以上述使用者5 妁扣動而接 所根據的第-運動執跡62假孟執跡產生單元342Corresponding relationship, the movable X lies the same as the mode in which the producer responds to the object 2 and the aged object 4彳_. #互动式f定互动模式下运(4), if the response object 2 wants to connect to Cong (4) can not be placed out of the pure piece cut 3 can count - the object coordinate at the time point and the response coordinate with the coordinate corresponding unit 324 to the coordinates Corresponding to _ minus, the contact point 2 θ of the response object 2 touches the display object 4. π [interaction mode] Please continue, close 2, the processing module Μ includes the trajectory calculation unit trajectory generation 単 7G 342, the intersection point calculation unit 344, the notification unit secret 】 3/28, and the trajectory identification unit 348. Please refer to FIG. 2 and FIG. 7 at the same time. FIG. 7 is a diagram of the response object and the stalk. When the interactive device 3 operates in the interactive mode, 2, the _ generating unit 342 of the group 34 can generate a first motion track according to the memory unit (not given to the data recorded in the second:: 摅7). According to the movement of the younger brother, the control unit draws a single stereo image and outputs it to the display device. The display object == the first moving track movement display object =: speed - the piece 4 in the first motion track "=, the display device 1 When the display object 4 having the stereoscopic effect is output, the image of the user 5 regardless of the display device i is maintained relative to the user 5. 2. The page: the object 4 stands facing the display device i at a position. For example, After displaying the object 4 of the user 5, using the ^5 2 to see the display on the right side of the user 5, the viewed object _ dry matter ^ even if facing the display device 1 to the right to move the object 2 will still be the same as before the move, The object 4 is displayed at the same distance in the display. Therefore, each time the display object 4 is caused to generate a single illusion 42 to 5 in the first trajectory, the user needs to refer to the user image _ takeout module 30 It can be the guardian of the 枸 爻.,,, the object coordinates of the object 4 The element 35 is based on the image identified by the image, and is identified by the position of the position. There are 5 position positions (such as the position of the user 5 to generate the unit position and the position recognized by the first motion path recognition unit 35) The instant computing display object 4 corresponds to the object coordinate of 14/28 M409872 which is set by the user. By this object 4, it will be 'followed' with the user 2, and the user 5 妁According to the first-exercise track 62 false monk obstruction generating unit 342

轴與”由的執跡,當使用顯示平面1。之X 所產生_'示物件4的物件座標 342 漸而朝向使用者5的方向靠近,換^上^^間經過逐The axis and the "deformation of the object, when using the display plane 1. X produced by the object coordinate 342 of the object 4 is gradually approaching the direction of the user 5, changing ^^^^

件座標X轴數值越來越小。藉此,、;:用;=二_ 時’才不致於產生顯示物件4始終與制者5間隔二2 距離而無法使回應物件2接_顯示物件4的問題。“ 單具體例示說明,當顯示裝置1根據繪圖 早το 36攸制以顯示出以第—運動轨跡運動的棒球的立The coordinate of the coordinate X axis is getting smaller and smaller. Therefore, the use of "==2" does not cause the problem that the display object 4 is always spaced from the maker 5 by two distances and the response object 2 cannot be connected to the display object 4. "Single specific illustration, when the display device 1 is based on the drawing, it is displayed to show the movement of the baseball moving in the first motion trajectory.

體影像時’制者即可贱轉置丨觀看縣是從顯示裝 置1向使用者投擲而來的虛擬棒球。此時,使用者可握持 回應物件2 ’即轉’向虛擬的球體影像揮擊。影像掏取模 組30可持續擷取球棒的影像,並傳送到軌跡辨識單元3牝 。由於球棒會根據使用者的揮擊動作,而隨著時間經過產 生位移,因此,執跡辨識單元348可依照影像擷取模組3〇 在不同時間所擷取到的多個影像判斷球棒的位移,特別是 球棒上預定的接觸點22的位移距離及方向,以辨識出回應 物件2運動的第二運動執跡64。 父會點计异單元344耗接執跡產生單元342及執跡辨 識單元348,並分別根據轨跡產生單元342產生的第一運動 執跡62及轨跡辨識單元348計算出的第二運動軌跡64計 鼻顯示物件4及回應物件2交會時,該交會點I的交會座標 15/28 钟j跡產生單元342和執跡辨識單元348可分別包括有 早元(未繪於圖2),因此執跡產生單元產生第一In the case of a body image, the system can be placed in a state where the county is a virtual baseball that is thrown from the display device 1 to the user. At this point, the user can hold the response object 2 'turn' to swipe the virtual sphere image. The image capture module 30 can continuously capture the image of the bat and transmit it to the track recognition unit 3牝. Since the bat will be displaced according to the user's swipe action, the trace recognition unit 348 can judge the bat according to the plurality of images captured by the image capture module 3 at different times. The displacement, in particular the displacement distance and direction of the predetermined contact point 22 on the bat, identifies a second motion trace 64 that responds to the movement of the object 2. The parent point counting unit 344 consumes the track generating unit 342 and the track identifying unit 348, and respectively calculates the second motion track according to the first motion track 62 and the track recognizing unit 348 generated by the track generating unit 342. When the 64-meter display object 4 and the response object 2 meet, the intersection coordinates 15/28 clock trace generation unit 342 and the trace recognition unit 348 of the intersection point I may include early elements (not shown in FIG. 2), The execution generation unit generates the first

St執跡辨識單元348接收影像擷取模組3〇擷取回 Γ' Μ數影像時’分別可記錄顯示物件4影像輸出 被操取的時間。交會點計算單元= 的_座彳》=的顯示物件4的物件座標位置及回應物件2 件4 i回’配合座標對應單元324所計算出顯示物 依昭第2兩個座標系統的越對應_,以判斷 Μ的球^ ^跡62物的賴及舰第二運純跡運動 得球的同一點交會。若是,則同時獲 未在二軌跡62及第二運動軌跡64並 有交合則曰右^父曰(如兩個路徑完全無交會,或是雖 的‘單則透過減在交會點計算單元糾 通知所包括的訊;依照未交會 及/或相關的打擊記錄數據=面場景“ 丹口幻弟一運動執跡62及第二運動 時間點產生交會的情況。交會料轉64在相同的 正模組32中所獲得的顯 4除彻虞校 乂由軌跡计算單元34〇計算出 進 顯示物件4與庫 :;棒擊中(也就是 。、口應物件2父會)後應產生的反應運動執跡 ,.貞不物件4與回應物件2交會後所產生的反應運動勒 16/28 66 M409872 跡66包括顯示物件4的反應距離及反應方向,此二者的變 化受到回應物件2的回應力道、顯示物件4被擊中時之^ 度,以及顯示物件4與回應物件2接觸的入射角度的影^ 。由於模擬㈣並非真實世界的碰撞,故可制將因回應 物件2揮擊而產生的衝擊量完全由顯示物件4接收的㈣ 模型加以計算。而由於騎物件4㈣—運純跡6 2係由 互動式裝置3根據記憶單元(未示於圖2)記錄的資料而產The St trace recognition unit 348 receives the time when the image capture module 3 retrieves the Μ 'number of images', respectively, and records the time at which the image output of the display object 4 is read. The intersection position of the object 4 of the intersection point calculation unit = the object coordinate position of the display object 4 and the response object 2 piece 4 i back 'the coordinate coordinate unit 324 calculates the corresponding correspondence of the display object according to the 2nd coordinate system of the Zhao _ In order to judge the ball of the ^ ^ ^ ^ trace of the object of the ship and the second point of the pure movement of the ball to the same point of the ball. If yes, at the same time, if there is no second track 62 and the second motion track 64 and there is a match, then the right parent is (if the two paths are completely unsuccessful, or if the 'single pass minus the intersection point calculation unit is notified) Included in the news; according to the unsettled meeting and / or related strike record data = face scene "Dankou phantom one sports obstruction 62 and the second sports time point to generate a meeting. The meeting is turned 64 in the same positive module The display 4 obtained in 32 is divided by the trajectory calculation unit 34 〇 to calculate the display object 4 and the library:; the reaction movement exercised after the stick hits (that is, the mouth should be the object 2 parent) Trace, the reaction motion generated after the intersection of the object 4 and the response object 2 Le 16/28 66 M409872 The trace 66 includes the reaction distance and the reaction direction of the object 4, and the changes of the two are affected by the return stress of the object 2, It shows the degree when the object 4 is hit, and the incident angle of the object 4 in contact with the response object 2. Since the simulation (4) is not a real world collision, the impact caused by the response object 2 can be made. The amount is completely connected by the display object 4 (Iv) to calculate a model while riding since the object transported 4㈣- pure based data recording track 62 by the interactive device according to the memory unit 3 (not shown in FIG. 2) to yield

生’因此可預先定義顯示物件4的質量為m。及顯示物件4 的初速度V〇和加速度a〇。 刀It is thus possible to pre-define the mass of the display object 4 to be m. And displaying the initial velocity V〇 and the acceleration a〇 of the object 4. Knife

队0入队佩机判、座玍早兀342的設定以及執跡辨識 單元348的計算,判斷出顯示物件4在輸出經過^單位時 間、以及回應物件2在揮擊經過^位時間時交會,且回 應物件2在t2單位時間之間的位置自起始 變動到交會點 l(XHhyHi,ZHi)。 (HP,yHP,HP) 物件2的移動距離s即根據起始點P愈交合點I 所對應的回應座標計算而得。即: ,、又曰』1 Κ?^ηΓ~~ 獲得移動距離"^(少你—少所)2+(Ζ//Ρ—2///)2 而可獲得_物件2/,更進―步根據加速度計算公式, ㈣;2速度a]: 應物件2 件2的質量與加速度a】計算出回 物件2的質量1’’’。#中球體時的回應力道F。其中,回應 記憶單元^ 職記祕絲絲置3的 F 二卿 a] 當顯示物件4被回應物件2擊中後,其反彈的加速度 17/28 a2係根據回應力道!:針 對續不物件4質量叫的作用而產生: 再由顯示物件4隨莖 逮度㈣顯示物件執跡運動的初速度v。、如 出顯示物件4根據第—運的時間h ’計算 : 連動執跡到達父會點I時的速户v ν^ν〇 + α〇Ίι 的反彈速Hi知1貝不物件4根據反應運動執跡66運動時 。其中%係指回應物件2與顯示物件4接觸的時間長度 值得—提的是,直實世尺由 間約為0.6至0 7毫/ 1中揮擊而接觸顯示物件的時 最接.先心,因此可預設6為0.6或〇.7毫秒。 飛行的時間勿件t根據反應運動執跡在空中 而移動的距^ S1:4 化十异出球體根據反應運動教跡 S^'V2't4 =充說明,上述對受擊物被擊打後其速度變化 其:::用物理學上的衝量關係來表示,即广—。·二 /、 V —V2—V1,係為顯示物件4的速度改變量。 '乂。十τττ顯示物件4根據反應運動轨跡飛行的時間長 ,4带換句活⑨’指顯示物件4停留於空_到墜地前的時間 則而要先計异出顯示物件4與回應物件2交會時,相對 於回應物件2所形成的一接觸平面的入射角度。 請參閱圖2與圖8Α,圖8Α為顯示物件入射角度及反 18/28 射角度之示意圖。接觸平面70係為回應物件2揮擊以致碰 觸到顯示物件4時所形成的一個虛擬平面。執跡辨識單元 348根據影像擷取模组3〇連續擷取的多個回應物件2之影 像,分析出回應物件2的接觸平面70,並由交會點計算= 元344分析顯示物件4依照第一運動轨跡62進入接觸=面 70時,與接觸平面70的正切方向產生的入射角度。顯 示物件4被擊中後的反彈方向即會相對於接觸平面7〇的^ 切方向’而以與人射角度L同大小的反射角度飛行。因此 ,轨,計算單元340根據反㈣度及重力加速度公式,印 可计算出顯示物件4依照反應運動轨跡66運動時的時間 度t4。當球體入射的角度越大,其反射的角度也越大,二圖 犯的入射角度所示。 01 繼續㈣Β 2,藉由上述各項變數的計算結果,軌跡 叶算單元34〇可獲得顯稀倾回應物件2擊巾後反’、 =和方向,並產生出反應運動執卿會圖單元%,由: 36«反應運純跡的内容產生顯示物件二 輪:到顯谢卜呈現出顯示物件反彈而飛行的物4 執跡計算單元340可進一步舻嬙θε — a 〜像 跡計算隨飛行路徑變化的即時·的反應運動執 元%緣賴雜《面時- = 並⑽圖單 算出各種對應於不同反應運動執跡的背景::疋340所計 不裝置1呈現出如臨實體揮擊場地的臨場^面以更在顯 位置可能會移動:定位位置,因此 識早兀35亦可持續接收影像 此 使用者影像,並㈣計算岐用者 =3G所接收的 座標,並細綱响㈣點計 M409872 ^點計算單元344在計算第一運動執跡62 跡64的交會點時,可依據使用者所在的位置自义動執 圖2未示)讀取座標對應單元324預先估算^^ 係,執行正確的運算。 π對應關 在另-個實施财,回應物件2可包括回饋單元 例如振動產生單元'音效單元或發光單元等。, 皁元為例,當交會點計算單元344計算出回庫物件^生 f力^後,可透過訊號傳輸單元%根據喊力❹^ ::線或無線地傳送回饋信號到振動產生單 $The team 0 enters the team, the setting of the 玍 玍 342 and the calculation of the tamper recognition unit 348, and determines that the display object 4 meets at the output of the unit time and the response object 2 is swiped through the time. And the position of the response object 2 between t2 unit time changes from the initial point to the intersection point l (XHhyHi, ZLi). (HP, yHP, HP) The moving distance s of the object 2 is calculated based on the response coordinates corresponding to the starting point P and the intersection point I. That is: , , 曰 曰 1 Κ ^ ^ ^ ^ ~ ~ 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 获得 2+ The step is based on the acceleration calculation formula, (4); 2 speed a]: The mass and acceleration of the object 2 2 are calculated] The mass 1'' of the object 2 is calculated. #回球的回重道F. Among them, the response memory unit ^ secretary secret wire set 3 F Erqing a] When the display object 4 is hit by the response object 2, its rebound acceleration 17/28 a2 is based on the back stress road! : The needle is generated by the effect of the continuation of the object 4 mass: and then the display object 4 with the stem catching degree (4) shows the initial velocity v of the object's obstruction movement. If the display object 4 is calculated according to the time h ' of the first transport: the speed of the fast-moving v ν ^ ν 〇 + α 〇Ί 时 到达 Hi Hi Hi Hi Hi Hi Hi Hi Hi Hi Hi When the 66 was in motion. % refers to the length of time that the response object 2 is in contact with the display object 4. It is worth mentioning that the straight-footed ruler is the most connected to the display object when the swing is between 0.6 and 0.7 millimeters/1. Therefore, it can be preset to 6 to 0.6 or 〇.7 milliseconds. The time of flight is not t. According to the reaction movement, the distance moved in the air ^ S1:4 The spheroidal spheroid is according to the reaction movement teachings S^'V2't4 = charging, after the above-mentioned attack on the object is hit Its speed changes::: It is expressed by the relationship of physics, that is, wide-. · 2 /, V - V2 - V1, is to display the amount of speed change of the object 4. '乂. Ten τττ shows that the object 4 travels according to the reaction trajectory for a long time, and the four-segment exchanges the live 9' finger to indicate that the object 4 stays in the space _ to the time before the fall, but the difference between the display object 4 and the response object 2 is met. The angle of incidence with respect to a contact plane formed by the response object 2. Please refer to Fig. 2 and Fig. 8Α. Fig. 8Α is a schematic diagram showing the incident angle of the object and the inverse 18/28 angle. The contact plane 70 is a virtual plane formed in response to the object 2 swiping so as to touch the display object 4. The trace recognition unit 348 analyzes the contact plane 70 of the response object 2 according to the image of the plurality of response objects 2 continuously captured by the image capture module 3, and analyzes the display object 4 according to the intersection point calculation = 344. When the motion trajectory 62 enters the contact = face 70, the angle of incidence with the tangential direction of the contact plane 70. The rebound direction of the display object 4 after being hit is to fly at a reflection angle of the same magnitude as the human angle L with respect to the contact direction of the contact plane 7〇. Therefore, the track, calculation unit 340 calculates the time t4 when the display object 4 moves in accordance with the reaction motion trajectory 66 based on the inverse (four) degree and the gravitational acceleration formula. When the angle of incidence of the sphere is larger, the angle of reflection is larger, as shown by the incident angle of the second graph. 01 Continuing (4) Β 2, by the calculation result of the above variables, the trajectory leaf calculating unit 34 〇 can obtain the faint tilting response object 2 after the scarf, the opposite ', = and direction, and generate the reaction movement control unit map unit% , by: 36 «Reaction of the content of the pure trace to produce the display object two rounds: to the display of the object showing the object bounced and flying 4 the execution calculation unit 340 can further 舻嫱 θ ε — a ~ trace calculation varies with the flight path The real-time reactionary movement is based on the fact that the face-time and the (10) chart calculate various backgrounds corresponding to the different reaction movements:: 疋340 does not count the device 1 to present the physical swing site The spot surface may move in a more visible position: the positioning position, so the image of the user can be continuously received by the image 35, and (4) the coordinates received by the user = 3G are calculated, and the coordinates are calculated (4) M409872 ^ The point calculation unit 344 can calculate the coordinate position of the first motion trace 62 trace 64, and can read the coordinate correspondence unit 324 according to the position of the user. The correct operation. The π corresponds to the other, and the response object 2 may include a feedback unit such as a vibration generating unit, a sound effect unit or a light emitting unit. For example, after the intersection point calculation unit 344 calculates the return object, the signal transmission unit can transmit the feedback signal to the vibration generation unit by the signal transmission unit % according to the shouting force.

生早^根據_信號產生對應於喊力道 X 動= 讓使用者感受到更擬真的打擊經驗。m' 元時’音效單元輕光單元== 你丨彳#產生相對於回應力道f的聲響或产朵。 1 ·回應力道F越強音效單元發出 < & ° 道F越弱則聲音越小.+ 3 «的卓曰越大,回應力 的光咬越☆ G 或疋回應力道F越強發光單元發出 泉越冗,反之則發出微弱的光線。 裝置應的圖式中’ 相對應的球棒,具體例示’而回應物们則為 (如遙控器)對互—動呆作上’使用者可透過遠端控制 件的項目,例如對應x 輸入控制命令’切換顯示物 件會對應不同的運肩球打擊練習。不同的練習顯示物 經即大異其趣。校模式’例如棒球與網球的運動路 農置3被指定的遊戲=32^處34需依照互動式 示物件影像時產生_、。撼生弟—運動執跡及輸出顯 20/28 172 172 請 Γ參照圖9,圖9洽千下士盘 的方塊圖。圖9所示的互曰動式二述的互動式裝3a ,本實施例的緣圖單元36耗接於傳^不=之處在於 過訊號傳輪單元38將緣圖單元用以透 、執跡計算單元340或物件座桿1 ^產生早70 342 製出來的顯示物件4的影像訊號傳;觸所繪 。此外,座標對應單元324 加以輸出 應關係可記錄在記憶單元37,使用者)的座標對 斷顯示物件W件的交算單元⑽判 置以中,回應物件除可為球棒或球拍等打轉 罝以揮擊如棒球或網球等顯乎打莩裝 =或手排時,亦可為使用者的二 為回應物件。校正_ 32 ^作 杈組34的軌跡辨識單元州 ^早凡322或處理 辨識出影像t賴用者賴,藉;術而 如上肢末端的手掌)與顯示物件4接:二者特糊 〔互動式裝置的再一實施例〕 請參閱圖1G,在另-實施例中, 括相接於處理模組34的連接單元39,用^裝置处還可包 或無線傳輸的方式另一 5 + 透匕連接線390 ,以供不同使二過…結 互相連結進行雙向對打的互動式打。 產生單元342將對方的顯示物件的反應運動^相由執跡 為自J的顯示物件的第一運動轨跡而 ::轉換 可達成連線進行雙向互動的目的。 W不衣置’即 21/28 M409872 〔實施例的可能功效〕 示δ上述各實施例的内容, •與運作。所述互動式裝置可透過 ==像’同時亦以影像擷取 擷取= ;牛=應動作時的影像。並利用各式影像分析及 又吏互動式裝置計算出應 :物件的移動,得知回應物収否;影像 示出相對應的活動,供使用者以门j應物件父會與否顯 揮擊以+ ’、吏用者㈣應物件對立體影像進行 ::_動場館擊球等互動活動的場景 件的虛擬飛行路徑,不需要建㈣二 到保護環境的效果。除此之外,使用:白=,間接達 件會損壞週遭器物或用品,所述的互心顯示物 室内場地’如客廳或書房等= 動式裝置可细各種專用電路(ASIC) 互 析或運算元件,或由軟體程式分別達成各^的各個分 能,故其建置成本低且更新容易達成各辑所負責之功 作:Γ為本創作之實施例’其並非用簡限本創 【圖式簡單說明】 本創作實施例提供的-種互動式裝置之干音圖. 圖2 :本創作實施例提供的—種互動 〜圖, 圖;圖3:賴作實關提__物件㈣^=意 22/28 M409872 圖4:本創作實施例提供的回應物件的回應座標之示意 圖; 圖5:本創作實施例提供的回應物件與顯示物件交會之 不意圖, 圖6A及6B :本創作實施例提供的回應物件與顯示物 件的對應位置之示意圖; 圖7:本創作實施例提供的回應物件與顯示物件執跡之 示意圖; 圖8A及8B :本創作實施例提供的受擊裝置入射角度 及反射角度之示意圖; 圖9:本創作之實施例提供的一種互動式裝置之方塊圖 ;及 圖10:本創作再一實施例提供的一種互動式裝置之方 塊圖。 【主要元件符號說明】 1, la-lc顯示裝置 10顯示平面 2回應物件 20回饋單元 22,22’接觸點 3, 3a-3b互動式裝置 30影像擷取模組 300擷取平面 32校正模組 320物件座標輸出單元 23/28 M409872 322回應座標計算單元 324座標對應單元 34處理模組 340執跡計算單元 342執跡產生單元 344交會點計算單元 346通知單元 348執跡辨識單元 35位置辨識單元 36繪圖單元 37記憶單元 38訊號傳輸單元 39連接單元 390連接線 39無線傳輸單元 4, 4a-4c顯示物件 5, 5a-5c使用者 62第一運動執跡 64第二運動執跡 66反應運動轨跡 70接觸平面 24/28Early life ^ according to the _ signal generated corresponding to the shouting force X movement = let the user feel more realistic combat experience. M'yuan time' sound unit light unit == you 丨彳# produces sound or production relative to the return stress path f. 1 · The stronger the back stress path F is, the sound unit emits <& ° The weaker the F is, the smaller the sound is. The larger the + 3 «, the more the back-stressed light bites ☆ G or the returning stress path F is stronger. The more redundant the spring is, the worse the light is. In the diagram of the device, the corresponding bats are specifically exemplified, and the respondents are (for example, the remote control) for the mutual-moving operation of the item that the user can pass through the remote control, for example, corresponding to the x input. The control command 'switch display object will correspond to different shoulder ball strike exercises. Different exercises show that things are very different. The school mode, such as the sports road of baseball and tennis, is designated as the game = 32^ where 34 is required to generate an image in accordance with the interactive image.撼生弟—Sports Obstruction and Output Display 20/28 172 172 Please refer to Figure 9 and Figure 9 for the block diagram of the thousand corporal disk. In the interactive device 3a of the mutual swaying type shown in FIG. 9, the edge map unit 36 of the present embodiment is used for the transmission of the signal. The trace calculation unit 340 or the object holder 1 ^ generates an image signal transmission of the display object 4 which is prepared as early as 70 342; In addition, the coordinate correspondence unit 324 outputs an output relationship that can be recorded in the memory unit 37, and the coordinate of the user is determined by the intersection unit (10) of the broken display object W, and the response object can be rotated for the bat or the racket. In the case of a swipe such as baseball or tennis, such as a snoring = or a hand row, it can also be a response object for the user. Correction _ 32 ^ The trajectory identification unit of the 杈 group 34 is 322 or processed to recognize the image t 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 。 Still another embodiment of the device] Referring to FIG. 1G, in another embodiment, a connection unit 39 connected to the processing module 34 is used, and the device can also be packaged or wirelessly transmitted by another 5+.匕Connecting line 390, for different interactions, so that the two links are connected to each other to perform an interactive play. The generating unit 342 converts the reaction motion of the display object of the other party by the first motion trajectory of the display object from J to :: the conversion can achieve the purpose of two-way interaction. W is not placed on the same's 21/28 M409872 [possible efficacy of the embodiment] shows the contents of the above embodiments, and operation. The interactive device can pass the == image while also capturing the image with the image capture; the cow = the image when the action is performed. And using various types of image analysis and interactive devices to calculate: the movement of the object, to know whether the response is received; the image shows the corresponding activity, for the user to respond to the object The virtual flight path of the scenes of the interactive activities such as the action of the object of the '3' and the object (4):: _ The virtual flight path of the interactive scene such as the ball striking, does not need to build (four) two to protect the environment. In addition, use: white =, indirect parts will damage the surrounding objects or supplies, the mutual display of the indoor venues such as the living room or study, etc. = mobile devices can be fine-grained special circuit (ASIC) The computing components, or the software program respectively achieve each of the different components, so the cost of construction is low and the update is easy to achieve the work of each series: Γ is the embodiment of the creation 'it is not limited to the original Brief Description of the Drawings] The dry sound map of the interactive device provided by the present embodiment. Fig. 2: The interaction of the present embodiment provides a kind of interaction ~ map, Fig. 3: Fig. 3: Lai Zuoguan mentions __ object (four) ^= Figure 22: Schematic diagram of the response coordinates of the response object provided by the present embodiment; Figure 5: Unintention of the response object and the display object provided by the present embodiment, Figs. 6A and 6B: The present embodiment A schematic diagram of the corresponding position of the response object and the display object; FIG. 7 is a schematic diagram of the response object and the display object provided by the present embodiment; FIG. 8A and FIG. 8B are views of the incident angle and reflection of the device provided by the present embodiment. angle The schematic diagram; FIG. 9: block diagram of an interactive apparatus according to the present embodiment of FIG Creation provided; and Figure 10: a block diagram of an embodiment of the interactive apparatus to provide another embodiment of the present writing. [Main component symbol description] 1, la-lc display device 10 display plane 2 response object 20 feedback unit 22, 22' contact point 3, 3a-3b interactive device 30 image capture module 300 capture plane 32 correction module 320 object coordinate output unit 23/28 M409872 322 response coordinate calculation unit 324 coordinate correspondence unit 34 processing module 340 execution calculation unit 342 execution generation unit 344 intersection point calculation unit 346 notification unit 348 trace identification unit 35 position recognition unit 36 Drawing unit 37 memory unit 38 signal transmission unit 39 connection unit 390 connection line 39 wireless transmission unit 4, 4a-4c display object 5, 5a-5c user 62 first motion track 64 second motion track 66 reaction motion track 70 contact plane 24/28

Claims (1)

M409872 六、申請專利範圍: 1. 一種互動式裝置,用以供一使用者透過一顯示裝置進行 遊戲,包括: 一影像擷取模組,擷取一回應物件之複數影像; 一處理模組,包括一執跡產生單元、一執跡辨識單元 、一交會點計算單元及一執跡計算單元,其中: 該軌跡產生單元,耦接該顯示裝置,用以產生一顯 示物件的一第一運動執跡; 該軌跡辨識單元,耦接該影像擷取模組,接收該回 應物件之該等影像,以辨識該回應物件之一第二 運動執跡; 該交會點計算單元,耦接該執跡產生單元及該執跡 辨識單元,用以計算該顯示物件依照該第一運動 執跡運動及該回應物件根據該第二運動轨跡運 動而交會時之一交會座標;及 該執跡計算單元,耦接該交會點計算單元,用以根 據該第一運動軌跡、該第二運動執跡及該交會座 標,計算該顯示物件與該回應物件交會後之一反 應運動執跡; 一繪圖單元,耦接該執跡產生單元、該執跡計算單元 及該顯示裝置,根據該第一運動軌跡繪製該顯示物 件的立體影像,以及繪製該顯示物件對應於該反應 運動執跡之立體影像,並分別輸出所繪製的立體影 像到該顯示裝置;及 一通知單元,辆接該交會點計算單元及該顯示裝置, 當該顯示物件及該回應物件分別根據該第一運動 25/28 軌跡及該第二運動勤 交m " 跡運動而不交會時,產生-未 2 ,. θ、〇,亚輸出到該顯示裝置。 •申請專利範圍第】項所述的 =單元更根據該回應物件的運動距離及:: 以及該回應物件之曾旦 、切吁间 件接觸時的-回應力該回應物件與該顯示物 3. 口範2項所述的互動式裝置,其中,該交 二應===組所擷取的該等影像中, 出該運動距離」根=件交會的位移變化而計算 質量及加速度計算該回應及根據_應物件之 4. :申請專利範圍第2項所述的互動式裝置,其中,兮反 二執跡包括該顯示物件的—反應距離及一反庫速 :其中’該執跡計算單元根據該回應力道、該顯;物; =量^顯示物件依據該第—運動執跡運動G = ^ 出遠反應速度’以及根據該反應速度及 應時間計算出該反應距離。 5. 如申請專利範圍第4項所述的互動式裝置,並中, ,運動軌跡包括關示物件的—反㈣度,該執跡^ 早元根據該回應物件與該顯示物件交會時,該回Z =成的-接觸平面,以及該顯示物件到達該接觸;二 入射角度,計算出該反射角度。 、 6·如申4專利!&圍第2項所述的互動式裝置,更包括: -訊號傳輸單元’耦接該交會點計算單元,用以輸 忒回應力迢所轉換的一回饋信號到該回應物件,藉 26/28M409872 VI. Scope of Application: 1. An interactive device for a user to play through a display device, comprising: an image capture module for capturing a plurality of images of a response object; a processing module, The method includes: a track generating unit, a track identifying unit, a meeting point calculating unit, and a track calculating unit, wherein: the track generating unit is coupled to the display device for generating a first motion display of the display object The track recognition unit is coupled to the image capture module to receive the images of the response object to identify a second motion trace of the response object; the intersection point calculation unit is coupled to the trace generation The unit and the obstruction identification unit are configured to calculate one of the intersection coordinates of the display object according to the first motion tracking motion and the response object moving according to the second motion trajectory; and the execution calculation unit, coupled And the intersection point calculation unit is configured to calculate, according to the first motion track, the second motion track, and the intersection coordinate, the display object and the response object And a drawing unit, coupled to the obstruction generating unit, the obsolete computing unit, and the display device, drawing a stereoscopic image of the display object according to the first motion trajectory, and mapping the display object correspondingly Stereoscopic images of the reaction movement, and respectively outputting the drawn stereoscopic images to the display device; and a notification unit that is connected to the intersection point calculation unit and the display device, when the display object and the response object are respectively The first motion 25/28 trajectory and the second motion diligent m " track motion do not intersect, generating - not 2, . θ, 〇, sub-output to the display device. • The unit of the patent application scope refers to the movement distance of the response object and:: and the response object, the returning stress and the returning stress of the response object and the display object 3. The interactive device according to Item 2, wherein the intersection of the intersection and the === group of the images, the movement distance "root = the displacement change of the piece intersection and the calculation of the mass and the acceleration calculation of the response 4. According to the object of the invention, the interactive device described in claim 2, wherein the anti-two execution includes the reaction distance of the display object and an anti-depot speed: wherein the 'destruction calculation unit According to the return stress track, the display; the quantity ^ indicates that the object is calculated according to the first motion-moving movement G = ^ far-distance reaction speed ' and the reaction distance is calculated according to the reaction speed and the response time. 5. The interactive device according to claim 4, wherein the motion track includes an inverse (four) degree of the object, and the tracker ^ is based on the response object and the display object, the Back to Z = the - contact plane, and the display object reaches the contact; two angles of incidence, the angle of reflection is calculated. 6, 6 such as Shen 4 patent! And the interactive device of the second item, further comprising: - the signal transmission unit is coupled to the intersection point calculation unit for transmitting a feedback signal converted by the stress 到 to the response object, by 26/ 28
TW100203504U 2011-02-25 2011-02-25 Care interacting instrument TWM409872U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW100203504U TWM409872U (en) 2011-02-25 2011-02-25 Care interacting instrument

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW100203504U TWM409872U (en) 2011-02-25 2011-02-25 Care interacting instrument

Publications (1)

Publication Number Publication Date
TWM409872U true TWM409872U (en) 2011-08-21

Family

ID=45085684

Family Applications (1)

Application Number Title Priority Date Filing Date
TW100203504U TWM409872U (en) 2011-02-25 2011-02-25 Care interacting instrument

Country Status (1)

Country Link
TW (1) TWM409872U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8982047B2 (en) 2012-09-25 2015-03-17 Au Optronics Corp. Autostereoscopic display system and control method thereof
US9300950B2 (en) 2012-08-20 2016-03-29 Au Optronics Corporation Entertainment displaying system and interactive stereoscopic displaying method of the same

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9300950B2 (en) 2012-08-20 2016-03-29 Au Optronics Corporation Entertainment displaying system and interactive stereoscopic displaying method of the same
US8982047B2 (en) 2012-09-25 2015-03-17 Au Optronics Corp. Autostereoscopic display system and control method thereof

Similar Documents

Publication Publication Date Title
CN106796453B (en) Driving a projector to generate a shared space augmented reality experience
US8282481B2 (en) System and method for cyber training of martial art on network
KR101007947B1 (en) System and method for cyber training of martial art on network
US9599821B2 (en) Virtual reality system allowing immersion in virtual space to consist with actual movement in actual space
JP6555513B2 (en) program
CN101991949B (en) Computer based control method and system of motion of virtual table tennis
WO2015180497A1 (en) Motion collection and feedback method and system based on stereoscopic vision
TW201244776A (en) Virtual golf simulation apparatus and sensing device and method used for the same
US12569721B2 (en) Virtual evaluation tools for augmented reality exercise experiences
JP2014531588A (en) System and method for detecting a user-dependent state of a sports item
WO2019187862A1 (en) Information processing device, information processing method, and recording medium
CN103732299A (en) 3d device and 3d game device using a virtual touch
JP2016047219A (en) Physical movement practice support system
US20070021199A1 (en) Interactive games with prediction method
US20190172271A1 (en) Information processing device, information processing method, and program
CN104258555B (en) Adopt two body-building interactive systems of fisting the goal of RGBD visual sensing
US20070021207A1 (en) Interactive combat game between a real player and a projected image of a computer generated player or a real player with a predictive method
TWI492096B (en) 3d image interactive system and position-bias compensation method of the same
CN108434698B (en) Sports ball game teaching system
TWI423114B (en) Interactive device and operating method thereof
CN111672089B (en) An electronic scoring system and implementation method for multiplayer confrontation projects
TWI835289B (en) Virtual and real interaction method, computing system used for virtual world, and virtual reality system
JP7248353B1 (en) Hitting analysis system and hitting analysis method
TWM409872U (en) Care interacting instrument
CN202142008U (en) interactive installation

Legal Events

Date Code Title Description
MM4K Annulment or lapse of a utility model due to non-payment of fees