TW200811690A - A camera-based human-computer interaction system - Google Patents

A camera-based human-computer interaction system Download PDF

Info

Publication number
TW200811690A
TW200811690A TW95131220A TW95131220A TW200811690A TW 200811690 A TW200811690 A TW 200811690A TW 95131220 A TW95131220 A TW 95131220A TW 95131220 A TW95131220 A TW 95131220A TW 200811690 A TW200811690 A TW 200811690A
Authority
TW
Taiwan
Prior art keywords
camera
computer interaction
human
screen
display
Prior art date
Application number
TW95131220A
Other languages
Chinese (zh)
Other versions
TWI407333B (en
Inventor
Jyh-Horng Chen
Original Assignee
Iotech Co Ltd
Jyh-Horng Chen
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Iotech Co Ltd, Jyh-Horng Chen filed Critical Iotech Co Ltd
Priority to TW95131220A priority Critical patent/TWI407333B/en
Publication of TW200811690A publication Critical patent/TW200811690A/en
Application granted granted Critical
Publication of TWI407333B publication Critical patent/TWI407333B/en

Links

Landscapes

  • Controls And Circuits For Display Device (AREA)
  • Studio Devices (AREA)

Abstract

This present invention is to develop human-computer interactive method based on a portable camera device. It adopts real-time digital image processing methods to calculate the position of the display screen within the captured images. Then, the relative position between camera and monitor screen can be easily defined to attend the human-computer interactive applications. With the benefit of this present invention, users can easily and intuitively utilizes the portable camera, pointing at the desired points on image display region, to control the user interface of the set-top box/media center, which replaces the function of conventional mouse in a computer system.

Description

200811690 九、發明說明: 【發明所屬之技術領域】 本發明係關於一種以攝影機為基礎之人 攝影機對螢幕顯示器拍攝,對拍攝的影像一 如機對螢幕顯不關之相對關係,進而達成互動控制的目的。 【先前技術】 近年來,隨著科技的不斷演進創新,新一代顯干哭 =:(’=大尺二的顯示。 g颍不TO(CRT),其新一代顯示器包括電 、-代顯示器中。而大;機g家並無法應用於新 面成像掃關步管顯示器晝 弟一人稱的射擊遊戲。但礙於平面頻: • 限的:了==的r搶便無法適用 先作描邊處理,藉以獲得 像 影像,其描邊處理後二只包括榮幕顯示器之 者在操作上的流^^不益上之困難度。此缺點將會影響使用 方法,藉由之^點,本發明提供一種新的人機互動 動式攝衫機拍攝螢幕顯示器,對拍攝影像中之 5 200811690 示書i内ίίίΐίΓ術’柯體_嫩心對螢幕顯 位4座標值,轴達成人機互動控_目的。其定 ==巧排序之RGB色塊組合,可降低顯示影像外^ 像干^""财魄料麟崎幕齡旨於拍攝影 示哭旦^值。耕,若攝影機娜畫面只能取得部分之螢幕顯 影齡驗類及攝 【發明内容】 此裝巧=:;^攝^^基^之人機互動方法,透過 撰罝㈣D士使用者在透過機上盛或多媒體中心進行互動式 解争習,大幅度的提升使用者操控的便利性;此外,亦可 叶獨、線搶不翻於新—代顯示器的窘境。此發明藉由設 ίίί攝’ __影像進行影像處理技術,即可定義 對整=$ ί拍攝影像巾的細座標值,並換算出攝影機中心 二3::::的瞄準座標’進而達成互動控制的目的。本 ,月所广成的攝影機定位裝置具有以下的特色: • 於螢軸示11的麵,提升此裝置於絲系統整合 的應用價值。 Ζ 影機外I需要外加額外的感應器辅助榮幕顯示器 1 用場合之背複雜度與環境光源容忍度高,保持 極鬲的系統執行穩定度與準確度。 卞行 的近人性的操控特性,以絕對指向取代相對位移 5么^式’藉由攝影機的指向的動作來操控滑鼠游標的移動。 • 執行的過程中未拍攝完整的螢幕顯示器時,亦能計 機中心對螢幕顯示晝面内的醇座標值, 者不受限制的控制滑鼠游標之移動。 200811690 為達成上述的功能及特色,本發明 用的影像處理技術,錄圖就本㈣之 標記及採 方法與步驟,有祕對本發明案的全麵解、。Μ詳加說明其 【實施方式】 為了使本發明的特徵與產業的利用價山μ =贿所附圖示,對本發明之攝影機游標 巧 描述。本發明之操作示意圖如圖一 的f置做砰細的 ⑴以移動式攝影機控制電腦滑鼠游標。而供使用者 =訊攝影機作為影像齡輸人裝置(如圖;用 於顯示器(2)上的畫面,對拍攝的影 ‘^白攝電月自顯不 可計算螢幕顯示ϋ於拍攝影像中_ ϋ象處理技術,即 之移動。座標,藉此控制電腦滑鼠游標 其中心點心200811690 IX. Description of the invention: [Technical field of the invention] The present invention relates to a camera-based human camera that shoots a screen display, and the relative relationship between the captured image and the screen is displayed, thereby achieving interactive control. the goal of. [Prior Art] In recent years, with the continuous evolution of technology, a new generation of dry crying =: ('=Daily two display. g颍 not TO (CRT), its new generation of displays including electric, - generation display The big one; the machine g can not be applied to the new face imaging sweeping the tube display, the younger one's shooting game. But because of the plane frequency: • limited: the == r grab can not be applied to the first stroke Processing, to obtain image-like images, the difficulty of the operation of the two screens including the screen display after the stroke processing is difficult. This shortcoming will affect the method of use, by means of the point, the present invention Providing a new human-machine interactive dynamic camera to shoot the screen display, in the shooting image of 5,200811690, the book, the internals of the ' ί _ _ body _ tender to the screen display 4 coordinates, the axis reached human-computer interaction control _ Purpose:========================================================================================================== Can only obtain part of the screen development age class and photo [invention content] =:;^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ It can also be used to define the fine coordinate value of the image towel of the whole =$ ί, by using the image processing technology of the ί ί _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ Converting the aiming coordinates of the camera center 2:::: to achieve the purpose of interactive control. The camera positioning device widely used in the month has the following features: • Lifting the device on the surface of the fluorescent shaft 11 Application value of system integration Ζ Outside the camera I need additional sensors to assist the glory display 1 The back complexity of the occasion and the environmental light source tolerance are high, maintaining the stability and accuracy of the extremely stable system execution. The human-like control feature replaces the relative displacement with absolute pointing. The motion of the mouse cursor is manipulated by the pointing action of the camera. • When the complete screen display is not captured during the execution, The computer center displays the alcohol coordinate value in the screen on the screen, and controls the movement of the mouse cursor without restriction. 200811690 In order to achieve the above functions and features, the image processing technology used in the present invention is recorded in the picture (4). The method and the steps of the present invention have a comprehensive solution to the present invention. The detailed description of the present invention is directed to the camera of the present invention in order to make the features of the present invention and the use of the industry. The cursor is described in detail. The operation diagram of the present invention is as shown in Fig. 1. (1) The mobile camera controls the computer mouse cursor, and the user is used as the image age input device (as shown in the figure; (2) The picture on the screen, the shadow of the shooting '^白电电月自显不算算屏显示ϋ In the shooting image _ 处理 处理 processing technology, that is, moving. Coordinates to control the computer mouse cursor

動至攝影_準之位置。同理U ,移動式攝影機裝置⑺如圖二所示,由?=的=攝 組成,利用視迅攝影機模組之鏡頭(職^螢 職㈣供使用者左 細入之人機互動方法,S將四組不同排序的R、G、B irffi記,搭配播放影像共同顯示於顯示器所顯示的晝面 Xfri #衫機拍攝勞幕顯示畫面後,辨識出此四組定位標記 半搞姑山ΐ以疋義出螢幕顯示器於拍攝影像中的座標值,進一 二、t峰影機拍射々於齡畫面巾觸準的位置。其定位 置於榮幕顯示晝面的左上角⑼,右上角(12),右 下角(13),左下角⑽的R、G、Β組合色塊(如圖三所示)。螢 7 200811690 度的提升定位標記於顯“ζ%色像素的干擾’大幅 係利用數位:處圖理 咖置’並即輪之;方塊圖。⑻為滑: Α·影像擷取程序 辨心猶储由触麟處理之方法,即時 續的ΐiί ,的定位點。於考量開發成本及後 壯後’以—般市面上的網路攝影機作為影像娜 去沾11保游^疋位的流暢度。而影像擷取大小為320x240像 素的衫像,以24位元之RGB格式輸入。 Β·影像校正程序 由於輸入影像之格式為RGB,而RGB色彩 =象^,係以紅藍、綠三元色所組成,將亮度 ”知^ (chrominance)成份同時混合於紅、藍、綠三元 2由三元色的成份比例呈現影像中像素的顏色,為^度互相 ,色彩模型。因此當環境光源改變時,攝影機所拍攝之$ ”之改變。礙於RGB色毅間亮度與彩度混和的^ I二光源的影響將會平均的分佈於R、G、B色彩頻帶中,可 旎導致以顏色為基礎之影像處理效果呈現不穩定的 了要降低或消除環境光源變動的影響,提升系統辨識顏色定^立 8 200811690 c·定位標記即時辨識程序 由於拍攝影像中除了含有螢幕顯示器之晝面,亦 二=1景Ϊ體之晝面,且螢幕顯示器之背景物體i色不? 的i位3,i7fc辨識所偵測的RGB ^位標記為顯示畫面中 =中,除了以職_崎為觸標= 要=之外’亦要判斷RGB色塊是否相鄰且符合定位點 慕t系統利用四肢定位標記的獨特排序方式二 由比綱與Move to the position of photography _ quasi. Similarly, U, mobile camera device (7) as shown in Figure 2, by? = = composition, using the lens of the video camera module (the job ^ 职 职 (4) for the user to enter the human-machine interaction method, S will record four different sorts of R, G, B irffi, with the playback image common After displaying the screen display screen of the Xfri #shirt machine displayed on the display screen, the four sets of positioning marks are identified and the coordinates of the screen display are displayed in the captured image. The peak camera slaps the position of the aging screen towel. Its position is placed in the upper left corner (9), the upper right corner (12), the lower right corner (13), and the lower left corner (10) of the R-G, Β The combination of color blocks (as shown in Figure 3). Firefly 7 200811690 degree of improved positioning mark in the display "ζ% color pixel interference" is a large use of digits: at the map of the coffee set and immediately turn; block diagram. (8) Sliding: Α·Image capture program to distinguish the mind from the method of processing by the touch of Lin, the continuation of the ΐiί, the anchor point. After considering the development cost and the post-Zhuang Zhuang, the net camera on the market is used as the image. Dip 11 saves the fluency of the position, and the image capture size is 320x240 pixels For example, input in 24-bit RGB format. Β·Image correction program Because the format of the input image is RGB, and RGB color=image^, it is composed of red, blue and green ternary colors, and the brightness is known as chrominance. The composition is mixed in red, blue, and green ternary 2. The proportion of the ternary color is the ratio of the pixels in the image, which is the mutual color and color model. Therefore, when the ambient light source changes, the camera changes the $ ” The effect of the two light sources, which are mixed with the brightness and chroma of the RGB color, will be evenly distributed in the R, G, and B color bands, which may cause the color-based image processing effect to be unstable. To reduce or eliminate the influence of changes in the ambient light source, improve the system identification color setting. 200811690 c·Positioning mark instant identification program Because the captured image contains the screen surface of the screen display, The background object of the screen display i color is not? i bit 3, i7fc recognition detected RGB ^ bit mark as the display screen = medium, in addition to the job _ saki as the touch mark = want = outside 'also to judge RGB color Whether the block is adjacent and conforms to The location of the Mut system uses the unique ordering of the limb positioning markers.

, ,JW,b;GB ^己,反之|RB|大於|GB丨為右半部㈣落 與刚的外積值Z即可分辨出上半部與τ己 值Ζ大於零則判斷為下半部之角落點標記;^外 小於零顺_上半敎祕點標記。因此,勤丨, , JW, b; GB ^己, otherwise |RB| is greater than |GB丨 is the right half (four) falling and just the outer product value Z can distinguish the upper half and the τ value Ζ greater than zero is judged as the lower half The corner point mark; ^ is less than zero _ _ upper half 敎 secret point mark. Therefore, diligence

R G P / / / / B W 之距離差異及|RG|與_丨之外積值此 兩組條件判斷式,即可定義出定位標 記於,幕顯示晝面中的相對闕係。以 左圖定位標記為例,|BR|小於|BG|, 則^斷此RGB標記為左半部的角落 點標記,再加上|RG|與网的外積值Z 小於零,則為上半部,故此定健為 9 200811690 螢幕顯示晝面之左上角之定位點。最铁,依 ,,將依據先前成功擷取之螢幕顯示器之比例尺寸疋;= 以ί角ί =。因此,系統於執行:雜 於拍攝影像^=^=值料持_計糾螢幕顯示器 D·滑鼠游標控制程序 u 於減影像⑽之賴資訊後(圖The difference in distance between R G P / / / / B W and the product value outside the |RG| and _丨 can be used to define the relative tethers in the face of the screen. Taking the left positioning mark as an example, if |BR| is smaller than |BG|, then the RGB mark is the corner mark of the left half, and the outer product value Z of |RG| and the net is less than zero, which is the upper half. Ministry, so the fixed health is 9 200811690 screen showing the positioning point of the upper left corner of the face. The most iron, by , will be based on the scale of the previously successfully captured screen display; = at ί angle ί =. Therefore, the system is executed: mixed with the shooting image ^=^= value holding _ meter correction screen display D · mouse cursor control program u after reducing the image (10) depends on the information (Figure

&像座標原點(21),可定義拍攝影像中心點I 攝Ϊ機醇點。因此’即可換算出細幾拍t 幕ΓΓ面内的畔座標,啸制電腦滑鼠游標之定 記,本系的過程中即使只拍翻—個或二個定位點標 ^ r進^ (:、、^估計祕影機拍攝1^點對螢幕顯示畫面内 ίϋ’能讓朗者不受限制·術鼠游標之移"^。 組人定二f人f互動方法,係將四組不同排序的R、G、B 中^由^己德搭配播放影像共同顯示於顯示器所顯示的晝面 ΐ位幕顯示p後,辨識出此四組定位標記 攝中心於顯示畫面中所瞄準的位置。本系統 ^ ^ ^" /,不需於螢幕顯示器外框多加其它額外的硬體 ί本/明所統於拍攝影像巾定錢幕顯示11之座標值。其 位以其它形式呈現於螢幕顯示畫面中有助於定 之3傻,❿二日种請專利範圍之範•。再者,本發明所採用 ^处米用的方法步驟,若以硬體方式實現亦不脫本發 200811690 明申請專利範圍之範疇。 【圖式簡單說明】 圖一表示本發明實施例之操作示意圖。 圖二表示本發明實施例之移動式攝影機裝置。 圖三表示本發明實施例之定位標記。 圖四表示本發明實施例之方法架構圖。 圖五表示本發明實施例之滑鼠游標控制機制示意圖。 【主要元件符號說明】 1、 使用者; 2、 顯示器; 3、 移動式攝影機裝置; 4、 電腦滑鼠游標; 5、 移動式攝影機裝置; 6、 電腦滑鼠游標; 7、 移動式攝影機裝置; 8、 攝影機模組之鏡頭; 9、 滑鼠左鍵點擊之按鈕Γ 10、 滑鼠右鍵點擊之按鈕; 11、 左上角之定位點; 12、 右上角之定位點; 13、 右下角之定位點; 14、 左下角之定位點; 15、 方塊圖A; 16、 方塊圖B; 17、 方塊圖C; 18、 方塊圖D; 19、 螢幕顯示器; 11 200811690 20、 拍攝影像; 21、 拍攝影像座標原點; 22、 拍攝影像中心點座標值& like coordinate origin (21), you can define the image center point I to capture the alcohol point. Therefore, you can convert the coordinates of the side of the screen in the face of the screen, and make a note of the computer mouse cursor. In the process of the system, even if only one or two positioning points are pressed, ^ r into ^ ( :,, ^ Estimated secret camera to shoot 1 ^ point on the screen display screen ϋ ϋ 'can make the Lang unrestricted · mouse cursor cursor shift " ^. Group of people define two f people f interaction method, the system will be four groups Different sorts of R, G, and B are displayed in the same position as the display of the four sets of positioning marks on the display screen. This system ^ ^ ^ " /, does not need to add additional hardware in the frame of the screen display. The text is displayed on the screen. The position is displayed in other forms on the screen. In the display screen, it helps to make 3 stupid, and the scope of the patent range is required for the second day. In addition, the method steps used in the present invention are not implemented in hardware. Scope of the patent range. [Simplified illustration of the drawings] Figure 1 shows the operation of the embodiment of the present invention. Fig. 2 shows a mobile camera device according to an embodiment of the present invention. Fig. 3 shows a positioning mark of an embodiment of the present invention. Fig. 4 shows a structural diagram of a method according to an embodiment of the present invention. Fig. 5 shows a mouse cursor control according to an embodiment of the present invention. Schematic diagram of the mechanism. [Main component symbol description] 1. User; 2. Display; 3. Mobile camera device; 4. Computer mouse cursor; 5. Mobile camera device; 6. Computer mouse cursor; 7. Mobile Camera device; 8, camera lens; 9, button left click button Γ 10, right click button; 11, the upper left corner of the positioning point; 12, the upper right corner of the positioning point; 13, the lower right corner Positioning point; 14, positioning point in the lower left corner; 15, block diagram A; 16, block diagram B; 17, block diagram C; 18, block diagram D; 19, screen display; 11 200811690 20, shooting images; Shoot the image coordinate origin; 22, shoot the image center point coordinate value

Claims (1)

200811690 十、申請專利範圍: 1. 一種以攝影機為基礎之人機互動方法, $攝榮幕顯示器’對拍攝影像中之放=進 值,進而達成人機互動的目的。 旱座軚 2ϋ請專利範圍第1項所述之以攝影機為基礎之人 ίί’其中該顯示器可以為外投影器(extra-_eet0〇 3 3.=請專利第!項所述之輯影機為基礎之 以影像處理技術_螢幕顯示晝面中的粒 , =包括影像擷取程序、影像校正程序及定位標記^辨識 4·=請項所述之以攝影機為基礎之人機互動 識疋位標記的準確性及穩^性。 又#㈣_ 5‘ 項賴^縣縣基狀人機互動 6 .=請L3項所述之以攝影機為基礎之人機互動 行亦可實中二rfff技術可於任意作業平台之電腦中執 用。3見於曰曰片中,附加於更多的裝置上做為整合之應 • 述之以攝影機為基礎之人機互動 centejf^t ^ (S^ 單。 工时中,提供使用者操控顯示晝面中的内容選 如申明專利補第6項所述之以攝影機為基礎之人機互動 13 200811690 ίί標可朗於電麟輸人裝置巾,提供麵控電腦滑 專利細第6撕述之明影機絲礎之人 2:可朗於職機㈣光驗巾,提做用者進行射| ια 】ι 开螢幕顯不器四個角落的座標值。 ' -i 面中’亦可以實體方式黏^螢幕面器上所 方法,二=ί程1中;基礎之人機互動 時,亦可計算出攝影機中Γ&ΐίίίΓ器之播放影像 14 ^進而達成人機互動控制的=的幕扣晝面内的猫準座標200811690 X. The scope of application for patents: 1. A camera-based human-computer interaction method, the $ glory display ’ puts the value in the captured image to achieve the purpose of human-computer interaction.軚座軚 2ϋPlease refer to the camera-based person mentioned in item 1 of the patent scope ίί' where the display can be an external projector (extra-_eet0〇3 3.=Please refer to the camera of the patent item! The basic image processing technology _ screen display in the face of the grain, = including image capture program, image correction program and positioning mark ^ identification 4 · = the camera-based human-computer interaction identification mark Accuracy and stability. Also #(四)_ 5' Item Lai ^ County-level human-computer interaction 6 .= Please refer to the camera-based human-computer interaction described in L3 item can also be used in the second rfff technology Applicable in the computer of any operating platform. 3 Seen in the cymbal, attached to more devices as an integration. • The camera-based human-computer interaction centejf^t ^ (S^ single. In working hours, Provides user-controlled display of content in the face of the video camera-based human-computer interaction as described in claim 6 200811690 ίί standard can be used in the electric armored device towel, providing face-controlled computer sliding patent The 6th tearing of the people of the shadow machine 2: Rare to the job machine (four) light inspection towel, to mention the user to shoot | ια 】 ι open the screen to display the coordinates of the four corners. '-i face in the 'can also be physically glued to the method on the screen. 2 = ί程1; when the basic human-computer interaction, you can also calculate the play image of the camera in the camera and then click the cat coordinate in the screen of the man-machine interaction control =
TW95131220A 2006-08-25 2006-08-25 A camera-based human-computer interaction system and device TWI407333B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW95131220A TWI407333B (en) 2006-08-25 2006-08-25 A camera-based human-computer interaction system and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW95131220A TWI407333B (en) 2006-08-25 2006-08-25 A camera-based human-computer interaction system and device

Publications (2)

Publication Number Publication Date
TW200811690A true TW200811690A (en) 2008-03-01
TWI407333B TWI407333B (en) 2013-09-01

Family

ID=44767801

Family Applications (1)

Application Number Title Priority Date Filing Date
TW95131220A TWI407333B (en) 2006-08-25 2006-08-25 A camera-based human-computer interaction system and device

Country Status (1)

Country Link
TW (1) TWI407333B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI484376B (en) * 2012-08-09 2015-05-11 Pixart Imaging Inc Interacting system and remote controller

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6791467B1 (en) * 2000-03-23 2004-09-14 Flextronics Semiconductor, Inc. Adaptive remote controller
TW588258B (en) * 2002-06-18 2004-05-21 Zeroplus Technology Co Ltd Device for pointer positioning using photography method
TW200600167A (en) * 2004-06-30 2006-01-01 Zeroplus Technology Co Ltd Method for photographic indexing based on specific frame

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI484376B (en) * 2012-08-09 2015-05-11 Pixart Imaging Inc Interacting system and remote controller

Also Published As

Publication number Publication date
TWI407333B (en) 2013-09-01

Similar Documents

Publication Publication Date Title
US8818027B2 (en) Computing device interface
JP6153564B2 (en) Pointing device with camera and mark output
WO2013121455A1 (en) Projector, graphical input/display device, portable terminal and program
JP6723512B2 (en) Image processing apparatus, image processing method and program
CN110096209A (en) Handwriting trace display methods and device
WO2015029554A1 (en) Information processing device, information processing method, and program
CN103197778A (en) Display device, projector, display system, and method of switching device
TWI354220B (en) Positioning apparatus and related method of orient
JP2011164243A (en) Display processing device and program
US8752967B2 (en) Projection system and image processing method thereof
US8723893B2 (en) Color conversion apparatus, imaging apparatus, storage medium storing color conversion program and storage medium storing imaging program
JP5083697B2 (en) Image display device, input device, and image display method
WO2018032674A1 (en) Color gamut mapping method and device
CN106210701A (en) A kind of mobile terminal for shooting VR image and VR image capturing apparatus thereof
CN103543829A (en) Intelligent 3D (three-dimensional) visual presenter capable of realizing projection touch
JP2012215660A (en) Image display device, image processing method and image processing program
TW200811690A (en) A camera-based human-computer interaction system
KR100660137B1 (en) Input apparatus using a raser pointer and system for offering presentation using the apparatus
JP2015184988A (en) display device, projector, and display control method
EP1542471A4 (en) Image processing device, method, information processing device, method, recording medium, and program
US20090103811A1 (en) Document camera and its method to make an element distinguished from others on a projected image
JP5114140B2 (en) Image display system and method
JP2001350585A (en) Image display device with coordinate input function
JPH06103353A (en) Marker tracking/drawing device
US20120079435A1 (en) Interactive presentaion control system

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees