TW201112105A - Method and system of dynamic operation of interactive objects - Google Patents

Method and system of dynamic operation of interactive objects Download PDF

Info

Publication number
TW201112105A
TW201112105A TW98131430A TW98131430A TW201112105A TW 201112105 A TW201112105 A TW 201112105A TW 98131430 A TW98131430 A TW 98131430A TW 98131430 A TW98131430 A TW 98131430A TW 201112105 A TW201112105 A TW 201112105A
Authority
TW
Taiwan
Prior art keywords
unit
interactive
user interface
image
information
Prior art date
Application number
TW98131430A
Other languages
Chinese (zh)
Inventor
Chueh-Pin Ko
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to TW98131430A priority Critical patent/TW201112105A/en
Publication of TW201112105A publication Critical patent/TW201112105A/en

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a method and system of dynamic operation of interactive objects, suitable for the electronic system having user interface module and interaction module. Said method includes: receiving a starting signal outputted from a starting unit of the electronic system; starting the interactive objects in the interactive module according to the starting signal; receiving the detection information outputted from a detection unit of the electronic system, so that the interactive module generates a respond command for operating the interactive objects based on the detection information, and accordingly operates the interactive objects of the user interface currently outputted from the user interface module.

Description

201112105 六、發明說明: 【發明所屬之技術領域】 本發明是有關於一種動態操作互動物件之方法及系 統’特別是有關於一種利用偵測單元操作使用者介面的互 動物件的方法及系統。 【先前技術】 ws -前’現有的作業系統諸如MacOSX、Linux、Windo 係傳統 等輸入 #的使用者介面皆是二維晝面顯示。其主要原因 的使用者介面的設計,皆是為適應於滑鼠或鍵盤 裝置。 然而’現今的應用軟體的功能日益趨多,每個檔案戋 物件可被對應到的不同的應用軟體或功能,因此用軟 ,所設計7使用者介面,也隨不同的功能及應用也越來越 〔’、甚至是,使用者介面的功能選單的層次也越來越多, 2使用者在操作過程中必須經過多次的輸人,才可能完 成某一個操作。 以微軟的視窗作業軟·體而言,假設使用者要使 豕應用軟體,其輸入梦 點#蚩^ 為觸彳螢幕,較用者至少需要 軸直^ 方的開始物件’再㈣開始功能選單中的卷 式的物件,再點選小書_ =點選附屬應用程 操作,這用要經過多次的觸控才能完成-個 吏用者而§疋相當不方便的操作。 201112105 【發明内容】 種之提供— 便操作的問題。 '、專,,先使用者介面不方 方法㈣㈣魅動物件之 面模組及互動模:此方電、=充’此電伽 -啟動單元所輸出的=電子系統接收電子系統的 所ί現的一使用者介面的-互動物件:ΐ 互η 、測單謂輸出的—偵職訊,根據制資Μ在 互動杈組内取得對應的〆貝几在 以操作使用者介面的互動Μ。卩7奸线根據回應命令 υί中’偵測單元包括—影像擷取部及—影像計算部, 衫像麻部及影像計算部產生侧#訊的步驟至少包括 :影像操取部所輸出的複數個影像資料;根據各影像資料計 异至t局部特徵的-㈣資訊;根據變動資訊產生對應的 偵測 >、況。且麦動資訊係為使用者與影像擷取部間的相對動 /、中,偵測單元係為一加速度動能感測器,其產生偵 7貝汛的步驟至少包括,接收加速度動能感測器所輸出的至 少一軸向的加速度資料,根據加速度資料計算出一位移資 訊,再根據位移資訊產生對應的偵測資訊。 其中’電子系統根據回應命令操作使用者介面的步驟, 更進一步包括’根據一自動立體法(Autostereoscopic)改變使 201112105 用者介面至現的角度及形狀’據以改變使用者介面為一立體 物件。 其中,當電子系統根據啟動訊號啟動互動模組後,或 電子系統根據回應命令的操作使用者介面的互動物件後,皆 可更可進一步包括,接收電子系統的一操控單元所輸出的一 操作訊號,根據操作訊號在互動模組内取得對應的產生操作 命令,電子系統根據操作命令操作互動物件。 根據丰赞明BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a method and system for dynamically operating an interactive object, and more particularly to a method and system for operating a user interface of a user interface using a detecting unit. [Prior Art] The user interface of the ws-pre-existing operating system such as MacOSX, Linux, Windo, etc., is a two-dimensional display. The main reason for the user interface design is to adapt to the mouse or keyboard device. However, 'the function of today's application software is increasing, and each file can be corresponding to different application software or functions. Therefore, the user interface is designed to be soft, and the functions and applications are also brought along with different functions and applications. The more [', even the user interface's function menu is more and more, 2 users must go through multiple attempts during the operation, it is possible to complete a certain operation. In the case of Microsoft's Windows operating software, it is assumed that the user wants to use the software, and the input dream point #蚩^ is the touch screen. The user needs at least the starting object of the axis to start the function menu. In the volume of the object, then click on the small book _ = click on the auxiliary application operation, which can be completed after multiple touches - a user and § 疋 quite inconvenient operation. 201112105 [Summary of the Invention] Provision of the problem - the problem of operation. ', special, first user interface method (4) (four) charm animal parts face module and interactive mode: this square, = charge 'this electric gamma - start unit output = electronic system receiving electronic system A user interface - interactive object: 互 mutual η, the measurement of the output of the syllabus, according to the Μ 取得 取得 取得 取得 取得 取得 取得 取得 取得 Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ Μ According to the response command, the detection unit includes the image capturing unit and the image calculating unit, and the step of generating the side of the shirt image and the image calculating unit includes at least the plural output by the image capturing unit. Image data; based on each image data, the (-) information of the t-local feature is calculated; the corresponding detection is generated according to the change information. And the kinetic information is a relative motion/memory between the user and the image capturing unit, and the detecting unit is an acceleration kinetic energy sensor, and the step of generating the Detecting 7 汛 至少 includes at least receiving the acceleration kinetic energy sensor The outputted at least one axial acceleration data is calculated according to the acceleration data, and corresponding detection information is generated according to the displacement information. The step of the electronic system operating the user interface according to the response command further includes 'changing the user interface to the current angle and shape according to an Autostereoscopic change' to change the user interface to a three-dimensional object. After the electronic system activates the interactive module according to the activation signal, or after the electronic system operates the interactive object of the user interface according to the response command, the operating system may further include an operation signal output by a control unit of the receiving electronic system. The corresponding operation generating command is obtained in the interactive module according to the operation signal, and the electronic system operates the interactive object according to the operation command. According to Feng Zanming

顯示單 統,此系統包括一使用者斤面模組、一互動模組、一颟示导 元、一啟動單元及一偵測單元,其中,使用者介面模組係肩 有複數個使时介面’互動餘具有複數個互 :回應命r,顯示單元係顯示至少-使用者介面及互2 -二Ϊ早:啟動訊號,測單元提供輸出到 測資訊輸出對應的動τ被啟動’並刪 者介面的互動物件。應1令’且根據回應命令以操作使用 可具有下述二點:本明之動恚操作互動物件之方法,其 電子統間的相對位移,或電子 滑氣、鍵動物#,將可減少以 血乂觸控式螢幕的操作次數。 ⑺互動物件以電子系統 件’再搭配滑鼠、鍵盤或螢=動物 令電子乂卿径式螢幕的操作, 錢具備多樣化的操作方式。 201112105 【實施方式】 請參閱第1及2圖,分別為本發明之系統 立 圖,及本發明之動態操作互動物件之方法之動二:思 圖中:此方法適用於-電子系統!,此電子系統i 右 一顯示單元10、一使用者介面模組12及一互動模袓ι/、一 啟動單元16及-偵測單it 18,並由顯示單元10輪出使驗 介面模組12内的一使用者介面120(如圖形化使用者八 面,Graphical User lnterface,簡稱:GUI)及互動物二 (Human - Machine Interaction Interface),此方法包括. (S10)是否持續接收啟動單元16所輸出的一啟動%, 160,若是進行步驟(S2〇),否則進行步驟(s6〇f (S20)根據啟動訊號160啟動互動模組14内,用以輪 對應當前使用者介面120的一互動物件14〇 ·出 (S30)接收偵測單元18所輸出的一偵測資訊丨⑽; (S40)根據偵測資訊180在互動模組u内取得對應 回應命令142 ; ‘ ' (S50)電子系統】根據回應命令142操作使用者介 的互動物件140,再依照步驟(Sl〇)進行.^及120 (S60)維持當前的使用者介面12〇。 據上所述,本發明的電子系統1在互 閉前’倘測單元18會持續取得偵測資訊18()及破關 142’使得電子系,统1可在使用者介面120呈現命令 140 ’並知作使用者介面12〇的互動物件丨如 日物件 九共疋,不同 201112105 的偵測資tfl 18 0產生不同的回應命令i 4 2,進而對使用者介 面120有不同互動物件14〇產生,並進行相應的操作。如此, 減少反覆點選使用者介面12G的動作,達到簡化電子系統t 的操作步驟的目的。 清參閱第3圖所示,其係為本發明之第一實施例之 使用狀態不意圖。圖中,電子系統丨係一計算機系統, 顯不單元10係可為一陰極射線管(CRT)顯示器、液晶 (LCD)顯示器或觸控式|焉示器。偵測單元18包括一影像擷 取部丨82及一影像計算部184,其中影像擷取部182係可為 一網路攝影機(Web Cam)、·智慧型攝影機或具有擷取影像功 月b的装置,且影像摘取部182設置在顯示單元1 〇朝向面對 一使用者的方向,或影像擷取部182可被設置在使用者朝向 面對顯示單元10的方向’而影像計算部182與影像擷取部 184連接。請參閱第4圖’.係本發明在此實施例中的影像擷 取部及影像計算部產生债測資訊的流程圖,其包括: (5301) 影像擷取部182擷取複數個影像資料1820 ; (5302) 影像計算部184接收各影像資料1820 ; @303)影像計算部184計算各影像資料mo的至少一 局部特徵’據ώ產生一變動資訊1840 ;以及 (S304)影像計算部184根據變動資訊1840產生對應的 價測資訊180。 其中變動資訊1840係為使用者與影像擷取部182間的 相對動作,係包括使用者朝向影像擷取部182移動,使用者 遠離影像擷取部182移動、朝向顯示單元左側或右側移動、 201112105 使用者點頭/搖頭/轉頭或各種手勢…等,以及使用者做出各 相對動作的快慢速度及角度變化…等,進而產生使用者接 近、遠離、左/右移動、點頭/搖頭/轉頭…等偵測資訊180。 如此,不論是使用者的動作變化,或顯示單元10被移動, 皆可產生對應的偵測資訊180。 請參閱第5圖所示,其係為本發明之第二實施例之 使用狀態示意圖。圖中,其架構‘大致與第一實施例雷同, 二者之差異在於,偵測單元18 .係為一加速度動能感測器 (G-Sensor),其係被設置使用者身體上的任一部位(如:頭、 手或腳),或被設置在顯示單元10内。又或,請參閱第6圖, 在本發明之第三實施例之使用狀態示意圖。圖中,電子系 統1係為一手持式設備(如:行動電話、導航器或定位器… 等),偵測單元18亦為一加速度動能感測器(G-Sensor),其 係被設置在手持式設備内。請參閱第7圖,係本發明在此 實施例中的加速度動能感測器產生偵測資訊的流程圖,其包 括: . (S3〇r)加速度動能感測器感測至少一軸向的加速度資 料; (S302')根據加速度資料計算出一位移資訊;以及 (S303〇根據位移資訊產生對應的偵測資訊180。 其中位移資訊係為使用者的安裝加速度動能感測器的 部位之移動或轉動…等位移,或顯示單元10的移動或轉動… 等位移,或電子系統1的移動或轉動…等位移,以及使用者、 顯示單元10或電子系統1作出各位移的速度及角度變化… 201112105 點頭/搖頭/轉 等’進而產生使用者接近:遠離、左/右移動 頭或各種手勢…等偵測資訊18〇。 請參閱第8圖,係本發明在第三實 意圖。圖中,電子系統1係-行動電話,當行動 轉動時,隨即輸出偵測資訊18〇,用::左 單元1〇所呈現的圖片2中,再將互動= =的功能選單20)隨轉動的方向滑動呈現出來 係以圖㈣覽介面為例,但並未限定本發明的應用領 ^參閱第9圖,係本發明在第三實施例之又另—使用狀 也不思圖,或當該行動電話被向下轉動時 令行動電話的顯示單^在㈣… ^牛刚(即第9圖中所示的訊息框22)隨轉動的方向滑動= 息框中表示圖片的拍攝位置。請參閱第10 Θ ’、 Θ的行動電話再被轉動的示意圖。圖中,當行動 電話繼續被向下轉動時,隨即輸幻貞測資訊18G,增力田σ互動 ,件⑽㈣9財所示的訊息框22)的内容例二= 心框中表示圖片的拍攝位置及拍攝時間。 ~ =參閱第η圖,係本發明在第三實施例再另—示意圖, ^丁動電活的顯示單元1σ目前的使用者介面為—地圖導覽 Γ⑽面’入且/""動^話被向左傾時,將產生—左傾的_資訊 也圖導覽晝面上顯示出互動物件14〇 (如第u圖左 =方=標記Ρ(停車場)的位置);又,當行動電話被向右傾 寻,將產生—右傾的_資訊⑽,令地_覽晝面上顯示 201112105 =置Γ Λ 第11圖右上方所示標記7(便利商店)的 二二行動電話被向上傾時,將產生-上賴 =幻8〇’令地圖導覽晝面上顯示出互動物件140(如第n 所示標記〇(加油站)的位置);以及,當行動電話被 口 、'將產生一下傾的债測資訊180,令地圖導覽晝面 H出互動物件140(如第Η圖右下方所示的斜線部份的 =(次導航路線))。另外,當然亦可將上下左右傾斜或轉動的 偵測貝乱⑽,作為上下左右移動的回應命令142 得地圖導覽晝面隨之移動。 Κ ^綜上所述,本發明的回應命令如何操作使用者介面,係 完全取決於互動模組對偵測資訊與回應命令的定義,而有所 不同,因此,在不同的應用軟體所對應的使用者介面上,各 種偵測資訊180會有獨特的回應命令142,用以操作使用者 介面。 睛參閱第12及13圖,係本發明在各實施例中的電子系 統根據回應命令操作使用者介面的流程圖,及本發明的立體 物件的示意圖,及本發明之各實施例的立體物件示意圖。 中,其包括: ^ (5501) 根據一自動立體法(Aut〇stere〇sc〇pic)改變使用者 介面120呈現的角度及形狀,據以改變使用者介 面120為—立體物件3 ; (5502) 將使用者介面12〇呈現在立體物件3的其中—個 可視面;以及 (5503) 在立體物件3其他的可視面上,據以形成互動物 201112105 件140(如:功能選單、訊息框)。 • 復請參閱第13圖,‘使用者介面m係被改變成立體 物件3的形狀各不相同,且在立體物件3的不同面上分別以 不同顏色表示’藉以達到·區別各使用者介面⑽的目的。 請參閱第1及14圖,其中第14圖係本發明之各 中操控單元被操作後之動作流程圖,圖中,當電子系= 據啟動訊號160啟動互動模組14後,或電:系統^根據】 應命令142操作使用者介面12〇互動物件14〇後, • 進〆步包括: .(S70)接收電子系統i的一搡控單元】9所輪出的一操作 訊號 190 ; , * '' (S80)根據操作訊號190在互動模組14内取得對應的一 操作命令144 ; (S90)電子线丨根據操作命令m操作絲物件⑽。 請參閱第15圖’係本發明第四實施例之使用狀離示音、 •圖。圖中,電子系統1係一行動電話,偵測單元18係為一 加速度動能感測器(G-Sensor),係被設置在行動電 示單元ίο係觸控式顯示器。當行動電話使用者介面12〇内 的其f 一個互動物件140(如第is圖所示之圖示句被使用者 的手指觸控而選取時(此時·手指尚未離開物件),行動電話即 以圖示4被選取當作操作命令190 ’接著將行動電話左傾, 即產,生回應命令142,逐漸呈現出使用者介面12〇的下一個 顯示範圍内,又,被選取的圖示4進一步被拖移到下一個顯 示範圍内,行動電話即以圖示4被拖移當作操作命令19〇, 201112105 之後,行動電話被左傾直到顯示單^ 1G的顯示範圍呈現出 下-個顯示範圍=手指離_控式顯示器,即完成將圖示4 移動到下-個顯不範圍的動作。當然刪除、複製圖示或檀案 亦可以完成,在此不再贅述。 田 —再者,在本發明的各實施例中,啟動單元16或操控單 兀19係可為一滑鼠、一鍵盤或一觸控式顯示器的一輸入訊 號部’滑鼠、鍵盤或觸控式顯示器的輸人訊號部被操作後 產生啟動訊號160或操作訊號丨90。 〃復請參閱第i圖,係本發明之一種動態操作互動物件 糸統’此糸統包括-顯示單元1G、—使用者介面模組& 一互動模組14、一啟動單元16及一偵測單元ΐ8,,使 用者介面模組12係具有複數個使用者介面m,絲模板 14具有複數個互動物件14〇及複數個回應命令142,顯示單 元ίο係顯示至少-使用者介面120及互動物们4〇,啟動單 兀16提供輸出一啟動訊號16〇,烟單元18提供輸出至少 -憤測資訊18〇 ’而互動模組14接收啟動訊號⑽而被啟 動’亚根據偵測資訊180輸出對應的一回應命令]42,且根 據回應命令142以互動物件14〇操作使用者介面12〇。 復請參閱第3圖所示,備測單元18包括一影像操取部 182及-衫像计算部184,其令影像擷取部係揭取複數 個影像資料182〇,影像計算部184接收各影像資料卿, 並據以計算各影料料獅的至少―局部特徵的一變動資 訊聊’並依據變動資訊刪產生對應的谓測資訊18〇。、 復請參閱第5圖所示,债測單元18係可為一加速度動 12 201112105 能感測器’加速度動能感測器感測至少一軸向的加速度資 料,ii據以計算出一位移資訊,進而產生谓測資訊。復請參 閱第1圖所示’此系統尚包括一 ί呆控單元19,操控單元19 係提出一操作訊號190,而互動模組14尚具有至少一操作命 令144,並根據操作訊號190輸出對應的操作命令144,並 據以操作互動物件140。The display system comprises a user interface module, an interaction module, a display guide, a start unit and a detection unit, wherein the user interface module has a plurality of time interfaces 'Interactive residual has multiple mutual: response life r, display unit display at least - user interface and mutual 2 - 2 early: start signal, the measurement unit provides output to the measured information output corresponding to the dynamic τ is activated 'and delete Interface interactive objects. According to the response command, the following two points can be used: the method of operating the interactive object, the relative displacement between the electronic systems, or the electronic slippery, the key animal #, can reduce the blood次数 The number of operations of the touch screen. (7) The interactive object is electronically operated. The device is equipped with a mouse, keyboard or firefly=animal to make the operation of the electronic 乂 径 萤 screen, and the money has a variety of operation modes. 201112105 [Embodiment] Please refer to Figures 1 and 2, respectively, for the system diagram of the present invention, and the method for dynamically operating the interactive object of the present invention: Figure 2: This method is applicable to the electronic system! The electronic system i has a right display unit 10, a user interface module 12, an interactive module /, an activation unit 16 and a detection unit 18, and is rotated by the display unit 10 to enable the interface module. A user interface 120 (such as Graphical User lnterface, GUI for short) and Human- Machine Interaction Interface, the method includes: (S10) whether to continuously receive the startup unit 16 The output %, 160, if the step (S2〇) is performed, otherwise the step (s6〇f (S20) activates the interaction module 14 according to the activation signal 160 to interact with the current user interface 120. The object 14 〇 出 (S30) receives a detection information 丨 (10) output by the detecting unit 18; (S40) obtains a corresponding response command 142 in the interactive module u according to the detection information 180; ' ' (S50) electronic system The user interface 140 is operated according to the response command 142, and the current user interface 12 is maintained according to the step (S1). (S60). According to the above, the electronic system 1 of the present invention is Before the closure, the unit 18 will continue to detect The message 18() and the break 142' enable the electronic system to display the command 140' in the user interface 120 and to know that the user interface 12 is an interactive object, such as a Japanese object, which is different from the 201112105. Tfl 18 0 generates different response commands i 4 2, and then generates different interactive objects 14 使用者 for the user interface 120, and performs corresponding operations. Thus, reducing the action of repeatedly selecting the user interface 12G, thereby simplifying the electronic system t The purpose of the operation steps is as shown in Fig. 3, which is a state of use of the first embodiment of the present invention. In the figure, the electronic system is a computer system, and the display unit 10 can be a cathode. A ray tube (CRT) display, a liquid crystal (LCD) display, or a touch-sensitive display device. The detecting unit 18 includes an image capturing unit 丨82 and an image computing unit 184. The image capturing unit 182 can be a A web cam, a smart camera, or a device having an image capturing power b, and the image extracting portion 182 is disposed in a direction in which the display unit 1 is facing a user, or the image capturing portion 182 Can be set to make The image calculating unit 182 is connected to the image capturing unit 184 in the direction facing the display unit 10. Referring to FIG. 4, the image capturing unit and the image calculating unit in the present embodiment generate debts. The flow chart of the measurement information includes: (5301) the image capturing unit 182 captures a plurality of image data 1820; (5302) the image computing unit 184 receives each image data 1820; @303) the image computing unit 184 calculates each image data mo At least one partial feature 'generates a change information 1840; and (S304) the image calculation unit 184 generates corresponding price measurement information 180 based on the change information 1840. The change information 1840 is a relative movement between the user and the image capturing unit 182. The user moves toward the image capturing unit 182, and the user moves away from the image capturing unit 182 and moves toward the left or right side of the display unit. The user nods / shakes his head / turns his head or various gestures, etc., and the user makes the speed and angle change of each relative action, etc., thereby generating user proximity, distance, left/right movement, nodding/shaking/turning head ...and detect information 180. In this way, corresponding detection information 180 can be generated regardless of whether the user's motion changes or the display unit 10 is moved. Referring to Fig. 5, it is a schematic view showing the state of use of the second embodiment of the present invention. In the figure, the architecture 'is substantially the same as the first embodiment. The difference between the two is that the detecting unit 18 is an acceleration kinetic energy sensor (G-Sensor), which is set on any body of the user. A portion (such as a head, a hand or a foot) is disposed in the display unit 10. Alternatively, please refer to Fig. 6, which is a schematic view showing the state of use of the third embodiment of the present invention. In the figure, the electronic system 1 is a handheld device (such as a mobile phone, a navigator or a locator, etc.), and the detecting unit 18 is also an acceleration kinetic energy sensor (G-Sensor), which is set in Inside the handheld device. Please refer to FIG. 7 , which is a flow chart of generating acceleration detection information by the acceleration kinetic energy sensor in the embodiment of the present invention, including: (S3〇r) acceleration kinetic energy sensor sensing at least one axial acceleration (S302') calculating a displacement information based on the acceleration data; and (S303) generating corresponding detection information 180 according to the displacement information. wherein the displacement information is a movement or rotation of a portion of the user's installed acceleration kinetic energy sensor ...equal displacement, or movement or rotation of the display unit 10, etc., displacement, or displacement of the electronic system 1 or the like, and the speed and angle change of the displacement made by the user, the display unit 10 or the electronic system 1... 201112105 Nodding / shaking the head / turning etc. to generate user proximity: away, left/right moving head or various gestures... etc. Detection information 18〇. Please refer to Fig. 8, which is the third embodiment of the present invention. 1 series - mobile phone, when the action is rotated, the detection information is output 18〇, with:: picture 2 in the left unit 1〇, then the function menu of interaction == 20) The sliding display is taken as an example of the (4) viewing interface, but the application of the present invention is not limited. Referring to FIG. 9, the present invention is still in the third embodiment, and the use is not considered, or when The mobile phone is turned down and the display of the mobile phone is displayed in (4)... ^Niu Gang (ie, message box 22 shown in Fig. 9) slides in the direction of rotation = the frame indicates the shooting position of the picture. Please refer to the 10th ’ ', the schematic diagram of the mobile phone being turned again. In the figure, when the mobile phone continues to be rotated downward, the information is 18G, and the force is added to the field. The content of the message box 22) shown in the message box (10) (4) is shown in the box. And shooting time. ~ = Refer to the η diagram, which is another schematic diagram of the third embodiment of the present invention, and the current user interface of the display unit 1 σ of the electrodynamic display is - map navigation Γ (10) surface 'into / / quot; ^ When the word is tilted to the left, it will produce - left-leading _ information also maps the interactive object 14 〇 (such as u u left = square = mark Ρ (parking) position; and, when the mobile phone Looking to the right, it will produce a right-turning _ information (10), and the display will show 201112105 = Γ Γ when the 22nd mobile phone of the mark 7 (convenience store) shown at the top right of the 11th figure is tilted up, The generated object 140 will be displayed on the map guide (such as the position of the mark 〇 (gas station) shown in the nth); and, when the mobile phone is ported, 'will generate The distressed debt test information 180 causes the map to navigate through the interactive object 140 (as indicated by the slash portion of the lower right side of the figure (sub-navigation route)). In addition, it is of course possible to detect the bumps (10) which are tilted or rotated up and down, and as a response command 142 for moving up, down, left, and right, and then move the map guide. In summary, the response command of the present invention how to operate the user interface depends entirely on the definition of the detection information and the response command of the interactive module, and is different, and therefore, corresponding to different application software. On the user interface, various detection information 180 has a unique response command 142 for operating the user interface. FIG. 12 and FIG. 13 are flowcharts showing the operation of the user interface according to the response command in the electronic system of the present invention, and the schematic view of the three-dimensional object of the present invention, and the three-dimensional object diagram of each embodiment of the present invention. . The method includes: ^ (5501) changing the angle and shape of the user interface 120 according to an autostereoscopic method (Aut〇stere〇sc〇pic), thereby changing the user interface 120 to be a three-dimensional object 3; (5502) The user interface 12A is presented on one of the visible faces of the three-dimensional object 3; and (5503) is formed on the other visible faces of the three-dimensional object 3 to form an interactive object 201112105 140 (eg, a function menu, a message box). • Referring to Figure 13, the user interface m is changed to have different shapes, and is represented by different colors on different faces of the three-dimensional object 3. By means of achieving and distinguishing the user interfaces (10) the goal of. Please refer to FIG. 1 and FIG. 14 , wherein FIG. 14 is a flow chart showing the operation of the control unit in each of the present invention. In the figure, when the electronic system is activated according to the activation signal 160, the system is activated. ^ According to the command 142 to operate the user interface 12 〇 interactive object 14 •, • further steps include: (S70) receiving a control unit of the electronic system i] 9 operating signal 190; '' (S80) obtains a corresponding operation command 144 in the interaction module 14 according to the operation signal 190; (S90) The electronic wire 操作 operates the wire object (10) according to the operation command m. Please refer to Fig. 15 for use of the fourth embodiment of the present invention. In the figure, the electronic system 1 is a mobile phone, and the detecting unit 18 is an acceleration kinetic energy sensor (G-Sensor), which is disposed on the mobile display unit ίο is a touch display. When the mobile phone user interface 12 is located in an interactive object 140 (if the icon shown in the figure is selected by the user's finger touch (at this time, the finger has not left the object), the mobile phone is 4 is selected as the operation command 190' and then the mobile phone is tilted leftward, that is, the production response command 142 is gradually displayed in the next display range of the user interface 12〇, and the selected icon 4 is further After being dragged to the next display range, the mobile phone is dragged as the operation command 19图示, and after 201112105, the mobile phone is tilted left until the display range of the display unit 1 1G shows the next display range = The finger is away from the _ control display, that is, the action of moving the icon 4 to the next-display range is completed. Of course, deleting, copying the icon or the tartan case can also be completed, and will not be repeated here. Tian-again, in this In various embodiments of the invention, the activation unit 16 or the control unit 19 can be a mouse, a keyboard or an input signal portion of a touch display, a mouse, a keyboard or a touch display. After being operated The activation signal 160 or the operation signal 丨90. Please refer to the figure i, which is a dynamic operation interactive object system of the present invention. The system includes a display unit 1G, a user interface module & an interactive module. The user interface module 12 has a plurality of user interfaces m, and the silk template 14 has a plurality of interactive objects 14 and a plurality of response commands 142, and the display unit ίο. The system displays at least a user interface 120 and an interactive object. The activation unit 16 provides an output signal 16 , the smoke unit 18 provides an output of at least an anger information 18 〇 ' and the interactive module 14 receives the activation signal ( 10 ). It is activated to output a corresponding response command 42 according to the detection information 180, and the user interface 12 is operated by the interactive object 14 according to the response command 142. Referring to FIG. 3, the preparation unit 18 includes a The image capturing unit 182 and the shirt image calculating unit 184 cause the image capturing unit to extract a plurality of image data 182, and the image calculating unit 184 receives each image data and calculates at least each of the shadow lions. ―Local features A change of information chat 'and according to the change information to generate the corresponding predicate information 18 〇. Please refer to Figure 5, the debt measurement unit 18 can be an acceleration 12 201112105 can sensor 'acceleration kinetic energy The detector senses at least one axial acceleration data, and ii calculates a displacement information to generate predation information. Please refer to FIG. 1 'This system further includes a liaison control unit 19, the control unit 19 An operation signal 190 is presented, and the interaction module 14 further has at least one operation command 144, and outputs a corresponding operation command 144 according to the operation signal 190, and accordingly operates the interactive object 140.

據上所述,電子祕以操作命令及回應命令操作互動 :==面的操作流程’帶給使用者在操作電子系 僅為舉例性,而非為限制性者。 離本發明之精神與範_ 可未脫 更,均應包含於其 等效修改或變 钱^請專利範圍中。According to the above, the electronic secret operates the interaction with the operation command and the response command: the operation flow of the face is brought to the user in the operation of the electronic system only by way of example, not limitation. The spirit and scope of the present invention may be included in the scope of its equivalent modification or change.

13 201112105 【圖式簡單說明】 第1圖係為本發明之系統架構示意圖; 第2圖係本發明之動作流程圖; 第3圖係本發明之第一實施例之使用狀態示意圖; 第4圖係本發明之第一實施例影像擷取部及影像計算部 產生偵測資訊的流程圖; 第5圖係本發明之第二實施例之使用狀態示意圖; 第6圖係本發明之第三實施例之使用狀態示意圖; 第7圖係本發明之第三實施例之加速度動能感測器產生 偵測資訊的流程圖; 第8圖係本發明在第三實施例之另一使用狀態示意圖; 第9圖係本發明在第三實施例之又另一使用狀態示意圖; 第1 0圖係第9圖的行動電話再被轉動的示意圖; 第11圖係本發明在第三實施例之再另一使用狀態示意圖; 第12圖係本發明在各實施例中的電子系統根據回應命令 操作使用者介面的流程圖; 第13圖,係本發明之各實施例的立體物件示意圖; 第14圖係本發明之各實施例的操控單元被操作後之動作 流程圖;以及 第15圖 係本發明第四實施例之使用狀態示意圖。 14 201112105 【主要元件符號說明】 1 :電子系統; 10 :顯示單元; 12 :使用者介面模組; 120 :使用者介面; 14 :互動模組; 140 :互動物件; 142 :回應命令; 144 :操作命令; ♦ 16 :啟動單元; 160 :啟動訊號; 18:偵測單元; 180:偵測資訊; 182 :影像擷取部; 1820 :影像資料; 184:影像計算部; 1.840 :變動資訊; 19 :操控單元; 190 :操控訊號; 2 :圖片; 20 :功能選單; 201112105 22 :訊息框; 3:立體物件; 4 :圖示; S10〜S60 :步驟; S301 〜S304 :步驟; S301S303' ··步驟; S501〜S503 :步驟;以及 S70〜S90 :步驟。13 201112105 [Simplified description of the drawings] Fig. 1 is a schematic diagram of the system architecture of the present invention; Fig. 2 is a flow chart of the operation of the present invention; Fig. 3 is a schematic diagram showing the state of use of the first embodiment of the present invention; A flowchart of detecting information generated by the image capturing unit and the image calculating unit according to the first embodiment of the present invention; FIG. 5 is a schematic view showing a state of use of the second embodiment of the present invention; and FIG. 6 is a third embodiment of the present invention. FIG. 7 is a flow chart showing the detection information generated by the acceleration kinetic energy sensor according to the third embodiment of the present invention; FIG. 8 is a schematic view showing another use state of the third embodiment of the present invention; 9 is a schematic diagram of still another use state of the third embodiment of the present invention; FIG. 10 is a schematic diagram of the mobile phone being rotated again according to FIG. 9; FIG. 11 is another embodiment of the present invention in the third embodiment. FIG. 12 is a flow chart showing the operation of the user interface according to the response command in the electronic system of the present invention; FIG. 13 is a schematic view of the three-dimensional object of each embodiment of the present invention; The control unit according to the respective embodiments is next after an operation flowchart of the operation; and FIG. 15 schematic view showing the use state of the fourth embodiment of the present invention. 14 201112105 [Key component symbol description] 1 : electronic system; 10 : display unit; 12 : user interface module; 120 : user interface; 14 : interactive module; 140 : interactive object; 142 : response command; Operation command; ♦ 16: start unit; 160: start signal; 18: detection unit; 180: detection information; 182: image capture unit; 1820: image data; 184: image calculation unit; 1.840: change information; : Control unit; 190: Control signal; 2: Picture; 20: Function menu; 201112105 22: Message box; 3: Three-dimensional object; 4: Graphic; S10~S60: Step; S301~S304: Step; S301S303' ·· Steps; S501~S503: steps; and S70~S90: steps.

1616

Claims (1)

201112105 七、申請專利範圍·· !.-種動態操作互動物件之方法,該方法適用於一電子 系統,該電子系統具備一顯示單元、一使用者介面模 組及一互動模組,該方法包含· 根據-啟動訊號啟動該互動模組内對應該使用者介面 的一互動物件; 接收該電子系統的-偵測單元所輸出的一铺測資訊; 根據該·ί貞測貧訊在該互動模組内取得對應的一回鹿 令;以及 〜 該電子系統根據該回應命令以該互動物件操作該使用 者介面。 * 2.如申請專韻圍第丨項所述之方法,其中該偵測單 疋包括-影像擷取部及一影像計算部,該影像擷取部 及該影像計算部產生該偵測資訊的步驟,包括: 該影像擷取部擷取複數個影像資料; 該影像計算部接收各該影像資料; 該影像計算部根據各該影像資料計算至少一局部特 徵’據以產生一變動資訊;以及 •該影像計算部根據夺該變動資訊產生對應的該偵測資 訊。 、 3·如申請專利範圍第2項所述之方法,其中該變動資 訊係為使用者與影像擷取部間的一相對動作。 、 4·如申請專利範圍第3項所述之方法,其中該相對動 17 201112105 =使用者朝向該影雜取部移動、該使用者遠 如頜取部、朝向顯示單元左側移動、朝向顯示單: 右側移動、使用者點頭/搖頭/轉頭或手勢。 凡 5. 6· 如::專利範圍第1項所述之方法,其中該偵測單 兀糸:,、、一加速度動能感測器,該加速度動能感測器 生該偵測資訊的步驟,包括: .座 該加速度動能感測器感測至少一軸向的加速度資料; 根據該加速度資料計算出一位移資訊;以及 根據該位移資訊產生對應的該偵測資訊。 ,申請專利範圍第5項所述之方法,其中該位移資 汛係為一使用者的安裝該加速度動能感測器的部份之 移動或轉動的位移,以及該使用者的各該位移的速户 及角度變化。 ^ 7·如申請專利範圍第5項所述之方法,其中該仅移資 Λ係為5玄顯示單元的移動或'轉動等位移,以及該顯示 單元的各該位移的速度及角度變化。 8·如申請專利範圍第5項所述之方法,其中該位移資 訊係為該電子系統的移動或轉動,以及該電子系統作 出各位移的速度及角度變化。 9·如申請專利範圍第丨項所述之方法,其中該電子系 統根據該回應命令操作該使用者介面的步驟,包括: 根據一自動立體法改變該埤用者介面呈現的角度及形 狀,據以改變該使用者介面為一立體物件;. 201112105 者介面呈現在該立體物件的其中-個可視 =該立體物件其他的可視面上,並據以形成該互動物 項所述之方法,其中該互動物 ^功忐選早或一訊息框的其中之一。 如申晴專利範圍第丨項 系統啟動該互動模組後,更進括其中當該電子 接收該電子系統的一操控單元所輸出的 :據:,號在該互動模組内取得對應的=命 該電子系統根據魏作命令操作該互動物件。 12.如申請專利範圍第u項所述之 早為:滑鼠、一鍵盤或—觸控式顯示器的亥:控 後產生該操作訊號。4錢入訊號部被操 AM請專利範圍第1項所述之方法,其中⑼雷 系統操作該使用者介面的互動物件後更進一曰牛X勺右 接收該電子系統的-操控單元所.輪出的—操作訊^ ί據::作訊號在該互動模組内取得對應的一操 該電子系統根據該操作命令操作該互動物件。 14.如申請專利範圍第12項所述之方法,其h操 201112105 單元係可為一滑鼠、一鍵盤或一觸控式顯示器的—輪 入訊號部’該滑鼠、該鍵盤或該輸入訊號部被操作 後產生該操作訊號。 15.如申請專利範圍第1項所述之方法,其中該啟動單 元係可為一滑鼠、一鍵盤或一觸控式顯示器的一輪入 訊號部,該滑鼠、該鍵盤或該輸入訊號部被操作後 產生該啟動訊號。 10·如申請專利範圍第1項所述之方法,其中當該電子 系統一旦未接收到該啟動訊號,隨即將該互動物件鲁 恢復成該使用者介面。 11'種動恶操作互動物件之系統,其包括: —使用者介面模組’係具有複數個使用者介面; 一啟動單元’提供輸出一啟動訊號; 貞則單元,&供輸出至少一偵測資訊; 八互動H具有複數個互動物件及複數個回應命201112105 VII. Patent application scope ···.- A method for dynamically operating an interactive object, the method is applicable to an electronic system having a display unit, a user interface module and an interactive module, the method comprising · activating an interactive object corresponding to the user interface in the interactive module according to the -initiating signal; receiving a paving information outputted by the detecting unit of the electronic system; and measuring the poor information in the interactive mode according to the A corresponding deer order is obtained in the group; and ~ the electronic system operates the user interface with the interactive object according to the response command. The method of the present application, wherein the detection unit includes an image capture unit and an image calculation unit, and the image capture unit and the image calculation unit generate the detection information. The method includes: capturing, by the image capturing unit, a plurality of image data; the image computing unit receiving each of the image data; the image computing unit calculating at least one partial feature according to each of the image data to generate a change information; and The image calculation unit generates the corresponding detection information according to the change information. 3. The method of claim 2, wherein the variable information is a relative action between the user and the image capturing unit. 4. The method of claim 3, wherein the relative motion 17 201112105 = the user moves toward the image capturing portion, the user is far like the jaw portion, moves toward the left side of the display unit, and faces the display list. : Move on the right, the user nods / shakes his head / turns his head or gestures. 5.6. The method of claim 1, wherein the detecting unit: ,, an acceleration kinetic energy sensor, the acceleration kinetic energy sensor generates the information detecting step, The device includes: the acceleration kinetic energy sensor sensing at least one axial acceleration data; calculating a displacement information according to the acceleration data; and generating the corresponding detection information according to the displacement information. The method of claim 5, wherein the displacement factor is a movement or rotation displacement of a portion of the user to mount the acceleration kinetic energy sensor, and a speed of each displacement of the user Household and angle changes. The method of claim 5, wherein the only transfer is a movement or 'rotational displacement' of the 5 display unit, and a speed and an angle change of each of the displacements of the display unit. 8. The method of claim 5, wherein the displacement information is movement or rotation of the electronic system, and the electronic system performs speed and angular changes of the displacements. 9. The method of claim 2, wherein the step of operating the user interface according to the response command comprises: changing an angle and shape of the user interface according to an autostereoscopic method, The method of changing the user interface into a three-dimensional object; the 201112105 interface is presented on the other visible surface of the three-dimensional object, and the method according to the interactive item is formed, wherein the method The interactive object ^ is selected as one of the early or one message boxes. If the Shenqing patent scope system activates the interactive module, it is further outputted by a control unit that receives the electronic system by the electronic: according to: the number is obtained in the interactive module. The electronic system operates the interactive object according to the command of Wei. 12. As described in the scope of claim 5, the operation of the mouse, a keyboard or a touch-sensitive display is generated after the control. 4 money into the signal department is handled by AM, please refer to the method described in item 1 of the patent scope, wherein (9) the lightning system operates the interactive interface of the user interface and then enters the yak X spoon to receive the electronic system - the control unit. The operation-operation data is: the signal is obtained in the interactive module, and the electronic system operates the interactive object according to the operation command. 14. The method of claim 12, wherein the unit 201112105 unit can be a mouse, a keyboard or a touch display - a wheeled signal portion of the mouse, the keyboard or the input The operation signal is generated after the signal portion is operated. 15. The method of claim 1, wherein the activation unit is a wheel, a keyboard, or a wheel-in display of a touch-sensitive display, the mouse, the keyboard, or the input signal unit. The start signal is generated after being operated. 10. The method of claim 1, wherein when the electronic system does not receive the activation signal, the interactive object is subsequently restored to the user interface. 11's system for operating an interactive object, comprising: - a user interface module having a plurality of user interfaces; a start unit providing an output start signal; a unit, & for outputting at least one detection Information; Eight Interactive H has multiple interactive objects and multiple responses 動::單元’該顯示單元係顯示該等使用者介面及互 接㈣啟_號而被啟動’並根據該 1 8.如申請專利範圊 單元包括: 項所述之系統,其中f寒偵測 20 201112105 一影像擷取部,該影像擷取部係擷取複數個影像資 .料;以及 一影像計算部,該影像計算部係接收各影像資料,並 據以計算各該影像資料的至少一局部特徵的一變動資 訊,並依據各該變動資訊產生對應的該偵測資訊。 19. 如申請專利範圍第17項所述之系統,其中該偵測 單元係可為一加速度動能感測器,該加速度動能感測 器感測至少一軸向的加速度資料,並據以計算出一位 移資訊,進而產生偵測資訊。 20. 如申請專利範圍第.17項所述之系統,其中更包括 .一操控單元,該操控單元係輸出一操作訊號,且該互 動模組尚具有複數個操作命令,該互動模組根據各操 作訊號輸出對應的操作命令,並據以操作互動物件。Action:: Unit 'The display unit is displayed by the user interface and the interconnection (4) is started _ and is activated according to the 1 8. The patent application unit includes: the system described in the item, wherein f Measure 20 201112105 an image capturing unit, the image capturing unit extracts a plurality of image materials; and an image computing unit, the image computing unit receives each image data, and calculates at least each of the image materials A change information of a local feature, and generating corresponding detection information according to each of the change information. 19. The system of claim 17, wherein the detecting unit is an acceleration kinetic energy sensor that senses at least one axial acceleration data and calculates A displacement information, which in turn generates detection information. 20. The system of claim 17, wherein the system further comprises: a control unit, the control unit outputs an operation signal, and the interaction module further has a plurality of operation commands, and the interaction module is The operation signal outputs the corresponding operation command, and the interactive object is operated accordingly. 21twenty one
TW98131430A 2009-09-17 2009-09-17 Method and system of dynamic operation of interactive objects TW201112105A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW98131430A TW201112105A (en) 2009-09-17 2009-09-17 Method and system of dynamic operation of interactive objects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW98131430A TW201112105A (en) 2009-09-17 2009-09-17 Method and system of dynamic operation of interactive objects

Publications (1)

Publication Number Publication Date
TW201112105A true TW201112105A (en) 2011-04-01

Family

ID=44909137

Family Applications (1)

Application Number Title Priority Date Filing Date
TW98131430A TW201112105A (en) 2009-09-17 2009-09-17 Method and system of dynamic operation of interactive objects

Country Status (1)

Country Link
TW (1) TW201112105A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI547855B (en) * 2011-12-01 2016-09-01 新力股份有限公司 Information processing device, information processing method and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI547855B (en) * 2011-12-01 2016-09-01 新力股份有限公司 Information processing device, information processing method and program

Similar Documents

Publication Publication Date Title
US10948950B2 (en) Information processing device, table, display control method, program, portable terminal, and information processing system
US10511778B2 (en) Method and apparatus for push interaction
US9924018B2 (en) Multi display method, storage medium, and electronic device
WO2020253655A1 (en) Method for controlling multiple virtual characters, device, apparatus, and storage medium
KR102051418B1 (en) User interface controlling device and method for selecting object in image and image input device
JP5807686B2 (en) Image processing apparatus, image processing method, and program
US20130342459A1 (en) Fingertip location for gesture input
JP2012514786A (en) User interface for mobile devices
JP5561092B2 (en) INPUT DEVICE, INPUT CONTROL SYSTEM, INFORMATION PROCESSING METHOD, AND PROGRAM
TW200941302A (en) Control device, input device, control system, control method, and hand-held device
JP2012212345A (en) Terminal device, object control method and program
CN103513894A (en) Display apparatus, remote controlling apparatus and control method thereof
JP2011511379A (en) Select background layout
KR20130142824A (en) Remote controller and control method thereof
KR20140060818A (en) Remote controller and display apparatus, control method thereof
EP2843533A2 (en) Method of searching for a page in a three-dimensional manner in a portable device and a portable device for the same
JP6386897B2 (en) Electronic blackboard, information processing program, and information processing method
US9400575B1 (en) Finger detection for element selection
US20130201157A1 (en) User interface device and method of providing user interface
CN109033100A (en) The method and device of content of pages is provided
TW201112105A (en) Method and system of dynamic operation of interactive objects
CN109416599A (en) For handling the device and method of touch input based on the input parameter of adjustment
JP6008904B2 (en) Display control apparatus, display control method, and program
US20150042621A1 (en) Method and apparatus for controlling 3d object
JP2009205609A (en) Pointing device