TW201117089A - Mobile electronic device and method for controlling 3D operation interface thereof - Google Patents
Mobile electronic device and method for controlling 3D operation interface thereof Download PDFInfo
- Publication number
- TW201117089A TW201117089A TW98138088A TW98138088A TW201117089A TW 201117089 A TW201117089 A TW 201117089A TW 98138088 A TW98138088 A TW 98138088A TW 98138088 A TW98138088 A TW 98138088A TW 201117089 A TW201117089 A TW 201117089A
- Authority
- TW
- Taiwan
- Prior art keywords
- angle
- screen
- operation interface
- electronic device
- preset
- Prior art date
Links
Landscapes
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
201117089 PNAI-SW-0230-TW 31075twf.doc/n 六、發明說明: 【發明所屬之技術領域】 本發明是有關於一種電子裝置的操作方法,且特別是 有關於一種操控三雉(three-dimensiona卜3D )操作介面的 方法及行動電子裝置。 【先前技術】 所謂的虛擬實境(Virtual Reality,VR)是指利用電 腦繪圖及影像合成等技術,由電腦模擬現實環境進而建構 出的虛擬世界。一般而言,使用者可透過頭戴式顯示器以 f三維(three-dimensiona卜3D)感應手套等裝備來對虛擬 實境中的物件進行操作。其中,虛擬實境的晝面便顯示於 頭戴式顯示器,而3D感應手套則是用來偵測使用者手部 的動作進而對應改變頭戴式顯示器所顯示的虛擬實境書 ,,並且讓使用者可以觸碰虛擬實境中的物件。然而無論 是頭,式顯示器或3D感應手套,都需要相當複雜的 j及昂貴的製作成本。因此一般人並不容易在日常生活中 旱受虛擬實境所帶來的便利。 h =隨著科技的進步,越來越多的電子裝置開始以3D =)ι面來提供使用者—種類似於虛擬實境的操作感受。 當2人電腦的3D桌面程式便是將桌面的背景二以及 呈用程式捷徑、標案與#料夾等圖示以立體的形式 ^幕令。然而’目前的3D桌面程式僅是將桌面上 、疋件以立體的方式呈現,並非真正的3D虛擬實境設計。 201117089 PNAI-SW-0230-TW 31075twf.doc/n 且不難想見的疋’即便是個人電腦可以支援真正的3D虚 擬實境,在個人電暇續鼠或賴作為輸人裝置的前提 之下使用者要以這類—維(iw〇_出,2D)輸入 f置來操作^虛擬實境也容易祕諸多_。換句話說, 、-虛擬貝^進订真正的3 D操控還是必須仰賴昂貴且 稷雜度尚的虛擬實境設備。 【發明内容】 介面 種三維(three-dimenSiGnah 3D)操作 3Df動1 據行動電子裝置在3〇空間所產生的 士乃杈供—種行動電子裝置,讓使用去〇 真實世界對物件進行操作般的感 : 裝置進行操作。 尤罝戒地對仃動電子 本發明提出-種3D操作介面的 螢幕的行動電子穿罟k、,A ^ 方法,用於具有 裝置。此方法百先令螢幕以第一# - 灯動,裝置之犯操作介面的第=顯不 一視角對應於—告A*土 姐 丨區戈。其中,第 前垂直方位角。^ ㈣及-當201117089 PNAI-SW-0230-TW 31075twf.doc/n VI. Description of the Invention: [Technical Field] The present invention relates to an operation method of an electronic device, and in particular to a three-dimensiona 3D) Method of operating the interface and mobile electronic device. [Prior Art] The so-called Virtual Reality (VR) refers to a virtual world constructed by computer simulation of the real environment using technologies such as computer graphics and image synthesis. In general, the user can operate the objects in the virtual reality through equipment such as f-three-dimensional (3D) induction gloves through the head-mounted display. The virtual reality surface is displayed on the head-mounted display, and the 3D induction glove is used to detect the movement of the user's hand and correspondingly change the virtual reality book displayed by the head-mounted display, and The user can touch objects in the virtual reality. However, whether it is a head-mounted display or a 3D induction glove, it requires a relatively complicated j and expensive production costs. Therefore, it is not easy for the average person to enjoy the convenience brought by the virtual reality in daily life. h = As technology advances, more and more electronic devices are beginning to provide users with a 3D =) ι face-like operational experience similar to virtual reality. When the 3D desktop program of the 2-person computer is the background of the desktop and the shortcuts of the application, the standard and the #clip, the stereoscopic form is used. However, the current 3D desktop program only presents the desktop and the widget in a stereoscopic manner, not a true 3D virtual reality design. 201117089 PNAI-SW-0230-TW 31075twf.doc/n And it's not hard to imagine 即便 'Even if the PC can support real 3D virtual reality, under the premise of personal sequel or sling as a loss device Users should use this type of dimension (iw〇_出, 2D) to input f to operate ^ virtual reality is also easy to secret a lot _. In other words, the virtual 3D control must rely on expensive and noisy virtual reality devices. SUMMARY OF THE INVENTION Interface three-dimensional (three-dimenSiGnah 3D) operation 3Df motion 1 According to the mobile electronic device in the 3 〇 space, the snail is a kind of mobile electronic device, which allows the use of the real world to operate on objects. Sense: The device is operating.罝 罝 仃 仃 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本 本In this method, the first shilling screen is marked with the first #-light, and the device's operative interface of the device corresponds to the same - the same angle of view - AA*土姐丨区戈. Among them, the first vertical azimuth. ^ (4) and - when
移動量時持L 若行動電子裝置在3D空間產生3D 令,則根據3D移動巧==域令待定物件的選取指 私動I控制螢幕轉而以第-i目_ ^的第二局部區域,同時根據3D移動上不犯操 件在®操作介變更特定物 局部區域。 文特疋物件顯示於第二 201117089 PNAI-SW-0230-Tw 31075twf.d〇c/n 從另一觀點來看,本發明接 _ 括螢幕、選取偵測模組、3D移動量偵測電子裝置,包 組。其中,選取偵測模組用以偵測行動+子且^及處理模 作介面中特定物件的選取指:子裝置之3D操 測行動電子裝置在3D空間產生的貞測模組用以偵 別連接至鮮、選取齡愼組,旬^量。處理模組分 處理模組控制螢幕以第-視角顯;:移f量偵測模組。If the mobile electronic device generates a 3D command in the 3D space, the selection of the pending object according to the 3D mobile device == field means that the private I control screen is turned to the second partial region of the first-i__ At the same time, according to the 3D movement, the operation of the device does not change the local area of the specific object. The article is shown in the second 201117089 PNAI-SW-0230-Tw 31075twf.d〇c/n. From another point of view, the present invention includes a screen, a selection detection module, and a 3D movement detection electronic device. , package group. The detection module is used to detect the action + sub- and the selection of the specific object in the processing template interface: the 3D operation of the sub-device operates the detection module generated in the 3D space to detect the connection. To the freshest, select the age group, the amount of ten. The processing module component processing module controls the screen to display the first-view angle;
部區域,而第一視角對應於-當前參考^介面的f—局 方位角及-綠肖。^ ^ ^ —當前水平 到行動電子裝置的3D移動量,日置偵測模組偵測 到第-局部區域令特定物件=偵測模組持續制 根據3D移動量控制螢幕轉第那麼處理模組將 的第二局部區域,同時根據^ = f 3D操作介面 域。 使侍特疋物件顯示於第二局部區 基於上述,本發明係模擬 a 行操作的方式,根據使用者貫被境對物體進 行操作所產生的犯移動量:動電子裝置在犯空間進 角以及3D操作介面中特更犯操作介面的視 者能在行動電子裝置上體驗以置而讓使用 便利感受,大幅降低操作行動;帶來的 為讓本發明之上述特徵 ^ 、钹*又。 舉f Λ ' 優點能更明顯易懂,下文特 牛貝_,並配合所附圖式作詳細說明如下 特 201117089 PNAI-SW-0230-TW 31075twf.doc/n 【實施方式】 圖1是依照本發明之一實施例所繪示之行動電子裝 的方塊圖。請參閱圖1,行動電子裝置1〇〇包括螢幕 選取偵測模組120、三維(three-dimensionah 3D) 旦 债測模組m,以及處賴組。在本實施例中,^ = 子裝置100例如是手機、個人數位助理(Pe職: Assistant,PDA)、PDA 手機,或智慧型 1 並不限制其範圍。 予在此 其中,螢幕110可以是電阻式或電容式 , 行動電子裂置_的各種操作或使用書面ΓΪ本 行動電子裝置⑽具有包括多個予頁設物 子F詈100 Μ _ I、有立體的外觀’並分別表示行動電 可絲顯案’或資料失等等。而榮幕㈣ 面來使用行動電子 =1〇面〇’進而讓使用者透過3〇操作介 物件。進一步來日:技产特疋物件例如是任意-個預設 模組120便能為觸控螢幕時,選取備測 觸控筆等輸入工具觸件被使用者以手指或 當使用者請應產生的選取指令。 晃、旋轉或甩動行子裝置100四處走動’或是搖 】〇〇產生對應的3ϋ电子胃裳置〗00時,將使行動電子裝置 夕動罝C包括3D位移變化量以及31) 201117089 PNAI-SW-0230-TW 31〇75twf.doc/n 角度變化量等等)。而犯移動量價測模㈣ 動以置:受到使用者的操作而在3D空間產 生的私動夏。在本實施例中 座 包括加速度感測器(accel_n sen^\t 速度感測器可以是重力加速度感測 測力曰口速度變化量以推算出行動電子裳置i )用2 化量。而電子羅盤則可以精確 、立移變The area of the area, and the first angle of view corresponds to the f-office azimuth of the current reference interface and the - green xiao. ^ ^ ^ - the current level of 3D movement to the mobile electronic device, the day detection module detects the first local area to make the specific object = detection module continues to control the screen according to the 3D movement amount, then the processing module will The second partial area, while operating the interface domain according to ^ = f 3D. Displaying the object to be displayed in the second partial area. Based on the above, the present invention simulates the operation of the a-line operation, and the amount of movement caused by the operation of the object by the user is: the electronic device is in the space advancement angle and The viewer who has made the operation interface in the 3D operation interface can experience on the mobile electronic device to make the user feel comfortable, and greatly reduce the operation action; the above features of the present invention are brought about. The advantages of f Λ ' can be more obvious and easy to understand. The following is a detailed description of the following: 201117089 PNAI-SW-0230-TW 31075twf.doc/n [Embodiment] Figure 1 is in accordance with this A block diagram of a mobile electronic device as shown in one embodiment of the invention. Referring to FIG. 1, the mobile electronic device 1 includes a screen selection detection module 120, a three-dimensional (three-dimensionah 3D) debt measurement module m, and a relying group. In the present embodiment, the sub-device 100 is, for example, a mobile phone, a personal digital assistant (Pe: Assistant, PDA), a PDA mobile phone, or a smart type 1 and does not limit its scope. In this case, the screen 110 may be a resistive or capacitive type, various operations of the mobile electronic splitting or use of the written mobile electronic device (10) having a plurality of pre-set objects F詈100 Μ _ I, having a stereo The appearance of 'and separately indicates that the mobile electric wire can be displayed' or the data is lost. And the screen (4) uses the mobile electronic =1 〇 〇 进而 to allow the user to operate the device through 3 。. Further coming: the special feature of the technology is, for example, any one of the preset modules 120 can be used for the touch screen, and the input tool such as the preparation stylus is selected by the user to be generated by the user or by the user. Selection instruction. When the swaying, rotating or swaying device 100 moves around 'or shake' 〇〇 to generate the corresponding 3 ϋ electronic stomach 〗 00, the mobile electronic device will be included in the 3D displacement change and 31) 201117089 PNAI -SW-0230-TW 31〇75twf.doc/n Angle variation, etc.). And the mobile volume measurement model (4) moves to set: the private movement summer generated in the 3D space by the user's operation. In this embodiment, the seat includes an acceleration sensor (accel_n sen^\t, the speed sensor can be a gravitational acceleration sensing force, the porting speed change amount to estimate the action electrons i). The electronic compass can be accurately and vertically changed.
角度變化量。 馈動電子褒置100的沁 處理模組140分別盘螢墓〗1n ' 以及3D移動量_模組;30相連,用模組120, 模組!30偵測到行動電子裝置 在=動量偵测 測模組12〇持續偵測到第__局部且選取伯 取指令時,根據®移動量來變換3 疋物件的選 時根據3D移動量將特杨件移動貝示視 另一個位置。 3D插作介面的 模組140的控制而隨著行動電^ ^ 110將受到處理 量轉而以不同的視角顯矛時的祀移動 之外,使用者可以透過按麵幕部區蜮。除此 作介面令的任意物件,處理模組⑽^方^來點選3D操 物件亚拿著辆f子打〗⑽在 ^者持續點選 所點選的物件由3D操作介面的一工處^動的情況下,將 以下將㈣―實施辣進—步。 勒电子裒置100 201117089 PNAI-S W-0230-TW 31075twf.doc/n 的坪細運作流程。® 2是依照本發明之—實施給_ 3D操作介面之控制方法的流程圖,請同時表閱 ' =之 2。在本實施例中,行動電子裝置1〇〇的3D操二與圖 而3D操作介面上的各預設物件分別表利為^ 裝置100的應用程式、擋案,或資料夾。在行動子 ⑽中,記錄有預先定義的3D操作介面之原點位^ = 位於犯介面財心位置)、3D操作介面之水=如 的初始水平方位角(介於〇度至360度之間),以旬上 操作介面之垂直面上的初始垂直方位角(介於q = 度之間)。在另一實施例中,上述原點位置、初又 位角以及初歸直方位肖也可*使用者根據°使= 慣來自行設定。 w文用白 如步驟201所示,處理模組M〇控制螢幕ιι〇以 視角顯示3D操作介面的第-局部區域,而第一視 Z 於-當前參考位置、-當前水平方位角及—當前垂直方: 角。詳細地說,處理模組140首先判撕當前來考 符合原點位置。若當前參考位置符合於原點位= ,電子裝置100可能剛被使用者啟動。此時,處理模組14〇 定義以原點位置為中心而在水平面上介於初始水平方位角 加減第一特定角度(例如25度)之間,同時在垂直面上介 =初始垂直方位角加減第二特定角度(例如3〇度)之間的 範圍為苐視角所對應的可視範圍。接下來,處理模組14〇 取得3D操作介面中各預設物件的物件位置,並且計算各 預設物件之物件位置分別與原點位置所構成的向量角度。 201117089 PNAI-SW-0230-TW 3I〇75twf.doc/„ 向量角度落於可視範圍的所有預設物件顯 者在行二參::二二,表示使用 四處走動,或;〇。=, 作。在這樣嫩下,處理編The amount of angular change. The processing module 140 of the feeding electronic device 100 is respectively connected to the tomb 〗 1n ' and the 3D moving amount _ module; 30, and the module 120, the module !30 detects the mobile electronic device in the momentum detection When the test module 12 〇 continuously detects the __ partial and selects the BER command, the time of the object is changed according to the amount of movement of the 疋 object. The 3D plug-in interface is controlled by the module 140, and the user can move through the face-to-face area as the action power is transferred to the different angles of view. In addition to any object made by the interface, the processing module (10) ^ ^ ^ to select the 3D operator object holding a f-player (10) in the ^ continue to select the selected object by the 3D operation interface of a work place In the case of moving, the following will be (4) - implement the spicy step. Le electronic device 100 201117089 PNAI-S W-0230-TW 31075twf.doc/n ping operation process. ® 2 is a flow chart for implementing the control method for the _ 3D operation interface in accordance with the present invention, please also read '=2. In this embodiment, the 3D operation of the mobile electronic device 1 and the preset objects on the 3D operation interface respectively represent the application, the file, or the folder of the device 100. In the action sub- (10), the origin of the pre-defined 3D operation interface is recorded ^ ^ is located at the financial interface of the interface), the water of the 3D operation interface = the initial horizontal azimuth of the image (between the temperature and the degree of 360 degrees) ), the initial vertical azimuth of the vertical plane of the interface (between q = degrees). In another embodiment, the origin position, the initial re-angle, and the initial orientation can also be set by the user according to °. As shown in step 201, the processing module M〇 controls the screen to display the first partial area of the 3D operation interface from the perspective, and the first view Z is at the current reference position, the current horizontal azimuth and the current Vertical square: Angle. In detail, the processing module 140 first determines that the current test is in accordance with the origin position. If the current reference position conforms to the origin position =, the electronic device 100 may have just been activated by the user. At this time, the processing module 14 is defined as centering on the origin position and increasing or decreasing between the initial horizontal azimuth angle and the first specific angle (for example, 25 degrees) on the horizontal plane, and simultaneously adding or subtracting the initial vertical azimuth angle on the vertical plane. The range between the second specific angles (eg, 3 degrees) is the visual range corresponding to the angle of view. Next, the processing module 14 取得 obtains the object positions of the preset objects in the 3D operation interface, and calculates the vector angle formed by the object positions of the respective preset objects and the origin position. 201117089 PNAI-SW-0230-TW 3I〇75twf.doc/„ All preset objects whose vector angle falls within the visible range are shown in the second line:: 22, indicating the use of walking around, or; 〇. =, do. In such a tender, processing
置為甲心,而在水平面上介於别茶考位 定角度之間同時在垂直面上介二角加減第-特 範圍為第-視角二二= 于各預叹物件的物件位置後,處理模电 置分別舆當前參考位置所構成的設: 量角度落於上述可視範圍的所有預錄= 必需說明的是,在第—視角 同時,勞幕H0所顯示的書的=參考位置不 。會利用-消隱演算法7例如2 = 3法f =組 在本實拖例中,假設螢幕11〇為 斷行動電子裝置⑽在3D空間產生而為了判 中* 一特定物件二= 空間產生置刚在犯 使用者-指或觸控筆觸碰螢幕 201117089 PNAI-SW-0230-TW 31075twf.doc/n 疋發生在螢幕110的第一二維(tw〇_dimensi〇nai,2d)座 標上。而行動電子裝置10〇在3D空間產生的3D移動量是 由3D移動量偵測模組13〇所偵測之。 接著在步驟210中,判斷此一觸碰操作是否可作為第 一局部區域中特定物件的選取指令。亦即,判斷使用者是 否乂手^曰或觸控筆點選3D操作介面之第一局部區域中的 物件。進—步來說,處理模組140會將觸碰操作所在之2D 座標轉換為3D操作介面中的對應位置。在取得3D操作介 面中所有預設物件的物件位錢,便可比對出對應位置是 =與任何的物件位置補。若與所有的物件位置皆不相 付,則如步驟215所示,處理模組14〇僅根據3D移動量 來控制螢幕110轉而以第二視角顯示犯操作介面的第二 局部區域。其中,處理模組⑽控制螢幕11()顯示第 =區域的詳細倾容後再述。接下來,此控财 程 將回到步驟205’等待谓測另-個觸碰操作。 ^觸碰㈣謂純置符合於某—預設物Set as the center of the heart, and in the horizontal plane between the angles of the other tea test positions and the addition and subtraction of the two angles on the vertical plane. The first-specific range is the first-view angle 2 = after the object position of each pre-sighing object, the processing The mode consists of the current reference position: all pre-records whose quantity angle falls within the above-mentioned visible range. It must be noted that, at the same time, the reference position of the book displayed by the screen H0 is not. Will use the - blanking algorithm 7 for example 2 = 3 method f = group in the real example, assuming that the screen 11 is broken mobile electronic device (10) generated in 3D space in order to judge * a specific object two = space generation The user-finger or stylus touches the screen 201117089 PNAI-SW-0230-TW 31075twf.doc/n 疋 occurs on the first two-dimensional (tw〇_dimensi〇nai, 2d) coordinates of the screen 110. The amount of 3D movement generated by the mobile electronic device 10 in the 3D space is detected by the 3D motion detection module 13A. Next, in step 210, it is determined whether the one touch operation is available as a selection instruction for a particular object in the first partial region. That is, it is determined whether the user clicks on the object or the stylus to select an object in the first partial area of the 3D operation interface. Further, the processing module 140 converts the 2D coordinates of the touch operation into corresponding positions in the 3D operation interface. When the object position of all the preset objects in the 3D operation interface is obtained, the corresponding position can be compared with any object position. If there is no payment with all of the object positions, then as shown in step 215, the processing module 14 controls the screen 110 to display the second partial area of the operational interface in a second view based only on the amount of 3D movement. The processing module (10) controls the screen 11 () to display the detailed tilt of the = area. Next, the control will return to step 205' to wait for another touch operation. ^Touch (4) means purely in accordance with a certain - preset
則表示使用者以手指(麵控筆)按下此物件。 乂驟22〇所不’處理模組M〇將使用者點選物件 的打間記錄為第一參考時間’並將物件位置符合於對二立 置的預設物件視為㈣者所選取的特定物件。㈣ 极組12G便會將觸碰動作作為特定物件的選取指令。侦J 若選酬齡齡是爾續存在。 ^^曰々持、.貝存在’表示選取偵測模組120能在3D浐 動置偵測模組13()不斷_到行動電子襄置1()二』: 201117089 PNAI-SW-0230-TW 31075twf.doc/n 2時,持_剩選取指令。亦即,使用者係在持續點 ^特疋物件的同時拿著行動電子裝置四處走動。因此 處^模組⑽會對應地控制螢幕11〇以不同的視角顯示沁 ^作介面’而特定物件在3ϋ操作介面中的顯示 有所變動。 ^ — 步驟23=°中’處理模組140根據3D移動量計算特 =#在3D操作介面中的顯示位置。並如步驟所示, j权組14〇根據31)移動量控制營幕ιι〇轉而以第二視 -3D操作介面的第二局部區域,同時使特定物件颟 不於弟二局部區域。 . 的當==目=140將根據第-視角所對應 满彳置位移變化量計算對應於第二 見角的新參考位置(例如,新參考位置為當 =化量的總和),並以㈣度變化量中的水平面 度ί化二二視角的新水平方位角,以及以3D角 C分量作為對應於第二視角的新垂直方 平而ρ ί者,處理模組140以新參考位置為中心、,將在水 ;ι於新水平方位角加減第—特定角度之間,同時在 定義3切新垂直方位角加減第二特定歧之間的範圍 中各預角所對應的可視範圍。在取得3D操作介面 位置件位置並計算各物件位置分別與新參考 量角产Ip向里角度之後’處理模、组140將所對應之向 幕‘ΓΠ見角ΐ可視範圍的所有預設物件顯示於螢 4,處理杈組140亦會利用消隱演算法來處理 201117089 PNAI-SW-0230-TW 31〇75twf.doc/n 可顯示在可視範®巾的所有預設物件,以呈現近景遮蓋遠 景的效果。 在因應使用者持續按著物件並拿著行動電子裝置1〇〇 在空間移動而改變3D操作介面之顯示視角的同時,處 理扠組140也會對應地改變特定物件的顯示位置。在本實 施例中,處理模組M0是崎參考位置作為特定物件目前 的顯不位置,接著取得特定物件的3D造型資料,並依據 3D造型資料在顯示位置上顯示特定物件,進而使得特定物 件顯示於第二局部區域。 睛回到圖2之步驟225,倘若在步驟225中判斷選取 日令不再持續存在(因觸碰動作消失導致選取指令也隨之 1$),則表示使用者可能放開手指(或觸控筆)而不再 墙績點選特定物件,因此如步驟所示,處理模板⑽ 取得觸碰動作消失前對應於螢幕n〇上的第二2D座標, 亚將觸碰動作消失當時的時間記錄為第二參考時間。丁 η ώ Ϊ下來在步驟245 _,處理模組140判斷第—參考時 曰〃第二參考時間之間的差值是否小於第一時間預設值 =^.5秒)。若第—參考時間與第二參考時間的差值 隹弟—時間狀值’表示錢者是在所選取的特定物件 進行一點擊動作,因此如步驟25〇所示,處理模组“Ο =仃特絲件所對應的魏。舉例來說,雜定物件對應 、成I動電子裴置100的應用程式,處理模組14〇將執行: 用程式。若特定物件對應行動電子裝置⑽中的構 處理模組140將開啟檔案並透過螢幕11〇將檔案内容 12 201117089 PNA1-SW-0230-TW 31075twf.doc/n 呈現給使用者。若特定物件對應於資料失,那麼處理 140將開啟資料夹’進而讓使用者可以檢閱資料失中的# 案。換句話說,使用者在拿著行動電子裝置1〇〇四處= 時,處理模組140將控制螢幕110根據3D移動量而 角顯示3D操作介面。而當螢幕11〇所顯示的局 &域中包括使用者希望執行或開啟的特定物件時 二要:第一時間預設值内快速地點選再放開特定物件 將二後’此流程 =口到γ驟215,由處理模組14〇根據 轉而以第二視角顯示3D操作介面二里= ί者ίΓΓ驟205,等待偵測另一個觸碰操作 的中3第—參考時間與第二參考時間 ==二的顯示位置是否符合二:: 是表示-虛擬 =位置與特定位置相符,、處理模 = 的特定物件或將特定物件搬^ 根二=驟,由處理模組1: 作介面的第二局部 > 轉而以第一視角顯示3D操 個觸碰操作的1 4。接者再回到步驟205,等待另- 位置示位置與特定 斤不判斷弟一 2D座標與第二 201117089 PNAI-SW-0230-TW 31075twf.doc/n 2D座標之間的距離是否小於距離預設值(例如i〇個點), 且第-參考時間與第二參考時間之間的差值是否大於第二 ,間預設值⑷如1秒)。若是,則表示使用者是按住特 並拿著行動電子裝置100四處走動,接著再放開特 =勿件。細樣的情況下’處理模組⑽會令特定物件固 員不於目前之顯示位置。進一步來說,如步驟23〇所示, 3模組刚會根據犯移動量計算特缝件在 =顯示(例如是以新參考位置作為特定物件的; 。接著如步驟235所示,處理模組⑽根據ί 幕乂 1轉而以第二視角顯示3D操作介面的 t ’使狀物件顯示於第二局部區域。 再回到步驟265,當判斷第一 2 D座標鱼 卩 ^間,離大於或等於距離預設值’且第二參 -f考時間之值小於或等 0 ”, 接續步驟270,處理模組14〇將計算^ ^值日守’則 =2D座標的抛物線距離,並根據拋物線距離; 作"面中的—目標位置,進而以目標^ 3 ^呆 顯示位置。接著在步驟235中,根據行物件的 第二局部區域’同時使特定物件顯 == r將ίί實施例中,在完成步驟235的顯示動作後f 私將再次回到步驟2G5,等 勒作後,此流 犯移動量時使用者對營幕4=貞:丁動^電子裝置⑽產生 幕110的觸石亚動作。換句話說,行 14 201117089 PNAI-SW-0230-TW 31075twf.doc/n 動電子裝置_在啟動後便會反覆執行圖2之 的某個特定物件並拿:亍動二 裝置剛四處走動,除了會使營幕110 = =示3D操作介面的局部區域之外,使 者峨定物件,則根據持續點選特 :長絲心要執行物件對應的功能 :It means that the user presses the object with a finger (face control pen). Step 22: The processing module M 〇 records the interval between the user's selected objects as the first reference time 'and the object position is determined as the specific object selected by the (four) preset objects. object. (4) The pole group 12G will use the touch action as the selection instruction of the specific object. Detective J is a continuation of the age of the remuneration. ^^曰々持, .贝有' indicates that the selection detection module 120 can be used to detect the module 13 in the 3D continually to the mobile electronic device 1 () 2: 201117089 PNAI-SW-0230- When TW 31075twf.doc/n 2, hold the _ remaining selection command. That is, the user walks around with the mobile electronic device while continuing to point on the object. Therefore, the module (10) correspondingly controls the screen 11 to display the interface as a different angle of view, and the display of the specific object in the 3-inch operation interface is changed. ^ - Step 23 = ° The processing module 140 calculates the display position of the special = # in the 3D operation interface based on the 3D movement amount. And as shown in the step, the j-right group 14 〇 according to the 31) movement amount control camp ιι〇 turns to the second partial area of the second-view 3D operation interface, while making the specific object not in the second partial area. When == mesh=140, a new reference position corresponding to the second angle of view is calculated according to the amount of displacement change corresponding to the full angle of the first angle of view (for example, the new reference position is the sum of the = amount), and (4) The horizontal flatness in the degree of change reduces the new horizontal azimuth of the 22nd view, and the 3D angle C component as the new vertical level corresponding to the second angle of view, and the processing module 140 is centered on the new reference position. , in the water; ι in the new horizontal azimuth plus or minus the first - specific angle, while defining the 3 new vertical azimuth plus or minus the second specific difference between the range of the corresponding angle of the pre-angle. After obtaining the position of the 3D operation interface position piece and calculating the position of each object and the new reference amount angle to produce the inward angle of the Ip, the processing mode and the group 140 will display all the preset objects of the corresponding range of the viewing angle of the corner. In Firefly 4, the processing group 140 will also use the blanking algorithm to process 201117089 PNAI-SW-0230-TW 31〇75twf.doc/n All preset objects that can be displayed in the Visual Van® towel to present a close-up view Effect. While the display viewing angle of the 3D operation interface is changed in response to the user continuing to press the object and holding the mobile electronic device 1 〇〇 in the space, the processing fork set 140 also correspondingly changes the display position of the specific object. In this embodiment, the processing module M0 is the current reference position of the specific object, and then obtains the 3D modeling data of the specific object, and displays the specific object in the display position according to the 3D modeling data, thereby causing the specific object to be displayed. In the second partial area. Returning to step 225 of FIG. 2, if it is determined in step 225 that the selection date no longer exists (the selection instruction is also 1$ due to the disappearance of the touch action), the user may release the finger (or touch). Pen) and no longer select specific objects, so as shown in the step, the processing template (10) obtains the second 2D coordinate on the screen n〇 before the touch action disappears, and the time when the touch action disappears is recorded as Second reference time. After the step 245 _, the processing module 140 determines whether the difference between the second reference time of the first reference time is less than the first time preset value = ^. 5 seconds). If the difference between the first reference time and the second reference time - the time value - indicates that the money is performing a click action on the selected specific object, as shown in step 25, the processing module "Ο =仃For example, if the miscellaneous object corresponds to the application of the electronic device 100, the processing module 14〇 executes: the program. If the specific object corresponds to the structure of the mobile electronic device (10) The processing module 140 will open the file and present the file content 12 201117089 PNA1-SW-0230-TW 31075twf.doc/n to the user through the screen 11. If the specific object corresponds to the data loss, the process 140 will open the folder ' In turn, the user can review the data missed case. In other words, when the user holds the mobile electronic device 1=, the processing module 140 controls the screen 110 to display the 3D operation according to the 3D movement amount. Interface: When the page & field displayed on the screen 11〇 includes the specific object that the user wants to execute or open, the second time: the first time preset value within the quick location and then release the specific object will be two after the process = mouth to γ 215, the processing module 14 显示 according to the second view to display the 3D operation interface 2 = ΓΓ 205 205, waiting to detect the other 3 touch time of the other touch operation and the second reference time == Whether the display position of the second meets the two:: Yes means that the virtual = position matches the specific position, the specific object of the processing mode = or the specific object is moved to the root 2 = step, and the processing module 1: the second part of the interface > In turn, the 3D operation touch operation is displayed in the first view. The receiver then returns to step 205 to wait for the other position display position and the specific weight to determine the 2D coordinate and the second 201117089 PNAI-SW- 0230-TW 31075twf.doc/n Is the distance between the 2D coordinates smaller than the preset value (for example, i〇 points), and the difference between the first reference time and the second reference time is greater than the second, Set the value (4) as 1 second. If it is, it means that the user is holding the special and holding the mobile electronic device 100 to move around, and then releasing the special = no. In the case of the sample, the processing module (10) will make the specific The object is not in the current display position. Further, as in step 23 〇, 3 modules will just calculate the special parts according to the amount of movement in the = display (for example, the new reference position as a specific object; then, as shown in step 235, the processing module (10) according to ί 乂 1 And displaying the object of the 3D operation interface by the second view to display the object in the second partial region. Returning to step 265, when determining the first 2D coordinate fish, the distance is greater than or equal to the distance preset value' The value of the second reference-f test time is less than or equal to 0 ”, and after step 270, the processing module 14〇 calculates the parabola distance of the ^^ value of the time = 2D coordinates, and according to the parabolic distance; - the target position, and then display the position with the target ^ 3 ^. Next, in step 235, according to the second partial area of the line object, at the same time, the specific object is displayed == r. In the embodiment, after the display action of step 235 is completed, the private party will return to step 2G5 again. When the flow commits the movement amount, the user acts on the camping 4=贞: Ding moving ^ electronic device (10) to generate the touchstone action of the screen 110. In other words, line 14 201117089 PNAI-SW-0230-TW 31075twf.doc/n The electronic device _ will restart the specific object of Figure 2 after starting up and take: move the two devices just around, except Will make the curtain 110 = = outside the local area of the 3D operation interface, the operator will determine the object, according to the continuous point selection: the filament core to perform the corresponding function of the object:
否刪除物件或改變特定物件的在 哭或Jrfl如此—來,使用者將不再需要頭戴式顯示 #感心手套等昂貴且複雜的虛擬實境設備^也rff 子裝置100體驗虛擬實境的操作效果 未被二,者並 動電子萝晋丨nn ^ , 换δ之,使用者可以拿著行 角與景深。^ 改變3D操作介面的顯示視 變#、在乂下的貫施例中則提供了-種瞬間改 更^^ V據以讓使用者在操控犯操作介面時感到 面之^制^^、本發明之另—實施例崎*之3 D操作介 步驟的Γ圖。請同時參閱圖1與圖3,首先如 示叫㈣介面二里幕110以第-視角顯 一視角_ - 弟局。卩£域。由於控制螢幕110以第 施例相同1 操作介面之第一局部區域的步驟與前述實 门_或相似,故在此不再贅述。 著在V驟320中’判斷選取偵測模組12〇是否偵測 15 201117089 PNAI-SW-0230-TW 31075twf.doc/n 到第一局部區域中某一特定物件的選取指令。若選取偵測 模組120並未偵測到任何的選取指令,表示使用者尚未點 選任何物件,因此如步驟330所示,在3D移動量偵測模 組130所偵測到的3D位移變化量在一特定時間内超過一 預設值時,處理模組14〇依據3D位移變化量變更第一視 角目别對應的景深’進而顯示第一局部區域中的子區域。 也就是說,只要使用者對行動電子裝置100施以一特定操 作(例如快速搖晃或甩動行動電子裝置1〇〇)而使得行動 電子裝置—100在瞬間產生較大的加速度變化量,那麼螢幕 110所顯不之晝面的景深便會隨之改變。 :构右㈣320的判斷結果顯示選取偵測模組⑶ 斷二指令,則如步驟340所示,判 生,目,丨^丰 持、,存在。在本實施例+,若選取指令消 干在目350所示,處理模組140令特定物件固定顯 在目則的顯不位置。然而在本發明的Jt妯容#加士 . 選取指令消失時,處射本發明的其他貫施例中,當 放開特定物件的時門門t140也可以根據使用者點選及 所對應的功能,亦』定是否要執行特定物件 件的顯示位置來根據使用者放開特定物件時特定物 存在,則如步驟3、^否刪除特定物件。若選取指令持續 内超過預設值時,由产在3D位移變化量在特定時間 更第一視角所對應的产⑽、j 140依擄3D位移變化量變 區域的子區域,同時^ 以在螢幕110顯示第一局部 位置,使得特定物件j3D移動量變更特定物件的顯示 員不在子區域令。也就是說,只 201117089 PNAI-SW-0230-TW 31〇75twf.doc/n 要選取偵測模組120持續偵測到選取指令,而3D移動旦 偵測模組130所偵測到的3D位移變化量在特定時間内^ 過預設值,便表示使用者邊點選特定物件邊對行動電子^ 置卿施以特定操作(例如快速搖晃或甩動行動電子^ 1::麼螢幕11〇所顯示之畫面的景深以及使用麵點 選之物件的顯示位置也將瞬間改變。 洛- ^以下的實施例中’行動電子裝置100的第一按鍵(未 φ ::士預先被定義為對應-預設3D移動量,此預設3D移 動里匕括3D位移變化量、預設水平方位角以及預設垂直 方位角。當使用者按壓此按鍵時,便相當於行動電子 100文到使用者的操作而產生上述預設3D移動量^情 況。據此’ 3D操作介面的視角、以及3D操作介面上物件 的顯不位置也將隨之改變。 圖4是依照本發明之又一實施例所緣示之犯操作介 面的控制方法的流程圖。請同時參閱圖j與圖4,首先如 步,所示,處理模組14 0控制螢幕! ! 〇以第一視角顯 =D操作"面的第—局部區域。由於控制螢幕no顯示 弟一局部區域的詳細步驟與前述實施例相同或相似 ,故在 此不再贅述。 接著在步驟420中,判斷選取侧模組⑽是否伯測 弟局部區域令某一特定物件的選取指令。若選取偵測 吴組120亚未_到任何的選取指令,表示使用者並未選 何物件目此如步驟430所示,處理模組140在第〆 破按壓8守根據第一按鍵所對應的預設犯移動量控制 201l17089 PNAI-SW-0230-TW 31075twf.d〇c/n 】幕110轉而以弟三視角顯示犯操作介面的第三局部區 則如物件的選取指令, 偵測到選取指令時編,處理模=== 二Γ根據預設3D移動量變更特定物件的 使仔特定物件顯示於第三局部區域中。在本實 :固-顯指令消失’處理模組140便會令特定物 件口疋.,、|不於目别的顯示位置。在另—實施例中,去 指令消失時,處理模組14〇則會根 ^^ = :物件的時間間隔長短來決定是否要執行 的舰,械是《制者放_杨 ^ 不位置來蚊Μ珊特絲件。 狀㈣的顯 旦的ΓΓΓ中,只要使用者按下對應於預設3D移動 =第-按鍵,便可快速地轉__ 二由於在前述實施例已說明了處理模組 =示_介面之不同局部區域=:據同二 =夏改變特定物件的顯示位置,使得特定物件由3 = "面的一處移往他處,故在此便不再贅、求。 粽作 二按鏠置⑽具有-第 應3D操作介面中的一預設還原位置:例對 18 201117089 PNAI-SW-0230-TW 31075fwf.doc/n 預設還原^平方位角以及預設還原垂直方位角。只要使用 者按壓此遇原按鍵,處理模组⑽便會控制勞幕⑽顯示 j 3D操作介面之預設還原位置為中心的晝面。進一步來 s兄:處軸,I4。㈣預設還原位置、預設還原水平方位 角以及預設還原垂直方位角控制勞幕110轉而以第四視角 顯示3D操作介面的預設局部區域(亦即,以預設還原位 置,中心的晝面)。據此,即便使用者因頻繁操控犯操 作;I面而暫時迷失所在位置,也能透過按屢還原按鍵快速 操作介面的預設還原位置。砂扣預設局部 區戍的γ驟與前述實關相同或相似,故在此不再資述。 上述實施例所述之行動電子裝置及其3D 介面 控制方法是根據行動電子裝置在3D空間產生的犯移動量 對應地改變3D操作介面的顯示視角與景深,同時讓使用 者可以對3D操作介面中的各物件進行點選動 ===能等操作。上述實施例提供的方式讓;;用者 體驗如同身處在3D操作介財並對各種物件進 作的感受。即便是不習慣操作電子裝置的使用者在取壯 述^動電子f置時,也不需要花_外的時 控3D操作介面,從而確保行動電子裝置 在使用上更為直觀便利。 雖然本發明已以實施例揭露如上,然其 之3所屬技術領域中具有通常知識者,在不脫離 =月之和神和範圍内,當可作些許之更動與潤傅,故本 發月之保4賴當視後附之申請專利範圍所界定者為準。 201117089 PNAI-SW-0230-TW 31〇75twf.doc/n 【圖式簡單說明】No delete objects or change specific objects in crying or Jrfl - so users will no longer need head-mounted displays # expensive gloves and other expensive virtual reality devices ^ also rff sub-device 100 experience virtual reality operations The effect is not the second, and the person moves the electronic radish 丨 ^ ^, for δ, the user can hold the line angle and depth of field. ^ Change the display of the 3D operation interface. In the example of the underarm, it provides a kind of instant change to ^^V to allow the user to feel the control of the operation interface. Another example of the invention - an example of the 3D operation of the Saki*. Please refer to Fig. 1 and Fig. 3 at the same time. First, as shown in the figure (4), the second screen 110 is displayed in the first perspective.卩£ domain. Since the step of controlling the screen 110 to use the first partial area of the same operation interface of the first embodiment is similar to the above-mentioned actual door or the like, it will not be described herein. In step V, it is determined whether the selection detection module 12 detects 15 201117089 PNAI-SW-0230-TW 31075twf.doc/n to select a specific object in the first partial area. If the selection detection module 120 does not detect any selection command, indicating that the user has not selected any object, the 3D displacement change detected by the 3D movement detection module 130 is as shown in step 330. When the amount exceeds a predetermined value within a certain time, the processing module 14 changes the depth of field corresponding to the first perspective by the 3D displacement change amount to display the sub-region in the first partial region. That is, as long as the user applies a specific operation to the mobile electronic device 100 (for example, quickly shaking or tilting the mobile electronic device 1), the mobile electronic device 100 generates a large amount of acceleration change in an instant, then the screen The depth of field of the 110 faces will change. : The judgment result of the right (four) 320 shows that the selection detection module (3) breaks the two instructions, and as shown in step 340, the judgment, the destination, the 丰^, and the presence are present. In the present embodiment +, if the selection command is dried as shown in item 350, the processing module 140 causes the specific object to be fixed to the apparent position of the target. However, in the other embodiments of the present invention, when the selection command disappears, the door gate t140 can also be selected according to the user and the corresponding function when the specific object is released. And also determine whether to display the display position of the specific object piece according to the presence of the specific object when the user releases the specific object, and then delete the specific object as in step 3, ^. If the selection command continues to exceed the preset value, the sub-region corresponding to the 3D displacement change region corresponding to the first view angle produced by the 3D displacement change amount at a specific time is simultaneously on the screen 110. The first partial position is displayed such that the specific object j3D movement amount changes the display of the specific object not to the sub-area order. In other words, only the 201117089 PNAI-SW-0230-TW 31〇75twf.doc/n selects the detection module 120 to continuously detect the selection command, and the 3D displacement detected by the 3D mobile detection module 130 If the amount of change exceeds the preset value within a certain time, it means that the user selects a specific object and applies a specific operation to the action electronic device (for example, fast shaking or shaking the action electronic ^ 1:: The depth of field of the displayed screen and the display position of the object selected by using the noodle point will also change instantaneously. Luo - ^ The first button of the mobile electronic device 100 in the following embodiment (not φ::Shi is pre-defined as corresponding - preset 3D movement amount, the preset 3D movement includes a 3D displacement change amount, a preset horizontal azimuth angle, and a preset vertical azimuth angle. When the user presses the button, it is equivalent to the action of the mobile electronic 100 text to the user. The above-mentioned preset 3D movement amount is generated. Accordingly, the viewing angle of the '3D operation interface and the display position of the object on the 3D operation interface will also change accordingly. FIG. 4 is a view showing another embodiment of the present invention. The flow of the control method of the operation interface Please refer to Figure j and Figure 4 at the same time. First, as shown in the following steps, the processing module 14 0 controls the screen! ! 〇 The first view is displayed = D operation " the first partial area of the face. Because the control screen no display The detailed steps of a partial area of the younger brother are the same as or similar to those of the foregoing embodiment, and therefore will not be described here. Next, in step 420, it is judged whether the selection side module (10) selects a local area to select a specific object. Selecting the detection Wu group 120 ya _ to any selection command, indicating that the user has not selected the object. As shown in step 430, the processing module 140 is smashed and pressed according to the first button. Set the movement amount control 201l17089 PNAI-SW-0230-TW 31075twf.d〇c/n 】 Curtain 110 to display the third partial area of the operation interface by the younger three perspectives, such as the object selection instruction, detecting the selection instruction Time editing, processing mode === Second, according to the preset 3D movement amount, the specific object is changed to display the specific object in the third partial area. In the present: the solid-display instruction disappears, the processing module 140 will make the specific Objects 疋.,,| In another embodiment, when the de-command disappears, the processing module 14 根 will root ^^ = : the length of the time interval of the object to determine whether or not to execute the ship, the device is "producer _ Yang ^ no The position of the mosquito Μ 特 丝 。 状 状 状 ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( Processing module = different local areas of the interface _ interface =: according to the same two = summer change the display position of a specific object, so that the specific object moved from the 3 = " face to another place, so no longer here, begging. The second pressing device (10) has a preset rest position in the 3D operation interface: an example pair 18 201117089 PNAI-SW-0230-TW 31075fwf.doc/n Preset reduction ^ square bit angle and preset reduction vertical Azimuth. As long as the user presses the original button, the processing module (10) controls the screen (10) to display the front surface of the j 3D operation interface with the preset restoration position as the center. Further to s brother: axis, I4. (4) a preset reduction position, a preset reduction horizontal azimuth angle, and a preset reduction vertical azimuth control to control the screen 110 to display a preset partial area of the 3D operation interface in a fourth perspective (ie, at a preset restoration position, at the center昼面). According to this, even if the user temporarily loses the position due to frequent manipulation of the operation, the position can be restored by pressing the preset button to quickly operate the interface. The gamma of the preset local zone of the sand buckle is the same as or similar to the above, so it will not be described here. The mobile electronic device and the 3D interface control method thereof according to the above embodiments are correspondingly changing the display angle of view and the depth of field of the 3D operation interface according to the amount of movement generated by the mobile electronic device in the 3D space, and allowing the user to interact with the 3D operation interface. Each object is selected by the action === can be operated. The above embodiments provide a way for the user to experience the feeling of being in a 3D operation and interacting with various objects. Even users who are not accustomed to operating an electronic device do not need to spend a time-controlled 3D operation interface to ensure that the mobile electronic device is more intuitive and convenient to use. Although the present invention has been disclosed above by way of example, the person having ordinary knowledge in the technical field of the third embodiment can make some changes and invigorating the body without departing from the scope of the sum of the moon and the moon. The warranty is based on the scope of the patent application. 201117089 PNAI-SW-0230-TW 31〇75twf.doc/n [Simple description]
圖1是依照本發明之一實施例所繪示之行動電子裳 的方塊圖。 、I 疋伙和 、冬發明之一實施例所繪示之: 的控制方法的流程圖。BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 is a block diagram of a mobile electronic skirt in accordance with an embodiment of the present invention. A flow chart of the control method of the embodiment of the present invention.
圖3是依照本發明之另一實施例所繪示之3D操作分 面的控制方法的流程圖。 J3 is a flow chart of a method of controlling a 3D operation surface according to another embodiment of the present invention. J
圖4是依照本發明之又一實施例所繪示之3D操作介 面的控制方法的流程圖。 【主要元件符號說明】. 100 :行動電子裝置 :螢幕 120 ·選取彳貞測模組 130 : 3D移動量偵測模組 H0 :處理模組4 is a flow chart showing a method of controlling a 3D operation interface according to still another embodiment of the present invention. [Main component symbol description]. 100: Mobile electronic device: Screen 120 · Selecting the measurement module 130 : 3D movement detection module H0 : Processing module
210〜270 :本發明之一實施例所述之3D操作介面的 控制方法之各步驟 310〜360 ·_本發明之另一實施例所述之3D操作介面 的控制方法之各步驟 410〜440 :本發明之又一實施例所述之3D操作介面 的控制方法之各步驟 20210 to 270: Steps 310 to 360 of the control method of the 3D operation interface according to an embodiment of the present invention. - Steps 410 to 440 of the control method of the 3D operation interface according to another embodiment of the present invention: Each step 20 of the method for controlling the 3D operation interface according to still another embodiment of the present invention
Claims (1)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW098138088A TWI502468B (en) | 2009-11-10 | 2009-11-10 | Mobile electronic device and method for controlling 3d operation interface thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW098138088A TWI502468B (en) | 2009-11-10 | 2009-11-10 | Mobile electronic device and method for controlling 3d operation interface thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
TW201117089A true TW201117089A (en) | 2011-05-16 |
TWI502468B TWI502468B (en) | 2015-10-01 |
Family
ID=44935121
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW098138088A TWI502468B (en) | 2009-11-10 | 2009-11-10 | Mobile electronic device and method for controlling 3d operation interface thereof |
Country Status (1)
Country | Link |
---|---|
TW (1) | TWI502468B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI486866B (en) * | 2012-06-29 | 2015-06-01 | Mediatek Singapore Pte Ltd | Method and device for icon displaying |
TWI560580B (en) * | 2014-12-04 | 2016-12-01 | Htc Corp | Virtual reality system and method for controlling operation modes of virtual reality system |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10217186B2 (en) * | 2017-02-15 | 2019-02-26 | Htc Corporation | Method, virtual reality apparatus and recording medium for displaying fast-moving frames of virtual reality |
US10890979B2 (en) | 2018-04-23 | 2021-01-12 | Industrial Technology Research Institute | Controlling system and controlling method for virtual display |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6992702B1 (en) * | 1999-09-07 | 2006-01-31 | Fuji Xerox Co., Ltd | System for controlling video and motion picture cameras |
US8174561B2 (en) * | 2008-03-14 | 2012-05-08 | Sony Ericsson Mobile Communications Ab | Device, method and program for creating and displaying composite images generated from images related by capture position |
-
2009
- 2009-11-10 TW TW098138088A patent/TWI502468B/en active
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI486866B (en) * | 2012-06-29 | 2015-06-01 | Mediatek Singapore Pte Ltd | Method and device for icon displaying |
TWI560580B (en) * | 2014-12-04 | 2016-12-01 | Htc Corp | Virtual reality system and method for controlling operation modes of virtual reality system |
Also Published As
Publication number | Publication date |
---|---|
TWI502468B (en) | 2015-10-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11227446B2 (en) | Systems, methods, and graphical user interfaces for modeling, measuring, and drawing using augmented reality | |
EP3607418B1 (en) | Virtual object user interface display | |
US9507431B2 (en) | Viewing images with tilt-control on a hand-held device | |
Leibe et al. | The perceptive workbench: Toward spontaneous and natural interaction in semi-immersive virtual environments | |
US10341642B2 (en) | Display device, control method, and control program for stereoscopically displaying objects | |
Fu et al. | Multi-touch techniques for exploring large-scale 3D astrophysical simulations | |
JP6214981B2 (en) | Architectural image display device, architectural image display method, and computer program | |
US20120102438A1 (en) | Display system and method of displaying based on device interactions | |
US20170352188A1 (en) | Support Based 3D Navigation | |
TW201214266A (en) | Three dimensional user interface effects on a display by using properties of motion | |
US12062142B2 (en) | Virtual environment | |
TW201117089A (en) | Mobile electronic device and method for controlling 3D operation interface thereof | |
US20240062279A1 (en) | Method of displaying products in a virtual environment | |
Tseng et al. | EZ-Manipulator: Designing a mobile, fast, and ambiguity-free 3D manipulation interface using smartphones | |
JP5876600B1 (en) | Information processing program and information processing method | |
Kim et al. | Virtual object sizes for efficient and convenient mid-air manipulation | |
KR101520746B1 (en) | Input device and display device | |
Yoo et al. | 3D remote interface for smart displays | |
Chen et al. | An integrated framework for universal motion control | |
GB2533777A (en) | Coherent touchless interaction with steroscopic 3D images | |
WO2011150702A1 (en) | Method for displaying contacts in instant messenger and instant messaging client | |
JP2016224595A (en) | System, method, and program | |
Kwon et al. | A spatial user interface design using accordion metaphor for VR systems | |
WO2016057997A1 (en) | Support based 3d navigation | |
Takeyama et al. | PhoneCanvas: 3D Sketching System Using a Depth Camera-Equipped Smartphone as a Canvas |