TW201504925A - Method for operating user interface and electronic device - Google Patents

Method for operating user interface and electronic device Download PDF

Info

Publication number
TW201504925A
TW201504925A TW102126940A TW102126940A TW201504925A TW 201504925 A TW201504925 A TW 201504925A TW 102126940 A TW102126940 A TW 102126940A TW 102126940 A TW102126940 A TW 102126940A TW 201504925 A TW201504925 A TW 201504925A
Authority
TW
Taiwan
Prior art keywords
touch
mapping
touch surface
motion sensor
coordinate
Prior art date
Application number
TW102126940A
Other languages
Chinese (zh)
Inventor
Ting-Chiang Huang
Wen-Neng Liao
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to TW102126940A priority Critical patent/TW201504925A/en
Publication of TW201504925A publication Critical patent/TW201504925A/en

Links

Abstract

A method for operating a user interface and an electronic device are provided, the electronic device has a touch screen, and the touch screen displays the user interface. The method includes: defining a touch operating region and a motion mapping region on a touch surface of the touch screen; detecting a touch operation activated on the touch screen within the touch operating region by the touch screen, and detecting a body motion above the touch surface and the body motion is not contact with the touch surface; operating the user interface within the touch operating region according to the touch operation, and operating the user interface within the motion mapping region according to the body motion.

Description

使用者介面的操作方法與電子裝置 User interface operation method and electronic device

本發明是有關於一種電子裝置,且特別是有關於一種使用者介面的操作方法與電子裝置。 The present invention relates to an electronic device, and more particularly to a method and an electronic device for operating a user interface.

為滿足使用者對於大尺吋螢幕的需求,市面上的智慧型手機或平板電腦等可攜式電子裝置的觸控螢幕尺吋也越來越大。以智慧型手機為例,許多智慧型手機的觸控螢幕尺寸已超過5吋,更不用提平板電腦了,部分平板電腦的觸控螢幕尺寸甚至接近或超過10吋。 In order to meet the user's demand for large-size screens, the touch screen size of portable electronic devices such as smart phones or tablets on the market is also growing. Taking smart phones as an example, the touch screen size of many smart phones has exceeded 5 inches, not to mention the tablet. Some touch screens of touch screens are even close to or exceeding 10 inches.

然而,對於具有大尺吋觸控螢幕的可攜式電子裝置來說,除非使用者具有足夠長的手指(特別是拇指),否則一般的使用者幾乎不可能使用單手來操作具有過大尺吋之觸控螢幕的可攜式電子裝置。 However, for a portable electronic device having a large-footprint touch screen, unless the user has a sufficiently long finger (especially a thumb), it is almost impossible for a general user to operate with a single hand. The portable electronic device of the touch screen.

有鑑於此,本發明提供一種使用者介面的操作方法與電子裝置,其可讓大多數的使用者利用單手即可完成對電子裝置的使用者介面的完整操控。 In view of this, the present invention provides a user interface operation method and an electronic device, which allows most users to complete the complete manipulation of the user interface of the electronic device with one hand.

本發明提供一種使用者介面的操作方法,適用於具有觸控螢幕的電子裝置,並且觸控螢幕顯示使用者介面,此電子裝置的操作方法包括:在觸控螢幕的觸控面上定義一接觸操作區域與一體感映射區域;由觸控螢幕偵測作用於接觸操作區域內的接觸操作,並且由設置於電子裝置上的動作感測器偵測位於觸控面上方且沒有接觸到觸控面的體感動作;根據接觸操作來操作涵蓋於接觸操作區域內的使用者介面,並且根據體感動作來操作涵蓋於體感映射區域內的使用者介面。 The invention provides a user interface operation method, which is suitable for an electronic device with a touch screen, and the touch screen displays a user interface. The operation method of the electronic device includes: defining a contact on the touch surface of the touch screen The operation area and the integrated sensing area; the touch operation detects the contact operation in the contact operation area, and the motion sensor disposed on the electronic device detects the touch surface and does not touch the touch surface The somatosensory action; the user interface covered in the contact operation area is operated according to the touch operation, and the user interface included in the somatosensory map area is operated according to the somatosensory action.

本發明另提供一種電子裝置,此電子裝置包括動作感測器、觸控螢幕及處理器。動作感測器設置於電子裝置上。觸控螢幕用以顯示使用者介面。處理器耦接觸控螢幕與動作感測器,並且用以在觸控螢幕的觸控面上定義接觸操作區域與體感映射區域。觸控螢幕更用以偵測作用於接觸操作區域內的接觸操作,並且動作感測器更用以偵測位於觸控面上方且沒有接觸到觸控面的體感動作。處理器更用以根據接觸操作來操作涵蓋於接觸操作區域內的使用者介面,並且根據體感動作來操作涵蓋於體感映射區域內的使用者介面。 The invention further provides an electronic device comprising a motion sensor, a touch screen and a processor. The motion sensor is disposed on the electronic device. The touch screen is used to display the user interface. The processor is coupled to the touch screen and the motion sensor, and is configured to define a contact operation area and a body sense mapping area on the touch surface of the touch screen. The touch screen is further configured to detect a touch operation acting on the touch operation area, and the motion sensor is further configured to detect a body motion that is located above the touch surface and does not touch the touch surface. The processor is further configured to operate the user interface covered in the contact operation area according to the touch operation, and operate the user interface included in the somatosensory map area according to the somatosensory action.

基於上述,本發明提供一種使用者介面的操作方法與電子裝置,其在具有觸控螢幕的電子裝置上額外設置動作感測器, 以捕捉出現在觸控螢幕上方且沒有觸碰到觸控螢幕的體感動作,並且根據所捕捉到的體感動作來操作使用者原本單手無法操作的觸控螢幕的部份區域。 Based on the above, the present invention provides a user interface operation method and an electronic device, and an action sensor is additionally disposed on an electronic device having a touch screen. To capture the somatosensory motion that appears above the touch screen without touching the touch screen, and to operate a portion of the touch screen that the user cannot operate with one hand, according to the captured somatosensory motion.

為讓本發明的上述特徵和優點能更明顯易懂,下文特舉實施例,並配合所附圖式作詳細說明如下。 The above described features and advantages of the invention will be apparent from the following description.

10、50、60‧‧‧電子裝置 10, 50, 60‧‧‧ electronic devices

12、222、224、562、564‧‧‧動作感測器 12, 222, 224, 562, 564‧‧‧ motion sensors

14‧‧‧觸控螢幕 14‧‧‧ touch screen

16‧‧‧處理器 16‧‧‧ Processor

21、51‧‧‧觸控面 21, 51‧‧‧ touch surface

212、214、216、218‧‧‧觸控面的側邊 212, 214, 216, 218‧‧ ‧ side of the touch surface

23、53‧‧‧接觸操作區域 23, 53‧‧‧Contact operation area

232、234‧‧‧接觸操作區域的側邊 232, 234‧‧‧Contact the side of the operating area

24‧‧‧虛擬線段 24‧‧‧virtual line segments

25、55‧‧‧體感映射區域 25, 55‧‧‧ Somatosensory mapping area

252、254‧‧‧體感映射區域的側邊 252, 254‧‧ ‧ side of the somatosensory mapping area

31、312、314、52‧‧‧虛擬體感面 31, 312, 314, 52‧‧‧ virtual body surface

412、414‧‧‧虛擬體感座標 412, 414‧‧‧ virtual body sense coordinates

452、454‧‧‧映射座標 452, 454‧‧‧ mapping coordinates

63、65‧‧‧區域 63, 65‧‧‧ areas

S702、S704、S706‧‧‧本發明的一實施例中使用者介面面的操作方法各步驟 S702, S704, S706‧‧‧ steps of the operation method of the user interface in an embodiment of the present invention

圖1為根據本發明的一實施例所繪示的電子裝置的示意圖。 FIG. 1 is a schematic diagram of an electronic device according to an embodiment of the invention.

圖2為根據本發明的一實施例所繪示的定義接觸操作區域與體感映射區域及配置動作感測器的示意圖。 FIG. 2 is a schematic diagram of defining a contact operation area and a somatosensory mapping area and configuring a motion sensor according to an embodiment of the invention.

圖3為根據本發明的一實施例所繪示的虛擬體感面的示意圖。 FIG. 3 is a schematic diagram of a virtual body sensing surface according to an embodiment of the invention.

圖4為根據本發明的一實施例所繪示的將虛擬體感座標轉換為映射座標的示意圖。 FIG. 4 is a schematic diagram of converting virtual body sensation coordinates into mapping coordinates according to an embodiment of the invention.

圖5為根據本發明的另一實施例所繪示的電子裝置的示意圖。 FIG. 5 is a schematic diagram of an electronic device according to another embodiment of the invention.

圖6為根據本發明的又一實施例所繪示的電子裝置的示意圖。 FIG. 6 is a schematic diagram of an electronic device according to still another embodiment of the present invention.

圖7為根據本發明的一實施例所繪示的使用者介面的操作方法的流程圖。 FIG. 7 is a flow chart of a method for operating a user interface according to an embodiment of the invention.

圖1為根據本發明的一實施例所繪示的電子裝置的示意圖。請參照圖1,電子裝置10例如是智慧型手機、遊戲機或平板電腦等可讓使用者單手持有的電子裝置。 FIG. 1 is a schematic diagram of an electronic device according to an embodiment of the invention. Referring to FIG. 1 , the electronic device 10 is, for example, an electronic device such as a smart phone, a game machine, or a tablet computer that allows a user to hold a single device.

電子裝置10包括動作感測器12、觸控螢幕14及處理器16。動作感測器12設置於電子裝置10上,並且動作感測器12具有感測鏡頭。其中,動作感測器12的感測鏡頭可以包括由各式感光元件組成的感測鏡頭與紅外光感測器等等,以在其偵測範圍內偵測物體的所在位置與移動軌跡。此外,本發明並不限制動作感測器12的數目。例如,在本實施例中,動作感測器12的數目可以是2個。然而,在另一實施例中,動作感測器12的數目也可以是1個或是3個以上,視實務上的需求而定。 The electronic device 10 includes a motion sensor 12, a touch screen 14 and a processor 16. The motion sensor 12 is disposed on the electronic device 10, and the motion sensor 12 has a sensing lens. The sensing lens of the motion sensor 12 may include a sensing lens composed of various photosensitive elements, an infrared light sensor, and the like to detect the position and the moving track of the object within the detection range thereof. Moreover, the present invention does not limit the number of motion sensors 12. For example, in the present embodiment, the number of motion sensors 12 may be two. However, in another embodiment, the number of motion sensors 12 may also be one or more than three, depending on practical requirements.

觸控螢幕14例如是電阻式(resistive)觸控面板、電容式(capacitive)觸控面板、光學式(optical)觸控面板、聲波式(acoustic wave)觸控面板或電磁式(electromagnetic)觸控面板等,在此並不限制其種類。觸控螢幕14具有一觸控面,當使用者利用手指或觸控筆等輸入裝置在此觸控面上進行點擊、滑動或拖曳等接觸操作時,觸控螢幕14可產生相對應的觸碰訊號。 The touch screen 14 is, for example, a resistive touch panel, a capacitive touch panel, an optical touch panel, an acoustic wave touch panel, or an electromagnetic touch. Panels and the like are not limited herein. The touch screen 14 has a touch surface. When the user performs a touch operation such as clicking, sliding or dragging on the touch surface by using an input device such as a finger or a stylus, the touch screen 14 can generate a corresponding touch. Signal.

處理器16耦接動作感測器12與觸控螢幕14,並且處理器16可以是微控制器(micro-controller)、嵌入式控制器(embedded controller)或中央處理器(central processing unit,CPU)等,且不限定於上述。處理器16可運行一個或多個作業系統,並且觸控螢 幕14可以顯示由此一個或多個作業系統所提供的使用者介面。然後,當觸控螢幕14根據接觸操作產生觸碰訊號時,處理器16可根據此觸碰訊號決定是否產生相對應的輸入指令來操作使用者介面。 The processor 16 is coupled to the motion sensor 12 and the touch screen 14, and the processor 16 can be a micro-controller, an embedded controller, or a central processing unit (CPU). And the like, and is not limited to the above. The processor 16 can run one or more operating systems and touch the flash Curtain 14 may display the user interface provided by one or more operating systems. Then, when the touch screen 14 generates a touch signal according to the touch operation, the processor 16 can determine whether to generate a corresponding input command to operate the user interface according to the touch signal.

特別是,在本實施例中,處理器16會在觸控螢幕14的觸控面上定義接觸操作區域與體感映射區域。觸控螢幕14負責偵測作用於接觸操作區域內的接觸操作,並且動作感測器12負責偵測位於觸控螢幕14的觸控面上方且沒有接觸到此觸控面的體感動作。接著,處理器16可以根據觸控螢幕14所偵測到的接觸操作來操作涵蓋於接觸操作區域內的使用者介面,並且根據動作感測器12所偵測到的體感動作來操作涵蓋於體感映射區域內的使用者介面。然而,雖然處理器16可以根據動作感測器12所偵測到的體感動作來操作涵蓋於體感映射區域內的使用者介面,但是由於觸控螢幕14本身可以偵測使用者作用在整個觸控面上幾乎所有位置的接觸操作,因此觸控螢幕14其實也可以偵測作用於體感映射區域(或非作用於接觸操作區域)內的接觸操作,並且處理器16也可以根據來自觸控螢幕14的觸控訊號來操作包含體感映射區域的整個使用者介面。 In particular, in the embodiment, the processor 16 defines a contact operation area and a somatosensory mapping area on the touch surface of the touch screen 14. The touch screen 14 is responsible for detecting a touch operation acting on the touch operation area, and the motion sensor 12 is responsible for detecting a body motion that is located above the touch surface of the touch screen 14 and does not touch the touch surface. Then, the processor 16 can operate the user interface covered in the contact operation area according to the touch operation detected by the touch screen 14 and operate according to the somatosensory action detected by the motion sensor 12 User interface within the somatosensory mapping area. However, although the processor 16 can operate the user interface included in the somatosensory mapping area according to the somatosensory motion detected by the motion sensor 12, since the touch screen 14 itself can detect the user acting throughout The contact operation is performed at almost all positions on the touch surface, so the touch screen 14 can actually detect the contact operation acting on the somatosensory mapping area (or not in the contact operation area), and the processor 16 can also be based on the touch The touch signal of the screen 14 is controlled to operate the entire user interface including the somatosensory mapping area.

為了說明方便,以下將先針對習慣用左手來握持電子裝置,並且由左手的拇指來操作電子裝置的觸控螢幕的使用者對電子裝置10進行說明。 For convenience of explanation, the electronic device 10 will be described below for a user who is accustomed to holding the electronic device with his left hand and operating the touch screen of the electronic device with the thumb of the left hand.

舉例來說,圖2為根據本發明的一實施例所繪示的定義 接觸操作區域與體感映射區域及配置動作感測器的示意圖。請參照圖2,處理器16可以在觸控螢幕14的觸控面21上定義接觸操作區域23與體感映射區域25。例如,在本實施例中,處理器16可在虛擬線段24處將觸控面21劃分為接觸操作區域23與體感映射區域25。如圖2所示,假設觸控面21具有側邊212、側邊214、側邊216及側邊218,接觸操作區域23具有側邊232與側邊234,並且體感映射區域25具有側邊252與側邊254,則觸控面21的側邊232與觸控面21的側邊212相鄰或緊靠,體感映射區域25的側邊252與觸控面21的側邊214相鄰或緊靠,並且接觸操作區域23的側邊234與體感映射區域25的側邊254相鄰或緊靠。 For example, FIG. 2 is a diagram illustrating an embodiment of the present invention. A schematic diagram of contacting the operation area and the somatosensory mapping area and configuring the motion sensor. Referring to FIG. 2 , the processor 16 can define a contact operation area 23 and a somatosensory mapping area 25 on the touch surface 21 of the touch screen 14 . For example, in the present embodiment, the processor 16 may divide the touch surface 21 into the touch operation area 23 and the somatosensory map area 25 at the virtual line segment 24. As shown in FIG. 2, it is assumed that the touch surface 21 has a side edge 212, a side edge 214, a side edge 216 and a side edge 218. The contact operation area 23 has a side edge 232 and a side edge 234, and the body sense mapping area 25 has side edges. 252 and the side 254, the side 232 of the touch surface 21 is adjacent to or close to the side 212 of the touch surface 21, and the side 252 of the somatosensory map area 25 is adjacent to the side 214 of the touch surface 21 Or abutting, and the side 234 of the contact operating area 23 is adjacent or abutting the side 254 of the somatosensory mapping area 25.

特別是,在本實施例中,由於接觸操作區域23與體感映射區域25的面積及形狀完全相同,因此虛擬線段24例如是由觸控面21的側邊216之中間點延伸至觸控面21的側邊218之中間點。然而,針對觸控螢幕尺吋不同的電子裝置,接觸操作區域23與體感映射區域25的面積及/或形狀也可以適應性調整。例如,在一實施例中,若觸控螢幕的尺吋小於一尺吋下限,或者觸控面21的側邊212與側邊214的距離小於一距離下限,則可使虛擬線段24相對於觸控面21的側邊212而較為靠近側邊214,並且使得接觸操作區域23的面積大於體感映射區域25的面積。或者,在另一實施例中,若觸控螢幕的尺吋大於一尺吋上限,或者觸控面21的側邊212與側邊214的距離大於一距離上限,則可使虛擬線段24相對於觸控面21的側邊214而較為靠近側邊212,並且使得接 觸操作區域23的面積小於體感映射區域25的面積。 In particular, in the present embodiment, since the area and shape of the touch operation area 23 and the body sense map area 25 are completely the same, the virtual line segment 24 extends from the middle point of the side 216 of the touch surface 21 to the touch surface, for example. The middle point of the side 218 of 21. However, the area and/or shape of the touch operation area 23 and the somatosensory map area 25 can also be adaptively adjusted for different electronic devices with different touch screen sizes. For example, in an embodiment, if the size of the touch screen is less than a lower limit of one foot, or the distance between the side 212 of the touch surface 21 and the side 214 is less than a lower limit of the distance, the virtual line segment 24 can be made to be relatively touched. The side 212 of the control surface 21 is closer to the side 214 and the area of the contact operating area 23 is made larger than the area of the somatosensory mapping area 25. Alternatively, in another embodiment, if the size of the touch screen is greater than the upper limit of one foot, or the distance between the side 212 of the touch surface 21 and the side 214 is greater than an upper limit of the distance, the virtual line segment 24 may be relative to The side 214 of the touch surface 21 is closer to the side 212 and is connected The area of the touch operation area 23 is smaller than the area of the somatosensory map area 25.

此外,為說明方便,圖1中提及的動作感測器12例如是以圖2中的動作感測器222與動作感測器224作為範例,但不限於此。例如,在一實施例中,電子裝置10也可以僅包括動作感測器222與動作感測器224的其中之一或者其他的動作感測器,視實務上的需求而定。 In addition, for convenience of explanation, the motion sensor 12 mentioned in FIG. 1 is exemplified by the motion sensor 222 and the motion sensor 224 in FIG. 2, but is not limited thereto. For example, in an embodiment, the electronic device 10 may also include only one of the motion sensor 222 and the motion sensor 224 or other motion sensors, depending on practical requirements.

在本實施例中,動作感測器222與動作感測器224鄰近觸控螢幕16的觸控面21設置,並且由於接觸操作區域23鄰近觸控面21的側邊212,並且體感映射區域25鄰近觸控面21的側邊214,因此動作感測器222與動作感測器224可相對於觸控面21的側邊212而更為靠近觸控面21的側邊214,並且動作感測器222與動作感測器224的感測鏡頭皆分別朝向觸控面21的側邊212。另外,為擴大動作感測器224與動作感測器222的可偵測範圍,動作感測器224相對於觸控面21的側邊216而更為靠近觸控面21的側邊218,並且動作感測器222相對於觸控面21的側邊218而更為靠近觸控面21的側邊216,但不以此為限。 In this embodiment, the motion sensor 222 and the motion sensor 224 are disposed adjacent to the touch surface 21 of the touch screen 16 , and since the contact operation area 23 is adjacent to the side 212 of the touch surface 21 , and the body sense mapping area 25 is adjacent to the side 214 of the touch surface 21 , so the motion sensor 222 and the motion sensor 224 can be closer to the side 214 of the touch surface 21 relative to the side 212 of the touch surface 21 , and the motion sense The sensing lenses of the detector 222 and the motion sensor 224 are respectively directed toward the side 212 of the touch surface 21 . In addition, to expand the detectable range of the motion sensor 224 and the motion sensor 222 , the motion sensor 224 is closer to the side 218 of the touch surface 21 relative to the side 216 of the touch surface 21 , and The motion sensor 222 is closer to the side 216 of the touch surface 21 than the side 218 of the touch surface 21, but is not limited thereto.

在本實施例中,動作感測器222與動作感測器224的感測鏡頭之朝向皆不平行於觸控面21,以分別或共同捕捉位於觸控面21的上方且沒有接觸到觸控面21的體感動作,並且此體感動作例如是以至少一虛擬體感面作為基準進行座標與移動軌跡的判定。 In this embodiment, the sensing sensors 222 and the sensing lenses of the motion sensor 224 are not parallel to the touch surface 21, respectively, or are respectively captured above the touch surface 21 and are not in contact with the touch. The somatosensory action of the face 21 is determined by, for example, at least one virtual body surface as a reference for the coordinates and the movement trajectory.

舉例來說,圖3為根據本發明的一實施例所繪示的虛擬 體感面的示意圖。請參照圖2與圖3,在本實施例中,由於動作感測器222與動作感測器224的感測鏡頭皆朝向觸控面21的側邊212所在的方向,且不平行於觸控面21,因此動作感測器222與動作感測器224的感測鏡頭可在觸控面21上形成虛擬體感面31。例如,虛擬體感面31可由較為靠近動作感測器222的虛擬體感面312與較為靠近動作感測器224的虛擬體感面314組成,其中虛擬體感面312是動作感測器222用以感測體感動作的感測基準,並且虛擬體感面314是動作感測器224用以感測體感動作的感測基準。 For example, FIG. 3 is a virtual diagram according to an embodiment of the invention. Schematic diagram of the somatosensory surface. Referring to FIG. 2 and FIG. 3 , in the embodiment, the sensing lenses of the motion sensor 222 and the motion sensor 224 are both oriented in the direction of the side 212 of the touch surface 21 and are not parallel to the touch. The surface 21, therefore, the motion sensor 222 and the sensing lens of the motion sensor 224 can form a virtual body surface 31 on the touch surface 21. For example, the virtual body surface 31 may be composed of a virtual body surface 312 that is closer to the motion sensor 222 and a virtual body surface 314 that is closer to the motion sensor 224. The virtual body surface 312 is used by the motion sensor 222. The sensing reference for sensing the somatosensory motion is used, and the virtual somatosensory surface 314 is a sensing reference for the motion sensor 224 to sense the somatosensory motion.

然而,在一實施例中,若電子裝置10僅包括動作感測器222與動作感測器224的其中之一,則也可透過調整動作感測器222或動作感測器224的感測鏡頭之朝向或角度來形成完整或部分的虛擬體感面31。此外,由於虛擬體感面31不會平行於觸控面21,因此虛擬體感面31與觸控面21之間會具有平面間夾角θ,並且平面間夾角θ可以是10度至90度中的任一數值,視實務上的需求而定。也就是說,虛擬體感面31實質上垂直或接近垂直於觸控面21,視動作感測器222與動作感測器224的感測鏡頭之朝向或角度而定。此外,虛擬體感面31(或虛擬體感面31向觸控面21的延伸)與觸控面21的交會處例如會與圖1中用來劃分接觸操作區域23與體感映射區域25的虛擬線段24重疊。 However, in an embodiment, if the electronic device 10 includes only one of the motion sensor 222 and the motion sensor 224, the sensing lens of the motion sensor 222 or the motion sensor 224 can also be adjusted. The orientation or angle is formed to form a complete or partial virtual body surface 31. In addition, since the virtual body sensing surface 31 is not parallel to the touch surface 21, the virtual body sensing surface 31 and the touch surface 21 may have an interplanar angle θ, and the inter-plane angle θ may be 10 degrees to 90 degrees. Any value, depending on the actual needs. That is, the virtual body surface 31 is substantially perpendicular or nearly perpendicular to the touch surface 21, depending on the orientation or angle of the motion sensor 222 and the sensing lens of the motion sensor 224. In addition, the intersection of the virtual body surface 31 (or the extension of the virtual body surface 31 to the touch surface 21) and the touch surface 21 may be used, for example, to divide the contact operation area 23 and the body sense map area 25 in FIG. The virtual line segments 24 overlap.

詳細而言,處理器16可以獲得體感動作對應在虛擬體感面上的虛擬體感座標,並且將虛擬體感座標轉換為體感動作對應 在涵蓋於體感映射區域內的使用者介面上的映射座標。然後,處理器16可以根據映射座標來操作涵蓋於體感映射區域內的使用者介面。 In detail, the processor 16 can obtain the virtual body sense coordinates corresponding to the virtual body feeling surface of the somatosensory motion, and convert the virtual body feeling coordinates into the somatosensory motion corresponding The mapping coordinates are covered on the user interface within the somatosensory mapping area. Processor 16 can then operate the user interface within the somatosensory mapping area in accordance with the mapping coordinates.

舉例來說,圖4為根據本發明的一實施例所繪示的將虛擬體感座標轉換為映射座標的示意圖。請參照圖4,假設使用者用左手握持電子裝置10,並且在此握持狀態下,使用者的左手拇指可以觸碰到涵蓋於接觸操作區域23內的觸控面21,但觸碰不到涵蓋於體感映射區域25內的觸控面21。此時,若使用者欲對涵蓋於體感映射區域25內的使用者介面進行操作,使用者可以利用其左手拇指在不接觸觸控面21的前提下,執行移動左手拇指等體感動作。例如,使用者可以移動其左手拇指,以對應在虛擬體感面31上產生先點擊虛擬體感座標412的所在位置,再接續點擊虛擬體感座標414的所在位置的體感動作,或者是由虛擬體感座標412的所在位置移動至虛擬體感座標414的所在位置的體感動作。若在上述過程中使用者的左手拇指持續位於動作感測器222及/或動作感測器224的偵測範圍內,則動作感測器222與動作感測器224可分別或同時偵測到此體感動作,並且處理器16可以根據動作感測器222及/或動作感測器224所偵測到的體感動作,獲得此體感動作對應在虛擬體感面31上的虛擬體感座標412、虛擬體感座標414及使用者的左手拇指對應在虛擬體感面31上的點擊或移動軌跡等等體感動作資訊。 For example, FIG. 4 is a schematic diagram of converting virtual body sensation coordinates into mapping coordinates according to an embodiment of the invention. Referring to FIG. 4, it is assumed that the user holds the electronic device 10 with the left hand, and in the grip state, the user's left thumb can touch the touch surface 21 covered in the contact operation area 23, but the touch is not The touch surface 21 included in the somatosensory mapping area 25 is included. At this time, if the user wants to operate the user interface included in the somatosensory mapping area 25, the user can perform a somatosensory action such as moving the left thumb without using the left thumb to touch the touch surface 21. For example, the user can move the thumb of the left hand to correspondingly generate a somatosensory action on the virtual body surface 31 to click the position of the virtual body sense coordinate 412, and then click on the position of the virtual body sense coordinate 414, or The position of the virtual somatosensory coordinate 412 is moved to the somatosensory action of the position of the virtual somatosensory coordinate 414. If the left thumb of the user continues to be within the detection range of the motion sensor 222 and/or the motion sensor 224 during the above process, the motion sensor 222 and the motion sensor 224 may be detected separately or simultaneously. The somatosensory action, and the processor 16 can obtain the virtual sense of the somatosensory motion corresponding to the virtual somatosensory surface 31 according to the somatosensory motion detected by the motion sensor 222 and/or the motion sensor 224. The coordinates 412, the virtual somatosensory coordinates 414, and the user's left thumb correspond to somatosensory motion information such as clicks or movement trajectories on the virtual body sensing surface 31.

接著,處理器16可以根據映射演算法分別將所獲得的虛 擬體感座標412、虛擬體感座標414及使用者的左手拇指對應在虛擬體感面31上的點擊或移動軌跡等體感動作資訊轉換為此體感動作對應在涵蓋於體感映射區域25內的使用者介面上的映射座標452、映射座標454及對應在觸控面21上的點擊或移動軌跡。特別是,從圖4中可以清楚得知,在本實施例中,虛擬體感座標412相對於虛擬體感座標414而更靠近觸控面21,並且在完成座標轉換之後,處理器16可以獲得在觸控面21上的映射座標452與映射座標454。其中,映射座標452是由虛擬體感座標412轉換而得,映射座標454是由虛擬體感座標414轉換而得,並且映射座標452相對於與映射座標454更接近接觸操作區域23。藉此,不需使用到使用者的右手,處理器16可以根據映射座標452與映射座標454來操作涵蓋於體感映射區域25內的使用者介面。 Then, the processor 16 can separately obtain the obtained virtual image according to the mapping algorithm. The somatosensory coordinates 412, the virtual somatosensory coordinates 414, and the left thumb of the user correspond to the somatosensory motion information such as clicks or movement trajectories on the virtual body sensing surface 31, and the somatosensory motion corresponding to the somatosensory mapping region 25 is included. The mapping coordinates 452, the mapping coordinates 454, and the click or movement trajectory corresponding to the touch surface 21 are displayed on the user interface. In particular, as is clear from FIG. 4, in the present embodiment, the virtual body-sensing coordinates 412 are closer to the touch surface 21 with respect to the virtual body-sensing coordinates 414, and after the coordinate conversion is completed, the processor 16 can obtain The mapping coordinates 452 and mapping coordinates 454 on the touch surface 21 are shown. The mapping coordinates 452 are converted from the virtual body sensation coordinates 412, the mapping coordinates 454 are converted from the virtual body sensation coordinates 414, and the mapping coordinates 452 are closer to the contact operation area 23 than the mapping coordinates 454. Thereby, the processor 16 can operate the user interface included in the somatosensory mapping area 25 according to the mapping coordinates 452 and the mapping coordinates 454 without using the right hand of the user.

值得一提的是,請再次參照圖1,在一實施例中,使用者也可以自行選擇是否啟動動作感測器12。例如,當使用者想要用雙手來操作電子裝置10時,動作感測器12並不需要被啟動。反之,若使用者想要只使用單手來操作電子裝置10時,則使用者可透過觸控螢幕14來送出一觸發訊號至處理器16,或者透過按壓電子裝置10上的一實體按鍵而送出此觸發訊號至處理器16。當處理器16接收到此觸發訊號時,處理器16啟動動作感測器12。 It should be noted that, referring again to FIG. 1 , in an embodiment, the user may also select whether to activate the motion sensor 12 . For example, when the user wants to operate the electronic device 10 with both hands, the motion sensor 12 does not need to be activated. On the other hand, if the user wants to use only one hand to operate the electronic device 10, the user can send a trigger signal to the processor 16 through the touch screen 14 or send it by pressing a physical button on the electronic device 10. This trigger signal is sent to the processor 16. When the processor 16 receives the trigger signal, the processor 16 activates the motion sensor 12.

此外,本發明也針對習慣用右手來握持電子裝置,並且由右手的拇指來操作電子裝置之觸控螢幕的使用者提出相應的實施例。 In addition, the present invention is also directed to a user who is accustomed to holding an electronic device with his right hand, and a user who operates the touch screen of the electronic device with the thumb of the right hand proposes a corresponding embodiment.

舉例來說,圖5為根據本發明的另一實施例所繪示的電子裝置的示意圖。請參照圖5,電子裝置50上配置有動作感測器562與動作感測器564,電子裝置50的處理器可以在觸控螢幕的觸控面51上定義接觸操作區域53與體感映射區域55,並且動作感測器562與動作感測器564可在觸控面51上形成對應至體感映射區域55的虛擬體感面52。藉此,如圖5所示,當使用者以右手握持電子裝置50,並且由右手的拇指來操作電子裝置50之觸控螢幕時,即使使用者的右手姆指無法接觸到涵蓋於體感映射區域55的的觸控面51,使用者也可以直接透過與觸控面51之間具有平面間夾角θ的虛擬體感面52來下達操作指令,以操作涵蓋於體感映射區域55內的使用者介面。 For example, FIG. 5 is a schematic diagram of an electronic device according to another embodiment of the invention. Referring to FIG. 5 , the electronic device 50 is configured with a motion sensor 562 and a motion sensor 564 . The processor of the electronic device 50 can define a touch operation area 53 and a body sense mapping area on the touch surface 51 of the touch screen. 55, and the motion sensor 562 and the motion sensor 564 can form a virtual body surface 52 corresponding to the somatosensory mapping area 55 on the touch surface 51. Thereby, as shown in FIG. 5, when the user holds the electronic device 50 with the right hand and the touch screen of the electronic device 50 is operated by the thumb of the right hand, even if the user's right hand thumb cannot be touched, the sense of body is covered. In the touch surface 51 of the mapping area 55, the user can directly send an operation command through the virtual body surface 52 having an inter-plane angle θ with the touch surface 51 to operate in the somatosensory mapping area 55. user interface.

也就是說,本發明實施例中的動作感測器、接觸操作區域及體感映射區域皆可以視實務上的需求而調整配置位置,而不侷限於上述。此外,在一實施例中,動作感測器也可設置於接觸操作區域及體感映射區域的交界處上方與下方,而可透過調整動作感測器的鏡頭朝向來選擇性地偵測使用者的左手拇指或右手拇指對應在虛擬體感面上的移動軌跡。 In other words, the motion sensor, the contact operation area, and the body-sensing map area in the embodiment of the present invention can adjust the configuration position according to actual requirements, and are not limited to the above. In addition, in an embodiment, the motion sensor can also be disposed above and below the boundary between the contact operation area and the body sense mapping area, and can selectively detect the user by adjusting the lens orientation of the motion sensor. The left thumb or the right thumb corresponds to the movement trajectory on the virtual body surface.

另外,圖6為根據本發明的又一實施例所繪示的電子裝置的示意圖。請參照圖6,電子裝置60上配置有動作感測器222、動作感測器224、動作感測器562及動作感測器564。在本實施例中,電子裝置60的處理器可以根據使用者的設定來調整啟動或關閉動作感測器222、動作感測器224、動作感測器562及動作感測 器564的其中之一或其組合。例如,使用者可同時啟動所有的動作感測器222、動作感測器224、動作感測器562及動作感測器564,以同時偵測使用者的左手拇指與右手拇指的體感動作。或者,使用者也可以僅啟動動作感測器222與動作感測器224或動作感測器562與動作感測器564,本發明不對其限制。當使用者以左手操作電子裝置60時,電子裝置60的處理器可將區域63定義為體感映射區域,並且根據動作感測器222與動作感測器224所偵測到的體感動作來操作涵蓋於區域63內的使用者介面。當使用者以右手操作電子裝置60時,則電子裝置60的處理器可將區域65定義為體感映射區域,並且根據動作感測器562與動作感測器564所偵測到的體感動作來操作涵蓋於區域65內的使用者介面。 In addition, FIG. 6 is a schematic diagram of an electronic device according to still another embodiment of the present invention. Referring to FIG. 6 , the operation device 222 , the motion sensor 224 , the motion sensor 562 , and the motion sensor 564 are disposed on the electronic device 60 . In this embodiment, the processor of the electronic device 60 can adjust the activation or deactivation action sensor 222, the motion sensor 224, the motion sensor 562, and the motion sensing according to the user's settings. One of the devices 564 or a combination thereof. For example, the user can simultaneously activate all the motion sensors 222, the motion sensor 224, the motion sensor 562, and the motion sensor 564 to simultaneously detect the somatosensory motion of the user's left thumb and right thumb. Alternatively, the user may only activate the motion sensor 222 and the motion sensor 224 or the motion sensor 562 and the motion sensor 564, which is not limited by the present invention. When the user operates the electronic device 60 with the left hand, the processor of the electronic device 60 can define the region 63 as a somatosensory mapping region, and according to the somatosensory action detected by the motion sensor 222 and the motion sensor 224. The operation covers the user interface within area 63. When the user operates the electronic device 60 with the right hand, the processor of the electronic device 60 can define the region 65 as a somatosensory mapping region, and according to the motion sense detected by the motion sensor 562 and the motion sensor 564. The operation covers the user interface within the area 65.

或者,在一實施例中,電子裝置60的處理器也可透過動作感測器222、動作感測器224、動作感測器562及動作感測器564的其中之一或其組合來偵測並判斷使用者當前是以左手或右手來握持電子裝置60,例如判斷哪個動作感測器有偵測到使用者的拇指。若電子裝置60的處理器判定使用者當前是以左手來握持與操作電子裝置60,例如動作感測器222及/或動作感測器224有偵測到使用者的拇指,則電子裝置60的處理器可將區域63定義為體感映射區域,並且根據動作感測器222與動作感測器224所偵測到的體感動作來操作涵蓋於區域63內的使用者介面。此外,在電子裝置60的處理器判定使用者當前是以左手來握持與操作電子裝置60時,電子裝置60的處理器也可選擇性地關閉動作感測器562 與動作感測器564。若電子裝置60的處理器判定使用者當前是以右手來握持電子裝置60,例如動作感測器562及/或動作感測器564有偵測到使用者的拇指,則電子裝置60的處理器可將區域65定義為體感映射區域,並且根據動作感測器562與動作感測器564所偵測到的體感動作來操作涵蓋於區域65內的使用者介面。此外,在電子裝置60的處理器判定使用者當前是以右手來握持與操作電子裝置60時,電子裝置60的處理器並也可選擇性地關閉動作感測器222與動作感測器224。 Alternatively, in an embodiment, the processor of the electronic device 60 can also be detected by one or a combination of the motion sensor 222, the motion sensor 224, the motion sensor 562, and the motion sensor 564. And determining that the user currently holds the electronic device 60 with the left or right hand, for example, determining which motion sensor has detected the user's thumb. If the processor of the electronic device 60 determines that the user is currently holding and operating the electronic device 60 with the left hand, for example, the motion sensor 222 and/or the motion sensor 224 detects the user's thumb, the electronic device 60 The processor may define the region 63 as a somatosensory mapping region and operate the user interface within the region 63 in accordance with the somatosensory motion detected by the motion sensor 222 and the motion sensor 224. In addition, when the processor of the electronic device 60 determines that the user is currently holding and operating the electronic device 60 with the left hand, the processor of the electronic device 60 can also selectively turn off the motion sensor 562. And motion sensor 564. If the processor of the electronic device 60 determines that the user is currently holding the electronic device 60 with the right hand, for example, the motion sensor 562 and/or the motion sensor 564 detects the user's thumb, the processing of the electronic device 60 The region 65 can be defined as a somatosensory mapping region, and the user interface encompassed within the region 65 is operated in accordance with the somatosensory motion detected by the motion sensor 562 and the motion sensor 564. In addition, when the processor of the electronic device 60 determines that the user is currently holding and operating the electronic device 60 with the right hand, the processor of the electronic device 60 can also selectively turn off the motion sensor 222 and the motion sensor 224. .

然而,關於圖5與圖6之實施例的部分實施細節,也可參考圖1至圖4的說明類推而得。 However, some implementation details regarding the embodiments of FIGS. 5 and 6 can also be derived from the description of FIGS. 1 through 4.

圖7為根據本發明的一實施例所繪示的使用者介面的操作方法的流程圖。請同時參照圖1與圖7,在步驟S702中,處理器16在觸控螢幕14的觸控面上定義接觸操作區域與體感映射區域。接著,在步驟S704中,處理器16透過觸控螢幕14偵測作用於接觸操作區域內的接觸操作,並且透過設置於電子裝置10上的動作感測器12偵測位於觸控面上方且沒有接觸到觸控面的體感動作。然後,在步驟S706中,處理器16根據接觸操作來操作涵蓋於接觸操作區域內的使用者介面,並且根據體感動作來操作涵蓋於體感映射區域內的使用者介面。 FIG. 7 is a flow chart of a method for operating a user interface according to an embodiment of the invention. Referring to FIG. 1 and FIG. 7 simultaneously, in step S702, the processor 16 defines a contact operation area and a somatosensory mapping area on the touch surface of the touch screen 14. Then, in step S704, the processor 16 detects the contact operation in the contact operation area through the touch screen 14, and detects that it is located above the touch surface through the motion sensor 12 disposed on the electronic device 10. Touching the somatosensory action of the touch surface. Then, in step S706, the processor 16 operates the user interface covered in the contact operation area according to the touch operation, and operates the user interface included in the somatosensory map area according to the somatosensory action.

然而,上述使用者介面的操作方法的詳細實施細節已詳述於上,故在此便不贅述。此外,上述使用者介面的操作方法可以是由電子裝置10、電子裝置50或電子裝置60的處理器(例如處 理器16)中的一個或多個硬體電路分別執行。或者,在一實施例中,也可以將一個或多個軟體或軔體模組儲存於電子裝置10、電子裝置50或電子裝置60的硬碟或記憶體中。當此些軟體或軔體模組被載入至電子裝置10、電子裝置50或電子裝置60的處理器(例如,處理器16)時,此些軟體或軔體模組可執行上述方法中的各個步驟。 However, detailed implementation details of the above-described user interface operation method have been described in detail above, and thus will not be described herein. In addition, the operation method of the user interface may be performed by the processor of the electronic device 10, the electronic device 50 or the electronic device 60 (for example, One or more hardware circuits in the processor 16) are respectively executed. Alternatively, in one embodiment, one or more software or firmware modules may be stored in the hard disk or the memory of the electronic device 10, the electronic device 50, or the electronic device 60. When the software or the body module is loaded into the electronic device 10, the electronic device 50, or the processor of the electronic device 60 (for example, the processor 16), the software or the body module can be implemented in the above method. Each step.

綜上所述,本發明提供一種使用者介面的操作方法與電子裝置,其在具有觸控螢幕的電子裝置上額外設置動作感測器,以捕捉出現在觸控螢幕上方且沒有觸碰到觸控螢幕的體感動作,並且根據所捕捉到的體感動作來操作使用者原本單手無法操作的觸控螢幕的部份區域。 In summary, the present invention provides a user interface operation method and an electronic device, and an action sensor is additionally disposed on an electronic device having a touch screen to capture a touch screen that does not touch the touch screen. Controlling the somatosensory motion of the screen, and operating a part of the touch screen that the user cannot operate with one hand according to the captured somatosensory motion.

雖然本發明已以實施例揭露如上,然其並非用以限定本發明,任何所屬技術領域中具有通常知識者,在不脫離本發明的精神和範圍內,當可作些許的更動與潤飾,故本發明的保護範圍當視後附的申請專利範圍所界定者為準。 Although the present invention has been disclosed in the above embodiments, it is not intended to limit the present invention, and any one of ordinary skill in the art can make some changes and refinements without departing from the spirit and scope of the present invention. The scope of the invention is defined by the scope of the appended claims.

S702、S704、S706‧‧‧使用者介面的操作方法各步驟 S702, S704, S706‧‧‧ user interface operation method steps

Claims (10)

一種使用者介面的操作方法,適用於具有一觸控螢幕的一電子裝置,並且透過該觸控螢幕顯示該使用者介面,該電子裝置的操作方法包括:在該觸控螢幕的一觸控面上定義一接觸操作區域與一體感映射區域;透過該觸控螢幕偵測作用於該接觸操作區域內的一接觸操作,並且透過設置於該電子裝置上的至少一動作感測器偵測位於該觸控面上方且沒有接觸到該觸控面的一體感動作;以及根據該接觸操作來操作涵蓋於該接觸操作區域內的該使用者介面,並且根據該體感動作來操作涵蓋於該體感映射區域內的該使用者介面。 A user interface operation method is applicable to an electronic device having a touch screen, and the user interface is displayed through the touch screen. The operation method of the electronic device includes: a touch surface of the touch screen Defining a contact operation area and an integrated sensing area; detecting a contact operation in the contact operation area through the touch screen, and detecting, by the at least one motion sensor disposed on the electronic device An integral sensing action on the touch surface without contacting the touch surface; and operating the user interface covered in the contact operation area according to the contact operation, and operating according to the somatosensory action The user interface within the mapping area. 如申請專利範圍第1項所述的使用者介面的操作方法,其中該接觸操作區域的一第一側邊與該觸控面的一第一側邊相鄰,該體感映射區域的一第一側邊與該觸控面的一第二側邊相鄰,並且該接觸操作區域的該第二側邊與該體感映射區域的該第二側邊相鄰,其中該至少一動作感測器相對於該觸控面的該第一側邊而更為靠近該觸控面的該第二側邊,並且該至少一動作感測器的感測鏡頭朝向該觸控面的該第一側邊,且該至少一動作感測器的感測鏡頭皆不平行於該觸控面。 The method of operating a user interface according to claim 1, wherein a first side of the contact operating area is adjacent to a first side of the touch surface, and the first part of the body sense mapping area is One side is adjacent to a second side of the touch surface, and the second side of the touch operation area is adjacent to the second side of the body sensing map area, wherein the at least one motion sensing The second side of the touch surface is closer to the first side of the touch surface, and the sensing lens of the at least one motion sensor faces the first side of the touch surface And the sensing lens of the at least one motion sensor is not parallel to the touch surface. 如申請專利範圍第2項所述的使用者介面的操作方法,其 中該至少一動作感測器包括一第一動作感測器與一第二動作感測器,該第一動作感測器相對於該觸控面的一第三側邊而更為靠近該觸控面的一第四側邊,並且該第二動作感測器相對於該觸控面的該第四側邊而更為靠近該觸控面的該第三側邊。 The method of operating the user interface as described in claim 2, The at least one motion sensor includes a first motion sensor and a second motion sensor, and the first motion sensor is closer to the touch with respect to a third side of the touch surface. a fourth side of the control surface, and the second motion sensor is closer to the third side of the touch surface relative to the fourth side of the touch surface. 如申請專利範圍第1項所述的使用者介面的操作方法,其中根據該體感動作來操作涵蓋於該體感映射區域內的該使用者介面的步驟包括:獲得該體感動作對應在至少一虛擬體感面上的至少一虛擬體感座標,其中該至少一虛擬體感面皆不平行於該觸控面;將該至少一虛擬體感座標轉換為該體感動作對應在涵蓋於該體感映射區域內的該觸控面上的至少一映射座標;以及根據該至少一映射座標來操作涵蓋於該體感映射區域內的該使用者介面。 The method of operating a user interface according to claim 1, wherein the step of operating the user interface included in the somatosensory mapping area according to the somatosensory action comprises: obtaining the somatosensory action corresponding to at least At least one virtual body sense coordinate on a virtual body surface, wherein the at least one virtual body surface is not parallel to the touch surface; converting the at least one virtual body sense coordinate to the body motion corresponding to the At least one mapping coordinate on the touch surface in the somatosensory mapping area; and operating the user interface included in the somatosensory mapping area according to the at least one mapping coordinate. 如申請專利範圍第4項所述的使用者介面的操作方法,其中將該至少一虛擬體感座標轉換為該體感動作對應在涵蓋於該體感映射區域內的該觸控面上的該至少一映射座標的步驟包括:將該至少一虛擬體感座標中的一第一虛擬體感座標與一第二虛擬體感座標分別轉換為該體感動作對應在涵蓋於該體感映射區域內的該觸控面上的該至少一映射座標中的一第一映射座標與一第二映射座標,其中該第一虛擬體感座標相對於該第二虛擬體感座標更靠近該觸控面,並且該第一映射座標相對於該第二映射座標更靠近該 接觸操作區域。 The method of operating a user interface according to claim 4, wherein the at least one virtual body sense coordinate is converted to the body motion corresponding to the touch surface included in the body sense map area. The at least one mapping of the coordinates includes: converting a first virtual body sensation coordinate and a second virtual body sensation coordinate in the at least one virtual body sensation coordinate to the body sensation action corresponding to the body sensation mapping area a first mapping coordinate and a second mapping coordinate of the at least one mapping coordinate on the touch surface, wherein the first virtual body sensing coordinate is closer to the touch surface than the second virtual body sensing coordinate, And the first mapping coordinate is closer to the second mapping coordinate than the second mapping coordinate Contact the operating area. 一種電子裝置,包括:一觸控螢幕;至少一動作感測器,設置於該電子裝置上;以及一處理器,耦接該觸控螢幕,並且用以透過該觸控螢幕顯示一使用者介面,其中該處理器更用以在該觸控螢幕的一觸控面上定義一接觸操作區域與一體感映射區域的模組,其中該處理器更用以透過該觸控螢幕偵測作用於該接觸操作區域內的一接觸操作,並且透過該至少一動作感測器偵測位於該觸控面上方且沒有接觸到該觸控面的一體感動作,其中該處理器更用以根據該接觸操作來操作涵蓋於該接觸操作區域內的該使用者介面,並且根據該體感動作來操作涵蓋於該體感映射區域內的該使用者介面。 An electronic device includes: a touch screen; at least one motion sensor disposed on the electronic device; and a processor coupled to the touch screen and configured to display a user interface through the touch screen The processor is further configured to define a module for contacting the operation area and the integrated mapping area on a touch surface of the touch screen, wherein the processor is further configured to act on the touch screen through the touch screen Contacting a contact operation in the operation area, and detecting, by the at least one motion sensor, an integrated motion that is located above the touch surface and does not contact the touch surface, wherein the processor is further configured to operate according to the contact The user interface included in the contact operation area is operated, and the user interface included in the somatosensory map area is operated according to the somatosensory action. 如申請專利範圍第6項所述的電子裝置,其中該接觸操作區域的一第一側邊與該觸控面的一第一側邊相鄰,該體感映射區域的一第一側邊與該觸控面的一第二側邊相鄰,並且該接觸操作區域的該第二側邊與該體感映射區域的該第二側邊相鄰,其中該至少一動作感測器相對於該觸控面的該第一側邊而更為靠近該觸控面的該第二側邊,並且該至少一動作感測器的感測鏡頭朝向該觸控面的該第一側邊,且該至少一動作感測器的感測鏡頭皆不平行於該觸控面。 The electronic device of claim 6, wherein a first side of the contact operating area is adjacent to a first side of the touch surface, and a first side of the body sensing map area is A second side of the touch surface is adjacent, and the second side of the touch operation area is adjacent to the second side of the body sensing map area, wherein the at least one motion sensor is opposite to the second side The first side of the touch surface is closer to the second side of the touch surface, and the sensing lens of the at least one motion sensor faces the first side of the touch surface, and the The sensing lenses of at least one motion sensor are not parallel to the touch surface. 如申請專利範圍第7項所述的電子裝置,其中該至少一動作感測器包括一第一動作感測器與一第二動作感測器,該第一動作感測器相對於該觸控面的一第三側邊而更為靠近該觸控面的一第四側邊,並且該第二動作感測器相對於該觸控面的該第四側邊而更為靠近該觸控面的該第三側邊。 The electronic device of claim 7, wherein the at least one motion sensor comprises a first motion sensor and a second motion sensor, and the first motion sensor is opposite to the touch a third side of the surface is closer to a fourth side of the touch surface, and the second motion sensor is closer to the touch surface than the fourth side of the touch surface The third side of the. 如申請專利範圍第6項所述的電子裝置,其中該處理器更用以獲得該體感動作對應在至少一虛擬體感面上的至少一虛擬體感座標,其中該至少一虛擬體感面皆不平行於該觸控面,其中該處理器更用以將該至少一虛擬體感座標轉換為該體感動作對應在涵蓋於該體感映射區域內的該觸控面上的至少一映射座標,其中該處理器更用以根據該至少一映射座標來操作涵蓋於該體感映射區域內的該使用者介面。 The electronic device of claim 6, wherein the processor is further configured to obtain at least one virtual body sense coordinate corresponding to the at least one virtual body surface, wherein the at least one virtual body surface The processor is further configured to convert the at least one virtual body sense coordinate to at least one mapping of the body motion corresponding to the touch surface included in the body sense mapping area. a coordinate, wherein the processor is further configured to operate the user interface included in the somatosensory mapping area according to the at least one mapping coordinate. 如申請專利範圍第9項所述的電子裝置,其中該處理器更用以將該至少一虛擬體感座標中的一第一虛擬體感座標與一第二虛擬體感座標分別轉換為該體感動作對應在涵蓋於該體感映射區域內的該觸控面上的該至少一映射座標中的一第一映射座標與一第二映射座標,其中該第一虛擬體感座標相對於該第二虛擬體感座標更靠近該觸控面,並且該第一映射座標相對於該第二映射座標更接近該接觸操作區域。 The electronic device of claim 9, wherein the processor is further configured to convert a first virtual body sense coordinate and a second virtual body touch coordinate in the at least one virtual body sense coordinate to the body respectively. The sensing action corresponds to a first mapping coordinate and a second mapping coordinate of the at least one mapping coordinate of the touch surface included in the somatosensory mapping area, wherein the first virtual body sensing coordinate is opposite to the first mapping The second virtual body sensation is closer to the touch surface, and the first mapping coordinate is closer to the contact operation area relative to the second mapping coordinate.
TW102126940A 2013-07-26 2013-07-26 Method for operating user interface and electronic device TW201504925A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW102126940A TW201504925A (en) 2013-07-26 2013-07-26 Method for operating user interface and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW102126940A TW201504925A (en) 2013-07-26 2013-07-26 Method for operating user interface and electronic device

Publications (1)

Publication Number Publication Date
TW201504925A true TW201504925A (en) 2015-02-01

Family

ID=53018932

Family Applications (1)

Application Number Title Priority Date Filing Date
TW102126940A TW201504925A (en) 2013-07-26 2013-07-26 Method for operating user interface and electronic device

Country Status (1)

Country Link
TW (1) TW201504925A (en)

Similar Documents

Publication Publication Date Title
TWI608407B (en) Touch device and control method thereof
JP5507494B2 (en) Portable electronic device with touch screen and control method
TWI509497B (en) Method and system for operating portable devices
US20110205186A1 (en) Imaging Methods and Systems for Position Detection
US9632690B2 (en) Method for operating user interface and electronic device thereof
TWI502459B (en) Electronic device and touch operating method thereof
TWI450150B (en) Touch method and touch system
US9727147B2 (en) Unlocking method and electronic device
WO2009128064A2 (en) Vision based pointing device emulation
TW201816581A (en) Interface control method and electronic device using the same
TW201741814A (en) Interface control method and mobile terminal
TWI427504B (en) Configurable apparatus for directional operation and computer system
US20140118276A1 (en) Touch system adapted to touch control and hover control, and operating method thereof
US20180059806A1 (en) Information processing device, input control method for controlling input to information processing device, and computer-readable storage medium storing program for causing information processing device to perform input control method
CN113515228A (en) Virtual scale display method and related equipment
US20160154489A1 (en) Touch sensitive edge input device for computing devices
US20160034172A1 (en) Touch device and control method and method for determining unlocking thereof
JP6183820B2 (en) Terminal and terminal control method
TWI502413B (en) Optical touch device and gesture detecting method thereof
WO2016206438A1 (en) Touch screen control method and device and mobile terminal
TW201504925A (en) Method for operating user interface and electronic device
TWI444875B (en) Multi-touch input apparatus and its interface method using data fusion of a single touch sensor pad and imaging sensor
TWM434992U (en) Touch screen device with calibration function
TW200941307A (en) Extended cursor generating method and device
TW201528114A (en) Electronic device and touch system, touch method thereof