TWI488068B - Gesture control method and apparatus - Google Patents

Gesture control method and apparatus Download PDF

Info

Publication number
TWI488068B
TWI488068B TW101109527A TW101109527A TWI488068B TW I488068 B TWI488068 B TW I488068B TW 101109527 A TW101109527 A TW 101109527A TW 101109527 A TW101109527 A TW 101109527A TW I488068 B TWI488068 B TW I488068B
Authority
TW
Taiwan
Prior art keywords
gesture
plane
user
display
module
Prior art date
Application number
TW101109527A
Other languages
Chinese (zh)
Other versions
TW201339895A (en
Inventor
Yan Lin Kuo
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to TW101109527A priority Critical patent/TWI488068B/en
Publication of TW201339895A publication Critical patent/TW201339895A/en
Application granted granted Critical
Publication of TWI488068B publication Critical patent/TWI488068B/en

Links

Landscapes

  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Description

手勢控制方法及裝置Gesture control method and device

本發明是有關於一種手勢控制方法及裝置,且特別是有關於一種在三維空間中進行手勢控制的方法及裝置。The present invention relates to a gesture control method and apparatus, and more particularly to a method and apparatus for performing gesture control in a three-dimensional space.

現今消費者對於電子裝置外型的要求傾向輕、薄、短、小,此要求使得電子裝置的體積及重量受到相當限制。近來觸控螢幕的應用愈來愈普及,對於智慧型手機或平板電腦等可攜式裝置而言,觸控螢幕可同時作為裝置的顯示及輸入介面,而能夠省去配置傳統鍵盤的成本及所佔用的面積。而對於桌上型電腦或筆記型電腦來說,則可讓消費者選擇使用傳統鍵盤或者觸控螢幕進行操控輸入。Nowadays, the consumer's requirements for the appearance of electronic devices tend to be light, thin, short, and small, and this requirement makes the size and weight of the electronic device quite limited. Recently, the application of the touch screen has become more and more popular. For a portable device such as a smart phone or a tablet computer, the touch screen can simultaneously serve as a display and input interface of the device, thereby eliminating the cost and configuration of the conventional keyboard. Occupied area. For desktop computers or laptops, consumers can choose to use traditional keyboards or touch screens for control input.

然而,現有的觸控螢幕中,無論是電容式觸控螢幕或電阻式觸控螢幕,都會在液晶顯示(liquid crystal display,LCD)面板前再增加一片觸控玻璃或薄膜,因而會增加顯示螢幕的厚度及重量。因此,當筆記型電腦為開啟時,消費者透過手指觸碰顯示螢幕,容易造成筆記型電腦的晃動。且筆記型電腦的顯示螢幕與消費者之間有一定的距離及角度,不若平板電腦般可自由移動,因而筆記型電腦的裝置型態較不適於用手指直接觸碰。此外,以手指對顯示螢幕直接進行觸碰控制,容易導致指紋殘留在顯示螢幕的問題。However, in the existing touch screen, whether it is a capacitive touch screen or a resistive touch screen, a touch glass or film is added in front of the liquid crystal display (LCD) panel, thereby increasing the display screen. Thickness and weight. Therefore, when the notebook is turned on, the consumer touches the display screen through the finger, which easily causes the notebook to shake. And the display screen of the notebook computer has a certain distance and angle between the display and the consumer, and is not free to move like a tablet computer, so the device type of the notebook computer is less suitable for direct contact with a finger. In addition, the touch control of the display screen by the finger directly causes the fingerprint to remain on the display screen.

有鑑於此,本發明提供一種手勢控制方法及裝置,可依據使用者於顯示螢幕前的三維空間中進行的手勢操作,來對應控制顯示螢幕中所顯示的物件。In view of the above, the present invention provides a gesture control method and apparatus, which can correspondingly control an object displayed on a display screen according to a gesture operation performed by a user in a three-dimensional space in front of a display screen.

本發明提出一種手勢控制方法,適用於具有顯示螢幕的電子裝置。此方法係先偵測使用者於顯示螢幕前的三維空間中進行的第一手勢操作,以定義出操作平面。接著,計算使用者的手掌相對於與操作平面所涵蓋範圍的比例。然後,依據此比例切分操作平面為多個操作區域,並且切分顯示螢幕的顯示平面為相對應的多個顯示區域。最後,偵測使用者於操作平面內進行的第二手勢操作,並依據第二手勢操作所在的操作區域,控制相對應顯示區域中所顯示的物件。The invention provides a gesture control method suitable for an electronic device having a display screen. The method first detects the first gesture operation performed by the user in the three-dimensional space before the display screen to define the operation plane. Next, the ratio of the palm of the user relative to the range covered by the operating plane is calculated. Then, according to the ratio, the operation plane is divided into a plurality of operation areas, and the display plane of the display screen is divided into corresponding display areas. Finally, the second gesture operation performed by the user in the operation plane is detected, and the object displayed in the corresponding display area is controlled according to the operation area where the second gesture operation is performed.

在本發明之一實施例中,上述依據第二手勢操作所在的操作區域,控制相對應顯示區域中的物件的步驟更包括判斷第二手勢操作的手勢種類,並且執行此手勢種類對應的控制操作於此物件。In an embodiment of the present invention, the step of controlling an object in the corresponding display area according to the operation area in which the second gesture operation is performed further includes determining a gesture type of the second gesture operation, and performing the gesture type corresponding to the gesture type Control operates on this object.

在本發明之一實施例中,在上述偵測使用者於操作平面內進行的第二手勢操作的步驟之後,所述方法更包括依據第二手勢操作在操作平面中的操作位置,在顯示螢幕中相對應的螢幕位置顯示手勢圖標。In an embodiment of the present invention, after the step of detecting a second gesture operation performed by the user in the operation plane, the method further comprises: operating the operation position in the operation plane according to the second gesture, Displays the gesture icon corresponding to the screen position displayed on the screen.

在本發明之一實施例中,上述依據比例切分操作平面為多個操作區域,並切分顯示螢幕的顯示平面為相對應的多個顯示區域的步驟更包括依據此比例在一或多個顯示區域中顯示此物件,並定義各個顯示區域中用以感應第二手勢操作的感應區域。In an embodiment of the present invention, the step of dividing the operation plane according to the proportional division into a plurality of operation areas, and dividing the display plane of the display screen into the corresponding plurality of display areas further comprises one or more according to the ratio The object is displayed in the display area, and the sensing area in each display area for sensing the second gesture operation is defined.

在本發明之一實施例中,上述依據第二手勢操作所在的操作區域,控制相對應顯示區域中顯示的物件的步驟更包括依據此比例判斷第二手勢操作在顯示區域中對應的手勢區域是否涵括此感應區域,以決定是否依照第二手勢操作控制顯示區域中顯示的物件。In an embodiment of the present invention, the step of controlling the object displayed in the corresponding display area according to the operation area in which the second gesture operation is performed further comprises determining, according to the ratio, the corresponding gesture of the second gesture operation in the display area. Whether the area includes the sensing area to determine whether to control the object displayed in the display area according to the second gesture operation.

在本發明之一實施例中,上述偵測使用者於顯示螢幕前的三維空間中進行的第一手勢操作,以定義操作平面的步驟包括偵測使用者與顯示螢幕之間的觀賞距離以及使用者雙手的伸展範圍,據以決定適於使用者進行手勢操作的操作平面的位置及範圍。In an embodiment of the invention, the step of detecting the first gesture operation performed by the user in the three-dimensional space before the display screen to define the operation plane includes detecting the viewing distance between the user and the display screen and using The range of extension of the hands is used to determine the position and extent of the operating plane that is suitable for the user to perform the gesture operation.

在本發明之一實施例中,在上述偵測使用者於操作平面內進行的第二手勢操作的步驟之後,所述方法更包括偵測第二手勢操作與操作平面之間的距離,並與預設值比較。若此距離大於預設值,則於顯示螢幕中顯示提示訊息,用以提示使用者調整第二手勢操作之位置,以適於在操作平面上進行手勢操作。In an embodiment of the present invention, after the step of detecting a second gesture operation performed by the user in the operation plane, the method further includes detecting a distance between the second gesture operation and the operation plane, And compared with the preset value. If the distance is greater than the preset value, a prompt message is displayed on the display screen to prompt the user to adjust the position of the second gesture operation to be suitable for performing a gesture operation on the operation plane.

在本發明之一實施例中,上述偵測使用者於操作平面內進行的第二手勢操作的步驟更包括偵測第二手勢操作是否移至操作平面之下緣。若是,則將電子裝置之控制模式自手勢控制模式切換成按鍵輸入模式。In an embodiment of the invention, the step of detecting the second gesture operation performed by the user in the operation plane further comprises detecting whether the second gesture operation moves to a lower edge of the operation plane. If so, the control mode of the electronic device is switched from the gesture control mode to the key input mode.

在本發明之一實施例中,上述在將電子裝置之控制模式自手勢控制模式切換成按鍵輸入模式的步驟之後,所述方法更包括偵測使用者於實體鍵盤上的按鍵操作,或投影虛擬鍵盤並偵測使用者於此虛擬鍵盤上的按鍵操作,以執行按鍵輸入。In an embodiment of the present invention, after the step of switching the control mode of the electronic device from the gesture control mode to the key input mode, the method further comprises: detecting a button operation of the user on the physical keyboard, or projecting the virtual The keyboard detects the key operation of the user on the virtual keyboard to perform key input.

本發明另提出一種手勢控制裝置,其包括偵測模組、操作平面定義模組、手勢操作對應模組以及控制模組。其中,偵測模組係用以偵測使用者於顯示螢幕前進行的手勢操作。操作平面定義模組係依照偵測模組所偵測的第一手勢操作來定義出操作平面。手勢操作對應模組係用以計算使用者的手掌相對於與操作平面所涵蓋範圍的比例,並依據此比例切分操作平面為多個操作區域,以及切分顯示螢幕的顯示平面為相對應的多個顯示區域。控制模組係用以判斷偵測模組所偵測的第二手勢操作在操作平面中的操作區域,據以控制相對應顯示區域中顯示的物件。The invention further provides a gesture control device, which comprises a detection module, an operation plane definition module, a gesture operation corresponding module and a control module. The detection module is configured to detect a gesture operation performed by the user before displaying the screen. The operation plane definition module defines an operation plane according to the first gesture operation detected by the detection module. The gesture operation corresponding module is used to calculate the ratio of the palm of the user relative to the range covered by the operation plane, and divides the operation plane into a plurality of operation areas according to the ratio, and the display plane of the split display screen corresponds to Multiple display areas. The control module is configured to determine an operation area of the second gesture detected by the detection module in the operation plane, thereby controlling an object displayed in the corresponding display area.

在本發明之一實施例中,上述的偵測模組更包括偵測第二手勢操作與操作平面之間的距離。所述手勢控制裝置更包括提示模組,用以將偵測模組所偵測的距離與預設值比較,若此距離大於預設值,則提示模組於顯示螢幕中顯示提示訊息,用以提示使用者調整第二手勢操作之位置,以適於在操作平面上進行手勢操作。In an embodiment of the invention, the detecting module further includes detecting a distance between the second gesture operation and the operation plane. The gesture control device further includes a prompting module, configured to compare the distance detected by the detecting module with a preset value, and if the distance is greater than the preset value, the prompting module displays a prompt message on the display screen, The user is prompted to adjust the position of the second gesture operation to be suitable for performing a gesture operation on the operation plane.

在本發明之一實施例中,上述的手勢控制裝置更包括按鍵輸入模組,係用以在按鍵輸入模式中,偵測使用者於實體鍵盤上的按鍵操作,或偵測使用者於投影的虛擬鍵盤上的按鍵操作,以執行對應的按鍵輸入。In an embodiment of the present invention, the gesture control device further includes a button input module for detecting a button operation of the user on the physical keyboard or detecting the user's projection in the button input mode. Key operations on the virtual keyboard to perform the corresponding key input.

基於上述,本發明所提供之手勢控制方法及裝置藉由偵測使用者於顯示螢幕前的三維空間中進行的手勢操作,來控制相對應顯示區域中所顯示的物件,避免以手指直接觸碰顯示螢幕而可減少筆記型電腦的晃動,還可避免指紋殘留在顯示螢幕的問題。Based on the above, the gesture control method and device provided by the present invention can detect the object displayed in the corresponding display area by detecting the gesture operation performed by the user in the three-dimensional space before the display screen, thereby avoiding direct contact with the finger. Displaying the screen reduces the shaking of the notebook and prevents fingerprints from remaining on the display screen.

為讓本發明之上述特徵和優點能更明顯易懂,下文特舉實施例,並配合所附圖式作詳細說明如下。The above described features and advantages of the present invention will be more apparent from the following description.

為了讓使用者不需透過手指直接觸碰電子裝置的顯示螢幕也能進行操控輸入,本發明係在電子裝置上配置手勢控制裝置,以偵測使用者於顯示螢幕前的三維空間中進行的手勢操作,並透過偵測手勢操作的位置及種類,對應控制在顯示螢幕中所顯示的物件。為了使本發明之內容更為明瞭,以下列舉實施例作為本發明確實能夠據以實施的範例。In order to enable the user to perform the manipulation input without directly touching the display screen of the electronic device through the finger, the present invention configures the gesture control device on the electronic device to detect the gesture performed by the user in the three-dimensional space before the display screen. The operation controls and controls the objects displayed on the display screen by detecting the position and type of the gesture operation. In order to clarify the content of the present invention, the following examples are given as examples in which the present invention can be implemented.

圖1是依照本發明一實施例所繪示之手勢控制裝置的方塊圖。請參照圖1,本實施例的手勢控制裝置100適用於具有顯示螢幕的電子裝置,此電子裝置例如是桌上型電腦或筆記型電腦等,在此不設限。手勢控制裝置100包括偵測模組110、操作平面定義模組120、手勢操作對應模組130以及控制模組140,其功能分述如下:FIG. 1 is a block diagram of a gesture control apparatus according to an embodiment of the invention. Referring to FIG. 1 , the gesture control device 100 of the present embodiment is applicable to an electronic device having a display screen, such as a desktop computer or a notebook computer, and is not limited herein. The gesture control device 100 includes a detection module 110, an operation plane definition module 120, a gesture operation corresponding module 130, and a control module 140. The functions are described as follows:

偵測模組110例如是具有電荷耦合元件(Charge Coupled Device,CCD)或互補式金氧半導體(Complementary Metal-Oxide Semiconductor,CMOS)元件的影像感測器,用以擷取顯示螢幕前方的影像,以偵測顯示螢幕前方的使用者在三維空間中進行手勢操作的位置及種類。The detection module 110 is, for example, an image sensor having a Charge Coupled Device (CCD) or a Complementary Metal-Oxide Semiconductor (CMOS) component for capturing an image in front of the display screen. To detect the position and type of gestures displayed by the user in front of the screen in three-dimensional space.

操作平面定義模組120耦接至偵測模組110,用以對上述影像感測器所擷取的影像進行分析,進而取得使用者與顯示螢幕之間的觀賞距離以及使用者雙手的伸展範圍,據以決定適於使用者進行手勢操作的操作平面的位置及範圍。在一實施例中,操作平面定義模組120還可包括投影單元,用以在使用者與顯示螢幕之間投影出虛擬的操作平面,便於使用者在此虛擬的操作平面上進行手勢操作。The operation plane definition module 120 is coupled to the detection module 110 for analyzing the image captured by the image sensor, thereby obtaining the viewing distance between the user and the display screen and the stretching of the user's hands. The range determines the position and extent of the operating plane that is suitable for the user to perform the gesture operation. In an embodiment, the operation plane definition module 120 may further include a projection unit for projecting a virtual operation plane between the user and the display screen, so that the user can perform a gesture operation on the virtual operation plane.

手勢操作對應模組130例如是以邏輯電路元件組合而成的運算單元,用以計算使用者的手掌相對於與操作平面定義模組120所定義的操作平面之比例,並依據此比例切分操作平面為多個操作區域,以及切分顯示螢幕的顯示平面為相對應的多個顯示區域。The gesture operation corresponding module 130 is, for example, an arithmetic unit composed of logical circuit elements for calculating a ratio of a user's palm to an operation plane defined by the operation plane definition module 120, and dividing the operation according to the ratio The plane is a plurality of operation areas, and the display plane of the divided display screen is a corresponding plurality of display areas.

控制模組140例如是具備運算功能的處理器或可程式化控制器等,依據偵測模組110所偵測的手勢操作位置來判斷使用者的手位於操作平面中的哪一操作區域,據以控制相對應顯示區域中所顯示的物件。The control module 140 is, for example, a processor or a programmable controller having an arithmetic function, and determines which operation region of the user's hand is located in the operation plane according to the gesture operation position detected by the detection module 110. To control the objects displayed in the corresponding display area.

圖2是依照本發明一實施例所繪示之手勢控制方法流程圖。請參照圖2,本實施例的方式適用於上述實施例中的手勢控制裝置100,以下即搭配手勢控制裝置100中的各模組說明本實施例手勢控制方法的詳細步驟。2 is a flow chart of a gesture control method according to an embodiment of the invention. Referring to FIG. 2, the mode of the embodiment is applied to the gesture control device 100 in the above embodiment. The detailed steps of the gesture control method in this embodiment are described below with the modules in the gesture control device 100.

首先,如步驟S210所述,偵測模組110先偵測使用者於顯示螢幕前的三維空間中進行的第一手勢操作,以定義出操作平面。圖3(a)與圖3(b)是依照本發明一實施例所繪示之手勢控制方法的應用情境示意圖。請配合參照圖3(a)與圖3(b),本實施例之電子裝置例如是筆記型電腦10,手勢控制裝置100配置於筆記型電腦10內,其中,偵測模組110例如是配置於位置A處的影像感測器,可用以擷取使用者20進行手勢操作的多張影像。First, as described in step S210, the detecting module 110 first detects a first gesture operation performed by the user in a three-dimensional space before the display screen to define an operation plane. 3(a) and 3(b) are schematic diagrams showing an application scenario of a gesture control method according to an embodiment of the invention. Referring to FIG. 3( a ) and FIG. 3( b ), the electronic device of the embodiment is, for example, a notebook computer 10 , and the gesture control device 100 is disposed in the notebook computer 10 , wherein the detection module 110 is configured, for example. The image sensor at position A can be used to capture multiple images of the user 20 performing gesture operations.

在此步驟中所指的第一手勢操作係由使用者的雙手進行簡單的上下左右之揮動,藉以定義出適當的操作平面。如圖3(a)所示,使用者20的手進行上下擺動;如圖3(b)所示,使用者20的雙手進行左右伸展。操作平面定義模組120對偵測模組110所擷取的影像進行分析,進而取得使用者20與顯示螢幕之間的觀賞距離(本實施例之觀賞距離d1約為600公厘)以及使用者20雙手的伸展範圍。在本實施例中,操作平面定義模組120依據上述條件定義操作平面30的寬d2為470公厘、操作平面30的長d3為770公厘,且操作平面30至顯示螢幕之間的距離d4為400公厘。其中,操作平面之尺寸定義係與觀賞距離及雙手的伸展範圍有關,然操作平面之尺寸設定可依實際情況做調校,在此不加以限制。The first gesture operation referred to in this step is a simple up, down, left, and right swing by the user's hands to define an appropriate operation plane. As shown in Fig. 3(a), the user's 20 hands swing up and down; as shown in Fig. 3(b), the hands of the user 20 are stretched left and right. The operation plane definition module 120 analyzes the image captured by the detection module 110, and obtains the viewing distance between the user 20 and the display screen (the viewing distance d1 of the embodiment is about 600 mm) and the user. 20 hands extended range. In this embodiment, the operation plane defining module 120 defines that the width d2 of the operation plane 30 is 470 mm according to the above condition, the length d3 of the operation plane 30 is 770 mm, and the distance d4 between the operation plane 30 and the display screen. It is 400 mm. The dimension definition of the operation plane is related to the viewing distance and the extension range of the hands. However, the size setting of the operation plane can be adjusted according to actual conditions, and is not limited herein.

接著,在步驟S220中,手勢操作對應模組130計算使用者的手掌相對於與操作平面所涵蓋範圍的比例。圖4是依照本發明一實施例所繪示之操作平面與顯示平面的比例分割示意圖。請參照圖4,假設使用者20的手掌之長d5約為100公厘,使用者20的手掌之寬d6約為170公厘,因此,使用者的手掌之長d5與操作平面30的長d3之比約為1:8;使用者的手掌之寬d6與操作平面30的寬d2之比約為1:3。Next, in step S220, the gesture operation corresponding module 130 calculates the ratio of the palm of the user relative to the range covered by the operation plane. FIG. 4 is a schematic diagram showing the division of the operation plane and the display plane according to an embodiment of the invention. Referring to FIG. 4, it is assumed that the length d5 of the palm of the user 20 is about 100 mm, and the width d6 of the palm of the user 20 is about 170 mm. Therefore, the length d5 of the palm of the user and the length d3 of the operation plane 30 are The ratio is approximately 1:8; the ratio of the width d6 of the user's palm to the width d2 of the operating plane 30 is approximately 1:3.

得到上述比例之後,便可接續步驟S230,手勢操作對應模組130即依據此比例切分操作平面30為多個操作區域,並且切分顯示螢幕的顯示平面為相對應的多個顯示區域。如圖4所示,操作平面30區分為24個操作區域30a~30x;且筆記型電腦10的顯示平面102亦依據相同的比例切分為24個顯示區域102a~102x。After the ratio is obtained, the step S230 is continued. The gesture operation corresponding module 130 divides the operation plane 30 into a plurality of operation areas according to the ratio, and divides the display plane of the display screen into corresponding display areas. As shown in FIG. 4, the operation plane 30 is divided into 24 operation areas 30a to 30x; and the display plane 102 of the notebook computer 10 is also divided into 24 display areas 102a to 102x according to the same ratio.

此外,在顯示平面102的顯示區域102a~102x區分完成之後,手勢操作對應模組130依據上述比例在顯示平面102的顯示區域102a~102x中顯示一或多個物件。其中,物件例如是應用程式縮圖、影像等,不限於上述。圖5是圖4之顯示平面102的放大示意圖。在本實施例中,顯示平面102例如顯示了8個物件I1~I8,若每一顯示區域102a~102x皆顯示一個物件,則最多可顯示24個物件。In addition, after the display areas 102a-102x of the display plane 102 are separated, the gesture operation corresponding module 130 displays one or more objects in the display areas 102a-102x of the display plane 102 according to the above ratio. The object is, for example, an application thumbnail, an image, etc., and is not limited to the above. FIG. 5 is an enlarged schematic view of the display plane 102 of FIG. 4. In the present embodiment, the display plane 102 displays, for example, eight objects I1 to I8. If each of the display areas 102a to 102x displays an object, up to 24 objects can be displayed.

最後,在步驟S240中,偵測模組110偵測使用者於操作平面內進行的第二手勢操作,控制模組140判斷第二手勢操作所在的操作區域,藉以控制相對應顯示區域中所顯示的物件。詳細地說,手勢操作對應模組130會先根據各個顯示區域之尺寸在各個顯示區域中定義用以感應第二手勢操作的感應區域。控制模組140判斷第二手勢操作在顯示區域中對應的手勢區域是否涵括感應區域,以決定是否依照第二手勢操作控制顯示區域中顯示的物件。Finally, in step S240, the detecting module 110 detects a second gesture operation performed by the user in the operation plane, and the control module 140 determines an operation area where the second gesture operation is located, thereby controlling the corresponding display area. The object shown. In detail, the gesture operation corresponding module 130 first defines a sensing area for sensing the second gesture operation in each display area according to the size of each display area. The control module 140 determines whether the corresponding gesture area in the display area of the second gesture operation includes a sensing area to determine whether to control the object displayed in the display area according to the second gesture operation.

以圖5為例做說明,假設圖5所示之顯示平面102之尺寸為11.6吋,則顯示平面102之長d7約為256公厘;顯示平面102之寬d8約為144公厘。故,每一顯示區域102a~102x之尺寸約為32*48公厘;感應區域Z的尺寸為24*36公厘;物件尺寸為16*24公厘。需注意的是,顯示平面102中的各個物件之間必須保留間距(gap),以避免手勢區域同時涵括兩個物件時,造成裝置的誤動作。舉例來說,本實施例之物件I2與物件I3之間的間距g1為8公厘;物件I3與物件I4之間的間距g2為12公厘。Taking FIG. 5 as an example, assuming that the display plane 102 shown in FIG. 5 has a size of 11.6 inches, the length d7 of the display plane 102 is about 256 mm; and the width d8 of the display plane 102 is about 144 mm. Therefore, the size of each display area 102a-102x is about 32*48 mm; the size of the sensing area Z is 24*36 mm; and the size of the object is 16*24 mm. It should be noted that a gap must be reserved between the objects in the display plane 102 to avoid malfunction of the device when the gesture area includes two objects at the same time. For example, the distance g1 between the object I2 and the object I3 of the present embodiment is 8 mm; and the distance g2 between the object I3 and the object I4 is 12 mm.

在本實施例中,手勢區域GA涵括感應區域Z之比例為一半以上,因此控制模組140例如可控制對應此位置的物件I7呈現反白形式,藉以提示使用者其第二手勢操作的位置係位於物件I7的感應範圍之內。相反地,若手勢區域GA雖涵括感應區域Z之一部分,但比例未達一半,則控制模組140判斷第二手勢操作的位置並非位於物件I7的感應範圍之內,因此物件I7將不會呈現反白形式。In this embodiment, the ratio of the gesture area GA to the sensing area Z is more than half. Therefore, the control module 140 can control, for example, the object I7 corresponding to the position to be in a reversed form, thereby prompting the user to operate the second gesture. The position is within the sensing range of the object I7. Conversely, if the gesture area GA includes one part of the sensing area Z, but the ratio is less than half, the control module 140 determines that the position of the second gesture operation is not within the sensing range of the object I7, so the object I7 will not Will appear in reverse.

控制模組140判斷第二手勢操作的位置是否位於物件的感應範圍之後,更包括判斷第二手勢操作的手勢種類,並且執行此手勢種類對應的控制操作於此物件。其中,控制操作包括放大、縮小、選取或搬移等等。圖6是依照本發明一實施例所繪示之手勢種類的示意圖。請參照圖6,手勢610~手勢670分別代表不同的控制操作,舉例來說,手勢640代表“選取物件”的控制操作;手勢670代表“確定執行”的控制操作。其中,手勢種類與控制操作的對應關係可由使用者事先設定之,在此不設限。The control module 140 determines whether the position of the second gesture operation is after the sensing range of the object, and further includes determining the type of the gesture of the second gesture operation, and executing the control corresponding to the gesture type to operate the object. Among them, the control operations include zooming in, zooming out, selecting or moving, and the like. FIG. 6 is a schematic diagram of the types of gestures according to an embodiment of the invention. Referring to FIG. 6, gestures 610 to 670 respectively represent different control operations. For example, gesture 640 represents a control operation of "select object"; gesture 670 represents a control operation of "determine execution." The correspondence between the type of the gesture and the control operation can be set by the user in advance, and is not limited herein.

值得一提的是,上述實施例中的顯示平面102之物件係為多個不同應用程式縮圖,然而,顯示平面102所顯示之物件亦可為單一物件,例如是一張影像。若顯示平面102所顯示之物件僅為一張影像時,則控制模組140可根據第二手勢操作的手勢種類及所在的操作區域,以顯示螢幕上相對應之顯示區域中的影像部分為對象進行操作。舉例來說,若控制模組140判斷第二手勢操作的手勢種類係為一放大手勢,則控制模組140即會以上述顯示區域中的影像部分為中心放大整張影像。It should be noted that the object of the display plane 102 in the above embodiment is a plurality of different application thumbnails. However, the object displayed on the display plane 102 can also be a single object, such as an image. If the object displayed on the display plane 102 is only one image, the control module 140 can display the image portion in the corresponding display area on the screen according to the type of the gesture operated by the second gesture and the operation area in which the second gesture is operated. The object operates. For example, if the control module 140 determines that the gesture type of the second gesture operation is an enlarged gesture, the control module 140 enlarges the entire image centering on the image portion in the display area.

如此一來,本發明之手勢控制方法及裝置可達到藉由偵測使用者於顯示螢幕前的三維空間中進行的手勢操作,來控制相對應顯示區域中所顯示的物件,避免以手指直接觸碰顯示螢幕而可減少筆記型電腦的晃動。除了手勢控制模式之外,使用者仍常需要利用鍵盤進行按鍵輸入的動作,針對此種情況,本發明亦提供相應的調整方式,以下另舉一實施例做詳細說明。In this way, the gesture control method and apparatus of the present invention can control the object displayed in the corresponding display area by detecting the gesture operation performed by the user in the three-dimensional space before the display screen, thereby avoiding direct contact with the finger. Touch the display screen to reduce the shaking of the notebook. In addition to the gesture control mode, the user still needs to perform the key input operation by using the keyboard. In this case, the present invention also provides a corresponding adjustment manner, which will be described in detail below.

圖7是依照本發明之另一實施例所繪示之手勢控制裝置的方塊圖。本實施例的手勢控制裝置700亦適用於具有顯示螢幕的電子裝置,電子裝置例如是桌上型電腦或筆記型電腦等,在此不設限。FIG. 7 is a block diagram of a gesture control apparatus according to another embodiment of the present invention. The gesture control device 700 of the present embodiment is also applicable to an electronic device having a display screen. The electronic device is, for example, a desktop computer or a notebook computer, and is not limited herein.

在本實施例中,手勢控制裝置700除了包括偵測模組110、操作平面定義模組120、手勢操作對應模組130以及控制模組140之外,還包括提示模組750以及按鍵輸入模組760。其中,提示模組750可於顯示螢幕(未繪示)中顯示提示訊息。按鍵輸入模組760係用於在控制模組140將控制模式由手勢控制模式切換成按鍵輸入模式時,偵測使用者的按鍵操作,以執行對應的按鍵輸入。In this embodiment, the gesture control device 700 includes a detection module 110, an operation plane definition module 120, a gesture operation corresponding module 130, and a control module 140, and a prompt module 750 and a key input module. 760. The prompt module 750 can display a prompt message on the display screen (not shown). The key input module 760 is configured to detect a user's key operation when the control module 140 switches the control mode from the gesture control mode to the key input mode to perform corresponding key input.

圖8是依照本發明之另一實施例所繪示之一種手勢控制方法的流程圖。以下將以圖8來說明手勢控制裝置700的運作方式。請同時參照圖7與圖8。FIG. 8 is a flow chart of a gesture control method according to another embodiment of the invention. The mode of operation of the gesture control device 700 will be described below with reference to FIG. Please refer to FIG. 7 and FIG. 8 at the same time.

首先,偵測模組110先偵測使用者於顯示螢幕前的三維空間中進行的第一手勢操作,以定義出操作平面(步驟S810)。接著,手勢操作對應模組130計算使用者的手掌相對於與操作平面所涵蓋範圍的比例(步驟S820)。得到上述比例之後,手勢操作對應模組130即依據此比例切分操作平面為多個操作區域,並且切分顯示螢幕的顯示平面為相對應的多個顯示區域(步驟S830)。上述步驟S810~S830的詳細內容係與前述實施例中的步驟S210~S230相同或相似,在此不贅述。First, the detecting module 110 first detects a first gesture operation performed by the user in the three-dimensional space before the display screen to define an operation plane (step S810). Next, the gesture operation corresponding module 130 calculates a ratio of the palm of the user with respect to the range covered by the operation plane (step S820). After the ratio is obtained, the gesture operation corresponding module 130 divides the operation plane into a plurality of operation areas according to the ratio, and divides the display plane of the display screen into a plurality of corresponding display areas (step S830). The details of the above steps S810 to S830 are the same as or similar to the steps S210 to S230 in the foregoing embodiment, and are not described herein.

接下來,偵測模組110偵測第二手勢操作與操作平面之間的距離。提示模組750用以將偵測模組110所偵測的距離與預設值比較(步驟S840),若此距離大於預設值,則提示模組750於顯示螢幕中顯示提示訊息,用以提示使用者調整第二手勢操作之位置,以適於在操作平面上進行手勢操作(步驟S850),並回到步驟S840,繼續由偵測模組110偵測第二手勢操作與操作平面之間的距離。其中,預設值小於操作平面與顯示螢幕之間的距離,其可由本領域具通常知識者依實際情況做調整與設定。Next, the detection module 110 detects the distance between the second gesture operation and the operation plane. The prompting module 750 is configured to compare the detected distance of the detecting module 110 with a preset value (step S840). If the distance is greater than the preset value, the prompting module 750 displays a prompt message on the display screen for The user is prompted to adjust the position of the second gesture operation to perform the gesture operation on the operation plane (step S850), and returns to step S840 to continue detecting the second gesture operation and the operation plane by the detection module 110. the distance between. The preset value is smaller than the distance between the operation plane and the display screen, and can be adjusted and set according to actual conditions by those skilled in the art.

若第二手勢操作與操作平面之間的距離不大於預設值,則控制模組140更包括依據偵測模組110所偵測的影像來判斷第二手勢操作是否移至操作平面的下緣(步驟S860)。若否,則控制模組140判斷第二手勢操作在操作平面中的操作位置,在顯示螢幕中相對應的螢幕位置顯示手勢圖標(步驟S870)。圖9是依照本發明另一實施例所繪示之手勢圖標的示意圖。請參照圖9,在一實施例中,使用者的第二手勢操作若位於操作平面30的第一位置GP1,則控制模組140會依照相對應的比例在顯示平面102中的第一顯示位置GP1’上顯示對應的手勢圖標。同理,若使用者的第二手勢操作若位於操作平面30的第二位置GP2,控制模組140同樣依照相對應的比例在顯示平面102中的第二顯示位置GP2’上顯示對應的手勢圖標。在另一實施例中,在顯示平面102中對應第二手勢操作的位置所顯示的手勢圖標亦可為固定的滑鼠游標圖示或其他圖案,在此不設限。If the distance between the second gesture operation and the operation plane is not greater than the preset value, the control module 140 further includes determining, according to the image detected by the detection module 110, whether the second gesture operation is moved to the operation plane. The lower edge (step S860). If not, the control module 140 determines the operation position of the second gesture operation in the operation plane, and displays a gesture icon on the corresponding screen position in the display screen (step S870). FIG. 9 is a schematic diagram of a gesture icon according to another embodiment of the invention. Referring to FIG. 9 , in an embodiment, if the second gesture operation of the user is located at the first position GP1 of the operation plane 30 , the control module 140 displays the first display in the display plane 102 according to the corresponding ratio. The corresponding gesture icon is displayed on the location GP1'. Similarly, if the second gesture operation of the user is located at the second position GP2 of the operation plane 30, the control module 140 also displays the corresponding gesture on the second display position GP2' in the display plane 102 according to the corresponding ratio. icon. In another embodiment, the gesture icon displayed in the display plane 102 corresponding to the position of the second gesture operation may also be a fixed mouse cursor icon or other pattern, which is not limited herein.

在顯示手勢圖標之後,控制模組140可進一步判斷第二手勢操作的手勢種類,並且執行此手勢種類對應的控制操作。After the gesture icon is displayed, the control module 140 may further determine the gesture type of the second gesture operation and perform a control operation corresponding to the gesture category.

回到步驟S860,若控制模組140判斷第二手勢操作確實移至操作平面的下緣,則控制模組140將電子裝置之控制模式自手勢控制模式切換成按鍵輸入模式(步驟S880)。在一實施例中,按鍵輸入模組760係用以在按鍵輸入模式中,偵測使用者於電子裝置之實體鍵盤上的按鍵操作。在另一實施例中,偵測模組110更包括投影出一虛擬鍵盤,而按鍵輸入模組760則可偵測使用者於投影的虛擬鍵盤上的按鍵操作,以執行對應的按鍵輸入。Going back to step S860, if the control module 140 determines that the second gesture operation has indeed moved to the lower edge of the operation plane, the control module 140 switches the control mode of the electronic device from the gesture control mode to the key input mode (step S880). In one embodiment, the key input module 760 is configured to detect a user's key operation on a physical keyboard of the electronic device in the key input mode. In another embodiment, the detection module 110 further includes a virtual keyboard, and the key input module 760 can detect a user's key operation on the projected virtual keyboard to perform corresponding key input.

綜上所述,本發明藉由偵測使用者於顯示螢幕前的三維空間中進行的手勢操作,來控制相對應顯示區域中所顯示的物件,避免以手指直接觸碰顯示螢幕而可減少筆記型電腦的晃動,還可避免指紋殘留在顯示螢幕的問題。除此之外,本發明還提供一種偵測手勢控制模式與按鍵輸入模式轉換的方法,便於使用者依據實際操作情況選擇適當的控制模式。In summary, the present invention controls the displayed object in the corresponding display area by detecting the gesture operation performed by the user in the three-dimensional space before the display screen, thereby avoiding the finger touching the display screen and reducing the note. The shaking of the computer can also avoid the problem of fingerprints remaining on the display screen. In addition, the present invention also provides a method for detecting a gesture control mode and a key input mode transition, which is convenient for the user to select an appropriate control mode according to actual operation conditions.

雖然本發明已以實施例揭露如上,然其並非用以限定本發明,任何所屬技術領域中具有通常知識者,在不脫離本發明之精神和範圍內,當可作些許之更動與潤飾,故本發明之保護範圍當視後附之申請專利範圍所界定者為準。Although the present invention has been disclosed in the above embodiments, it is not intended to limit the invention, and any one of ordinary skill in the art can make some modifications and refinements without departing from the spirit and scope of the invention. The scope of the invention is defined by the scope of the appended claims.

10...筆記型電腦10. . . Notebook computer

100、700...手勢控制裝置100, 700. . . Gesture control device

110...偵測模組110. . . Detection module

120...操作平面定義模組120. . . Operation plane definition module

130...手勢操作對應模組130. . . Gesture operation corresponding module

140...控制模組140. . . Control module

102a~102x...顯示區域102a~102x. . . Display area

20...使用者20. . . user

30...操作平面30. . . Operating plane

30a~30x...操作區域30a~30x. . . Operating area

610~670...手勢610~670. . . gesture

750‧‧‧提示模組750‧‧‧ prompt module

760‧‧‧按鍵輸入模組760‧‧‧Key input module

A、GP1、GP2‧‧‧位置A, GP1, GP2‧‧‧ position

GP1’、GP2’‧‧‧顯示位置GP1’, GP2’‧‧‧ display location

GA‧‧‧手勢區域GA‧‧‧ gesture area

I1~I8‧‧‧物件I1~I8‧‧‧ objects

Z‧‧‧感應區域Z‧‧‧ Sensing area

d1~d8‧‧‧長度D1~d8‧‧‧ length

g1、g2‧‧‧間距G1, g2‧‧‧ spacing

S210~S240‧‧‧一實施例之手勢控制方法之各步驟S210~S240‧‧‧ steps of the gesture control method of an embodiment

S810~S880‧‧‧另一實施例之手勢控制方法之各步驟S810~S880‧‧‧ steps of the gesture control method of another embodiment

圖1是依照本發明一實施例所繪示之手勢控制裝置的方塊圖。FIG. 1 is a block diagram of a gesture control apparatus according to an embodiment of the invention.

圖2是依照本發明一實施例所繪示之手勢控制方法流程圖。2 is a flow chart of a gesture control method according to an embodiment of the invention.

圖3(a)與圖3(b)是依照本發明一實施例所繪示之手勢控制方法的應用情境示意圖。3(a) and 3(b) are schematic diagrams showing an application scenario of a gesture control method according to an embodiment of the invention.

圖4是依照本發明一實施例所繪示之操作平面與顯示平面的比例分割示意圖。FIG. 4 is a schematic diagram showing the division of the operation plane and the display plane according to an embodiment of the invention.

圖5是圖4之顯示平面102的放大示意圖。FIG. 5 is an enlarged schematic view of the display plane 102 of FIG. 4.

圖6是依照本發明一實施例所繪示之手勢種類的示意圖。FIG. 6 is a schematic diagram of the types of gestures according to an embodiment of the invention.

圖7是依照本發明之另一實施例所繪示之手勢控制裝置的方塊圖。FIG. 7 is a block diagram of a gesture control apparatus according to another embodiment of the present invention.

圖8是依照本發明之另一實施例所繪示之一種手勢控制方法的流程圖。FIG. 8 is a flow chart of a gesture control method according to another embodiment of the invention.

圖9是依照本發明另一實施例所繪示之手勢圖標的示意圖。FIG. 9 is a schematic diagram of a gesture icon according to another embodiment of the invention.

S210~S240...一實施例之手勢控制方法之各步驟S210~S240. . . Each step of the gesture control method of an embodiment

Claims (18)

一種手勢控制方法,適用於具有一顯示螢幕的一電子裝置,該方法包括下列步驟:偵測一使用者於該顯示螢幕前的一三維空間中進行的一第一手勢操作,以定義一操作平面;計算該使用者的一手掌相對於與該操作平面所涵蓋範圍的一比例;依據該比例切分該操作平面為多個操作區域,並切分該顯示螢幕的一顯示平面為相對應的多個顯示區域;以及偵測該使用者於該操作平面內進行的一第二手勢操作,並依據該第二手勢操作所在的操作區域,控制相對應顯示區域中顯示的一物件,其中依據該比例切分該操作平面為該些操作區域,並切分該顯示螢幕的該顯示平面為相對應的該些顯示區域的步驟更包括:依據該比例在一或多個顯示區域中顯示該物件。 A gesture control method is applicable to an electronic device having a display screen, the method comprising the steps of: detecting a first gesture operation performed by a user in a three-dimensional space in front of the display screen to define an operation plane Calculating a ratio of a palm of the user relative to a range covered by the operation plane; dividing the operation plane into a plurality of operation areas according to the ratio, and dividing a display plane of the display screen to correspond to a plurality of a display area; and detecting a second gesture operation performed by the user in the operation plane, and controlling an object displayed in the corresponding display area according to the operation area where the second gesture operation is performed, wherein The step of dividing the operation plane into the operation areas, and dividing the display plane of the display screen into the corresponding display areas further comprises: displaying the object in one or more display areas according to the ratio . 如申請專利範圍第1項所述之手勢控制方法,其中依據該第二手勢操作所在的操作區域,控制相對應顯示區域中的該物件的步驟更包括:判斷該第二手勢操作的一手勢種類;以及執行該手勢種類對應的一控制操作於該物件。 The gesture control method of claim 1, wherein the step of controlling the object in the corresponding display area according to the operation area in which the second gesture operation is performed further comprises: determining one of the second gesture operations a gesture type; and a control operation corresponding to the type of the gesture is performed on the object. 如申請專利範圍第1項所述之手勢控制方法,其中在偵測該使用者於該操作平面內進行的該第二手勢操作的步驟之後,更包括: 依據該第二手勢操作在該操作平面中的一操作位置,在該顯示螢幕中相對應的一螢幕位置顯示一手勢圖標。 The gesture control method of claim 1, wherein after the step of detecting the second gesture operation performed by the user in the operation plane, the method further comprises: According to the second gesture operation, in an operation position in the operation plane, a gesture icon is displayed on a corresponding screen position in the display screen. 如申請專利範圍第1項所述之手勢控制方法,其中依據該比例切分該操作平面為該些操作區域,並切分該顯示螢幕的該顯示平面為相對應的該些顯示區域的步驟更包括:定義各該些顯示區域中用以感應該第二手勢操作的一感應區域。 The gesture control method according to claim 1, wherein the step of dividing the operation plane into the operation areas according to the ratio and dividing the display plane of the display screen into corresponding display areas is further The method includes: defining a sensing area in each of the display areas for sensing the second gesture operation. 如申請專利範圍第4項所述之手勢控制方法,其中依據該第二手勢操作所在的操作區域,控制相對應顯示區域中顯示的該物件的步驟更包括:依據該比例判斷該第二手勢操作在該顯示區域中對應的一手勢區域是否涵括該感應區域,以決定是否依照該第二手勢操作控制該顯示區域中顯示的該物件。 The gesture control method of claim 4, wherein the step of controlling the object displayed in the corresponding display area according to the operation area in which the second gesture operation is performed further comprises: determining the second hand according to the ratio The potential operation includes whether the corresponding one of the gesture areas in the display area includes the sensing area to determine whether to control the object displayed in the display area according to the second gesture operation. 如申請專利範圍第1項所述之手勢控制方法,其中偵測該使用者於該顯示螢幕前的該三維空間中進行的該第一手勢操作,以定義該操作平面的步驟包括:偵測該使用者與該顯示螢幕之間的觀賞距離以及該使用者雙手的一伸展範圍,據以決定適於該使用者進行手勢操作的該操作平面的位置及範圍。 The gesture control method of claim 1, wherein the step of detecting the first gesture operation performed by the user in the three-dimensional space before the display screen to define the operation plane comprises: detecting the The viewing distance between the user and the display screen and a range of extensions of the user's hands determine the position and extent of the operating plane suitable for the user to perform the gesture operation. 如申請專利範圍第1項所述之手勢控制方法,其中偵測該使用者於該操作平面內進行的第二手勢操作的步驟更包括:偵測該第二手勢操作與該操作平面之間的一距離,並 與一預設值比較;以及若該距離大於該預設值,於該顯示螢幕中顯示一提示訊息,用以提示該使用者調整該第二手勢操作之位置,以適於在該操作平面上進行手勢操作。 The method of claim 2, wherein the step of detecting the second gesture performed by the user in the operation plane further comprises: detecting the second gesture operation and the operation plane a distance between, and Comparing with a preset value; and if the distance is greater than the preset value, displaying a prompt message on the display screen for prompting the user to adjust the position of the second gesture operation to be suitable for the operation plane Gesture operation on. 如申請專利範圍第1項所述之手勢控制方法,其中偵測該使用者於該操作平面內進行的第二手勢操作的步驟更包括:偵測該第二手勢操作是否移至該操作平面之一下緣;以及若是,將該電子裝置之一控制模式自一手勢控制模式切換成一按鍵輸入模式。 The method of claim 2, wherein the detecting the second gesture operation performed by the user in the operation plane further comprises: detecting whether the second gesture operation moves to the operation a lower edge of the plane; and if so, switching the control mode of one of the electronic devices from a gesture control mode to a key input mode. 如申請專利範圍第8項所述之手勢控制方法,其中在將該電子裝置之該控制模式自該手勢控制模式切換成該按鍵輸入模式的步驟之後,更包括:偵測該使用者於一實體鍵盤上的一按鍵操作,或投影一虛擬鍵盤並偵測該使用者於該虛擬鍵盤上的按鍵操作,以執行一按鍵輸入。 The gesture control method of claim 8, wherein after the step of switching the control mode of the electronic device from the gesture control mode to the key input mode, the method further comprises: detecting the user in an entity A key operation on the keyboard, or projecting a virtual keyboard and detecting the user's key operation on the virtual keyboard to perform a key input. 一種手勢控制裝置,包括:一偵測模組,偵測一使用者於一顯示螢幕前的一三維空間中進行的一手勢操作;一操作平面定義模組,依照該偵測模組所偵測的一第一手勢操作,定義一操作平面;一手勢操作對應模組,計算該使用者的一手掌相對於與該操作平面所涵蓋範圍的一比例,並依據該比例切分該 操作平面為多個操作區域,以及切分該顯示螢幕的一顯示平面為相對應的多個顯示區域;以及一控制模組,判斷該偵測模組所偵測的一第二手勢操作在該操作平面中的操作區域,據以控制相對應顯示區域中顯示的一物件,其中該手勢操作對應模組依據該比例在一或多個顯示區域中顯示該物件。 A gesture control device includes: a detection module that detects a gesture operation performed by a user in a three-dimensional space before a display screen; and an operation plane definition module that is detected according to the detection module a first gesture operation defining an operation plane; a gesture operation corresponding module, calculating a ratio of a palm of the user relative to a range covered by the operation plane, and segmenting the scale according to the ratio The operation plane is a plurality of operation areas, and a display plane that divides the display screen is a corresponding plurality of display areas; and a control module determines that a second gesture operation detected by the detection module is An operation area in the operation plane, according to which an object displayed in the corresponding display area is controlled, wherein the gesture operation corresponding module displays the object in one or more display areas according to the ratio. 如申請專利範圍第10項所述之手勢控制裝置,其中:該控制模組更包括判斷該第二手勢操作的一手勢種類以及執行該手勢種類對應的一控制操作於該物件。 The gesture control device of claim 10, wherein the control module further comprises: determining a gesture type of the second gesture operation and performing a control operation corresponding to the gesture type on the object. 如申請專利範圍第10項所述之手勢控制裝置,其中:該手勢操作對應模組更包括依據該第二手勢操作在該操作平面中的一操作位置,在該顯示螢幕中相對應的一螢幕位置顯示一手勢圖標。 The gesture control device of claim 10, wherein the gesture operation corresponding module further comprises an operation position in the operation plane according to the second gesture, and a corresponding one in the display screen A gesture icon is displayed on the screen position. 如申請專利範圍第10項所述之手勢控制裝置,其中:該手勢操作對應模組定義各該些顯示區域中用以感應該第二手勢操作的一感應區域。 The gesture control device of claim 10, wherein the gesture operation corresponding module defines a sensing area in each of the display areas for sensing the second gesture operation. 如申請專利範圍第13項所述之手勢控制裝置,其中:該控制模組依據該比例判斷該第二手勢操作在該顯示區域中對應的一手勢區域是否涵括該感應區域,以決定 是否依照該第二手勢操作控制該顯示區域中顯示的該物件。 The gesture control device of claim 13, wherein: the control module determines, according to the ratio, whether the second gesture operation corresponds to a gesture area in the display area to include the sensing area, to determine Whether the object displayed in the display area is controlled in accordance with the second gesture operation. 如申請專利範圍第10項所述之手勢控制裝置,其中:該偵測模組偵測該使用者與該顯示螢幕之間的觀賞距離以及該使用者雙手的一伸展範圍,該操作平面定義模組據以計算適於該使用者進行手勢操作的該操作平面的位置及範圍。 The gesture control device of claim 10, wherein the detection module detects an ornamental distance between the user and the display screen and an extended range of the user's hands, the operation plane definition The module is configured to calculate a position and a range of the operation plane suitable for the user to perform a gesture operation. 如申請專利範圍第10項所述之手勢控制裝置,其中該偵測模組更包括偵測該第二手勢操作與該操作平面之間的一距離,而該手勢控制裝置更包括:一提示模組,將該偵測模組所偵測的該距離與一預設值比較,若該距離大於該預設值,於該顯示螢幕中顯示一提示訊息,用以提示該使用者調整該第二手勢操作之位置,以適於在該操作平面上進行手勢操作。 The gesture control device of claim 10, wherein the detecting module further comprises detecting a distance between the second gesture operation and the operation plane, and the gesture control device further comprises: a prompt The module compares the distance detected by the detection module with a preset value. If the distance is greater than the preset value, a prompt message is displayed on the display screen to prompt the user to adjust the first The position of the second gesture operation is adapted to perform a gesture operation on the operation plane. 如申請專利範圍第10項所述之手勢控制裝置,其中:該控制模組更包括判斷該第二手勢操作是否移至該操作平面之一下緣,若是,將一控制模式自一手勢控制模式切換成一按鍵輸入模式。 The gesture control device of claim 10, wherein the control module further comprises: determining whether the second gesture operation moves to a lower edge of the operation plane, and if so, controlling a mode from a gesture control mode Switch to a button input mode. 如申請專利範圍第17項所述之手勢控制裝置,更包括:一按鍵輸入模組,在該按鍵輸入模式中,偵測該使用者於一實體鍵盤上的一按鍵操作,或偵測該使用者於投影的一虛擬鍵盤上的按鍵操作,以執行對應的一按鍵輸入。The gesture control device of claim 17, further comprising: a button input module, in the button input mode, detecting a button operation of the user on a physical keyboard, or detecting the use The key operation on a virtual keyboard of the projection to perform a corresponding one-key input.
TW101109527A 2012-03-20 2012-03-20 Gesture control method and apparatus TWI488068B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW101109527A TWI488068B (en) 2012-03-20 2012-03-20 Gesture control method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW101109527A TWI488068B (en) 2012-03-20 2012-03-20 Gesture control method and apparatus

Publications (2)

Publication Number Publication Date
TW201339895A TW201339895A (en) 2013-10-01
TWI488068B true TWI488068B (en) 2015-06-11

Family

ID=49770928

Family Applications (1)

Application Number Title Priority Date Filing Date
TW101109527A TWI488068B (en) 2012-03-20 2012-03-20 Gesture control method and apparatus

Country Status (1)

Country Link
TW (1) TWI488068B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201715472A (en) 2015-10-26 2017-05-01 原相科技股份有限公司 Image segmentation determining method, gesture determining method, image sensing system and gesture determining system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW561423B (en) * 2000-07-24 2003-11-11 Jestertek Inc Video-based image control system
US20080170748A1 (en) * 2007-01-12 2008-07-17 Albertson Jacob C Controlling a document based on user behavioral signals detected from a 3d captured image stream
TW200945174A (en) * 2008-04-14 2009-11-01 Pointgrab Ltd Vision based pointing device emulation
TW201019239A (en) * 2008-10-30 2010-05-16 Nokia Corp Method, apparatus and computer program product for providing adaptive gesture analysis
US20100150399A1 (en) * 2008-12-12 2010-06-17 Miroslav Svajda Apparatus and method for optical gesture recognition
TW201129918A (en) * 2009-10-13 2011-09-01 Pointgrab Ltd Computer vision gesture based control of a device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW561423B (en) * 2000-07-24 2003-11-11 Jestertek Inc Video-based image control system
US20080170748A1 (en) * 2007-01-12 2008-07-17 Albertson Jacob C Controlling a document based on user behavioral signals detected from a 3d captured image stream
TW200945174A (en) * 2008-04-14 2009-11-01 Pointgrab Ltd Vision based pointing device emulation
TW201019239A (en) * 2008-10-30 2010-05-16 Nokia Corp Method, apparatus and computer program product for providing adaptive gesture analysis
US20100150399A1 (en) * 2008-12-12 2010-06-17 Miroslav Svajda Apparatus and method for optical gesture recognition
TW201129918A (en) * 2009-10-13 2011-09-01 Pointgrab Ltd Computer vision gesture based control of a device

Also Published As

Publication number Publication date
TW201339895A (en) 2013-10-01

Similar Documents

Publication Publication Date Title
US8570283B2 (en) Information processing apparatus, information processing method, and program
TWI509497B (en) Method and system for operating portable devices
TWI608407B (en) Touch device and control method thereof
TWI455011B (en) Touch display device and method for conditionally varying display area
TWI471756B (en) Virtual touch method
RU2541852C2 (en) Device and method of controlling user interface based on movements
US8976140B2 (en) Touch input processor, information processor, and touch input control method
US20120068946A1 (en) Touch display device and control method thereof
TWI658396B (en) Interface control method and electronic device using the same
US20140267029A1 (en) Method and system of enabling interaction between a user and an electronic device
US9727147B2 (en) Unlocking method and electronic device
US20150149954A1 (en) Method for operating user interface and electronic device thereof
JP5713180B2 (en) Touch panel device that operates as if the detection area is smaller than the display area of the display.
TW201432557A (en) Touch screen with unintended input prevention
US9235293B2 (en) Optical touch device and touch sensing method
JP2009283013A (en) Information processing device and display control method
CN103365401B (en) Gestural control method and device
US9389704B2 (en) Input device and method of switching input mode thereof
US9727233B2 (en) Touch device and control method and method for determining unlocking thereof
WO2016029422A1 (en) Touchscreen gestures
US20160328077A1 (en) Touch sensor
TWI488068B (en) Gesture control method and apparatus
TW201504885A (en) Electronic device and human-computer interaction method
KR101265296B1 (en) Apparatus and Method for Recognizing User Input using Camera
TW201349046A (en) Touch sensing input system