TWI507918B - Gesture recognition method and interacting system - Google Patents
Gesture recognition method and interacting system Download PDFInfo
- Publication number
- TWI507918B TWI507918B TW102129836A TW102129836A TWI507918B TW I507918 B TWI507918 B TW I507918B TW 102129836 A TW102129836 A TW 102129836A TW 102129836 A TW102129836 A TW 102129836A TW I507918 B TWI507918 B TW I507918B
- Authority
- TW
- Taiwan
- Prior art keywords
- angle
- angle threshold
- trajectory
- display
- hand
- Prior art date
Links
Landscapes
- User Interface Of Digital Computer (AREA)
Description
本發明是有關於一種辨識方法,且特別是有關於一種手勢辨識方法及互動系統。The invention relates to an identification method, and in particular to a gesture recognition method and an interactive system.
使用者在對電腦系統、電腦/電視遊戲、資訊家電等產品進行操作時,需要透過人機介面來對電腦系統、電腦/電視遊戲、資訊家電等產品下達命令,並透過人機介面取得電腦系統、電腦/電視遊戲、資訊家電等產品所產生的執行結果。而隨著科技的進步,使用者與電腦系統、電腦/電視遊戲、資訊家電等產品之間的溝通模式也越來越多樣化。換言之,人機介面所能接收的使用者訊息也不再侷限於鍵盤或滑鼠所產生的指令。When operating on computer systems, computers/video games, information appliances, etc., users need to use the human-machine interface to issue commands to computer systems, computer/video games, information appliances, etc., and obtain computer systems through human-machine interfaces. The results of executions of products such as computers/video games and information appliances. With the advancement of technology, the communication modes between users and computer systems, computer/video games, and information appliances have become more diverse. In other words, the user information that the human-machine interface can receive is no longer limited to commands generated by the keyboard or mouse.
不論是滑鼠、鍵盤、觸控面板或遙控器,電子產品的操控上始終強調使用的方便性與功能性。近年來,由於影像偵測式人機介面技術的提升,非接觸式的手勢操控介面隨之興起,但不論技術如何演變,以人為中心的產品設計理念依然不變。換言之,手勢操控介面是否具直覺便利的特性將左右消費者接受該產品的重要因素。Regardless of the mouse, keyboard, touch panel or remote control, the control of electronic products always emphasizes the convenience and functionality of use. In recent years, due to the improvement of the image detection type human-machine interface technology, the non-contact gesture control interface has arisen, but no matter how the technology evolves, the human-centered product design concept remains unchanged. In other words, whether the gesture manipulation interface is intuitive or convenient will affect the consumers' important factors in accepting the product.
本發明係有關於一種手勢辨識方法及互動系統。The invention relates to a gesture recognition method and an interactive system.
根據本發明,提出一種手勢辨識方法。手勢辨識方 法,包括:擷取使用者影像;分析以找出三維空間手部位置;根據三維空間手部位置分析手部軌跡;判斷手部軌跡是否符合多邊形軌跡且持續經一時間閥值;以及當手部軌跡符合多邊形軌跡且持續經時間閥值,將顯示器解鎖。According to the present invention, a gesture recognition method is proposed. Gesture recognition The method comprises: capturing user images; analyzing to find the position of the hand in the three-dimensional space; analyzing the hand trajectory according to the position of the hand in the three-dimensional space; determining whether the hand trajectory conforms to the polygonal trajectory and continuing the threshold value for a time; The trajectory conforms to the polygon trajectory and continues over the time threshold to unlock the display.
根據本發明,提出一種互動系統。互動系統包括顯 示器、影像擷取裝置及資料處理裝置。影像擷取裝置擷取使用者影像。資料處理裝置分析使用者影像以找出三維空間手部位置,並根據三維空間手部位置分析手部軌跡。資料處理裝置判斷手部軌跡是否符合多邊形軌跡且持續經一時間閥值。當手部軌跡符合多邊形軌跡且持續經時間閥值,將顯示器解鎖。According to the invention, an interactive system is proposed. Interactive system including display Display, image capture device and data processing device. The image capture device captures the user image. The data processing device analyzes the user image to find the position of the hand in the three-dimensional space, and analyzes the hand trajectory according to the position of the hand in the three-dimensional space. The data processing device determines whether the hand trajectory conforms to the polygon trajectory and continues for a time threshold. The display is unlocked when the hand trajectory conforms to the polygonal trajectory and continues over the time threshold.
為了對本發明之上述及其他方面有更佳的瞭解,下文特舉較佳實施例,並配合所附圖式,作詳細說明如下:In order to better understand the above and other aspects of the present invention, the preferred embodiments are described below, and in conjunction with the drawings, the detailed description is as follows:
1‧‧‧互動系統1‧‧‧Interactive system
3‧‧‧使用者3‧‧‧Users
11‧‧‧顯示器11‧‧‧ Display
12‧‧‧影像擷取裝置12‧‧‧Image capture device
13‧‧‧資料處理裝置13‧‧‧Data processing device
21~25、231~236‧‧‧步驟21~25, 231~236‧‧‧ steps
31‧‧‧三維空間手部位置31‧‧‧3D space hand position
111‧‧‧使用者圖形介面111‧‧‧User graphical interface
111a‧‧‧滑鼠游標111a‧‧‧Mouse cursor
111b‧‧‧圖形111b‧‧‧ graphics
112‧‧‧第一使用者圖形介面112‧‧‧First user graphical interface
113‧‧‧第二使用者圖形介面113‧‧‧Second user graphical interface
P1‧‧‧第一位置P1‧‧‧ first position
P2‧‧‧第二位置P2‧‧‧ second position
P3‧‧‧第三位置P3‧‧‧ third position
V1‧‧‧第一向量V1‧‧‧first vector
V2‧‧‧第二向量V2‧‧‧Second Vector
θ‧‧‧夾角Θ‧‧‧ angle
第1圖繪示係為依照第一實施例之互動系統之示意圖。Figure 1 is a schematic diagram showing an interactive system in accordance with a first embodiment.
第2圖繪示係為依照第一實施例之手勢辨識方法之流程圖。FIG. 2 is a flow chart showing a gesture recognition method according to the first embodiment.
第3圖繪示係為一種手部軌跡之部分示意圖。Figure 3 is a partial schematic view of a hand trajectory.
第4圖繪示係為步驟23之細部流程圖。Figure 4 is a flow chart showing the details of step 23.
第5圖繪示係為顯示器於解鎖前與解鎖後之第一種示意圖。Figure 5 is a first schematic diagram showing the display before and after unlocking.
第6圖繪示係為顯示器於解鎖前與解鎖後之第二種示意圖。Figure 6 is a second schematic diagram showing the display before and after unlocking.
第7圖繪示係為顯示器於解鎖前與解鎖後之第三種示意圖。Figure 7 is a third schematic diagram showing the display before and after unlocking.
第8圖繪示係為顯示器於解鎖前與解鎖後之第四種示意圖。Figure 8 is a fourth schematic diagram showing the display before and after unlocking.
請同時參照第1圖及第2圖,第1圖繪示係為依照第一實施例之互動系統之示意圖,第2圖繪示係為依照第一實施例之手勢辨識方法之流程圖。互動系統1包括顯示器11、影像擷取裝置12及資料處理裝置13,且資料處理裝置13例如為電腦。手勢辨識方法適用於互動系統1且包括步驟21至25。如步驟21所示,影像擷取裝置12於一段時間內連續擷取數個使用者影像。接著如步驟22所示,資料處理裝置13分析使用者影像以找出使用者3之三維空間手部位置31。跟著如步驟23所示,資料處理裝置13根據三維空間手部位置31分析使用者3之一手部軌跡。然後如步驟24所示,資料處理裝置13判斷手部軌跡是否符合多邊形軌跡且持續經一時間閥值。接著如步驟25所示,當手部軌跡符合多邊形軌跡且持續經時間閥值,將顯示器11解鎖。顯示器11解鎖後,使用者可進一步地取得互動系統1之控制權。Please refer to FIG. 1 and FIG. 2 simultaneously. FIG. 1 is a schematic diagram showing an interactive system according to the first embodiment, and FIG. 2 is a flow chart showing a gesture recognition method according to the first embodiment. The interactive system 1 includes a display 11, an image capturing device 12, and a data processing device 13, and the data processing device 13 is, for example, a computer. The gesture recognition method is applicable to the interactive system 1 and includes steps 21 to 25. As shown in step 21, the image capturing device 12 continuously captures a plurality of user images for a period of time. Next, as shown in step 22, the data processing device 13 analyzes the user image to find the three-dimensional space hand position 31 of the user 3. Following the step 23, the data processing device 13 analyzes one of the hand trajectories of the user 3 based on the three-dimensional hand position 31. Then, as shown in step 24, the data processing device 13 determines whether the hand trajectory conforms to the polygon trajectory and continues for a time threshold. Next, as shown in step 25, the display 11 is unlocked when the hand trajectory conforms to the polygonal trajectory and continues over the time threshold. After the display 11 is unlocked, the user can further gain control of the interactive system 1.
由於資料處理裝置13係根據三維空間手部位置31 來分析使用者3之手部軌跡,而非二維空間手部位置。所以,互動系統1能避免被其他類似物件或無意識的動作而造成誤觸發。 此外,多邊形軌跡對於使用者來說不僅簡單、直覺且容易做到,而且多邊形軌跡具有高辨識準確率與低誤判率的條件。再者,多邊形軌跡與其他動態物件容易區隔,不容易受到來回走動的人或雜訊而誤觸發,也能有效縮短使用者啟動互動系統1的動作時間。不僅如此,資料處理裝置13進一步地將手部軌跡的持續時間做為一限制條件,因此能過濾使用者3因無意識的快速動作而造成誤觸發。Since the data processing device 13 is based on the three-dimensional space hand position 31 To analyze the hand trajectory of the user 3, rather than the two-dimensional space hand position. Therefore, the interactive system 1 can avoid false triggering caused by other similar parts or unintended actions. In addition, the polygon trajectory is not only simple, intuitive and easy for the user, but also has a high recognition accuracy and a low false positive rate. Moreover, the polygon track is easily separated from other dynamic objects, is not easily triggered by people or noises moving back and forth, and can effectively shorten the action time of the user to activate the interactive system 1. Moreover, the data processing device 13 further regards the duration of the hand trajectory as a restriction condition, so that the user 3 can be filtered to cause false triggering due to unintentional rapid movement.
請同時參照第1圖、第3圖及第4圖,第3圖繪示係為一種 手部軌跡之部分示意圖,第4圖繪示係為步驟23之細部流程圖。為方便說明起見,三維空間手部位置於第3圖中係以之第一位置P1、第二位置P2及第三位置P3為例說明。前述步驟23進一步包括步驟231至236。如步驟231所示,資料處理裝置13根據三維空間手部位置之第一位置P1、第二位置P2及第三位置P3計算第一向量V1及第二向量V2。接著如步驟232所示,資料處理裝置13計算第一向量V1與第二向量V2之夾角θ。Please refer to Figure 1, Figure 3 and Figure 4 at the same time. Figure 3 shows a system A partial schematic diagram of the hand trajectory, and FIG. 4 is a detailed flow chart of the step 23. For convenience of explanation, the three-dimensional hand portion is illustrated in the first position P1, the second position P2, and the third position P3 in FIG. The aforementioned step 23 further includes steps 231 to 236. As shown in step 231, the data processing device 13 calculates the first vector V1 and the second vector V2 based on the first position P1, the second position P2, and the third position P3 of the three-dimensional hand position. Next, as shown in step 232, the data processing device 13 calculates an angle θ between the first vector V1 and the second vector V2.
跟著如步驟233所示,資料處理裝置13判斷夾角θ 是否大於第一角度閥值T1且小於第二角度閥值T2。第一角度閥值T1及第二角度閥值T2大於0度且小於180度。舉例來說,第一角度閥值T1為1度,而第二角度閥值T2為179度。當第一角度閥值T1<夾角θ<第二角度閥值T2,則執行步驟234。如步驟234 所示,資料處理裝置13遞增一計數量。Next, as shown in step 233, the data processing device 13 determines the angle θ Whether it is greater than the first angle threshold T1 and less than the second angle threshold T2. The first angle threshold T1 and the second angle threshold T2 are greater than 0 degrees and less than 180 degrees. For example, the first angle threshold T1 is 1 degree and the second angle threshold T2 is 179 degrees. When the first angle threshold T1 < the angle θ < the second angle threshold T2, step 234 is performed. As step 234 As shown, the data processing device 13 increments by one count.
相反地,當夾角θ不介於第一角度閥值T1與第二角 度閥值T2之間,則執行步驟235。如步驟235所示,資料處理裝置13判斷夾角θ是否大於第三角度閥值T3且小於第四角度閥值T4。當第三角度閥值T3<夾角θ<第四角度閥值T4,則執行步驟236。第三角度閥值T3及第四角度閥值T4大於180度且小於360度。舉例來說,第三角度閥值T3為181度,而第四角度閥值T4為359度。 如步驟236所示,資料處理裝置13遞減計數量。相反地,當夾角θ不介於第三角度閥值T3與第四角度閥值T4之間,則結束步驟23之細部流程。Conversely, when the angle θ is not between the first angle threshold T1 and the second angle Between the thresholds T2, step 235 is performed. As shown in step 235, the data processing device 13 determines whether the included angle θ is greater than the third angle threshold T3 and less than the fourth angle threshold T4. When the third angle threshold T3 < angle θ < fourth angle threshold T4, step 236 is performed. The third angle threshold T3 and the fourth angle threshold T4 are greater than 180 degrees and less than 360 degrees. For example, the third angle threshold T3 is 181 degrees and the fourth angle threshold T4 is 359 degrees. As shown in step 236, the data processing device 13 decrements the count. Conversely, when the angle θ is not between the third angle threshold T3 and the fourth angle threshold T4, the detailed flow of step 23 is ended.
以此類推,資料處理裝置13根據後續之三維空間手 部位置即能對應地遞增或遞減計數量。當計數量大於一上限值,資料處理裝置13判斷手部軌跡為一順時針轉圈軌跡。相對地,當計數量小於一下限值,資料處理裝置13判斷手部軌跡為一逆時針轉圈軌跡。By analogy, the data processing device 13 is based on the subsequent three-dimensional space hand The position of the part can be correspondingly incremented or decremented. When the count amount is greater than an upper limit value, the data processing device 13 determines that the hand trajectory is a clockwise trajectory. In contrast, when the count amount is less than the lower limit value, the data processing device 13 determines that the hand trajectory is a counterclockwise trajectory.
請參照第5圖,第5圖繪示係為顯示器於解鎖前與解鎖後之 第一種示意圖。當顯示器11於解鎖前,係不顯示使用者圖形介面111。當顯示器11經由前述手勢辨識方法解鎖後,則顯示使用者圖形介面111。請參照第6圖,第6圖繪示係為顯示器於解鎖前與解鎖後之第二種示意圖。當顯示器11於解鎖前,係不顯示使用者圖形介面111。當顯示器11經由前述手勢辨識方法解鎖後,則顯示使用者圖形介面111,且使用者圖形介面111中可進一步包括滑鼠游標 111a。請參照第7圖,第7圖繪示係為顯示器於解鎖前與解鎖後之第三種示意圖。當顯示器11於解鎖前,係不顯示使用者圖形介面111。當顯示器11經由前述手勢辨識方法解鎖後,則顯示使用者圖形介面111,且前述使用者圖形介面111中可進一步包括圖形111b。Please refer to Figure 5, which is shown in Figure 5 before and after unlocking the display. The first schematic. The user graphical interface 111 is not displayed until the display 11 is unlocked. When the display 11 is unlocked via the aforementioned gesture recognition method, the user graphical interface 111 is displayed. Please refer to FIG. 6 , which illustrates a second schematic diagram of the display before and after unlocking. The user graphical interface 111 is not displayed until the display 11 is unlocked. After the display 11 is unlocked by the gesture recognition method, the user graphic interface 111 is displayed, and the user graphic interface 111 may further include a mouse cursor. 111a. Please refer to FIG. 7 , which illustrates a third schematic diagram of the display before and after unlocking. The user graphical interface 111 is not displayed until the display 11 is unlocked. After the display 11 is unlocked by the gesture recognition method, the user graphic interface 111 is displayed, and the graphic interface 111b may be further included in the user graphic interface 111.
請參照第8圖,第8圖繪示係為顯示器於解鎖前與解鎖後之第四種示意圖。第二實施例與第一實施例主要不同之處第二實施例之顯示器11於解鎖前係顯示第一使用者圖形介面112,而於解鎖後顯示第二使用者圖形介面113。第一使用者圖形介面112與第二使用者圖形介面113不同。Please refer to FIG. 8. FIG. 8 is a fourth schematic diagram showing the display before and after unlocking. The second embodiment is mainly different from the first embodiment. The display 11 of the second embodiment displays the first user graphic interface 112 before unlocking, and displays the second user graphic interface 113 after unlocking. The first user graphical interface 112 is different from the second user graphical interface 113.
上述實施例所載之手勢辨識方法及互動系統能藉由三維空間手部位置來避免互動系統的誤觸發。此外,藉由多邊形軌跡來識別解鎖手勢不僅有助於提高使用者操作上的便利性,且能提高辨識準確率與降低誤判率。再者,將手部軌跡的持續時間做為一限制條件,能進一步過濾使用者因無意識的快速動作而造成誤觸發。The gesture recognition method and interactive system described in the above embodiments can avoid false triggering of the interactive system by the three-dimensional hand position. In addition, recognizing the unlocking gesture by the polygon trajectory not only helps to improve the user's operation convenience, but also improves the recognition accuracy and reduces the false positive rate. Furthermore, by using the duration of the hand trajectory as a constraint, the user can be further filtered to cause false triggering due to unintentional rapid action.
綜上所述,雖然本發明已以較佳實施例揭露如上,然其並非用以限定本發明。本發明所屬技術領域中具有通常知識者,在不脫離本發明之精神和範圍內,當可作各種之更動與潤飾。因此,本發明之保護範圍當視後附之申請專利範圍所界定者為準。In conclusion, the present invention has been disclosed in the above preferred embodiments, and is not intended to limit the present invention. A person skilled in the art can make various changes and modifications without departing from the spirit and scope of the invention. Therefore, the scope of the invention is defined by the scope of the appended claims.
21~25‧‧‧步驟21~25‧‧‧Steps
Claims (12)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW102129836A TWI507918B (en) | 2013-08-20 | 2013-08-20 | Gesture recognition method and interacting system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW102129836A TWI507918B (en) | 2013-08-20 | 2013-08-20 | Gesture recognition method and interacting system |
Publications (2)
Publication Number | Publication Date |
---|---|
TW201508545A TW201508545A (en) | 2015-03-01 |
TWI507918B true TWI507918B (en) | 2015-11-11 |
Family
ID=53186237
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW102129836A TWI507918B (en) | 2013-08-20 | 2013-08-20 | Gesture recognition method and interacting system |
Country Status (1)
Country | Link |
---|---|
TW (1) | TWI507918B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI598809B (en) * | 2016-05-27 | 2017-09-11 | 鴻海精密工業股份有限公司 | Gesture control system and method |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104809387B (en) * | 2015-03-12 | 2017-08-29 | 山东大学 | Contactless unlocking method and device based on video image gesture identification |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW201104513A (en) * | 2009-07-23 | 2011-02-01 | Ind Tech Res Inst | A trajectory-based control method and apparatus thereof |
US8386963B2 (en) * | 2009-05-28 | 2013-02-26 | Microsoft Corporation | Virtual inking using gesture recognition |
TW201322142A (en) * | 2011-11-22 | 2013-06-01 | Transcend Information Inc | Method of executing software functions using biometric detection and related electronic device |
US8491135B2 (en) * | 2010-01-04 | 2013-07-23 | Microvision, Inc. | Interactive projection with gesture recognition |
TW201331820A (en) * | 2011-12-01 | 2013-08-01 | Research In Motion Ltd | Electronic device and method of displaying information in response to a gesture |
-
2013
- 2013-08-20 TW TW102129836A patent/TWI507918B/en active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8386963B2 (en) * | 2009-05-28 | 2013-02-26 | Microsoft Corporation | Virtual inking using gesture recognition |
TW201104513A (en) * | 2009-07-23 | 2011-02-01 | Ind Tech Res Inst | A trajectory-based control method and apparatus thereof |
US8491135B2 (en) * | 2010-01-04 | 2013-07-23 | Microvision, Inc. | Interactive projection with gesture recognition |
TW201322142A (en) * | 2011-11-22 | 2013-06-01 | Transcend Information Inc | Method of executing software functions using biometric detection and related electronic device |
TW201331820A (en) * | 2011-12-01 | 2013-08-01 | Research In Motion Ltd | Electronic device and method of displaying information in response to a gesture |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI598809B (en) * | 2016-05-27 | 2017-09-11 | 鴻海精密工業股份有限公司 | Gesture control system and method |
Also Published As
Publication number | Publication date |
---|---|
TW201508545A (en) | 2015-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107643828B (en) | Vehicle and method of controlling vehicle | |
KR101761050B1 (en) | Human-to-computer natural three-dimensional hand gesture based navigation method | |
JP6370893B2 (en) | System and method for performing device actions based on detected gestures | |
JP6360050B2 (en) | Method and system for simultaneous human-computer gesture-based interaction using unique noteworthy points on the hand | |
US20170329487A1 (en) | Computer with graphical user interface for interaction | |
US9507417B2 (en) | Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects | |
US8933882B2 (en) | User centric interface for interaction with visual display that recognizes user intentions | |
US7880720B2 (en) | Gesture recognition method and touch system incorporating the same | |
US20120268369A1 (en) | Depth Camera-Based Relative Gesture Detection | |
US9218060B2 (en) | Virtual mouse driving apparatus and virtual mouse simulation method | |
US9317171B2 (en) | Systems and methods for implementing and using gesture based user interface widgets with camera input | |
US20140118244A1 (en) | Control of a device by movement path of a hand | |
US10126873B2 (en) | Stroke continuation for dropped touches on electronic handwriting devices | |
US20120139857A1 (en) | Gesture On Touch Sensitive Input Devices For Closing A Window Or An Application | |
US9285885B2 (en) | Gesture recognition module and gesture recognition method | |
US10528145B1 (en) | Systems and methods involving gesture based user interaction, user interface and/or other features | |
US20160357301A1 (en) | Method and system for performing an action based on number of hover events | |
CN104978018B (en) | Touch system and touch method | |
TWI507918B (en) | Gesture recognition method and interacting system | |
US9525906B2 (en) | Display device and method of controlling the display device | |
Kim et al. | Long-range touch gesture interface for Smart TV | |
KR101506197B1 (en) | A gesture recognition input method using two hands | |
TW201137671A (en) | Vision based hand posture recognition method and system thereof | |
Kim et al. | A hand tracking framework using the 3D active tracking volume | |
WO2017096802A1 (en) | Gesture-based operating component control method and device, computer program, and storage medium |