TWI426420B - Hand track identification system and method thereof - Google Patents
Hand track identification system and method thereof Download PDFInfo
- Publication number
- TWI426420B TWI426420B TW99103565A TW99103565A TWI426420B TW I426420 B TWI426420 B TW I426420B TW 99103565 A TW99103565 A TW 99103565A TW 99103565 A TW99103565 A TW 99103565A TW I426420 B TWI426420 B TW I426420B
- Authority
- TW
- Taiwan
- Prior art keywords
- trajectory
- hand
- fingertip
- stroke
- pixel
- Prior art date
Links
Landscapes
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
Description
本發明係有關一種手部軌跡辨識系統及其方法,尤其是具有手部視訊介面並使用鏈碼為軌跡特徵再利用混淆集合進行修正。The invention relates to a hand trajectory identification system and a method thereof, in particular to having a hand video interface and using a chain code as a trajectory feature to re-use a confusion set for correction.
目前應用在手寫技術方面上,其習知技術為都需要有另外的輸入設備,如手寫板、觸控面板,這類的設備較為昂貴,使得相關產品的成本提高;此外,過去此類技術大多需要一隻特定的筆做為輸入工具,而這類的筆大多為輕細短小,容易遺失,也造成操作上有許多不便。Currently used in handwriting technology, its conventional technology requires additional input devices, such as a tablet, a touch panel, such devices are relatively expensive, so that the cost of related products is increased; in addition, most of these technologies in the past A specific pen is needed as an input tool, and most of these pens are light and short, easy to lose, and cause a lot of inconvenience in operation.
現今許多觸控技術已經可以以手代筆,然而這些觸控面板仍然存在著厚度厚、接觸面易刮傷、透光性較差等問題。雖然觸控面板已經逐漸普及於許多裝置上,如智慧型手機、自動櫃員機以及部分觸控型個人電腦上,然而,在電腦家電化的目標裡,觸控面板反而顯得不實用;一來,觸控面板成本較高,無法直接普及在如電視類大型面板的市場上;二來,大型面板應用的觸控介面反而顯得笨拙,因為使用者需要往返座位與面板間進行操作。Many touch technologies are now available for handwriting. However, these touch panels still have problems such as thick thickness, easy scratching of the contact surface, and poor light transmission. Although touch panels have gradually spread to many devices, such as smart phones, ATMs, and some touch-type personal computers, in the goal of computer home appliances, the touch panel is not practical; The cost of the control panel is relatively high and cannot be directly popularized in the market of large panels such as TV. Secondly, the touch interface of large panel applications is awkward because the user needs to operate between the seat and the panel.
因此,需要一種手部軌跡辨識系統及其方法,利用類似觸控式介面的遙控操作,以解決上述習用技術的缺點。Therefore, there is a need for a hand trajectory recognition system and method thereof that utilizes a remote operation similar to a touch interface to address the shortcomings of the prior art described above.
本發明之主要目的在提供一種手部軌跡辨識系統及其方法,包括視訊攝影機模組、移動物件偵測模組、手指偵測模組、特徵擷取模組及軌跡辨識模組,其中視訊攝影機模組擷取手部動作影像,移動物件偵測模組整合背景重建與前景物體色彩分析技術找出動作中的手部,手指偵測模組根據分離出的手部影像定位手指尖端,並依據手指尖端移動位置以記錄軌跡,特徵擷取模組使用鏈碼做為軌跡特徵,軌跡辨識模組利用徑向基底函數類神經網路對鏈碼特徵進行分類,再經混淆集合進行修正,辨識手指書寫軌跡所表示的符號或文字。The main purpose of the present invention is to provide a hand track recognition system and method thereof, including a video camera module, a moving object detection module, a finger detection module, a feature capture module and a track recognition module, wherein the video camera The module captures the hand motion image, the moving object detection module integrates the background reconstruction and the foreground object color analysis technology to find the hand in the action, and the finger detection module locates the finger tip according to the separated hand image, and according to the finger The tip moves the position to record the trajectory, and the feature capture module uses the chain code as the trajectory feature. The trajectory identification module uses the radial basis function-like neural network to classify the chain code features, and then corrects the confusion by the confusion set to identify the finger writing. The symbol or text represented by the track.
因此,本發明係利用視頻訊號檢測以實現手寫輸入,可做為電腦動作之依據,進行電腦控制或是文字輸入,進而改善人機溝通的便利性,可應用於電腦家電化、即時的電腦教學、辦公室的會議、資料輸入、文章撰寫或繪圖所需的人機介面,尤其是對於語言障礙的病患,在不需透過任何外在的輸入設備下,即可直接利用手指書寫的方式進行對談與溝通。Therefore, the present invention utilizes video signal detection to implement handwriting input, and can be used as a basis for computer actions, and performs computer control or text input, thereby improving the convenience of human-machine communication, and can be applied to computer home appliances and instant computer teaching. The human-machine interface required for office meetings, data entry, article writing or drawing, especially for patients with language impairments, can be directly written by finger without any external input device. Talk and communicate.
以下配合圖式及元件符號對本發明之實施方式做更詳細的說明,俾使熟習該項技藝者在研讀本說明書後能據以實施。The embodiments of the present invention will be described in more detail below with reference to the drawings and the <RTIgt; </ RTI> <RTIgt; </ RTI> <RTIgt; </ RTI> <RTIgt; </ RTI> <RTIgt;
參閱第一圖,本發明手部軌跡辨識系統的示意圖。如第一圖所示,本發明的手部軌跡辨識系統包括視訊攝影機模組10、移動物件偵測模組20、手指偵測模組30、特徵擷取模組40及軌跡辨識模組50,用以提供手部視訊介面以實現軌跡辨識功能。Referring to the first figure, a schematic diagram of the hand trajectory recognition system of the present invention. As shown in the first figure, the hand track recognition system of the present invention includes a video camera module 10, a moving object detecting module 20, a finger detecting module 30, a feature capturing module 40, and a track recognition module 50. Used to provide a hand video interface for track recognition.
視訊攝影機模組10拍攝畫面以產生複數個且連續的動態影像資料T1,而視訊攝影機模組10可包括電荷耦合元件(CCD)或互補式金氧半(CMOS)攝影機,且具有複數個像素。每一動態影像資料T1包括複數個像素值,而每個像素具有相對應的像素值。The video camera module 10 captures a picture to generate a plurality of continuous motion picture data T1, and the video camera module 10 may include a charge coupled device (CCD) or a complementary metal oxide half (CMOS) camera and has a plurality of pixels. Each motion picture data T1 includes a plurality of pixel values, and each pixel has a corresponding pixel value.
移動物件偵測模組20接收動態影像資料T1以建立背景影像,並同時偵測出動態影像資料T1中的移動物件T2。隨著影像的移動及時間的改變,像素值也會同步變動,因此先對動態影像資料T1的像素值進行統計以建立直方圖(Histogram),再利用核函數密度估測法(Kernel Density Estimation,KDE)針對每個像素以產生逼近的相對應像素值分佈機率,最後依據下一動態影像資料T1而由每個像素的分佈機率判斷該像素是屬於不移動的背景或移動的移動物件T2,且判斷基準是分佈機率大的屬於背景而分佈機率小的屬於所需的移動物件T2。The moving object detecting module 20 receives the moving image data T1 to establish a background image, and simultaneously detects the moving object T2 in the moving image data T1. As the image moves and the time changes, the pixel values will also change synchronously. Therefore, the pixel values of the moving image data T1 are first counted to establish a histogram, and then the kernel function density estimation method (Kernel Density Estimation, KDE) for each pixel to generate an approximation corresponding pixel value distribution probability, and finally, according to the next moving image data T1, the probability of distribution of each pixel is determined to be a non-moving background or moving moving object T2, and The judgment criterion is a moving object T2 belonging to a desired one having a large distribution probability and belonging to the background and having a small distribution probability.
手指偵測模組30接收移動物件T2,先找出指尖,亦即手指尖端,係利用色彩分析技術過濾移動物件T2中非手部的移動物體,進而找出動作中的手部32,同時分離出手部32的影像,如第二圖所示,再以包圍矩形(Bounding Box)34將手部影像的輪廓框起來,並以手部31與包圍矩形32相重疊的交點33,34,35為中心,個別計算交點33,34,35之前後輪廓線段的夾角,而具有最小夾角的交點33即為指尖位置,最後連接指尖的連續位置以形成指尖軌跡T3。The finger detecting module 30 receives the moving object T2, first finds the fingertip, that is, the tip of the finger, and uses the color analysis technology to filter the moving object of the non-hand in the moving object T2, thereby finding the hand 32 in the action, and simultaneously finding the hand 32 in the action. The image of the hand 32 is separated, as shown in the second figure, and the outline of the hand image is framed by a Bounding Box 34, and the intersection 31, 34, 35 of the hand 31 and the surrounding rectangle 32 are overlapped. Centering, the angles of the contour lines before and after the intersection 33, 34, 35 are calculated individually, and the intersection point 33 having the smallest angle is the fingertip position, and finally the continuous position of the fingertip is connected to form the fingertip trajectory T3.
特徵擷取模組40接收代表書寫筆畫軌跡的指尖軌跡T3,以進行筆畫抽取與書寫特徵擷取,並產生軌跡特徵T4。筆畫抽取係依據書寫特點,比如筆畫有轉彎、提筆或落筆動作時指尖的移動速度會減慢,而由指尖軌跡T3中藉指尖移動速度的估算以取出停頓點,再利用停頓點以分辨指尖軌跡T3中的進入筆畫、離開筆畫與書寫筆畫,同時刪除進入筆畫及離開筆畫,僅保留書寫筆畫以供判斷。判斷方式係先將書寫筆畫依照其書寫方向進行N方向的鏈碼(Chain Code)編碼,N為正整數,並利用直方圖對書寫筆畫的軌跡移動方向的機率進行統計,再藉核函數密度估測法以估測機率分佈。由於鏈碼的循環編碼特性,造成利用核函數估測法所估測出的函數範圍會落於小於0或超過N的區域,因此超出範圍的機率必須合併至其相對的端點,例如對於方向為N+1的區域,相對應之機率必須累加至方向0的區域,此稱為循環式鏈碼機率分佈估測,而估測出的機率函數係當作辨識用的軌跡特徵T4。The feature capture module 40 receives the fingertip trajectory T3 representing the stroke of the written stroke to perform stroke extraction and writing feature extraction, and generates a trajectory feature T4. Stroke extraction is based on the writing characteristics. For example, when the stroke has a turn, a pen, or a pen down movement, the movement speed of the fingertip will be slowed down, and the fingertip trajectory T3 is estimated by the fingertip movement speed to take out the pause point, and then use the pause point. In order to distinguish the entry strokes in the fingertip trajectory T3, leave the strokes and the strokes, and delete the entry strokes and leave the strokes, only the written strokes are retained for judgment. The judgment method firstly encodes the written stroke in the N direction of the chain code according to the writing direction, N is a positive integer, and uses the histogram to count the probability of the moving direction of the stroke of the written stroke, and then evaluates the density of the kernel function. The test measures the probability distribution. Due to the cyclic coding characteristics of the chain code, the range of functions estimated by the kernel function estimation method will fall in a region smaller than 0 or more than N, so the probability of exceeding the range must be merged to its opposite end point, for example, for the direction. For the region of N+1, the corresponding probability must be added to the region of direction 0. This is called the cyclic chain probability distribution estimation, and the estimated probability function is used as the trajectory feature T4 for identification.
軌跡辨識模組50接收軌跡特徵T4,進行軌跡辨識,以產生書寫符號T5,係先依據徑向基底函數類神經網路(Radial Basis Function Neural Networks,RBFNN)對軌跡特徵T4進行分類,以產生初步分類結果。由於軌跡特徵T4中不同的文字或符號會有相似的鏈碼分佈,例如0與0,2與Z,因此再利用混淆集合(Confusing Set)以修正初步分類結果,並產生所需的書寫符號T5,可包括手寫符號或手寫文字,可做為電腦動作之依據,進而實現進行電腦控制或是文字輸入。The trajectory identification module 50 receives the trajectory feature T4 and performs trajectory identification to generate the written symbol T5. The trajectory feature T4 is first classified according to a Radial Basis Function Neural Networks (RBFNN) to generate a preliminary Classification results. Since different characters or symbols in the trajectory feature T4 have similar chain code distributions, such as 0 and 0, 2 and Z, the Confusing Set is reused to correct the preliminary classification result and generate the required writing symbol T5. It can include handwritten symbols or handwritten words, which can be used as the basis for computer actions, thereby enabling computer control or text input.
參閱第三圖,本發明手部軌跡辨識方法的示意圖。如第三圖所示,本發明的手部軌跡辨識方法係由步驟S10開始,在步驟S10中,利用一視訊攝影機模組以拍攝畫面,產生連續的複數個動態影像資料,每一該等動態影像資料包括複數個像素值,而每個像素值係對應於一像素。Referring to the third figure, a schematic diagram of the method for identifying the hand trajectory of the present invention. As shown in the third figure, the hand trajectory identification method of the present invention begins with step S10. In step S10, a video camera module is used to capture a picture to generate a continuous plurality of motion picture data, each of which is dynamic. The image data includes a plurality of pixel values, and each pixel value corresponds to one pixel.
接著在步驟S20中,利用一移動物件偵測模組接收該等動態影像資料,並藉核函數密度估測法逼近每個像素的像素值分佈機率,以產生相近似的分佈機率,並以分佈機率大的像素屬於背景,而分佈機率小的像素屬於移動物體,藉以偵測出所需的移動物件,並進入步驟S30,利用一手指偵測模組接收該移動物件,並利用色彩分析技術過濾移動物件中非手部的移動物體進而找出動作中的手部,再以包圍矩形將手部影像的輪廓框起來,並以手部與包圍矩形相重疊的交點為中心,計算交點之前後輪廓線段的夾角,而具最小夾角的交點即為指尖的位置,連接指尖的連續位置以形成指尖軌跡。Then, in step S20, the moving object detection module is used to receive the dynamic image data, and the probability of pixel value distribution of each pixel is approximated by the kernel function density estimation method to generate a similar distribution probability and distributed. A pixel with a high probability belongs to the background, and a pixel with a small probability of distribution belongs to the moving object, thereby detecting the required moving object, and proceeds to step S30 to receive the moving object by using a finger detection module, and filtering by using color analysis technology. Move the non-hand moving object in the object to find the hand in motion, and then frame the outline of the hand image with the enclosing rectangle, and calculate the intersection before and after the intersection with the intersection of the hand and the enclosing rectangle. The angle of the line segment, and the intersection with the smallest angle is the position of the fingertip, connecting the continuous position of the fingertip to form the fingertip trajectory.
接著在步驟S40中,利用一特徵擷取模組接收該指尖軌跡,係先藉指尖移動速度的估算以取出停頓點,再利用停頓點以分辨指尖軌跡T3中的進入筆畫、離開筆畫與書寫筆畫,將書寫筆畫依照其書寫方向進行N方向的鏈碼(Chain Code)編碼,N為正整數,利用直方圖對書寫筆畫的軌跡移動方向的機率進行統計,再藉核函數密度估測法以估測機率分佈,而估測出的機率函數為軌跡特徵。Then, in step S40, the fingertip trajectory is received by a feature extraction module, and the pause point is extracted by the estimation of the fingertip movement speed, and the pause point is used to distinguish the entry stroke in the fingertip trajectory T3 and leave the stroke. And writing strokes, the writing strokes are coded according to the writing direction of the N direction chain code (Chain Code), N is a positive integer, using the histogram to calculate the probability of moving the stroke direction of the stroke, and then using the kernel function density estimation The method estimates the probability distribution, and the estimated probability function is the trajectory feature.
最後進入步驟S50,利用一軌跡辨識模組接收該軌跡特徵,先依據徑向基底函數類神經網路對軌跡特徵進行分類以產生初步分類結果,再利用混淆集合以修正初步分類結果,並產生所需的書寫符號。Finally, proceeding to step S50, the trajectory identification module is used to receive the trajectory feature, and the trajectory features are first classified according to the radial basis function neural network to generate preliminary classification results, and then the confusion set is used to correct the preliminary classification result, and generate the The required writing symbol.
由於上述本發明方法的視訊攝影機模組、移動物件偵測模組、手指偵測模組、特徵擷取模組及軌跡辨識模組係如同第一圖所述之視訊攝影機模組10、移動物件偵測模組20、手指偵測模組30、特徵擷取模組40及軌跡辨識模組50,因此不再贅述。The video camera module, the moving object detecting module, the finger detecting module, the feature capturing module and the trajectory identifying module of the method of the present invention are the video camera module 10 and the moving object as described in the first figure. The detection module 20, the finger detection module 30, the feature extraction module 40, and the trajectory identification module 50 are therefore not described again.
本發明的特點在於,不須受輸入設備的限制,如觸控面板,可隨時隨地直接利用手指作為輸入介面,因此應用性強。本發明的另一特點在於,針對手部辨識技術,本發明係利用包圍矩形與手部的輪廓線段之最小夾角以定位指尖,可提升一般藉辨識手部最高點來定位指尖的方法,進而提高定位之準確率。此外,本發明藉由核函數估測法估測來估測鏈碼的機率分布以作為軌跡特徵,並利用徑向基底函數類神經網路進行分類,將輸出結果透過建立好的混淆集合來進行修正,以達到更好的辨識率。The invention is characterized in that it is not limited by the input device, such as a touch panel, and the finger can be directly used as an input interface anytime and anywhere, so the applicability is strong. Another feature of the present invention is that, for the hand recognition technology, the present invention utilizes the minimum angle between the outline of the surrounding rectangle and the hand to locate the fingertip, and can improve the method of positioning the fingertip by generally recognizing the highest point of the hand. Thereby improving the accuracy of positioning. In addition, the present invention estimates the probability distribution of the chain code by using the kernel function estimation method as a trajectory feature, and classifies the neural network by using a radial basis function, and performs the output result through the established confusion set. Corrected to achieve a better recognition rate.
因此,本發明的手部軌跡辨識系統可用以控制電腦之操作或文字輸入,進而改善人機溝通的便利性,可應用於電腦家電化、即時的電腦教學、辦公室的會議、資料輸入、文章撰寫或繪圖所需的人機介面,尤其是對於語言障礙的病患,在不需透過任何外在的輸入設備下,即可直接利用手指書寫的方式進行對談與溝通。Therefore, the hand trajectory recognition system of the present invention can be used to control computer operation or text input, thereby improving the convenience of human-machine communication, and can be applied to computer home appliances, instant computer teaching, office meetings, data input, article writing. Or the human-machine interface required for drawing, especially for patients with language impairments, can use the finger writing method to communicate and communicate directly without any external input device.
以上所述者僅為用以解釋本發明之較佳實施例,並非企圖據以對本發明做任何形式上之限制,是以,凡有在相同之發明精神下所作有關本發明之任何修飾或變更,皆仍應包括在本發明意圖保護之範疇。The above is only a preferred embodiment for explaining the present invention, and is not intended to limit the present invention in any way, and any modifications or alterations to the present invention made in the spirit of the same invention. All should still be included in the scope of the intention of the present invention.
10...視訊攝影機模組10. . . Video camera module
20...移動物件偵測模組20. . . Moving object detection module
30...手指偵測模組30. . . Finger detection module
31...手部31. . . hand
32...包圍矩形32. . . Enclosing rectangle
33...交點33. . . Intersection
34...交點34. . . Intersection
35...交點35. . . Intersection
40...特徵擷取模組40. . . Feature capture module
50...軌跡辨識模組50. . . Trajectory identification module
S10...利用視訊攝影機模組以拍攝畫面產生連續的複數個動態影像資料S10. . . Using the video camera module to generate a continuous plurality of motion image data by taking a picture
S20...利用移動物件偵測模組接收動態影像資料藉核函數密度估測法以偵測移動物件S20. . . Using the moving object detection module to receive the dynamic image data borrowing function density estimation method to detect moving objects
S30...利用手指偵測模組接收移動物件找出最小夾角交點以形成指尖軌跡S30. . . Using the finger detection module to receive the moving object to find the minimum angle intersection to form the fingertip trajectory
S40...利用特徵擷取模組接收指尖軌跡進行筆畫抽取與書寫特徵擷取以產生軌跡特徵S40. . . Using the feature capture module to receive the fingertip trajectory for stroke extraction and writing feature capture to generate trajectory features
S50...利用軌跡辨識模組接收軌跡特徵進行軌跡辨識以產生書寫符號S50. . . Using the trajectory identification module to receive trajectory features for trajectory recognition to generate written symbols
T1...動態影像資料T1. . . Motion picture data
T2...移動物件T2. . . Moving object
T3...指尖軌跡T3. . . Fingertip track
T4...軌跡特徵T4. . . Trajectory feature
T5...書寫符號T5. . . Writing symbol
第一圖為本發明手部軌跡辨識系統的示意圖。The first figure is a schematic diagram of the hand trajectory identification system of the present invention.
第二圖為本發明手指偵測的示意圖。The second figure is a schematic diagram of finger detection according to the present invention.
第三圖為本發明手部軌跡辨識方法的示意圖。The third figure is a schematic diagram of the method for identifying the hand trajectory of the present invention.
10...視訊攝影機模組10. . . Video camera module
20...移動物件偵測模組20. . . Moving object detection module
30...手指偵測模組30. . . Finger detection module
40...特徵擷取模組40. . . Feature capture module
50...軌跡辨識模組50. . . Trajectory identification module
T1...動態影像資料T1. . . Motion picture data
T2...移動物件T2. . . Moving object
T3...指尖軌跡T3. . . Fingertip track
T4...軌跡特徵T4. . . Trajectory feature
T5...書寫符號T5. . . Writing symbol
Claims (8)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW99103565A TWI426420B (en) | 2010-02-05 | 2010-02-05 | Hand track identification system and method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW99103565A TWI426420B (en) | 2010-02-05 | 2010-02-05 | Hand track identification system and method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
TW201128463A TW201128463A (en) | 2011-08-16 |
TWI426420B true TWI426420B (en) | 2014-02-11 |
Family
ID=45025229
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW99103565A TWI426420B (en) | 2010-02-05 | 2010-02-05 | Hand track identification system and method thereof |
Country Status (1)
Country | Link |
---|---|
TW (1) | TWI426420B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11126832B2 (en) | 2012-03-16 | 2021-09-21 | PixArt Imaging Incorporation, R.O.C. | User identification system and method for identifying user |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11551076B2 (en) | 2014-09-05 | 2023-01-10 | Qualcomm Incorporated | Event-driven temporal convolution for asynchronous pulse-modulated sampled signals |
CN112132039B (en) * | 2020-09-23 | 2023-08-08 | 深兰科技(上海)有限公司 | Method and system for realizing action classification based on LSTM and manual characteristics |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6633671B2 (en) * | 1998-01-28 | 2003-10-14 | California Institute Of Technology | Camera-based handwriting tracking |
TW200703118A (en) * | 2005-07-15 | 2007-01-16 | Inventec Appliances Corp | Hand-writting input system |
US20070273765A1 (en) * | 2004-06-14 | 2007-11-29 | Agency For Science, Technology And Research | Method for Detecting Desired Objects in a Highly Dynamic Environment by a Monitoring System |
TW200949717A (en) * | 2008-05-27 | 2009-12-01 | Ind Tech Res Inst | Method for recognizing writing motion and trajectory and apparatus for writing and recognizing system |
-
2010
- 2010-02-05 TW TW99103565A patent/TWI426420B/en not_active IP Right Cessation
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6633671B2 (en) * | 1998-01-28 | 2003-10-14 | California Institute Of Technology | Camera-based handwriting tracking |
US20070273765A1 (en) * | 2004-06-14 | 2007-11-29 | Agency For Science, Technology And Research | Method for Detecting Desired Objects in a Highly Dynamic Environment by a Monitoring System |
TW200703118A (en) * | 2005-07-15 | 2007-01-16 | Inventec Appliances Corp | Hand-writting input system |
TW200949717A (en) * | 2008-05-27 | 2009-12-01 | Ind Tech Res Inst | Method for recognizing writing motion and trajectory and apparatus for writing and recognizing system |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11126832B2 (en) | 2012-03-16 | 2021-09-21 | PixArt Imaging Incorporation, R.O.C. | User identification system and method for identifying user |
Also Published As
Publication number | Publication date |
---|---|
TW201128463A (en) | 2011-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109344793B (en) | Method, apparatus, device and computer readable storage medium for recognizing handwriting in the air | |
US8781230B2 (en) | Video-based biometric signature data collection | |
KR101033451B1 (en) | Video-based handwriting input method and apparatus | |
US20160154469A1 (en) | Mid-air gesture input method and apparatus | |
TWI382352B (en) | Video based handwritten character input device and method thereof | |
WO2013075466A1 (en) | Character input method, device and terminal based on image sensing module | |
EP3912338B1 (en) | Sharing physical writing surfaces in videoconferencing | |
TW201322058A (en) | Gesture recognition system and method | |
TWI625678B (en) | Electronic device and gesture recognition method applied therein | |
US20130111414A1 (en) | Virtual mouse driving apparatus and virtual mouse simulation method | |
Beg et al. | Text writing in the air | |
US20140369559A1 (en) | Image recognition method and image recognition system | |
TWI426420B (en) | Hand track identification system and method thereof | |
CA2806149C (en) | Method and system for gesture-based human-machine interaction and computer-readable medium thereof | |
JP2016099643A (en) | Image processing device, image processing method, and image processing program | |
Chen et al. | Air-writing for smart glasses by effective fingertip detection | |
CN103176603A (en) | Computer gesture input system | |
CN103914186A (en) | Image location recognition system | |
Dehankar et al. | Using AEPI method for hand gesture recognition in varying background and blurred images | |
CN104156067A (en) | Camera based handwriting inputting method | |
Pansare et al. | Comprehensive performance study of existing techniques in hand gesture recognition system for sign languages | |
US20200134341A1 (en) | Intelligent terminal | |
Meng et al. | Building smart cameras on mobile tablets for hand gesture recognition | |
Watanabe et al. | Recognition of character drawn on screen with laser pointer | |
Farhad et al. | Real-time numeric character recognition system based on finger movements |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
MM4A | Annulment or lapse of patent due to non-payment of fees |