CN101739122A - Method for recognizing and tracking gesture - Google Patents

Method for recognizing and tracking gesture Download PDF

Info

Publication number
CN101739122A
CN101739122A CN200810177689A CN200810177689A CN101739122A CN 101739122 A CN101739122 A CN 101739122A CN 200810177689 A CN200810177689 A CN 200810177689A CN 200810177689 A CN200810177689 A CN 200810177689A CN 101739122 A CN101739122 A CN 101739122A
Authority
CN
China
Prior art keywords
gesture
image
tracking
block
moves
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN200810177689A
Other languages
Chinese (zh)
Inventor
陈水来
许哲豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shirong Science & Technology Co Ltd
Original Assignee
Shirong Science & Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shirong Science & Technology Co Ltd filed Critical Shirong Science & Technology Co Ltd
Priority to CN200810177689A priority Critical patent/CN101739122A/en
Publication of CN101739122A publication Critical patent/CN101739122A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention relates to a method for recognizing and tracking a gesture. In the method, a gesture image is captured by using an image sensor, and is processed for recognizing and tracking the image; and actions corresponding to the gesture image are executed. The method comprises the following steps of: firstly, preprocessing the image; detecting image motion; analyzing image characteristics so as to judge the gesture state of the image; detecting and tracking a center coordinate of a movable gesture and outputting the center coordinate of the movable gesture if the gesture image is the movable gesture; and outputting an action instruction corresponding to a command gesture if the gesture image is the command gesture. Therefore, the method achieves the purposes of taking a digital signal processor as a hardware platform and recognizing and tracking the natural gesture.

Description

The method of gesture identification and tracking
Technical field
The present invention relates to the method for a kind of gesture identification and tracking, relate in particular to a kind of method of utilizing a digital signal processor this gesture to be carried out identification and tracking.
Background technology
Along with the progress of computing machine science and technology, the improvement of human-computer interaction is the problem that many research institutes are absorbed in always, from early stage keyboard, mouse and joystick, all is in order to allow user's operational computations machine more easily.In the application of many virtual realitys (virtual reality) and multimedia system (multimedia system), electronic game recreation for 3D Object Operations, 3D virtual product display systems, computer plotting system and action class or motion class ... etc. the application of scope, must have the input media of 3D and high-freedom degree.Yet, input medias such as aforesaid keyboard, mouse and joystick, and can't make things convenient for and nature and direct interaction effect be provided between user and system suitably.
Because the application of man-machine interface (human computer interface) is universal day by day, comprise gesture identification, speech recognition or body language identification or the like, all being widely studied and being applied among the daily life, is the most natural and the most direct in order to using gesture as inputting interface wherein.So, the gesture identification is applied in aspects such as machine vision and virtual reality, become new development trend.
On gesture identification and gesture are followed the trail of practical application as the computing machine inputting interface,, can provide accurately and respond to rapidly and identification effect based on the use of the method (glove-based method) of gloves.So-called method based on gloves, be meant that the user need put on data glove, and be equiped with the contact inductor on the data glove, the flexibility or the hand motion of user's finger can be accurately captured, and the activity of hand computing machine can be sent in the electronic signal mode.By analyzing these electronic signals, system can promptly pick out the operating state of gesture.Only, this type of production unit cost of data glove that is equiped with the contact inductor is quite expensive, and the dimensions kind of data glove is not various yet, cause the user on the wearing appropriate degree, more choices need be arranged, dress thick and heavy data glove in addition and also make feel limit user's activity to fatigue easily, cause causing the inconvenience in the use.
Therefore, how to design a kind of gesture identification that can reduce cost of development and simplify procedures and the method for tracking, the feasible distance that shortens user and machinery compartment, and make man-machine interface towards more efficient, more conform with hommization and more diversified direction and stride forward, be a big problem that overcomes and solve for this case institute desire row.
Summary of the invention
In view of this, the invention provides the method for a kind of gesture identification and tracking, utilize an image sensor to capture a gesture image, by a digital signal processor this image is handled again, in order to identification and follow the trail of this gesture image, and carry out the pairing action command of this gesture image.For this reason, reaching with this digital signal processor is hardware platform, the purpose of carrying out identification of nature gesture and tracking.
In order to address the above problem, the invention provides the method for a kind of gesture identification and tracking.Steps of the method are: at first, this gesture image is carried out pre-process.Then, detect a largest motion block of this gesture image and be defined as a gesture block.Then, analyze the feature of this gesture block, and then judge that this gesture block is to move to confirm gesture, order gesture or other undefined gesture.At last, if this gesture block was confirmed gesture for moving, then be transformed to one and move gesture, and this mobile gesture does not stop to surpass an actuation time in moving process, then detect and follow the trail of the centre coordinate that this moves gesture, and export this centre coordinate that this moves gesture.
Beneficial functional of the present invention is that can reach with the digital signal processor is hardware platform, the purpose of carrying out identification of nature gesture and tracking.
Describe the present invention below in conjunction with the drawings and specific embodiments, but not as a limitation of the invention.
Description of drawings
Fig. 1 is the method flow diagram of the present invention's one gesture identification and tracking;
Fig. 2 is the synoptic diagram of dynamic image Difference Calculation of the present invention;
Fig. 3 is the synoptic diagram of level of the present invention and the calculating of vertical projection amount;
Fig. 4 A to Fig. 4 C is the synoptic diagram of block label process of the present invention;
Fig. 5 A to Fig. 5 B continues to follow the trail of the synoptic diagram of mobile gesture centre coordinate process for the present invention; And
Fig. 6 is the device calcspar of the present invention's one gesture identification and tracking.
Wherein, Reference numeral
10 image sensors
20 digital signal processors
30 first internal memories
40 second internal memories
50 video output modules
60 data input
The Ps model
The S102-S502 step
Embodiment
Relevant technology contents of the present invention and detailed description cooperate graphic being described as follows:
See also the method flow diagram of Fig. 1 for the present invention's one gesture identification and tracking.This method is to utilize an image sensor to capture a gesture image, by a digital signal processor this gesture image is handled again.The step of this method and replaces the example that is applied as of mouse action as detailed below with gesture motion.
At first, this digital signal processor carries out pre-process (S102) to this gesture image.Because undressed image can contain some noises usually, make the probability of identification mistake increase, and too much and useless information also can reduce the whole efficient of carrying out, so the image that acquisition obtains all can pass through pre-process before analyzing.This image pre-process step (S102) is in regular turn: transfer this gesture image size for being fit to the calculation scope earlier; Then this gesture image is carried out color conversion, image is reduced to grey-tone image (8bit gray level) by full-color image (24bit RGB); Pass through the point-like noise of image low-pass filter (image low pass filter) this gesture image of filtering at last again, in order to follow-up practice.So, not only increase the accuracy of identification, and can save the space of storage data and promote transmission speed by the image pre-process.
Detect a largest motion block of this gesture image then and be defined as a gesture block (S104).This image motion detection step (S104) is in regular turn: utilize the dynamic image difference earlier, produce the scale-of-two image again, and adopt those scale-of-two images of logical operation, calculate all movable parts in this gesture image; Utilize the vertical and horizontal bright spot quantity of this gesture image of statistics then, and choose the logic in the maximal projection amount zone of Z-axis and transverse axis, and find out the maximum moving area of this gesture image; Then utilize (dilation) technology of expansion, the broken image of thin portion of the logic region of this Z-axis and transverse axis maximal projection amount is filled up; Use tag number at last, getting rid of non-gesture motion zone, and keep and calculate largest connected zone to detect this largest motion block.
See also the synoptic diagram of Fig. 2 dynamic image Difference Calculation of the present invention.As shown in the figure, be to utilize continuous three frame gesture images and calculate real mobile object.Wherein, the present frame grey-tone image is denoted as M2, and the former frame grey-tone image is denoted as M1, and the front cross frame grey-tone image is denoted as M0.And, set a threshold value (threshold value), in order to become the foundation of scale-of-two image (binary image) as those grey-tone images of conversion.Earlier present frame grey-tone image M2 is deducted former frame grey-tone image M1, obtains a new grey-tone image, then, pixel that again will this new grey-tone image and this threshold ratio: more than or equal to this threshold value, then be made as bright spot as if the pixel of this grey-tone image; If the pixel of this grey-tone image then is made as dim spot, and obtains a new scale-of-two image M3 less than this threshold value.Similarly, former frame grey-tone image M1 is deducted the first two frame grey-tone image M0, and obtain another new grey-tone image, and again in this threshold ratio, and obtain another new scale-of-two image M4.At last, again two scale-of-two image M3, M4 are carried out logic (AND) computing, get final product the scale-of-two image M5 of movable part to the end.
See also the synoptic diagram of Fig. 3 for level of the present invention and the calculating of vertical projection amount.The scale-of-two image M5 of resulting last movable part is added up vertical bright spot quantity and horizontal bright spot quantity, to find out maximum moving area.As shown in the figure, two big amount of movement blocks are arranged, are denoted as X and Y respectively, as calculated after, the horizontal projection amount obtains two big zones of A, B; The vertical projection amount obtains two big zones of C, D.Then, get the logic region of transverse axis maximal projection amount area B and Z-axis maximal projection amount zone C, can obtain maximum amount of movement block X.
See also the synoptic diagram of Fig. 4 A to Fig. 4 C for block label process of the present invention.After the broken image of the thin portion of the logic region of this Z-axis and transverse axis maximal projection amount is filled up, with the bright spot and the dim spot (being denoted as 0 and 1) of binary number value representation scale-of-two image as Fig. 4 A.Then, renumberd and reference area at connected region again, only keep largest portion (being denoted as 2 zone) at last, and get rid of the zone of non-gesture motion as Fig. 4 B and Fig. 4 C.
Analyze the feature of this gesture block then, and then judge that this gesture block is to move to confirm gesture, order gesture or other undefined gesture (S106).The aspect ratio of this gesture block is compared one by one with the gesture image data in the database, and comparison result is stored in the core buffer being relative extreme point occurrence positions and each the extreme value difference of utilizing each gesture to produce.For example, when the operator opened five fingers, because the definition relative maximum appears at the finger tip part, relative minimum appeared at two finger web junction and palm the right and lefts.So the feature of this gesture block then has five maximum value and six minimal values.So after the comparison of the gesture image data in this gesture block and the database, the gesture that then picks out this operator is to stretch out the state of five fingers.
See also Fig. 1 again.Then if this gesture block for should move confirming gesture (S108), then judging whether continues is transformed to one and moves gesture (S200).Be transformed to this if not continue and move gesture, then re-execute step (S102).Moved by this and confirm that gesture is transformed to this and moves gesture if judge operator's gesture, then may command one cursor is a shift action.Wherein, should move and confirm that gesture may be defined as a V-shape that is formed by forefinger and middle finger, when the operator stretches out forefinger and middle finger and forms this V-shape, after the gesture image data comparison in this gesture block and the database, the gesture that then picks out this operator is for stretching out the state of forefinger and middle finger, and is that this moves the affirmation gesture.When this moves gesture and is mobile status, then produce the action of moving this cursor.Judge then whether this mobile gesture continues to stop to be moved beyond an actuation time (S300).If the operator stops gesture and is moved beyond this actuation time, then re-execute step (S102).Wherein, can need be set different time spans this actuation time according to operator's use-pattern or use, and for example, can be set at for 1 second this actuation time.Move not above this actuation time if the operator stops gesture, then judge whether to detect the centre coordinate (S400) that this moves gesture.If do not detect this centre coordinate that this moves gesture, then detect this centre coordinate (S404) that this moves gesture again, and then execution in step (S400), rejudge whether detect this centre coordinate that this moves gesture.Because this mobile gesture is defined as the fist shape that the five fingers are clenched fist and formed, therefore, when opening of this centre coordinate begins to follow the trail of, utilize circular Hough conversion (circularHough transfer), find out the identical point in maximum centers of circle with statistical, can be judged as this centre coordinate place.
If detect this centre coordinate that this moves gesture, (sum ofaccumulator table, SAT) mode (S402) judge whether to track this centre coordinate (S500) that this moves gesture then to utilize the quick point table.If do not track this centre coordinate that this moves gesture, then execution in step (S404) again detects this centre coordinate that this moves gesture again.If track this centre coordinate that this moves gesture, then export this centre coordinate (S502) that this moves gesture, re-execute step (S102) then.
See also the synoptic diagram that Fig. 5 A to Fig. 5 B the present invention continues to follow the trail of mobile gesture centre coordinate process.When detecting this centre coordinate that this moves gesture, respectively get 20 pixels with the centre coordinate upper and lower, left and right, produce the model Ps of one 40 pixel *, 40 pixel sizes and calculate the total value that adds of all pixel gray level values, utilize quick point table mode afterwards again, search comparison in the zone of positive and negative 60 pixels of centre coordinate one by one and add the position that the total difference value is minimum or equate, be this and move the new center of gesture, produce new coordinate and be stored in this core buffer this moment, and Search Area is moved on to new centre coordinate.So compare one by one to the bottom right by upper left at this Search Area, to keep continuing to follow the trail of this centre coordinate by this square model Ps.To eventually,, then finish to follow the trail of, and re-execute step (S102) if this mobile gesture stops no longer to move and continue to surpass this actuation time.
In addition, in step S106,, then export the pairing action command of this order gesture (S112), then, re-execute step (S102) again if this gesture block is detected as this order gesture (S110).For example, this order gesture can be defined as one 1 word shapes that forefinger forms, and the pairing action command of this order gesture is a click action.When the operator stretched out forefinger and forms 1 word shape, after the gesture image data comparison in this gesture block and the database, the gesture that then picks out this operator was for stretching out the state of forefinger, therefore, located to carry out click action at cursor place coordinate.This order gesture can be defined as other gesture shape voluntarily according to operator's operating habit, or also can define other effective order gesture, to carry out the action command of correspondence out of the ordinary.
In addition, in step S106, if this gesture block is detected as this undefined gesture (S114), promptly this order gesture non-for should move confirm gesture (V-shape that forefinger and middle finger form), this moves maybe arbitrary gesture of this order gesture (one 1 word shapes of forefinger formation) of gesture (the fist shape that the five fingers are clenched fist and formed), and be an invalid undefined gesture, then re-execute step (S102).
See also the device calcspar of Fig. 6 for the present invention's one gesture identification and tracking.This device comprises an image sensor 10, a digital signal processor 20, one first internal memory 30, one second internal memory 40 and a video output module 50.
This image sensor 10 is in order to capture a gesture image.This digital signal processor 20 electrically connects this image sensor 10, in order to an algorithm to be provided this gesture image is handled.This first internal memory 30 electrically connects these digital signal processors 20, in order to storing this algorithm of this digital signal processor 20, and provides the storage of a large amount of calculation data.Wherein, this first internal memory 30 can be a flash memory (flash memory).This second internal memory 40 electrically connects these digital signal processors 20, required memory buffer zone when these digital signal processor 20 computings to be provided.Wherein, this second internal memory 40 can be a random access memory (randomaccess memory).This video output module 50 electrically connects these digital signal processors 20, results in order to an image of exporting after these digital signal processor 20 computings.Wherein, this this image results in and may be output to analog display unit (figure does not show), as TV or monitor; Or numeric display unit (figure does not show), as LCD.This digital signal processor 20 more can electrically connect a data input 60, result in to other device (figure does not show) in order to not only to transmit this image by different output interfaces, as independent operational devices such as computing machine or arcade servers, simultaneously also accept extraneous control command, in order to adjust the computing of this digital signal processor 20.
In sum, the present invention has following advantage:
1, utilizes this digital signal processor to be hardware platform, carry out identification of nature gesture and tracking, must additionally not dress gloves or special icon, color or light-emitting device, can reduce cost of development greatly and simplify procedures.
2, the hardware platform of this digital signal processor can connect other outside autonomous device, improves portable facility and flexible expansion and uses.
Certainly; the present invention also can have other various embodiments; under the situation that does not deviate from spirit of the present invention and essence thereof; those of ordinary skill in the art work as can make various corresponding changes and distortion according to the present invention, but these corresponding changes and distortion all should belong to the protection domain of the appended claim of the present invention.

Claims (13)

1. the method for gesture identification and tracking utilizes an image sensor to capture a gesture image, by this gesture image is handled, it is characterized in that again the step of this method comprises:
(a) this gesture image is carried out pre-process;
(b) detect a largest motion block of this gesture image and be defined as a gesture block;
(c) analyzing the feature of this gesture block, is to move to confirm a gesture or an order gesture to judge this gesture block;
(d) if this gesture block, judge then whether this gesture block continues for should move confirming gesture and be transformed to one and move gesture;
(e) if this gesture block continues and is transformed to this and moves gesture, and this mobile gesture does not stop then to detect and follow the trail of the centre coordinate that this moves gesture above an actuation time in moving process; And
(f) export this centre coordinate that this moves gesture, and re-execute step (a).
2. the method for gesture identification according to claim 1 and tracking is characterized in that, this step (a) comprises:
(a1) adjust this gesture image size for being fit to the calculation scope;
(a2) this gesture image is carried out color conversion; And
(a3) the point-like noise of this gesture image of filtering.
3. the method for gesture identification according to claim 1 and tracking is characterized in that, this step (b) comprises:
(b1) utilize the dynamic image difference, calculate all movable parts in this gesture image;
(b2) utilize the vertical and horizontal bright spot quantity of adding up this gesture image, find out the maximum moving area of this gesture image;
(b3) utilize expansion technique, calculate the broken image of thin portion and fill up; And
(b4) use tag number, calculate largest connected zone to detect this largest motion block.
4. the method for gesture identification according to claim 1 and tracking is characterized in that, this step (e) comprises:
(e1) utilize circular Hough conversion, detect this centre coordinate that this moves gesture; And
(e2) utilize the quick point table, follow the trail of this centre coordinate that this moves gesture.
5. the method for gesture identification according to claim 1 and tracking is characterized in that, in step (d), if this gesture block was confirmed gesture for moving, but the conversion that continues is non-for this mobile gesture, then re-executes step (a).
6. the method for gesture identification according to claim 1 and tracking is characterized in that, in step (d), if this gesture block is then exported the pairing action command of this order gesture, and re-executed step (a) for this order gesture.
7. the method for gesture identification according to claim 1 and tracking is characterized in that, in step (d), if this gesture block is a undefined gesture, then re-executes step (a).
8. the method for gesture identification according to claim 1 and tracking is characterized in that, in step (e), if this mobile gesture stops to surpass this actuation time, then re-executes step (a) in moving process.
9. the method for gesture identification according to claim 1 and tracking is characterized in that, in step (e), if can't detect side or track this centre coordinate that this moves gesture, this moves this centre coordinate of gesture then to detect tracking again.
10. the method for gesture identification according to claim 1 and tracking is characterized in that be set at 1 second this actuation time.
11. the method for gesture identification according to claim 1 and tracking is characterized in that, this moves confirms that gesture is defined as a V-shape of forefinger and middle finger formation.
12. the method for gesture identification according to claim 1 and tracking is characterized in that, this moves gesture and is defined as the fist shape that the five fingers are clenched fist and formed.
13. the method for gesture identification according to claim 1 and tracking is characterized in that, this order gesture is defined as one 1 word shapes that forefinger forms.
CN200810177689A 2008-11-24 2008-11-24 Method for recognizing and tracking gesture Pending CN101739122A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200810177689A CN101739122A (en) 2008-11-24 2008-11-24 Method for recognizing and tracking gesture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200810177689A CN101739122A (en) 2008-11-24 2008-11-24 Method for recognizing and tracking gesture

Publications (1)

Publication Number Publication Date
CN101739122A true CN101739122A (en) 2010-06-16

Family

ID=42462678

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200810177689A Pending CN101739122A (en) 2008-11-24 2008-11-24 Method for recognizing and tracking gesture

Country Status (1)

Country Link
CN (1) CN101739122A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102452591A (en) * 2010-10-19 2012-05-16 由田新技股份有限公司 Elevator control system
CN103019389A (en) * 2013-01-12 2013-04-03 福建华映显示科技有限公司 Gesture recognition system and gesture recognition method
CN103699212A (en) * 2012-09-27 2014-04-02 纬创资通股份有限公司 Interactive system and movement detection method
CN103778405A (en) * 2012-10-17 2014-05-07 原相科技股份有限公司 Method for gesture recognition through natural images
CN104461276A (en) * 2013-09-25 2015-03-25 联想(北京)有限公司 Switching method and information processing equipment
CN104508680A (en) * 2012-08-03 2015-04-08 科智库公司 Object tracking in video stream
CN107678551A (en) * 2017-10-19 2018-02-09 京东方科技集团股份有限公司 Gesture identification method and device, electronic equipment
CN112850406A (en) * 2015-04-03 2021-05-28 奥的斯电梯公司 Traffic list generation for passenger transport

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102452591A (en) * 2010-10-19 2012-05-16 由田新技股份有限公司 Elevator control system
CN104508680A (en) * 2012-08-03 2015-04-08 科智库公司 Object tracking in video stream
CN104508680B (en) * 2012-08-03 2018-08-17 科智库公司 Improved video signal is tracked
CN103699212A (en) * 2012-09-27 2014-04-02 纬创资通股份有限公司 Interactive system and movement detection method
CN103699212B (en) * 2012-09-27 2016-08-31 纬创资通股份有限公司 interactive system and movement detection method
CN107256089B (en) * 2012-10-17 2020-07-03 原相科技股份有限公司 Gesture recognition method by natural image
CN103778405B (en) * 2012-10-17 2017-07-04 原相科技股份有限公司 With the gesture identification that natural image is carried out
CN107256089A (en) * 2012-10-17 2017-10-17 原相科技股份有限公司 The gesture identification method carried out with natural image
CN103778405A (en) * 2012-10-17 2014-05-07 原相科技股份有限公司 Method for gesture recognition through natural images
CN103019389A (en) * 2013-01-12 2013-04-03 福建华映显示科技有限公司 Gesture recognition system and gesture recognition method
CN104461276A (en) * 2013-09-25 2015-03-25 联想(北京)有限公司 Switching method and information processing equipment
CN104461276B (en) * 2013-09-25 2019-02-05 联想(北京)有限公司 A kind of switching method and information processing equipment
CN112850406A (en) * 2015-04-03 2021-05-28 奥的斯电梯公司 Traffic list generation for passenger transport
CN107678551A (en) * 2017-10-19 2018-02-09 京东方科技集团股份有限公司 Gesture identification method and device, electronic equipment
CN107678551B (en) * 2017-10-19 2021-12-28 京东方科技集团股份有限公司 Gesture recognition method and device and electronic equipment
US11402918B2 (en) 2017-10-19 2022-08-02 Boe Technology Group Co., Ltd. Method for controlling terminal apparatus, apparatus for controlling terminal apparatus, and computer-program product

Similar Documents

Publication Publication Date Title
CN101739122A (en) Method for recognizing and tracking gesture
US8270670B2 (en) Method for recognizing and tracing gesture
TW201019241A (en) Method for identifying and tracing gesture
CN108845668B (en) Man-machine interaction system and method
CN102402289B (en) Mouse recognition method for gesture based on machine vision
CN101408824A (en) Method for recognizing mouse gesticulation
CN104808788A (en) Method for controlling user interfaces through non-contact gestures
CN102053702A (en) Dynamic gesture control system and method
US20130120250A1 (en) Gesture recognition system and method
CN103686283A (en) Smart television remote controller man-machine interaction method
CN104331154A (en) Man-machine interaction method and system for realizing non-contact mouse control
CN102073414A (en) Multi-touch tracking method based on machine vision
CN107229921A (en) Dynamic gesture identification method based on Hausdorff distances
Zhang et al. Handsense: smart multimodal hand gesture recognition based on deep neural networks
CN103713755A (en) Touch recognizing device and recognizing method
CN101192124B (en) System and method for automatic distinguishing and processing for touch screen input information
CN114792443A (en) Intelligent device gesture recognition control method based on image recognition
EP2204760A1 (en) Method for recognizing and tracing gesture
Aggarwal et al. An Approach to Control the PC with Hand Gesture Recognition using Computer Vision Technique
Shaker et al. Real-time finger tracking for interaction
JP4965590B2 (en) How to recognize and track gestures
CN114860060A (en) Method for hand mapping mouse pointer, electronic device and readable medium thereof
Feng et al. FM: Flexible mapping from one gesture to multiple semantics
KR101171239B1 (en) Non-touch data input and operating method using image processing
Agah et al. Human-machine interaction through an intelligent user interface based on contention architecture

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20100616