WO2012058782A1 - Method and device for detecting gesture inputs - Google Patents
Method and device for detecting gesture inputs Download PDFInfo
- Publication number
- WO2012058782A1 WO2012058782A1 PCT/CN2010/001733 CN2010001733W WO2012058782A1 WO 2012058782 A1 WO2012058782 A1 WO 2012058782A1 CN 2010001733 W CN2010001733 W CN 2010001733W WO 2012058782 A1 WO2012058782 A1 WO 2012058782A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- movement
- sub
- reciprocating movement
- type
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04102—Flexible digitiser, i.e. constructional details for allowing the whole digitising part of a device to be flexed or rolled like a sheet of paper
Definitions
- the present invention relates to user interface, and more particularly, relates to a method for detecting gesture inputs.
- Gesture recognition especially hand gesture recognition enables humans to interface with the machine and interact naturally without any mechanical devices.
- the gesture recognition can be conducted with techniques from computer vision and image processing. Using the concept of gesture recognition, it is possible to point a finger at the computer screen so that the cursor will move accordingly.
- the gesture includes static gesture, e.g. a victory sign (a hand gesture in which the index finger and middle finger are extended and parted whilst the thumb and remaining fingers are clenched), and dynamic gesture, i.e. within a period of a single valid gesture input, the shape (e.g. extend some fingers or clench some fingers) and/or position of user's hand is changed so as to form a valid gesture input. And sometimes, a consecutive hand shape change and/or hand movement can be interpreted as two or more instructions for the device.
- the gesture input is applied into many fields.
- One application of the hand gesture input is a book reading application. For example, upon a horizontal stroke of a user hand (the trajectory of hand movement is substantially horizontal and the movement distance exceeds a certain value, e.g. 15 centimeters) before a camera, the device interprets this gesture input as an instruction to flip a page of the book.
- the backward or forward flip of the book depends on the movement orientation.
- Fig. 1 is a diagram illustrating a rightward gesture signaling the backward flip of the book according to the prior art. As can be seen from the Fig. 1 , in order to make the rightward gesture, a user has to firstly put his right hand in front of the camera, and then move rightward for a
- a method for detecting gesture inputs in response to a consecutive reciprocating movement before a detecting device wherein, the consecutive reciprocating movement is made of a first type gesture and a second type gesture, each capable of being recognized by the detecting device to output a different control signal
- the method comprises the steps of receiving the consecutive reciprocating movement starting with a first type gesture among the two types, wherein, the first type gesture and a second type gesture occur alternately; and output control signals corresponding to the first type gesture with times number equaling to the number of the first type gesture contained within the consecutive reciprocating movement.
- a device for detecting gesture inputs in response to a consecutive reciprocating movement wherein, the consecutive reciprocating movement is made of a first type gesture and a second type gesture, each capable of being recognized to output a different control signal
- the device comprises a capturing module for capturing the consecutive reciprocating movement that starts with a first type gesture among the two types, wherein, the first type gesture and a second type gesture occur alternately; and a processing module for output control signals corresponding to the first type gesture with times number equaling to the number of the first type gesture contained within the consecutive reciprocating movement based on the captured consecutive reciprocating movement.
- Fig. 1 is a diagram showing a hand move before a camera according to the prior art
- Fig. 2A, 2B are diagrams showing trajectory of a single hand wave according to an embodiment of present invention
- Fig. 3 is a diagram showing two points corresponding to two adjacent image frames according to the embodiment of present invention.
- Fig. 4 is a diagram showing the hand trajectory from the start according to the embodiment of present invention.
- Fig. 5 is a diagram showing another hand trajectory from the start according to the embodiment of present invention.
- Fig. 6 is a flow chart showing a method for detecting gesture inputs from a consecutive reciprocating movement according to the embodiment of present invention
- Fig. 7 is a block diagram showing a device for detecting gesture inputs from a consecutive reciprocating movement according to the embodiment of present invention.
- the purpose of the invention is to provide an easy way to give duplicate instructions by a consecutive reciprocating hand movement.
- the consecutive reciprocating movement can be recognized by the device as a set of sequential sub movements, and any adjacent two sub movements have opposite orientation directions.
- the type of instruction resulting from the consecutive reciprocating movement is decided by the first one within the set of sequential sub movements, and the number of resulting instructions is the same as the number of sub movements having the same orientation direction as the first one within the set (including the first one).
- the system comprises a camera used for capturing the consecutive images of the gesture input, and a processing device connecting to the camera.
- the processing device uses the consecutive images to generate corresponding instructions.
- the trajectory of basic gesture for gesture recognition is substantially equal to a straight line (the actual gesture movement cannot be as straight as a line).
- horizontal hand wave gesture is used to illustrate the principle of present invention.
- the principle of present invention can also be applied to other successive reciprocating movements, e.g. vertical hand wave, oblique hand wave, and even movement of extending and withdrawing hand before a depth- detectable camera.
- the gesture recognition technique relates to image processing, and therefore, the gesture recognition can be carried out in real time manner based on real time image frames, or not carried out in real time manner, i.e. using the camera capturing a set of image frames, and then analyzing the set of image frames.
- all static gesture recognition can be carried out in real time manner based on real time image frames.
- the device when detecting a victory sign posture, the device only needs to judge the posture is a victory sign or not based on the analysis of the convex contour of hand shape.
- the device needs to capture a set of image frames, find the centroid of the hand in each frame, and judge what kind of gesture it is after analysis of this series of hand-centroid formed curve.
- Fig. 2 is a diagram showing trajectory of a single hand wave according to an embodiment of present invention, wherein Fig. 2A shows leftward hand movement and Fig. 2B shows rightward hand movement.
- the camera is the top left coordinates system.
- the solid curve or we called it substantially straight line
- the center of mass for the hand in each frame is equivalent to a point on the curve. Therefore, when all image frames for either direction movement, for example, leftward hand movement, we will record a series of consecutive points, in order to describe conveniently, we use the line link up all these points. The number of the discrete points depends on frame capture rate.
- Fig. 3 is a diagram showing two points corresponding to two adjacent image frames according to the embodiment of present invention.
- the point A corresponds to the former one of the two image frames
- the point B corresponds to the latter one of the two image frames.
- two data arrays are used to store the hand trajectory for the analysis of the user's gestures. If the motion trend is not reversed, e.g. as Fig. 2A shows, we store the hand trajectory data in the first array. After detecting the motion trend is reversed, we store the hand trajectory data since reversal of motion trend in second array, and then we use the data in first array to determine what gesture happens and output a corresponding instruction. In addition, a reverse threshold and the four direction counters are used to determine the occurrence of reverse motion.
- the user's consecutive gesture such as horizontal hand wave
- has certain regularity for example, when people is doing the wave action, if it is leftward waving, it must firstly move leftward a distance then move rightward a distance, and repeated several times; if it is rightward waving, it must be the first to move rightward a distance then move leftward a distance, and repeated several times too, so we can select the directions counter based on the previous sub-gesture.
- Step 1 at a time point during the successive gesture input, the device determines coordinates values of last position point and current position point of the hand.
- the pseudo-code is shown below:
- Last position x coordinates (old_x) (hand trace trend reverse happened) ? got last stored data in second array (pPointBufl.x) : got last stored data in first array (pPointBufO.x);
- Last position y coordinates (old_y) (hand trace trend reverse happened) ? got last stored data in second array (pPointBufl.y) : got last stored data in first array (pPointBufO.y);
- current frame corresponding to the time point is used to determine the current position point's coordinates value.
- last position point that corresponding to an image frame immediately preceding to the time point the device needs to first determine if reversal of motion trend has happened. If happens, the data in the second array will be used to determine the last position point's coordinates value, or otherwise, the data in the first array will be used to determine the last position point's coordinates value.
- Step 2 four orientation counters for determining the motion trend of hand movement are updated based on the position relation of the current position point and the last position point.
- the upward and downward orientation counters are redundant.
- Two orientation counters corresponding to opposite direction works in a group. For example, if the location of the current position point relative to last position point is in a left direction, then the leftward orientation counter will be added one, and at the same time, if the rightward orientation counter is not zero, then this counter will be subtracted one. And the same principle is applied to the upward and downward orientation counters.
- the related pseudo-code for determining the leftward and rightward counters (orientationj and orientation r) based on X-axis values is shown below.
- a similar principle can be applied to the determination of upward and downward counters based on Y-axis values.
- orientation_r++;// RIGHT orientation counter increase one
- orientation _/++;// LEFT orientation counter increase one ri.i/U ⁇ / u u 1 J J 3
- Fig. 4 is a diagram showing the hand trajectory from the start according to the present embodiment.
- the reverse point can be deemed as the starting point of the reversing gesture. Therefore, according to the above manipulation method for OrientationJ and Orientation_r (in this example, we omit the manipulation for upward and downward orientation counters), at the point before the reverse point, i.e. the rightmost point, the OrientationJ is 0 and the Orientation_r 13.
- the OrientationJ is added one and becomes 1
- Orientation_r is substrated one and becomes 12.
- Step 3 each time the device captures an image frame during the gesture input, the device determines if the motion trend reverses.
- the device captures the gesture at a certain capture rate as a set of image frames.
- the device For each image frame, the device firstly determines a position point of user's hand in the image frame, and uses the position point and its preceding position point to update the relevant orientation counters. After orientation counters are updated, the device uses the orientation counters and an orientation threshold value to determine if the reversal of motion trend occurs.
- the orientation threshold value is a predetermined value, in this example, we set it as 6.
- the pseudo-code for determining if the motion trend reverses is shown below.
- a set of sub gesture orientation indicators i.e.
- gesturej, gesture_r, gesture_u and gesture d are used to record the gesture orientation of last gesture among a consecutive gesture input or a consecutive reciprocating hand movement. Before a user inputs a consecutive gesture, these parameters are set false. And after the device detects the first sub gesture among the consecutive gesture input, the corresponding parameter will be set true.
- the maximum counter among the four counters is chosen to compare with the orientation threshold value. If the maximum counter is bigger than the threshold value and the orientation counter with the opposite orientation to the maximum counter is equal to 1 , then it is determined that a reversal occurs, and set a sub gesture orientation indicator corresponding to the opposite orientation of the first sub gesture true. According to a variant, a single indicator with four value choices can be used to replace the four true-false indicators.
- the device determines if the following two conditions are met, 1 ) the orientation counter with the same orientation as the last sub gesture orientation indicator exceeds the threshold value; and 2) the orientation counter with the opposite orientation to the last sub gesture orientation indicator is equal to 1. If they are both met, it is determined that a reversal occurs. This will reduce the complexity of the calculation.
- a pseudo-code for determination of reverse after got gesturej is shown below. The pseudo-code for other orientation indicators is similar. If (gesture J) ⁇ //last gesture is LEFT
- the data is stored in a first array pPointBufO.
- the data in pPointBufO is used to interpret the gesture to output a signal indicating the meaning of the gesture.
- Fig. 5 shows this situation. If the user is doing move rightward sub-gesture, it may also be accompanied by a downward move, although at the reversal point the downward orientation counter is bigger than rightward orientation counter, we cannot determinate this sub-gesture as downward gesture.
- the first array is used to store data of sub gesture movements having the same orientation as the first sub gesture movement
- the second array is used to store data of sub gesture movements having the opposite orientation to the first sub gesture movement because the gesture movements are reciprocating.
- the determination of the third reversal, fifth reversal, seventh reversal etc. is similar to the determination of the first reversal.
- the determination of the fourth reversal, sixth reversal, eighth reversal etc. is similar to the determination of the second reversal.
- nDiffx pPointBufO[nPointBufLength-1].x - pPointBufO[0].x;//Got the difference of x-axis nDiffx;
- nDiffy pPointBuf0[nPointBufLength-1].y - pPointBufO[0].y;//Got the difference of y-axis nDiffy;
- nDiffx and nDiffy are compared to determine what major movement occurs between horizontal movement and vertical movement. This is because user hand's movement cannot be an exactly horizontal movement or vertical movement. Although the difference in vertical is inevitable when intending to make a horizontal movement, the absolute difference value in horizontal shall be larger than that in vertical. After it's determined as the horizontal movement, the nDiffx is used to determine which gesture between left gesture and right gesture is.
- gesturej LEFT gesture happened
- gesture_d DOWN gesture happened
- FIG. 6 is a flow chart depicting a method for detecting gesture inputs from a consecutive reciprocating movement according to present embodiment.
- the consecutive reciprocating movement is made of two types of device recognizable gestures, each occurring several times.
- Step 601 the device receives a consecutive reciprocating movement by using a camera to capture the movement, and outputs the result of a sequence of image frames. It shall note the step 601 here includes at least two scenarios that 1 ) the device keeps capturing the movement till the movement ends, and then the capturing module outputs the sequence of image frames; and 2) the capturing module outputs an image frame immediately in response to an image capture during the movement.
- Step 602 the device determines at least one reverse point by using the sequence of image frames. Specifically, this step further comprises step 6021 determining a coordinate value corresponding to hand's position for each image frame so as to obtain a sequence of coordinate values. As to the determination of the coordinate value, it can, for example, convert the captured RGB image to HSV color space, and do background subtraction base on skin color. If needed, it can do some morphological operations, and then we can find the contour of the hand. The mass of this contour will be the coordinate value corresponding to hand's position for this frame. Step 6022 determining the reverse point based on the sequence of coordinate values by using above step 1 to step 3.
- Step 603 the device partitions the consecutive reciprocating movement into sub movements based on the determined reverse point(s). Because the reverse point(s) is determined and the sequence of image frames corresponds to a consecutive reciprocating movement, sub movements are separated by the reverse point(s) within the sequence of image frames, and the reverse point is the starting point for any sub movement except the first sub movement. Besides, it shall note that all the partitioned sub movements are recognizable gestures.
- Step 604 the device outputs at least one signal based on the partitioned sub movements. Specifically, the device firstly determines the number of sub movements including the first sub movement having the same orientation as the first sub movement, and then output signals with meaning corresponding to the recognizable first sub movement and number of which equal to the number of the sub movements having the same orientation as the first sub movement.
- the device may needs to, in response to each input of image frame, determine the first sub movement by finding the first reverse point and output a signal corresponding to the first sub movement. After determining the first sub movement, the device will determine, in response to each input of image frame, the second sub movement, third sub movement, fourth sub movement etc. in a sequential manner by finding the second reverse point, the third reverse point, the fourth reverse point etc., comparing separately the second sub movement, the third sub movement, the fourth sub movement etc to the first sub movement to determine if they are of the same type gesture, and if they are the device will outputs the same signal as the first sub movement.
- the device because it is the consecutive reciprocating movement, it's needless to compare the second sub movement, the fourth sub movement, the sixth sub movement etc. Thus, the device only needs to determine the first sub movement, the third sub movement, the fifth sub movement. Of course, the device needs to determine every reverse point for partitioning the consecutive reciprocating movement into sub movements. After obtaining the third sub movement, the fifth movement etc, the device compares them with the first sub movement, and outputs, after each positive comparison, a signal corresponding to the first sub movement. In addition, because the movement is reciprocating, it may not be necessary to compare the odd-times sub movements. Instead, at every other reverse point starting from the first reverse point, the device outputs a signal corresponding to the first sub movement. Fig.
- FIG. 7 is a block diagram showing a device for detecting user inputs. As can be seen from the Fig. 7, it is provided a device for detecting gesture inputs in response to a consecutive reciprocating movement, wherein, the consecutive reciprocating movement is made of two types of recognizable gestures, i.e. a first type gesture (e.g. moving leftward) and a second type gesture (e.g.
- a first type gesture e.g. moving leftward
- a second type gesture e.g.
- the device comprises a capturing module for capturing the consecutive reciprocating movement that starts with a first type gesture among the two types, wherein, the first type gesture and a second type gesture occur alternately; and a processing module for output control signals corresponding to the first type gesture with times number equaling to the number of the first type gesture contained within the consecutive reciprocating movement based on the captured consecutive reciprocating movement.
- processing module is configured to determine reverse points for the consecutive reciprocating movement.
- the processing module is configured to partition the consecutive reciprocating movement into at least two sub movements by using the reverse points, each sub movement corresponding to one of the two types of gesture; comparing it to the first sub movement for each sub movement; and in response to determination that a sub movement being under comparison is of the same type as the first sub movement, outputting a control signal corresponding to the first type gesture.
- processing module is configured to output a control signal corresponding to the first type gesture in response to each odd-times reverse point.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims
Priority Applications (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201080071032.0A CN103282858B (en) | 2010-11-01 | 2010-11-01 | Method and apparatus for detecting posture input |
PCT/CN2010/001733 WO2012058782A1 (en) | 2010-11-01 | 2010-11-01 | Method and device for detecting gesture inputs |
RU2013125228/08A RU2581013C2 (en) | 2010-11-01 | 2010-11-01 | Method and device for detecting input using gestures |
BR112013010520-8A BR112013010520B1 (en) | 2010-11-01 | 2010-11-01 | method and device for detecting gesture entries |
US13/882,592 US9189071B2 (en) | 2010-11-01 | 2010-11-01 | Method and device for detecting gesture inputs |
JP2013535232A JP5997699B2 (en) | 2010-11-01 | 2010-11-01 | Method and apparatus for detecting gesture input |
KR1020137014084A KR101760159B1 (en) | 2010-11-01 | 2010-11-01 | Method and device for detecting gesture inputs |
EP10859130.6A EP2635952B1 (en) | 2010-11-01 | 2010-11-01 | Method and device for detecting gesture inputs |
MX2013004805A MX2013004805A (en) | 2010-11-01 | 2010-11-01 | Method and device for detecting gesture inputs. |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2010/001733 WO2012058782A1 (en) | 2010-11-01 | 2010-11-01 | Method and device for detecting gesture inputs |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012058782A1 true WO2012058782A1 (en) | 2012-05-10 |
Family
ID=46023909
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2010/001733 WO2012058782A1 (en) | 2010-11-01 | 2010-11-01 | Method and device for detecting gesture inputs |
Country Status (9)
Country | Link |
---|---|
US (1) | US9189071B2 (en) |
EP (1) | EP2635952B1 (en) |
JP (1) | JP5997699B2 (en) |
KR (1) | KR101760159B1 (en) |
CN (1) | CN103282858B (en) |
BR (1) | BR112013010520B1 (en) |
MX (1) | MX2013004805A (en) |
RU (1) | RU2581013C2 (en) |
WO (1) | WO2012058782A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013250637A (en) * | 2012-05-30 | 2013-12-12 | Toshiba Corp | Recognition device |
US20140118244A1 (en) * | 2012-10-25 | 2014-05-01 | Pointgrab Ltd. | Control of a device by movement path of a hand |
JP2015127976A (en) * | 2015-03-04 | 2015-07-09 | 株式会社東芝 | Recognition device |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10088924B1 (en) * | 2011-08-04 | 2018-10-02 | Amazon Technologies, Inc. | Overcoming motion effects in gesture recognition |
WO2016017956A1 (en) | 2014-07-30 | 2016-02-04 | Samsung Electronics Co., Ltd. | Wearable device and method of operating the same |
KR102397397B1 (en) * | 2014-07-30 | 2022-05-13 | 삼성전자주식회사 | Wearalble device and operating method for the same |
CN105278763B (en) * | 2015-05-28 | 2019-05-17 | 维沃移动通信有限公司 | The method and device of gesture identification false-touch prevention |
DE102016202455B4 (en) | 2016-02-17 | 2024-10-10 | Volkswagen Aktiengesellschaft | User interface, means of transport and method for classifying a user gesture executed freely in space |
CN108076365B (en) * | 2017-02-22 | 2019-12-31 | 解波 | Human body posture recognition device |
KR101971982B1 (en) * | 2017-04-20 | 2019-04-24 | 주식회사 하이딥 | Apparatus capable of sensing touch and touch pressure and control method thereof |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001069365A1 (en) * | 2000-03-13 | 2001-09-20 | Ab In Credoble | Gesture recognition system |
US6369794B1 (en) * | 1998-09-09 | 2002-04-09 | Matsushita Electric Industrial Co., Ltd. | Operation indication outputting device for giving operation indication according to type of user's action |
CN1394325A (en) * | 2000-09-01 | 2003-01-29 | 美国索尼电脑娱乐公司 | User input device and method for interaction with graphic images |
JP2009211563A (en) * | 2008-03-05 | 2009-09-17 | Tokyo Metropolitan Univ | Image recognition device, image recognition method, image recognition program, gesture operation recognition system, gesture operation recognition method, and gesture operation recognition program |
WO2010088035A2 (en) * | 2009-01-30 | 2010-08-05 | Microsoft Corporation | Gesture recognizer system architecture |
US20100238137A1 (en) * | 2009-03-23 | 2010-09-23 | Samsung Electronics Co., Ltd. | Multi-telepointer, virtual object display device, and virtual object control method |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5048890B2 (en) | 1998-10-13 | 2012-10-17 | ソニー エレクトロニクス インク | Motion detection interface |
US6501515B1 (en) | 1998-10-13 | 2002-12-31 | Sony Corporation | Remote control system |
JP2001216069A (en) * | 2000-02-01 | 2001-08-10 | Toshiba Corp | Operation inputting device and direction detecting method |
RU2175143C1 (en) * | 2000-04-04 | 2001-10-20 | Свириденко Андрей Владимирович | Remote control technique |
US7259747B2 (en) * | 2001-06-05 | 2007-08-21 | Reactrix Systems, Inc. | Interactive video display system |
US20040001113A1 (en) * | 2002-06-28 | 2004-01-01 | John Zipperer | Method and apparatus for spline-based trajectory classification, gesture detection and localization |
US7932895B2 (en) * | 2005-05-24 | 2011-04-26 | Nokia Corporation | Control of an electronic device using a gesture as an input |
US20080040692A1 (en) | 2006-06-29 | 2008-02-14 | Microsoft Corporation | Gesture input |
TWI354225B (en) | 2008-04-29 | 2011-12-11 | Tsint | Touch free public browser |
US20090278801A1 (en) * | 2008-05-11 | 2009-11-12 | Kuo-Shu Cheng | Method For Executing Command Associated With Mouse Gesture |
WO2010011929A1 (en) * | 2008-07-25 | 2010-01-28 | Gesturetek, Inc. | Enhanced detection of waving engagement gesture |
CN101685343B (en) * | 2008-09-26 | 2011-12-28 | 联想(北京)有限公司 | Method, device and electronic aid for realizing gesture identification |
-
2010
- 2010-11-01 WO PCT/CN2010/001733 patent/WO2012058782A1/en active Application Filing
- 2010-11-01 CN CN201080071032.0A patent/CN103282858B/en not_active Expired - Fee Related
- 2010-11-01 EP EP10859130.6A patent/EP2635952B1/en active Active
- 2010-11-01 MX MX2013004805A patent/MX2013004805A/en active IP Right Grant
- 2010-11-01 KR KR1020137014084A patent/KR101760159B1/en active IP Right Grant
- 2010-11-01 JP JP2013535232A patent/JP5997699B2/en not_active Expired - Fee Related
- 2010-11-01 BR BR112013010520-8A patent/BR112013010520B1/en not_active IP Right Cessation
- 2010-11-01 RU RU2013125228/08A patent/RU2581013C2/en active
- 2010-11-01 US US13/882,592 patent/US9189071B2/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6369794B1 (en) * | 1998-09-09 | 2002-04-09 | Matsushita Electric Industrial Co., Ltd. | Operation indication outputting device for giving operation indication according to type of user's action |
WO2001069365A1 (en) * | 2000-03-13 | 2001-09-20 | Ab In Credoble | Gesture recognition system |
CN1394325A (en) * | 2000-09-01 | 2003-01-29 | 美国索尼电脑娱乐公司 | User input device and method for interaction with graphic images |
JP2009211563A (en) * | 2008-03-05 | 2009-09-17 | Tokyo Metropolitan Univ | Image recognition device, image recognition method, image recognition program, gesture operation recognition system, gesture operation recognition method, and gesture operation recognition program |
WO2010088035A2 (en) * | 2009-01-30 | 2010-08-05 | Microsoft Corporation | Gesture recognizer system architecture |
US20100238137A1 (en) * | 2009-03-23 | 2010-09-23 | Samsung Electronics Co., Ltd. | Multi-telepointer, virtual object display device, and virtual object control method |
Non-Patent Citations (1)
Title |
---|
See also references of EP2635952A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013250637A (en) * | 2012-05-30 | 2013-12-12 | Toshiba Corp | Recognition device |
US20140118244A1 (en) * | 2012-10-25 | 2014-05-01 | Pointgrab Ltd. | Control of a device by movement path of a hand |
JP2015127976A (en) * | 2015-03-04 | 2015-07-09 | 株式会社東芝 | Recognition device |
Also Published As
Publication number | Publication date |
---|---|
CN103282858B (en) | 2017-03-08 |
MX2013004805A (en) | 2013-07-02 |
JP5997699B2 (en) | 2016-09-28 |
BR112013010520A2 (en) | 2016-08-02 |
EP2635952A1 (en) | 2013-09-11 |
JP2013545183A (en) | 2013-12-19 |
BR112013010520B1 (en) | 2021-01-12 |
EP2635952B1 (en) | 2021-01-06 |
US20130215017A1 (en) | 2013-08-22 |
CN103282858A (en) | 2013-09-04 |
KR20130118895A (en) | 2013-10-30 |
US9189071B2 (en) | 2015-11-17 |
KR101760159B1 (en) | 2017-07-20 |
RU2013125228A (en) | 2014-12-10 |
EP2635952A4 (en) | 2014-09-17 |
RU2581013C2 (en) | 2016-04-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9189071B2 (en) | Method and device for detecting gesture inputs | |
US8339359B2 (en) | Method and system for operating electric apparatus | |
US9405373B2 (en) | Recognition apparatus | |
US10366281B2 (en) | Gesture identification with natural images | |
US20110299737A1 (en) | Vision-based hand movement recognition system and method thereof | |
KR101631011B1 (en) | Gesture recognition apparatus and control method of gesture recognition apparatus | |
US20140101620A1 (en) | Method and system for gesture identification based on object tracing | |
CN111754571B (en) | Gesture recognition method and device and storage medium thereof | |
US11205066B2 (en) | Pose recognition method and device | |
US9489077B2 (en) | Optical touch panel system, optical sensing module, and operation method thereof | |
US20170083145A1 (en) | Electronic apparatus and control method thereof | |
EP2428870A1 (en) | Device and method for controlling gesture for mobile device | |
Siam et al. | Human computer interaction using marker based hand gesture recognition | |
KR101360322B1 (en) | Apparatus and method for controlling electric boards using multiple hand shape detection and tracking | |
KR102107182B1 (en) | Hand Gesture Recognition System and Method | |
US20120293432A1 (en) | Method for touch device to transmit coordinates, method for touch device to transmit displacement vector and computer-readable medium | |
Stang et al. | Development of a self-learning automotive comfort function: an adaptive gesture control with few-shot-learning | |
KR101348763B1 (en) | Apparatus and method for controlling interface using hand gesture and computer-readable recording medium with program therefor | |
US10936052B2 (en) | Method and device for determining head movement according to electrooculographic information | |
JP2013109444A (en) | Automatic control device | |
KR101506197B1 (en) | A gesture recognition input method using two hands | |
WO2020152878A1 (en) | Operation analysis device, operation analysis method, operation analysis program, and operation analysis system | |
JP5775023B2 (en) | Residence degree calculation device and operation method thereof | |
CN118092837A (en) | Display method and device of electronic equipment and electronic equipment | |
Huerta et al. | Hand Gesture Recognition With a Novel Particle Filter |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10859130 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013535232 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2013/004805 Country of ref document: MX |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13882592 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010859130 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20137014084 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2013125228 Country of ref document: RU Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112013010520 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112013010520 Country of ref document: BR Kind code of ref document: A2 Effective date: 20130429 |