WO2011059404A2 - Method and system for interactive gesture-based control - Google Patents

Method and system for interactive gesture-based control Download PDF

Info

Publication number
WO2011059404A2
WO2011059404A2 PCT/SG2009/000421 SG2009000421W WO2011059404A2 WO 2011059404 A2 WO2011059404 A2 WO 2011059404A2 SG 2009000421 W SG2009000421 W SG 2009000421W WO 2011059404 A2 WO2011059404 A2 WO 2011059404A2
Authority
WO
WIPO (PCT)
Prior art keywords
hand
gestures
stroke
strokes
gesture
Prior art date
Application number
PCT/SG2009/000421
Other languages
French (fr)
Other versions
WO2011059404A3 (en
Inventor
Wing Kay Anthony Szeto
Original Assignee
Nanyang Polytechnic
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanyang Polytechnic filed Critical Nanyang Polytechnic
Priority to PCT/SG2009/000421 priority Critical patent/WO2011059404A2/en
Publication of WO2011059404A2 publication Critical patent/WO2011059404A2/en
Publication of WO2011059404A3 publication Critical patent/WO2011059404A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup

Definitions

  • the present invention generally relates to interactive technologies, and more particularly to a method for designing, representing and storing complex composite interactive hand gestures in a computer implementation, further to a database for designing the complex composite interactive hand gestures, and also to an interactive system for designing, representing and storing complex composite interactive hand gestures.
  • One embodiment of the present invention provides a method for designing, representing and storing complex composite interactive hand gestures in a computer implementation, wherein the computer is embedded with computer executable programs for performing the method.
  • the method comprises acquiring the composite interactive hand gesture performed by a user in front of an interactive interface window; pre-processing the acquired gestures by edge thinning and skeletonizing; dissecting a composite interactive hand gesture into a plurality of compass directional linear hand strokes based on the compass directions of the composite interactive hand gestures; determining the hand kinematics including speed and acceleration/deceleration of each stroke and the hand angular tilt during each stroke; representing the composite interactive hand gesture with a list of multi-tuple values including the plurality of compass directional linear hand strokes and the hand kinematics; designing the composite gesture according to the multi-tuple values of the acquired gesture; and codifying the acquired gestures using multi-tuple values of strokes.
  • it further comprises storing the codified composite interactive hand gesture into a database.
  • the gesture is any physical movement of an arm/hand that can be sensed and responded to by a digital system without the aid of a traditional device such as a mouse or stylus.
  • the compass directional linear hand stroke is selected from the group consisting of North, South, East, West, North-East, North- West, South-East, and South- West.
  • the direction of stroke movement of a single stroke is determined as follows: assuming the angle of 0° to be the origin reference/relative point with respect to the horizontal plane, increasing in a counter clockwise direction; obtaining two linear vectors, namely, the reference horizontal position/plane and the stroke movement to be determined; determining the angular degree (with respect to the reference horizontal plane) using sine, cosine or tangent trigonometry calculations; and assigning its direction using the angular degree of the single stroke.
  • the step of assigning the direction of the stroke movement of a single stroke comprises determining whether the angular degree of the stroke is falling into any of the compass direction points of 0°, 45°, 90°, 135°, 180°, 225°, 270°, and 315° with a degree of deviation, ⁇ 15°; if yes, a compass direction is assigned; if no, the stroke is ignored.
  • the angular tilt is categorized as left- tilt (LT), right-tilt (RT) or no-tilt (NT).
  • the speed is categorized as slow (S)
  • the speed change is categorized as constant (C), acceleration (A), or deceleration (D), depending on the final recorded speed change.
  • Another embodiment of the present invention provides a database for designing the complex composite interactive hand gestures, wherein the database is stored in a computer-readable medium, said database comprising codification of compass directional linear hand strokes, kinematics (speed, acceleration/deceleration) of strokes, and angular-tilts of strokes; thereby the database enables a user to design a composite interactive hand gesture in a list of multi-tuple values, wherein the list of multi-tuple values include codes for the compass directional linear hand strokes, kinetics of strokes, and angular-tilts of strokes.
  • the interactive system comprises an interface window for allowing a user to perform the gestures using hand movements so that the gestures of hand movements are captured; a hand gesture acquiring module electronically coupled with the interface window, for capturing the hand gestures when the user interacts with the interface window, wherein the hand gestures are comprised of a plurality of strokes with associated parameters; and a microprocessor embedded therein a set of computer programs for performing the designing, representing and storing complex composite interactive hand gestures, wherein the microprocessor comprises a memory for storing all the programs for performing the functions, intermediate results and database, wherein the microprocessor is electronically coupled with the hand gesture acquiring module and the interface window, and wherein the microprocessor comprises a pre-processing module for receiving the captured gestures from the hand gesture acquiring module, and identifying the compass directional linear hand strokes of the captured gestures, a processing module electronically coupled with the pre-processing module
  • the interactive window is a baseline location on the x and y axis for the width and height respectively to allow the gestures of hand movements be captured in a pre-defined 2D space.
  • the interactive window includes z axis to capture the depth of the gestures.
  • the hand gesture acquiring module comprises a touchless (or touch) sensing device/equipment for capturing the hand gestures of the user, and optionally an angular tilt detecting means for the angular tilt of a gesture, a speed detecting means for detecting the speed of a gesture, and a speed change detecting means for the speed change of a gesture.
  • the microprocessor is a computer, notebook, or PDA.
  • the pre-processing module performs edge thinning and skeletonizing.
  • FIG. 1 shows linear compass directions in accordance with one embodiment of the present invention.
  • FIGs. 2A-2D illustrate eight single strokes or basic gestures.
  • FIGs. 3A-3B show two exemplary composite gestures consisting of three single strokes (A, B, and C).
  • FIG- 4 is an exemplary composite gesture that is codified and stored in accordance with the present invention.
  • FIG. 5 is a functional block diagram of the interactive system in accordance with one embodiment of the present invention.
  • FIG. 6 is a flowchart of the method for design, representation and storage of gestures in accordance with one embodiment of the present invention. Detailed Description of the Invention
  • the present invention provides a method for representing and storing complex composite interactive hand gestures in a computer implementation and also a database for designing the complex composite interactive hand gestures represented by the method of the present invention.
  • the method for representation and storage first dissects a composite interactive hand gesture into a plurality of compass directional linear hand strokes (more than one) based on the compass directions of the composite interactive hand gesture, then determines the hand kinematics including speed and acceleration/deceleration of each stroke and the hand angular tilt during each stroke, and then represent and store the composite interactive hand gesture with a list of tuple values utilizing the strokes and the parameters of the strokes.
  • the database of the present invention comprises the codification of all possible compass directional linear hand strokes, the kinematics (speed, acceleration/deceleration) of strokes, and angular-tilts of strokes; thus the database enables the design of composite interactive hand gestures when a list of tuple values for a composite interactive hand gesture is known.
  • a gesture in the present invention refers to any physical movement of an arm/hand that can be sensed and responded to by a digital system without the aid of a traditional device such as a mouse or stylus.
  • a hand wave, an arm swing, a wrist movement, a finger tap or the like are examples of gestures.
  • linear compass directions in accordance with one embodiment of the present invention.
  • FIG. 1 there are 8 linear compass directions adopted for the movement of the hand: North, South, East, West, North-East, North- West, South-East, South- West.
  • a single compass directional linear hand stroke will be represented by one of the eight linear compass directions; in other words, the eight linear compass directions will represent all compass directional linear hand strokes of a composite interactive hand gesture. It is to be noted that the principle of the linear compass directions is not so limited, and more directions may be employed to suit for practical applications.
  • a basic gesture denotes the gestures that contain a single continuous stroke with no change in direction.
  • a composite gesture refers to the gestures that comprise two or more basic gestures; it means that the hand movement in a composite gesture can be separated into two or more segments, of which each segment consists of a single continuous stroke with a new direction change, thus a composite gesture can be viewed as a combination of single strokes or basic gestures.
  • the linear compass directions described in FIG. 1 correspond to the movement directions of distinct single strokes as shown in Table 1, and the movement directions of the single strokes are codified as shown in Table 1.
  • the direction of stroke movement of a single stroke is determined in accordance with one embodiment of the present invention.
  • the angle of 0° is assumed to be the origin reference/relative point with respect to the horizontal plane, increasing in a counter clock-wise direction.
  • the angular degree (with respect to the reference horizontal plane) can be determined using sine, cosine or tangent trigonometry calculations.
  • the angular degree of the single stroke is obtained, its direction is assigned according to the following principle.
  • the eight compass direction points correspond to 0°, 45°, 90°,
  • a range of angular degrees is used to assign one of the eight directions to a single stroke.
  • a degree of deviation, ⁇ 15° from the compass direction points of 0°, 45°, 90°, 135°, 180°, 225°, 270°, and 315°, is used, meaning that when the angular degree of a single stroke falls into the range for example of 15° - 345°, it will be assigned a direction of East (i.e., horizontal left-to-right stroke) with a code of A3 (see Table 1).
  • Table 2 shows the stroke movement and corresponding angular degree ranges.
  • FIGs. 2A-2D there are provided an illustration of the eight single strokes or basic gestures.
  • FIG. 2A shows a horizontal left-to-right gesture and a horizontal right-to-left gesture
  • FIG. 2B shows a diagonal left-to-right upward gesture and a diagonal right-to-left downward gesture
  • FIG. 2C shows a vertical upward gesture and a vertical downward gesture
  • FIG. 2D shows a diagonal left-to-right downward gesture and a diagonal right-to-left upward gesture.
  • FIGs. 3 A-3B there are provided two exemplary composite gestures consisting of three single strokes (A, B, and C).
  • FIG. 3A shows a composite gesture consisting of a sequential three strokes of A ->B->C; thus the hand encoded with the composite gesture performs the sequential movements of horizontal left-to-right, diagonal right-to-left downward, and vertical upward.
  • FIG. 3B shows a composite gesture consisting of a sequential three strokes of C->A->B; thus the hand encoded with the composite gesture performs the sequential movements of vertical upward, horizontal left- to-right, and diagonal right-to-left downward.
  • the hand movements during single strokes or basic gestures may include the angular tilt or inclination of the hand.
  • the angular tilt is categorized as left-tilt (LT), right-tilt (RT) or no-tilt (NT).
  • the angular tilt is recorded and associated with the respective single stroke or basic gesture. It is to be noted that the categorization of the angular tilt is not so limited; any suitable categorization shall be covered by the present invention.
  • kinematics information from the single stroke can be used to inject distinct characteristics into each single stroke and further into the composite gesture comprising single strokes.
  • the speed of the hand is categorized as slow (S) (l-15cm/sec), medium (M) (16-30 cm/sec), or fast (F) (>31 cm/sec).
  • the speed is recorded and associated with the respective single stroke or basic gesture.
  • the speed change of the hand is categorized as constant (C), acceleration (A), or deceleration (D), depending on the final recorded speed change.
  • C constant
  • acceleration acceleration
  • deceleration refers to that speed decreases either throughout or prior to end of the stroke.
  • the speed change is recorded and associated with the respective single stroke or basic gesture. It is to be noted that the categorization of speed and speed change of the hand during performing the strokes is not so limited; any other suitable schemes for categorization is covered by the present invention.
  • the database for design or representation of composite gestures comprises the codes for the movement directions, angular tilt, speed, and speed change for the hand.
  • a composite gesture can be represented by a series of ordered codes and stored; and conversely a composite gesture can be generated from the database according to a series of ordered codes.
  • the exemplary composite gesture comprises the following ordered strokes: i) horizontal left-to-right stroke (A); ii) diagonal right-to-left downward stroke (B); iii) vertical upward stroke (C).
  • the sets of ordered paired (x, y) values representing the composite gesture (A)->(B)->(C) are:
  • the sets of ordered paired (x, y) values are used to generate linear vectors which are used to determine the directions of the gesture strokes by performing trigonometric calculations as described above.
  • the direction codes for each stroke can be derived and referenced from Table 1. Then, the composite gesture with only the directions available can be represented and stored as follows:
  • the composite gesture can be represented and stored by 3 sets of 4-tuple values as follows:
  • a vertical upward direction gesture stroke, hand is not tilted, medium speed and slowing/decelerating motion.
  • each gesture is represented as a list of tuple values and stored as a record in the database of hand interactive composite gestures; how many tuple values are associated and stored for each composite gesture depend on the structures of the system as described below or the application requirements desired by a user.
  • the present invention also provides an interactive system using hand gestures.
  • FIG. 5 there is provided a functional block diagram of the interactive system in accordance with one embodiment of the present invention.
  • The. interactive system 1 comprises an Interface window 2, a hand gesture acquiring module 3, and a microprocessor 4 including a pre-processing module 5, a database 6, and a processing module 7. It is to be noted that the system implements many common and commercial elements; their features and specifics are known to one skilled in the art; thus no details for these elements will be provided herein.
  • the interface window 2 is the area of operation and comprises a baseline location on the x and y axes for the width and height respectively to capture the gestures of hand movements in a pre-defined 2D space.
  • the x and y axes determine the x and y coordinates respectively in the 2D space. With the inclusion of the z axis, the depth of the gesture movements can also be captured and utilized.
  • the interface window 2 also allows a user to perform the gestures using single finger or hand/palm as interaction input point.
  • the hand input interaction point may comprise a finger, a fist, or the palm of the hand.
  • the interface window 2 also serves as a display device for displaying the gestures outputted from the processing module 7.
  • the interface window 2 provides direct manipulation to control the digital space in the immediate surrounding environment and trigger system responses. By using an interactive gestural interface, the user has the ability to interact and communicate with the system, intuitively using normal hand gestures.
  • the hand gesture acquiring module 3 is electronically coupled with the interface window 2, for capturing the hand gestures when a user interacts with the interface window 2.
  • the hand gesture acquiring module 3 comprises a touchless (or touch) sensing device/equipment for capturing the hand gestures of the user, and optionally an angular tilt detecting means for example gyroscope for the angular tilt of a gesture, a speed detecting means for example a speed gauge for detecting the speed of a gesture, and a speed change detecting means for example an accelerometer for the speed change of a gesture.
  • a touchless (or touch) sensing device/equipment for capturing the hand gestures of the user
  • an angular tilt detecting means for example gyroscope for the angular tilt of a gesture
  • a speed detecting means for example a speed gauge for detecting the speed of a gesture
  • a speed change detecting means for example an accelerometer for the speed change of a gesture.
  • the configuration and use of the components are
  • the capture of a gesture involves the use of the coordinate values of the constituent pixels making up the gesture.
  • the "x", "y” coordinate values are represented as sets of paired values, ordered in a tuple list as a vector for each stroke of the gesture as described above.
  • the direction of the strokes within a gesture can be determined.
  • the strokes of a gesture are then codified in association with their directions.
  • the microprocessor 4 performs all the computing functions, and..may be a computer, notebook, a PDA or the like. It usually has a memory for storing all the programs for performing the functions, intermediate results and database. It is electronically coupled with the hand gesture acquiring module 3 and the interface window 2.
  • the microprocessor 4 comprises a pre-processing module 5, a database 6, and a processing module 7, where all components are electronically coupled.
  • the pre-processing module 5 receives the acquired gestures, from the hand gesture acquiring module 3, and identifies the strokes of the acquired gestures.
  • the pre-processing includes edge thinning and skeletonizing. Thick or broad stroke movements are reduced in thickness by applying thinning algorithms used in computer graphics or image processing. Algorithms such as 1) the classical "Hilditch” thinning algorithm; 2) “One-Pass Parallel Asymmetric Thinning Algorithm (OP ATA)"; and 3) "Contour-Coherence Based Thinning Algorithm” are examples of thinning algorithms that can be used in the thinning and skeletonization of the acquired stroke movements.
  • the identified strokes from the pre-processing module are outputted to the processing module 7.
  • the processing module 7 extracts the values of the parameters associated with the strokes, where the parameters include movement directions, angular tilt, speed, and speed change; all have been described above.
  • the processing module 7 codifies the composite gestures using a series of multi- tuple values and stores the codified composite gestures in the database 6, and also presents the composite gestures designed according to the multi-tuple values to the interface window allowing the user to actively interact with the system.
  • the present invention also provides a method for representation, design and storage of composite gestures captured in an interactive system.
  • FIG. 6 there is provided a flowchart of the method for representation, design and storage of composite gestures in accordance with one embodiment of the present invention.
  • the method 10 comprises acquiring the input gestures performed by a user in front of an interactive interface window 20, pre-processing the acquired input gestures by e.g. edge thinning and skeletonizing so that the strokes embedded in the acquired gestures are identified 30, determining the directions and angular tilt of strokes 40, optionally determining the kinematics of the strokes e.g.
  • Typical existing techniques use image capturing and representation is by camera image or video of gesture movement.
  • the accuracy and effectiveness of this approach is dependent on numerous factors, including the speed of the gesture movement, the positioning of the arm/hand, the possible occlusion by other parts of the body, the visibility of the gesture action, the positions of the cameras, blind spots, and other influencing factors.
  • the proposed invention can accurately and completely capture the gesture stroke, with no missing or recorded stroke sequence. This is achieved through the direct recording of the pixels involved in the gesture stroke, and not indirectly through the image of the gesture stroke.
  • the proposed technique can capture any stroke movements, and not be constrained or affected by external factors, like lighting, noise, occlusion, etc that may be experienced by the image capturing system.
  • the proposed method of generating gestures allows a plurality of different gestures, from basic to complex, to be created, simply by using an amalgamation of simple basic gestures strokes.
  • This method of creating gesture strokes facilitates in the capture, aids in non-complex representation with no ambiguity and helps in the efficient data storage of the gesture strokes.

Abstract

The present invention provides a method (10) for designing, representing and storing complex composite interactive hand gestures in a computer implementation. The present invention further provides a database (6) for designing the complex composite interactive hand gestures, wherein the database is stored in a computer-readable medium, said database (6) comprising codification of compass directional linear hand strokes, kinematics (speed, acceleration/deceleration) of strokes, and angular-tilts of strokes. The present invention also provides an interactive system (1) for designing, representing and storing complex composite interactive hand gestures.

Description

METHOD AND SYSTEM FOR INTERACTIVE GESTURE-BASED CONTROL
Field of the Invention
[0001] The present invention generally relates to interactive technologies, and more particularly to a method for designing, representing and storing complex composite interactive hand gestures in a computer implementation, further to a database for designing the complex composite interactive hand gestures, and also to an interactive system for designing, representing and storing complex composite interactive hand gestures.
Background of the Invention
[0002] Human-machine interactive technologies are expanding exponentially in expansive applications. Sign recognition systems and even more complex gesture recognition systems have been developed based on various methods to locate and track hands and their motions with respect to other body parts (e.g., arms, torso, head, and the like). However, the prior arts fail to provide a human-machine interactive system that is accurate, efficient, and of less requirement of computer power.
Summary of the Invention
[0003] One embodiment of the present invention provides a method for designing, representing and storing complex composite interactive hand gestures in a computer implementation, wherein the computer is embedded with computer executable programs for performing the method. In one embodiment, the method comprises acquiring the composite interactive hand gesture performed by a user in front of an interactive interface window; pre-processing the acquired gestures by edge thinning and skeletonizing; dissecting a composite interactive hand gesture into a plurality of compass directional linear hand strokes based on the compass directions of the composite interactive hand gestures; determining the hand kinematics including speed and acceleration/deceleration of each stroke and the hand angular tilt during each stroke; representing the composite interactive hand gesture with a list of multi-tuple values including the plurality of compass directional linear hand strokes and the hand kinematics; designing the composite gesture according to the multi-tuple values of the acquired gesture; and codifying the acquired gestures using multi-tuple values of strokes.
[0004] In another embodiment of the method, it further comprises storing the codified composite interactive hand gesture into a database.
[0005] In another embodiment of the method, the gesture is any physical movement of an arm/hand that can be sensed and responded to by a digital system without the aid of a traditional device such as a mouse or stylus.
[0006] In another embodiment of the method, the compass directional linear hand stroke is selected from the group consisting of North, South, East, West, North-East, North- West, South-East, and South- West.
[0007] In another embodiment of the method, the direction of stroke movement of a single stroke is determined as follows: assuming the angle of 0° to be the origin reference/relative point with respect to the horizontal plane, increasing in a counter clockwise direction; obtaining two linear vectors, namely, the reference horizontal position/plane and the stroke movement to be determined; determining the angular degree (with respect to the reference horizontal plane) using sine, cosine or tangent trigonometry calculations; and assigning its direction using the angular degree of the single stroke. In a further embodiment, the step of assigning the direction of the stroke movement of a single stroke comprises determining whether the angular degree of the stroke is falling into any of the compass direction points of 0°, 45°, 90°, 135°, 180°, 225°, 270°, and 315° with a degree of deviation, ±15°; if yes, a compass direction is assigned; if no, the stroke is ignored.
[0008] In another embodiment of the method, the angular tilt is categorized as left- tilt (LT), right-tilt (RT) or no-tilt (NT)..
[0009] In another embodiment of the method, the speed is categorized as slow (S)
(l-15cm/sec), medium (M) (16-30 cm/sec), or fast (F) (>31cm/sec).
[0010] In another embodiment of the method, the speed change is categorized as constant (C), acceleration (A), or deceleration (D), depending on the final recorded speed change.
[0011] Another embodiment of the present invention provides a database for designing the complex composite interactive hand gestures, wherein the database is stored in a computer-readable medium, said database comprising codification of compass directional linear hand strokes, kinematics (speed, acceleration/deceleration) of strokes, and angular-tilts of strokes; thereby the database enables a user to design a composite interactive hand gesture in a list of multi-tuple values, wherein the list of multi-tuple values include codes for the compass directional linear hand strokes, kinetics of strokes, and angular-tilts of strokes.
[0012] Another embodiment of the present invention provides an interactive system for designing, representing and storing complex composite interactive hand gestures. In one embodiment, the interactive system comprises an interface window for allowing a user to perform the gestures using hand movements so that the gestures of hand movements are captured; a hand gesture acquiring module electronically coupled with the interface window, for capturing the hand gestures when the user interacts with the interface window, wherein the hand gestures are comprised of a plurality of strokes with associated parameters; and a microprocessor embedded therein a set of computer programs for performing the designing, representing and storing complex composite interactive hand gestures, wherein the microprocessor comprises a memory for storing all the programs for performing the functions, intermediate results and database, wherein the microprocessor is electronically coupled with the hand gesture acquiring module and the interface window, and wherein the microprocessor comprises a pre-processing module for receiving the captured gestures from the hand gesture acquiring module, and identifying the compass directional linear hand strokes of the captured gestures, a processing module electronically coupled with the pre-processing module for receiving the identified strokes from the preprocessing module, extracting the values of the parameters associated with the strokes, wherein the parameters include movement directions, angular tilt, speed, and speed change, codifying the composite gestures using a series of multi-tuple values, and designing and outputting the codified composite gestures; and a database electronically coupled with the processing module for storing the codified composite gestures; whereby the outputted gestures are displayed on the interactive window or other displaying devices.
[0013] In another embodiment of the interactive system, the interactive window is a baseline location on the x and y axis for the width and height respectively to allow the gestures of hand movements be captured in a pre-defined 2D space.
[0014] In another embodiment of the interactive system, the interactive window includes z axis to capture the depth of the gestures. [0015] In another embodiment of the interactive system, the hand gesture acquiring module comprises a touchless (or touch) sensing device/equipment for capturing the hand gestures of the user, and optionally an angular tilt detecting means for the angular tilt of a gesture, a speed detecting means for detecting the speed of a gesture, and a speed change detecting means for the speed change of a gesture.
[0016] In another embodiment of the interactive system, the microprocessor is a computer, notebook, or PDA.
[00 7] In another embodiment of the interactive system, the pre-processing module performs edge thinning and skeletonizing.
[0018] The objectives and advantages of the invention will become apparent from the following detailed description of preferred embodiments thereof in connection with the accompanying drawings.
Brief Description of the Drawings
[0019] Preferred embodiments according to the present invention will now be described with reference to the Figures, in which like reference numerals denote like elements.
[0020] FIG. 1 shows linear compass directions in accordance with one embodiment of the present invention.
[0021] FIGs. 2A-2D illustrate eight single strokes or basic gestures.
[0022] FIGs. 3A-3B show two exemplary composite gestures consisting of three single strokes (A, B, and C).
[0023] . FIG- 4 is an exemplary composite gesture that is codified and stored in accordance with the present invention.
[0024] FIG. 5 is a functional block diagram of the interactive system in accordance with one embodiment of the present invention.
[0025] FIG. 6 is a flowchart of the method for design, representation and storage of gestures in accordance with one embodiment of the present invention. Detailed Description of the Invention
[0026] The present invention may be understood more readily by reference to the following detailed description of certain embodiments of the invention.
[0027] Throughout this application, where publications are referenced, the disclosures of these publications are hereby incorporated by reference, in their entireties, into this application in order to more fully describe the state of art to which this invention pertains.
[0028] The present invention provides a method for representing and storing complex composite interactive hand gestures in a computer implementation and also a database for designing the complex composite interactive hand gestures represented by the method of the present invention. Briefly, the method for representation and storage first dissects a composite interactive hand gesture into a plurality of compass directional linear hand strokes (more than one) based on the compass directions of the composite interactive hand gesture, then determines the hand kinematics including speed and acceleration/deceleration of each stroke and the hand angular tilt during each stroke, and then represent and store the composite interactive hand gesture with a list of tuple values utilizing the strokes and the parameters of the strokes. The database of the present invention comprises the codification of all possible compass directional linear hand strokes, the kinematics (speed, acceleration/deceleration) of strokes, and angular-tilts of strokes; thus the database enables the design of composite interactive hand gestures when a list of tuple values for a composite interactive hand gesture is known.
[0029] A gesture in the present invention refers to any physical movement of an arm/hand that can be sensed and responded to by a digital system without the aid of a traditional device such as a mouse or stylus. A hand wave, an arm swing, a wrist movement, a finger tap or the like are examples of gestures.
[0030] Now referring to FIG. 1, there is provided linear compass directions in accordance with one embodiment of the present invention. As shown in FIG. 1, there are 8 linear compass directions adopted for the movement of the hand: North, South, East, West, North-East, North- West, South-East, South- West. A single compass directional linear hand stroke will be represented by one of the eight linear compass directions; in other words, the eight linear compass directions will represent all compass directional linear hand strokes of a composite interactive hand gesture. It is to be noted that the principle of the linear compass directions is not so limited, and more directions may be employed to suit for practical applications.
[0031] The present invention covers both basic and composite gestures. A basic gesture denotes the gestures that contain a single continuous stroke with no change in direction. A composite gesture refers to the gestures that comprise two or more basic gestures; it means that the hand movement in a composite gesture can be separated into two or more segments, of which each segment consists of a single continuous stroke with a new direction change, thus a composite gesture can be viewed as a combination of single strokes or basic gestures.
[0032] In one embodiment, the linear compass directions described in FIG. 1 correspond to the movement directions of distinct single strokes as shown in Table 1, and the movement directions of the single strokes are codified as shown in Table 1.
[0033] Table 1. Correspondence of movement orientation and stroke movement
Figure imgf000007_0001
[0034] Now the direction of stroke movement of a single stroke is determined in accordance with one embodiment of the present invention. The angle of 0° is assumed to be the origin reference/relative point with respect to the horizontal plane, increasing in a counter clock-wise direction. With two linear vectors known, namely, the reference horizontal position/plane and the stroke movement to be determined, the angular degree (with respect to the reference horizontal plane) can be determined using sine, cosine or tangent trigonometry calculations. When the angular degree of the single stroke is obtained, its direction is assigned according to the following principle.
[0035] From FIG. 1, the eight compass direction points correspond to 0°, 45°, 90°,
135°, 180°, 225°, 270°, and 315°. In reality, it is impossible for the angular degrees of single strokes to fall on the points. Thus, a range of angular degrees is used to assign one of the eight directions to a single stroke. In one embodiment, a degree of deviation, ±15° , from the compass direction points of 0°, 45°, 90°, 135°, 180°, 225°, 270°, and 315°, is used, meaning that when the angular degree of a single stroke falls into the range for example of 15° - 345°, it will be assigned a direction of East (i.e., horizontal left-to-right stroke) with a code of A3 (see Table 1). Table 2 shows the stroke movement and corresponding angular degree ranges.
[0036] Table 2. Stroke movement and corresponding angular degree ranges
Figure imgf000008_0001
[0037] It is apparent that eight ranges of angular degrees are not covered
2; the uncovered ranges are listed in Table 3.
[0038] Table 3. Uncovered ranges of angular degrees
15° < X < 30° 60° < X < 75° 105° < X < 120° 150° < X < 165° 195° < X < 210° 240° < X < 255° 285° < X < 300° 330° < X < 345°
[0039] When the angular degree of a single stroke falls in the uncovered ranges, it will be deemed undefined and ignored. This will avoid any ambiguous interpretation of the stroke movement. Hence, the accuracy and robustness of the present invention will not be compromised. It is to' be noted that the setup of angular degree ranges for the direction of stroke movement is not so limited; one skilled in the art shall be able to use any ranges to suit for their applications.
[0040] Now referring to FIGs. 2A-2D, there are provided an illustration of the eight single strokes or basic gestures. FIG. 2A shows a horizontal left-to-right gesture and a horizontal right-to-left gesture; FIG. 2B shows a diagonal left-to-right upward gesture and a diagonal right-to-left downward gesture; FIG. 2C shows a vertical upward gesture and a vertical downward gesture; and FIG. 2D shows a diagonal left-to-right downward gesture and a diagonal right-to-left upward gesture.
[0041] Now referring to FIGs. 3 A-3B, there are provided two exemplary composite gestures consisting of three single strokes (A, B, and C). FIG. 3A shows a composite gesture consisting of a sequential three strokes of A ->B->C; thus the hand encoded with the composite gesture performs the sequential movements of horizontal left-to-right, diagonal right-to-left downward, and vertical upward. FIG. 3B shows a composite gesture consisting of a sequential three strokes of C->A->B; thus the hand encoded with the composite gesture performs the sequential movements of vertical upward, horizontal left- to-right, and diagonal right-to-left downward. It is evident that even though the two composite gestures are comprised of the same single strokes, they are totally different in their hand movements because the orders of the single strokes are not the same; this ordered combination of multiple single strokes or basic gestures allows the design or creation of various composite gestures based on the basic gesture types and their ordering sequence.
[0042] In addition to movement directions, the hand movements during single strokes or basic gestures may include the angular tilt or inclination of the hand. In one embodiment, the angular tilt is categorized as left-tilt (LT), right-tilt (RT) or no-tilt (NT). The angular tilt is recorded and associated with the respective single stroke or basic gesture. It is to be noted that the categorization of the angular tilt is not so limited; any suitable categorization shall be covered by the present invention.
[0043] Furthermore, kinematics information from the single stroke can be used to inject distinct characteristics into each single stroke and further into the composite gesture comprising single strokes. During performing a single stroke, the speed of the hand is categorized as slow (S) (l-15cm/sec), medium (M) (16-30 cm/sec), or fast (F) (>31 cm/sec). The speed is recorded and associated with the respective single stroke or basic gesture. During performing a single stroke, the speed change of the hand is categorized as constant (C), acceleration (A), or deceleration (D), depending on the final recorded speed change. For example, i) acceleration denotes that speed increases throughout the stroke; ii) deceleration refers to that speed decreases either throughout or prior to end of the stroke. The speed change is recorded and associated with the respective single stroke or basic gesture. It is to be noted that the categorization of speed and speed change of the hand during performing the strokes is not so limited; any other suitable schemes for categorization is covered by the present invention.
[0044] In one embodiment, the database for design or representation of composite gestures comprises the codes for the movement directions, angular tilt, speed, and speed change for the hand. Thus, a composite gesture can be represented by a series of ordered codes and stored; and conversely a composite gesture can be generated from the database according to a series of ordered codes.
[0045] Now referring to FIG. 4, there is provided an exemplary composite gesture that is codified and stored in accordance with the present invention. The exemplary composite gesture comprises the following ordered strokes: i) horizontal left-to-right stroke (A); ii) diagonal right-to-left downward stroke (B); iii) vertical upward stroke (C). The sets of ordered paired (x, y) values representing the composite gesture (A)->(B)->(C) are:
[0046] {[(3,1)(4,1)(5,1)(6,1)][(5,2)(4,3)(3,4)(2,5)][(2,4)(2,3)(2,2)]}
[0047] The sets of ordered paired (x, y) values are used to generate linear vectors which are used to determine the directions of the gesture strokes by performing trigonometric calculations as described above. The direction codes for each stroke can be derived and referenced from Table 1. Then, the composite gesture with only the directions available can be represented and stored as follows:
[0048] {A3, D4, A1} [0049] If other parameters of angular tilt, speed, and speed change for each stroke are known from each stroke as determined by the methods described above, the composite gesture can be represented and stored by 3 sets of 4-tuple values as follows:
[0050] {(A3, RT, S, C)(D4, LT, M, A)(A1, NT, M, D)}
[0051] Which indicates:
[0052] a horizontal left-to-right direction gesture stroke, hand is right tilted, slow and constant speed motion; [followed by]
[0053] a diagonal right-to-left downward direction gesture stroke, hand is left tilted, medium speed and accelerating motion; [followed by]
[0054] a vertical upward direction gesture stroke, hand is not tilted, medium speed and slowing/decelerating motion.
[0055] Therefore, each gesture is represented as a list of tuple values and stored as a record in the database of hand interactive composite gestures; how many tuple values are associated and stored for each composite gesture depend on the structures of the system as described below or the application requirements desired by a user.
[0056] The present invention also provides an interactive system using hand gestures. Referring to FIG. 5, there is provided a functional block diagram of the interactive system in accordance with one embodiment of the present invention. The. interactive system 1 comprises an Interface window 2, a hand gesture acquiring module 3, and a microprocessor 4 including a pre-processing module 5, a database 6, and a processing module 7. It is to be noted that the system implements many common and commercial elements; their features and specifics are known to one skilled in the art; thus no details for these elements will be provided herein.
[0057] The interface window 2 is the area of operation and comprises a baseline location on the x and y axes for the width and height respectively to capture the gestures of hand movements in a pre-defined 2D space. The x and y axes determine the x and y coordinates respectively in the 2D space. With the inclusion of the z axis, the depth of the gesture movements can also be captured and utilized. The interface window 2 also allows a user to perform the gestures using single finger or hand/palm as interaction input point. The hand input interaction point may comprise a finger, a fist, or the palm of the hand. The interface window 2 also serves as a display device for displaying the gestures outputted from the processing module 7. The interface window 2 provides direct manipulation to control the digital space in the immediate surrounding environment and trigger system responses. By using an interactive gestural interface, the user has the ability to interact and communicate with the system, intuitively using normal hand gestures.
[0058] The hand gesture acquiring module 3 is electronically coupled with the interface window 2, for capturing the hand gestures when a user interacts with the interface window 2. The hand gesture acquiring module 3 comprises a touchless (or touch) sensing device/equipment for capturing the hand gestures of the user, and optionally an angular tilt detecting means for example gyroscope for the angular tilt of a gesture, a speed detecting means for example a speed gauge for detecting the speed of a gesture, and a speed change detecting means for example an accelerometer for the speed change of a gesture. The configuration and use of the components are well known in the arts.
[0059] The capture of a gesture involves the use of the coordinate values of the constituent pixels making up the gesture. The "x", "y" coordinate values are represented as sets of paired values, ordered in a tuple list as a vector for each stroke of the gesture as described above. By using trigonometric calculations with the ordered vector values, the direction of the strokes within a gesture can be determined. The strokes of a gesture are then codified in association with their directions.
[0060] The microprocessor 4 performs all the computing functions, and..may be a computer, notebook, a PDA or the like. It usually has a memory for storing all the programs for performing the functions, intermediate results and database. It is electronically coupled with the hand gesture acquiring module 3 and the interface window 2. In one embodiment, the microprocessor 4 comprises a pre-processing module 5, a database 6, and a processing module 7, where all components are electronically coupled.
[0061] The pre-processing module 5 receives the acquired gestures, from the hand gesture acquiring module 3, and identifies the strokes of the acquired gestures. In one embodiment, the pre-processing includes edge thinning and skeletonizing. Thick or broad stroke movements are reduced in thickness by applying thinning algorithms used in computer graphics or image processing. Algorithms such as 1) the classical "Hilditch" thinning algorithm; 2) "One-Pass Parallel Asymmetric Thinning Algorithm (OP ATA)"; and 3) "Contour-Coherence Based Thinning Algorithm" are examples of thinning algorithms that can be used in the thinning and skeletonization of the acquired stroke movements. [0062] The identified strokes from the pre-processing module are outputted to the processing module 7. The processing module 7 extracts the values of the parameters associated with the strokes, where the parameters include movement directions, angular tilt, speed, and speed change; all have been described above. When all parameters are known, the processing module 7 codifies the composite gestures using a series of multi- tuple values and stores the codified composite gestures in the database 6, and also presents the composite gestures designed according to the multi-tuple values to the interface window allowing the user to actively interact with the system.
[0063] The present invention also provides a method for representation, design and storage of composite gestures captured in an interactive system. Now referring to FIG. 6, there is provided a flowchart of the method for representation, design and storage of composite gestures in accordance with one embodiment of the present invention. The method 10 comprises acquiring the input gestures performed by a user in front of an interactive interface window 20, pre-processing the acquired input gestures by e.g. edge thinning and skeletonizing so that the strokes embedded in the acquired gestures are identified 30, determining the directions and angular tilt of strokes 40, optionally determining the kinematics of the strokes e.g. speed and speed change 50, designing composite gestures according to the ^ multi-tuple values of the acquired gestures 60, codifying the acquired gestures using multi-tuple values of strokes 70, and storing the codified gestures into the database containing basic codes for directions, tilt, speed, speed change, and codified gestures 80. It is to be noted that the method does not necessarily require performing all steps in all applications. For instance, a user can input multi-tuple values directly into the system, and the composite gestures corresponding to the input values can be designed using the database.
[0064] Typical existing techniques use image capturing and representation is by camera image or video of gesture movement. The accuracy and effectiveness of this approach is dependent on numerous factors, including the speed of the gesture movement, the positioning of the arm/hand, the possible occlusion by other parts of the body, the visibility of the gesture action, the positions of the cameras, blind spots, and other influencing factors.
[0065] The proposed invention can accurately and completely capture the gesture stroke, with no missing or recorded stroke sequence. This is achieved through the direct recording of the pixels involved in the gesture stroke, and not indirectly through the image of the gesture stroke. The proposed technique can capture any stroke movements, and not be constrained or affected by external factors, like lighting, noise, occlusion, etc that may be experienced by the image capturing system.
[0066] The proposed method of generating gestures allows a plurality of different gestures, from basic to complex, to be created, simply by using an amalgamation of simple basic gestures strokes. This method of creating gesture strokes facilitates in the capture, aids in non-complex representation with no ambiguity and helps in the efficient data storage of the gesture strokes.
[0067] While the present invention has been described with reference to particular embodiments, it will be understood that the embodiments are illustrative and that the invention scope is not so limited. Alternative embodiments of the present invention will become apparent to those having ordinary skill in the art to which the present invention pertains. Such alternate embodiments are considered to be encompassed within the scope of the present invention. Accordingly, the scope of the present invention is defined by the appended claims and is supported by the foregoing description.

Claims

CLAIMS:
1. A method for designing, representing and storing complex composite interactive hand gestures in a computer implementation, wherein the computer is embedded with computer executable programs for performing the method, said method comprising:
acquiring the composite interactive hand gesture performed by a user in front of an interactive interface window;
pre-processing the acquired gestures by edge thinning and skeletonizing;
dissecting a composite interactive hand gesture into a plurality of compass directional linear hand strokes based on the compass directions of the composite interactive hand gestures;
determining the hand kinematics including speed and acceleration/deceleration of each stroke and the hand angular tilt during each stroke;
representing the composite interactive hand gesture with a list of multi-tuple values including the plurality of compass directional linear hand strokes and the hand kinematics; designing the composite gesture according to the multi-tuple values of the acquired gesture; and
codifying the acquired gestures using multi-tuple values of strokes.
2. A method according to claim 1, further comprising:
storing the codified composite interactive hand gesture into a database.
3. A method according to claim 1 or 2, wherein the gesture is any physical movement of an arm/hand that is sensed and responded to by a digital system without the aid of a traditional device such as a mouse or stylus.
4. A method according to any one of claims 1-3, wherein the compass directional linear hand stroke is selected from the group consisting of: North, South, East, West, North-East, North- West, South-East, and South- West.
5. A method according to any one of claims 1-4, wherein the direction of stroke movement of a single stroke is determined as follows: assuming the angle of 0° to be the origin reference/relative point with respect to the horizontal plane, increasing in a counter clock- wise direction;
obtaining two linear vectors, namely, the reference horizontal position/plane and the stroke movement to be determined;
determining the angular degree (with respect to the reference horizontal plane) using sine, cosine or tangent trigonometry calculations; and
assigning its direction using the angular degree of the single stroke.
6. A method according to claim 5, wherein the step of assigning the direction of the stroke movement of a single stroke comprises:
determining whether the angular degree of the stroke is falling into any of the compass direction points of 0°, 45°, 90°, 135°, 180°, 225°, 270°, and 315° with a degree of deviation, ±15°; if yes, a compass direction is assigned; if no, the stroke is ignored.
7. A method according to any one of claims 1-6, wherein the angular tilt is categorized as left-tilt (LT), right-tilt (RT) or no-tilt (NT).
8. A method according to any one of claims 1-7, wherein the speed is categorized as slow (S) (l-15cm/sec), medium (M) (16-30 cm/sec), or fast (F) (>31 cm/sec).
9. A method according to any one of claims 1-8, wherein the speed change is categorized as constant (C), acceleration (A), or deceleration (D), depending on the final recorded speed change.
10. A database for designing the complex composite interactive hand gestures, wherein the database is stored in a computer-readable medium, said database comprising codification of compass directional linear hand strokes, kinematics (speed, acceleration/deceleration) of strokes, and angular-tilts of strokes;
thereby the database enables a user to design a composite interactive hand gesture in a list of multi-tuple values, wherein the list of multi-tuple values include codes for the compass directional linear hand strokes, kinematics of strokes, and angular-tilts of strokes.
11. A database according to claim 10, wherein the compass directional linear hand stroke is selected from the group consisting of: North, South, East, West, North-East, North- West, South-East, and South- West.
12. A database according to claim 10 or 11, wherein the direction of a compass directional linear hand stroke is determined as follows:
assuming the angle of 0° to be the origin reference/relative point with respect to the horizontal plane, increasing in a counter clock- wise direction;
obtaining two linear vectors, namely, the reference horizontal position/plane and the stroke movement to be determined;
determining the angular degree (with respect to the reference horizontal plane) using sine, cosine or tangent trigonometry calculations; and
assigning its direction using the angular degree of the single stroke.
13. A database according to claim 12, wherein the direction of a compass directional linear hand stroke is determined as follows:
determining whether the angular degree of the stroke is falling into any of the compass direction points of 0°, 45°, 90°, 135°, 180°, 225°, 270°, and 315° with a degree of deviation, ±15°; if yes, a compass direction is assigned; if no, the stroke is ignored.
14. A database according to any one of claims 10-13, wherein the angular tilt is categorized as left-tilt (LT), right-tilt (RT) or no-tilt (NT).
15. A database according to any one of claims 10-14, wherein the speed is categorized as slow (S) (l-15cm/sec), medium (M) (16-30 cm/sec), or fast (F) (>31 cm/sec).
16. A database according to any one of claims 10-15, wherein the speed change is categorized as constant (C), acceleration (A), or deceleration (D), depending on the final recorded speed change.
17. An interactive system for designing, representing and storing complex composite interactive hand gestures, the system comprising: an interface window for allowing a user to perform the gestures using hand movements so that the gestures of hand movements are captured;
a hand gesture acquiring module electronically coupled with the interface window, for capturing the hand gestures when the user interacts with the interface window, wherein the hand gestures are comprised of a plurality of strokes with associated parameters; and a microprocessor embedded therein a set of computer programs for performing the designing, representing and storing complex composite interactive hand gestures, wherein the microprocessor comprises a memory for storing all the programs for performing the functions, intermediate results and database, wherein the microprocessor is electronically coupled with the hand gesture acquiring module and the interface window, and wherein the microprocessor comprises:
a pre-processing module for receiving the captured gestures from the hand gesture acquiring module, and identifying the compass directional linear hand strokes of the captured gestures.
a processing module electronically coupled with the pre-processing module for receiving the identified strokes from the pre-processing module, extracting the values of the parameters associated with the strokes, wherein the parameters include movement directions, angular tilt, speed, and speed change, codifjing the. composite gestures using a series of multi-tuple values, and designing and outputting the codified composite gestures; and
a database electronically coupled with the processing module for storing the codified composite gestures;
whereby the outputted gestures are displayed on the interactive window or other displaying devices.
18. An interactive system according to claim 17, wherein the interactive window is a baseline location on the x and y axes for the width and height respectively to allow the gestures of hand movements be captured in a pre-defined 2D space.
19. An interactive system according to claim 18, wherein the interactive window includes z axis to capture the depth of the gestures.
20. An interactive system according to any one of claims 17-19, wherein the hand gesture acquiring module comprises a touchless (or touch) sensing device/equipment for capturing the hand gestures of the user.
21. An interactive system according to any one of claims 17-20, wherein the hand gesture acquiring module further comprises:
an angular tilt detecting means for capturing the angular tilt of a gesture;
a speed detecting means for capturing the speed of a gesture; and/or
a speed change detecting means for capturing the speed change of a gesture.
22. An interactive system according to any one of claims 17-21, wherein the microprocessor is a computer, notebook, or PDA.
23. An interactive system according to any one of claims 17-22, wherein the preprocessing module performs edge thinning and skeletonizing.
24. An interactive system according to any one of claims 17-23, wherein the compass directional linear hand strokes are selected from the group consisting of: North, South, East, West, North-East, North- West, South-East, and South- West.
25. An interactive system according to any one of claims 17-24, wherein the direction of the compass directional linear hand strokes is determined as follows:
assuming the angle of 0° to be the origin reference/relative point with respect to the horizontal plane, increasing in a counter clock- wise direction;
obtaining two linear vectors, namely, the reference horizontal position/plane and the stroke movement to be determined;
determining the angular degree (with respect to the reference horizontal plane) using sine, cosine or tangent trigonometry calculations; and
assigning its direction using the angular degree of the single stroke.
26. An interactive system according to any one of claims 17-25, wherein the direction of the compass directional linear hand strokes is determined as follows: determining whether the angular degree of the stroke is falling into any of the compass direction points of 0°, 45°, 90°, 135°, 180°, 225°, 270°, and 315° with a degree of deviation, ±15°; if yes, a compass direction is assigned; if no, the stroke is ignored.
27. An interactive system according to any one of claims 17-26, wherein the angular tilt is categorized as left-tilt (LT), right-tilt (RT) or no-tilt (NT).
28. An interactive system according to any one of claims 17-27, wherein the speed is categorized as slow (S) (l-15cm/sec), medium (M) (16-30 cm/sec), or fast (F) (>31cm/sec).
29. An interactive system according to any one of claims 17-28, wherein the speed change is categorized as constant (C), acceleration (A), ox deceleration (D), depending on the final recorded speed change.
PCT/SG2009/000421 2009-11-12 2009-11-12 Method and system for interactive gesture-based control WO2011059404A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/SG2009/000421 WO2011059404A2 (en) 2009-11-12 2009-11-12 Method and system for interactive gesture-based control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/SG2009/000421 WO2011059404A2 (en) 2009-11-12 2009-11-12 Method and system for interactive gesture-based control

Publications (2)

Publication Number Publication Date
WO2011059404A2 true WO2011059404A2 (en) 2011-05-19
WO2011059404A3 WO2011059404A3 (en) 2011-07-21

Family

ID=43992281

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2009/000421 WO2011059404A2 (en) 2009-11-12 2009-11-12 Method and system for interactive gesture-based control

Country Status (1)

Country Link
WO (1) WO2011059404A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019570A (en) * 2012-12-31 2013-04-03 上海华勤通讯技术有限公司 Hand gesture recognition method and mobile terminal

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5454043A (en) * 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
WO2000034942A1 (en) * 1998-12-11 2000-06-15 Sunhawk Corporation Method and system for recognizing musical notations using a compass-direction user interface
US20050212760A1 (en) * 2004-03-23 2005-09-29 Marvit David L Gesture based user interface supporting preexisting symbols
US20070176898A1 (en) * 2006-02-01 2007-08-02 Memsic, Inc. Air-writing and motion sensing input for portable devices
US20090195497A1 (en) * 2008-02-01 2009-08-06 Pillar Ventures, Llc Gesture-based power management of a wearable portable electronic device with display
WO2009124181A2 (en) * 2008-04-02 2009-10-08 Oblong Industries, Inc. Gesture based control using three-dimensional information extracted over an extended depth of field
US20090265671A1 (en) * 2008-04-21 2009-10-22 Invensense Mobile devices with motion gesture recognition
DE102008020340A1 (en) * 2008-04-18 2009-10-22 Hochschule Magdeburg-Stendal (Fh) Gesture-controlled MIDI instrument

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5454043A (en) * 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
WO2000034942A1 (en) * 1998-12-11 2000-06-15 Sunhawk Corporation Method and system for recognizing musical notations using a compass-direction user interface
US20050212760A1 (en) * 2004-03-23 2005-09-29 Marvit David L Gesture based user interface supporting preexisting symbols
US20070176898A1 (en) * 2006-02-01 2007-08-02 Memsic, Inc. Air-writing and motion sensing input for portable devices
US20090195497A1 (en) * 2008-02-01 2009-08-06 Pillar Ventures, Llc Gesture-based power management of a wearable portable electronic device with display
WO2009124181A2 (en) * 2008-04-02 2009-10-08 Oblong Industries, Inc. Gesture based control using three-dimensional information extracted over an extended depth of field
DE102008020340A1 (en) * 2008-04-18 2009-10-22 Hochschule Magdeburg-Stendal (Fh) Gesture-controlled MIDI instrument
US20090265671A1 (en) * 2008-04-21 2009-10-22 Invensense Mobile devices with motion gesture recognition

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019570A (en) * 2012-12-31 2013-04-03 上海华勤通讯技术有限公司 Hand gesture recognition method and mobile terminal

Also Published As

Publication number Publication date
WO2011059404A3 (en) 2011-07-21

Similar Documents

Publication Publication Date Title
US10732725B2 (en) Method and apparatus of interactive display based on gesture recognition
US10761612B2 (en) Gesture recognition techniques
CN108431729B (en) Three-dimensional object tracking to increase display area
EP1460577B1 (en) Motion detection for handwriting recognition
US7598942B2 (en) System and method for gesture based control system
US10289214B2 (en) Method and device of controlling virtual mouse and head-mounted displaying device
US20090278915A1 (en) Gesture-Based Control System For Vehicle Interfaces
EP2790089A1 (en) Portable device and method for providing non-contact interface
US11047691B2 (en) Simultaneous localization and mapping (SLAM) compensation for gesture recognition in virtual, augmented, and mixed reality (xR) applications
US10438385B2 (en) Generating ink effects for a digital ink stroke
JP2017505965A (en) Real-time 3D gesture recognition and tracking system for mobile devices
WO2011146070A1 (en) System and method for reporting data in a computer vision system
WO2015051827A1 (en) Method of determining a similarity transformation between first and second coordinates of 3d features
CN114529691A (en) Window control method, electronic device and computer readable storage medium
Ahuja et al. TouchPose: hand pose prediction, depth estimation, and touch classification from capacitive images
US10318128B2 (en) Image manipulation based on touch gestures
CN109960404B (en) Data processing method and device
WO2011059404A2 (en) Method and system for interactive gesture-based control
Kim et al. Visual multi-touch air interface for barehanded users by skeleton models of hand regions
JP2015191250A (en) Information processing apparatus, control method thereof, program, and recording medium
CN109254671B (en) Interactive method, device and equipment for controlling object posture in AR/VR application
Maidi et al. Interactive media control using natural interaction-based Kinect
CN114327042B (en) Detection glove, gesture tracking method, AR equipment and key pressing method
Ahuja et al. TouchPose: Hand Pose Prediction, Depth Estimation, and Touch
Reimann et al. Computer vision based interaction techniques for mobile augmented reality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09851321

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09851321

Country of ref document: EP

Kind code of ref document: A2