US20180011619A1 - Systems and methods for adaptive gesture recognition - Google Patents
Systems and methods for adaptive gesture recognition Download PDFInfo
- Publication number
- US20180011619A1 US20180011619A1 US15/714,957 US201715714957A US2018011619A1 US 20180011619 A1 US20180011619 A1 US 20180011619A1 US 201715714957 A US201715714957 A US 201715714957A US 2018011619 A1 US2018011619 A1 US 2018011619A1
- Authority
- US
- United States
- Prior art keywords
- parameters
- user inputs
- user
- gesture
- gestures
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/35—Nc in input of data, input till input file format
- G05B2219/35444—Gesture interface, controlled machine observes operator, executes commands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00352—Input means
- H04N1/00381—Input by recognition or interpretation of visible user gestures
Definitions
- the present invention generally relates to user interfaces, and more particularly relates to systems and methods for recognizing gestures from user inputs.
- multi-directional user inputs can be tracked over time or space and combined to form a single command commonly called a “gesture”.
- a horizontal finger swipe may be recognized within a web browser or ebook reader as a gesture input to change pages, or to move forward or backward in a browsing history.
- a media player may recognize the same horizontal movement to perform other tasks, such as changing channels, performing a fast forward/rewind operation, or any other tasks as desired.
- Other gestures may involve vertical movements, rotational movements, taps, presses, holds and/or other user inputs tracked over time or over the multi-directional sensing region of the input device.
- certain devices may have difficulty in recognizing gestural inputs applied by certain users. While some users may intend to provide gestural inputs, their actual inputs applied to the sensing region of the device may be imperfect, and therefore difficult to recognize. If the user attempts to enter gestures that are not recognized by the input device, the user may become frustrated. Further, variations between different movements applied by different users can lead to gestural constraints that are overly restrictive for certain users, but that are nevertheless unable to recognize gestures produced by other users.
- systems and methods are described for adaptively recognizing gestures indicated by user inputs received from a touchpad, touchscreen, directional pad, mouse or other multi-directional input device. If a user's movement does not indicate a gesture using current gesture recognition parameters, additional processing can be performed to recognize the gesture using other factors. The gesture recognition parameters can then be adapted so that subsequent user inputs that are similar to the previously-rejected inputs will appropriately trigger gesture commands as desired by the user. In some embodiments, gestural data and/or parameters may be locally or remotely stored for further processing.
- Various embodiments provide a method to process user inputs received from a multi-directional input device.
- the method may be executed by an input device itself, by device driver associated with an input device, by a processor or other component of a computing system that performs functions in response to software or firmware instructions, by a server that receives user input data via a network, and/or by any other processing device or logic.
- the method suitably comprises receiving at least one of the user inputs from the multi-directional input device; determining if the at least one user input forms a gesture based upon a set of parameters; if the at least one user input does not form the gesture based upon the set of parameters, attempting to recognize the gesture based upon other factors that are different from the set of parameters; and if the gesture is recognized based upon the other factors, adapting the set of parameters based upon the at least one user input.
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- FIG. 1 A block diagram illustrating an exemplary computing system
- Still other embodiments provide a method executable by a computing system, server or other data processing logic to process user inputs obtained from at least one multi-directional input device.
- the method suitably comprises receiving the user inputs; identifying at least some of the user inputs that are not recognized as gestures based upon a set of parameters, but that are recognized as gestures based upon other factors that are different than the set of parameters; and adapting the set of parameters based upon the identified user inputs to create an adapted set of parameters.
- FIG. 1 is a block diagram showing various components of an exemplary processing system that includes adaptive gesture recognition
- FIG. 2 is a flowchart of an exemplary process for adaptively recognizing gestures from user inputs.
- adaptive gesture recognition is applied within an application, a computer system, a hardware or software device driver and/or an input device to identify actual user behaviors and to adapt gestural recognition parameters based upon the actual behaviors of the user. If a particular user consistently makes shorter-than-expected movements for a particular gesture, for example, the parameters used to recognize that particular gesture can be adapted so that the user's actual behavior is recognized as producing the intended gesture. This can significantly reduce frustration by the user and greatly enhance user satisfaction with the adaptive product.
- Adaptive gesture recognition may be applied with respect to particular users by storing the adapted parameters for use during subsequent sessions with the same user.
- the adapted parameters are stored locally at the user's input device or computer system.
- Other implementations may additionally or alternately store the adapted parameters at a network server or other shared location so that the user's adapted parameters are accessible from different computing systems.
- Still other embodiments may use aggregate data compiled from any number of users to adjust default settings, or to otherwise adapt the parameters used to recognize gestures used by multiple users of the same system or application.
- Other features, enhancements and other variations are described in increasing detail below.
- an exemplary system 100 could incorporate adaptive gestural recognition within an input device 102 , a computing system 104 , a device controller 121 and/or a software application 130 , as appropriate.
- the system 100 illustrated in FIG. 1 shows a multi-dimensional input device 102 that provides signals 103 to a device controller 121 or other feature of computing system 104 , which reports the received user inputs 132 to a gestural recognition module 131 or other feature of a software application 130 .
- Gestural recognition module 131 suitably compares the user inputs 132 to gestural recognition parameters 133 to identify gestures in the user's movements 113 , 115 , 117 .
- gestural recognition module 131 could be equivalently incorporated within the input device 102 itself, within the input device controller 121 , within the operating system 128 of computing system 104 , or in any other hardware or software processing logic within system 100 , as desired.
- Input device 102 is any device or component capable of sensing user inputs within a multi-dimensional input space 110 .
- Input device 102 may be a touch pad, touch screen, touch stick, directional pad, joystick, mouse, trackball or the like, to name just a few examples.
- input device 102 detects user motions 113 , 115 , 117 within the sensing region 110 , and provides corresponding output signals 103 to a device controller 121 or the like.
- FIG. 1 shows input device controller 121 as part of the hardware 120 associated with computing system 104 , in practice device controller 121 may be implemented within a microcontroller or other processing circuitry that is physically and/or logically located within input device 102 .
- Signals 103 may indicate the absolute (“X, Y”), relative (“ ⁇ X, ⁇ Y”) and/or other position of one or more inputs applied by the user. Such inputs may respond to a finger or stylus applied to a touchpad or touch screen, for example, or to the user's manipulation of a mouse, joystick, trackball or the like.
- input device 102 may also include a “select” button or similar feature that allows for selection of objects, as appropriate.
- Various gestures may incorporate actual or virtual “button pushes”, such as presses or holds of actual mechanical buttons provided as part of input device 102 .
- Other embodiments may also consider presses and/or holds of “virtual” buttons within sensing region 110 . Any number of additional features, including presses, holds, clicks, multi-clicks, drags and/or the like may be similarly considered in any number of other embodiments.
- Computing system 104 is any data processing device capable of processing user inputs to perform desired tasks.
- Various types of computing systems 104 may include, without limitation, any sort of portable or stationary personal computer or workstation, tablet computer system, media player, personal digital assistant, game playing system, mobile telephone, e-book reader, television, television receiver, audio receiver, consumer electronic device, appliance and/or the like.
- computing system 104 suitably includes conventional hardware 120 that processes data as directed by one or more software applications 130 .
- FIG. 1 shows computing system 104 as including a conventional processor 125 and memory 126 , as well as any number of input/output interfaces, such as a network interface 122 , disk or other storage interface 123 , input device controller 121 , and any additional input/output interfaces 124 (e.g., a display driver) as desired.
- applications 130 interact with the hardware 120 via any sort of conventional operating system 128 .
- the exemplary features shown in FIG. 1 may be adapted or supplemented as desired to accommodate different hardware or software designs as may be appropriate for different types of computing systems 104 .
- Computing system 104 may, in some embodiments, communicate with a remote server 106 via a network 105 .
- Server 106 may provide remote storage and retrieval of information, and/or any other data processing services as appropriate.
- Network 105 is any sort of local area, wide area or other network that supports data communications using any conventional protocols.
- Network 105 may represent a conventional wired or wireless LAN/WAN, the Internet, a telephone network, a corporate or other private network, and/or any other network or combination of networks as desired.
- Gestures are recognized within system 100 in any manner.
- a gesture is recognized when the user's movements applied within the sensing region 110 of input device 102 follow a track or pattern that is defined by one or more parameters 133 , as appropriate.
- a horizontal swipe may be defined by parameters 133 specifying movement through sensing region 110 that proceeds from a starting position 111 and that remains within a confined region 112 for a particular distance 114 . If signals 103 and/or user inputs 132 indicate that the user's motion follows path 113 from starting position 111 for a distance 114 in this example, then gestural recognition module 131 may appropriately recognize a “horizontal swipe” gesture.
- gestures could be additionally or alternately considered, including any gestures correlating to movements in different directions (e.g., right-to-left, vertical movements, rotational movements and/or the like), movements of different velocities, movements applied at particular times or durations, taps, sequences of taps, tap and hold actions, tap and drag actions, tap-hold and drag actions, movements that begin or end at particular locations within sensing region 110 , and/or the like.
- Gestures suitably correlate to commands 134 provided to a web browser, media player or other data processing module 135 .
- An exemplary media player could respond to the horizontal swipe, for example, by changing television channels, performing forward or backward navigation, or by otherwise adjusting an output 136 presented to the user.
- Web browsers, productivity applications or other data processing modules 135 may respond to the horizontal swipe gesture in any other manner, as desired.
- FIG. 1 shows gestural recognition module 131 providing user commands 134 to a single data processing module 135
- alternate embodiments could provide simultaneous and/or sequential commands 134 to any number of data processing modules representing different applications, programs or other processes executing within application space 130 .
- the parameters 133 that define one or more gestures may not be ideally suited for some users. Some users may expect gestures to be recognized with longer or shorter movements, for example, or users may unwittingly provide imprecise movements within sensing region no that are not initially recognized as gestures.
- FIG. 1 shows an exemplary user movement 115 that lies within the confined region 112 for a horizontal swipe gesture, but that does not extend for the full distance 114 required to recognize the gesture based upon current parameters 133 .
- FIG. 1 also shows an exemplary user movement 117 that extends for the expected distance 114 , but that lies outside of the confined region 112 .
- Motion 117 may result, for example, if the user's hand position differs from the expected position that would ordinarily provide properly horizontal movement. If the user moved along paths 115 or 117 , then, such movements would not ordinarily fall within parameters 133 that define the region 112 associated with the gesture, so the gesture intended by the user would not be recognized.
- the region 112 that defines one or more gestures can be modified to suit the actual behaviors of the user. If the user is observed to consistently move along path 115 without extending the full distance 114 typically needed to identify the gesture, for example, one or more parameters 133 could be adjusted so that movement for a shorter distance 116 triggered the gesture during subsequent operation. Similarly, consistent movement along path 117 could result in changes to parameters 133 that would place path 117 within the defined region 112 associated with the particular gesture. Other adaptations to parameters 133 may accommodate other user motions on sensing region 110 and/or other types of user inputs (e.g., button presses, holds, drags, etc.) as desired.
- other types of user inputs e.g., button presses, holds, drags, etc.
- FIG. 2 shows an exemplary method 200 to process adaptive recognition of gestures indicated by user movements on a multi-directional input device 102 .
- the particular functions and features shown in FIG. 2 may be executed, for example, by a gestural recognition module 131 executing in software 130 or firmware on computing system 104 .
- software instructions stored in memory 126 or storage 123 can be executed by processor 125 to carry out the various functions shown.
- Equivalent embodiments may execute some or all of method 200 within hardware, software and/or firmware logic associated with device controller 121 , within operating system 128 , within input device 102 itself, at a remote server 106 , and/or in any other location using any sort of data processing logic.
- Some embodiments may perform functions 201 - 206 at a local computing system 104 , for example, while performing functions 208 - 212 at a server that is accessible to the local computing system 104 via network 105 .
- the various functions and features shown in FIG. 2 may be supplemented or modified in any number of equivalent embodiments.
- Method 200 suitably includes receiving a user input 132 from the multi-directional input device 102 (function 202 ) and determining if a gesture is formed by the user input 132 based upon a set of parameters 133 (function 204 ). If the gesture is not formed based upon the set of parameters 133 , the method may nevertheless attempt to recognize the gesture based upon other factors that are different from the set of parameters 133 (function 208 ). If the gesture is recognized based upon the other factors, one or more of the parameters 133 can be adapted (function 210 ) as desired. In some embodiments, the adapted parameters may be locally and/or remotely stored (function 212 ) for subsequent retrieval (function 201 ), or for any other purpose.
- the various parameters 133 that define gestural movements within sensing region 110 may be initially defined in any manner (function 201 ).
- the parameters 133 are initially defined based upon default values, based upon expectations or behavior of average users, and/or upon any other factors.
- Some embodiments may initially define parameters 133 to be relatively stringent (e.g., to prevent false recognition of unintended gestures), or to be relatively permissive (e.g., to allow relatively imprecise movement by the user to trigger a gesture).
- the permissivity of gesture recognition parameters 133 may be selected and adjusted by the user or an administrator using conventional user interface features, as desired.
- Some embodiments may initially obtain parameters 133 from a local source, such as from default values configured in module 131 , from prior parameters 133 stored in memory 126 or storage 123 , and/or the like.
- Other implementations may initially obtain parameters 133 from a remote service (e.g., server 106 via network 105 ), as described more fully below.
- User inputs 132 are received in any manner (function 202 ).
- User inputs 132 are any signals or data that can be processed to ascertain the user's motion with respect to sensing region 110 of input device 102 .
- user inputs 132 are received at a gestural recognition module 131 from device controller 121 , which suitably constructs the user input signals 132 from the raw motion signals 103 received from the input device 102 itself.
- Other embodiments may not provide signals 103 that are separately-identifiable from user inputs 132 .
- Still other embodiments may incorporate gesture recognition within device controller 121 so that gestural commands 134 issue directly from controller 121 .
- Other embodiments may combine or otherwise process the information contained within signals 103 , 132 and 134 in any other manner.
- a gesture can be readily identified (function 204 ), and a user command 134 can be issued, or otherwise processed (function 206 ). If function 204 does not recognize a gesture based upon the then-current parameters 133 , however, then additional processing may take place as desired. In some implementations, unrecognized gestures may be initially (and erroneously) processed as cursor movement, scrolling or other default behaviors if a gesture cannot be identified in any other manner.
- Various embodiments further attempt to recognize gestures from the received user inputs 132 using other factors (function 208 ).
- the other factors may include, for example, an approximation of the parameters 133 , subsequent actions taken by the user, the setting in which the user's input was provided, and/or other factor as desired. Repetition and context may also be considered in various embodiments.
- Function 208 may be performed in real time as the user inputs are received in some embodiments. Equivalently, function 208 may be performed at a later time so that subsequent user inputs may also be considered, or for any other reason.
- functions 208 - 212 may be processed at a server 106 or other location that is remote from the user input device 102 . Such embodiments may support batch processing, or processing of inputs received from any number of users and/or devices on any temporal basis.
- Gesture recognition in function 208 may consider any appropriate information or other factors. To recognize an intended gesture based upon an approximation, for example, various embodiments could consider gestures having parameters 133 that most closely match the actual movements indicated by the user inputs 132 .
- FIG. 1 shows two movements 115 and 117 that may be close enough to movement 113 that the user's intentions can be inferred. Within the context of an application where a gesture input is expected, for example, the “closest” gesture can be more readily determined.
- a media player application 135 may limit cursor movement or other non-gestural interface features in some settings and situations so that navigation or other gestural controls can be more readily isolated and recognized.
- Other embodiments may recognize intended gestures based upon subsequent user inputs 132 . If a user tries unsuccessfully to trigger a gesture command 134 , he or she will often try again to complete the same action. If a particular sequence of user movements is not initially recognized, then, subsequent user movements that result in a recognized gesture can be compared to the earlier movements to determine whether the earlier movements represented an attempt to provide gestural input.
- the data from the prior attempts may also be considered for both recognizing the gesture and modifying the parameters 133 associated with the gesture, as appropriate.
- the unsuccessful gesture attempts may not be recognized in time to complete the intended action, they may nevertheless provide additional data that can improve future gesture recognition in some embodiments. Other embodiments may consider any number of alternate or additional features or gesture recognition techniques, as desired.
- the parameters 133 may be adapted as desired (function 210 ) to more closely correspond with the user's intended actions.
- a movement 115 that extends for a shorter distance 116 than the parameter distance 114 could result in changing the parameter distance to a value that is more in line with user expectations.
- a movement along path 117 could result in changing the parameters 133 that define region 112 , as appropriate.
- Various embodiments may not necessarily adapt parameters 133 upon each recognized gesture, but may instead make adaptations based upon repetitive behaviors observed over time. If a user is consistently observed to make shorter movements 115 instead of intended movements 113 , for example, the distance parameter may be modified only after a threshold number of intended gestures have been recognized.
- the amount of adaptation may be similarly determined in any manner.
- Various embodiments may adjust spatial or temporal parameters (e.g., distance 114 , or the layout of region 112 , time between clicks or button presses, etc.) based upon averages or other combinations of data obtained from multiple gesture attempts. Some embodiments may consider data from successful as well as unsuccessful gesture recognition.
- Various embodiments may apply any sort of adaptation algorithms or heuristics that could modify the occurrence, magnitude and/or frequency of parameter updates as desired.
- the adapted parameters 133 are appropriately stored for subsequent use or analysis (function 212 ).
- the adapted parameters 133 are stored locally at computing system 104 (e.g., in memory 126 , disk or other storage 123 , or elsewhere) for retrieval and use during subsequent operations.
- the adapted parameters 133 are stored remotely (e.g., on server 106 ) so that the user may obtain his or her customized settings even while using other computing systems 104 , or to permit analysis of the gestures attempted by multiple users so that default or other settings of parameters 133 can be improved.
- some embodiments may additionally or alternately allow server 106 to receive adjusted parameters 133 and/or user inputs 132 from any number of users, input devices 102 and/or computing systems 104 .
- Server 106 may therefore analyze empirical data obtained from multiple users to thereby generate improved default or other values for parameters 133 .
- functions 208 and 210 may be performed at a remotely-located server 106 based upon user inputs 132 received from computing system 104 . While such analysis may not necessarily support real-time recognition of intended gestures, it would nevertheless allow the initial parameters obtained at function 201 to be improved over time.
- the adapted parameters 133 could then be returned to computing system 104 as part of function 201 , or otherwise as appropriate.
- the parameters 133 that are obtained in function 201 could be determined solely based upon inputs received from a particular user or from a particular computing system to optimize gesture recognition for a particular user. Alternately, the obtained parameters 133 could be based upon information obtained from multiple users operating multiple computing systems 104 , as appropriate.
- Parameters may be adapted based upon the individual user behaviors and/or shared behaviors of multiple users, as desired.
- the resulting adapted parameters 133 can then be shared from server 106 with any number of users, computing system 104 and/or input devices 102 as desired.
- function 201 in some embodments could involve obtaining or updating initial parameters 133 from server 106 or elsewhere on any regular or irregular temporal basis, or as otherwise needed.
Abstract
Description
- This application is a continuation of U.S. application Ser. No. 12/978,949, filed Dec. 27, 2010 and is incorporated in its entirety herein.
- The present invention generally relates to user interfaces, and more particularly relates to systems and methods for recognizing gestures from user inputs.
- Many different computing devices such as portable or stationary computer systems, tablet computer systems, smart phones, personal digital assistants (PDAs), media players, electronic book (“ebook”) readers and the like have become extraordinarily popular in recent years. Many of these devices incorporate user interfaces that respond to the user's touch, or to other inputs received in a two dimensional space. Examples of input devices that process multi-directional inputs include touch screens, touch pads, directional pads and the like, as well as more conventional mice, joysticks, etc. Many smart phones and tablet computers, for example, have user interfaces that are primarily (if not entirely) designed around touch screens that recognize inputs applied to the display by a user's finger, a stylus, or some other pointing object.
- Often, multi-directional user inputs can be tracked over time or space and combined to form a single command commonly called a “gesture”. A horizontal finger swipe, for example, may be recognized within a web browser or ebook reader as a gesture input to change pages, or to move forward or backward in a browsing history. A media player may recognize the same horizontal movement to perform other tasks, such as changing channels, performing a fast forward/rewind operation, or any other tasks as desired. Other gestures may involve vertical movements, rotational movements, taps, presses, holds and/or other user inputs tracked over time or over the multi-directional sensing region of the input device.
- Often, however, certain devices may have difficulty in recognizing gestural inputs applied by certain users. While some users may intend to provide gestural inputs, their actual inputs applied to the sensing region of the device may be imperfect, and therefore difficult to recognize. If the user attempts to enter gestures that are not recognized by the input device, the user may become frustrated. Further, variations between different movements applied by different users can lead to gestural constraints that are overly restrictive for certain users, but that are nevertheless unable to recognize gestures produced by other users.
- It is therefore desirable to improve recognition of gestures in user inputs detected by touch screens, touch pads, directional pads and/or other multi-directional input devices. These and other desirable features and characteristics will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and this background section.
- According to various embodiments, systems and methods are described for adaptively recognizing gestures indicated by user inputs received from a touchpad, touchscreen, directional pad, mouse or other multi-directional input device. If a user's movement does not indicate a gesture using current gesture recognition parameters, additional processing can be performed to recognize the gesture using other factors. The gesture recognition parameters can then be adapted so that subsequent user inputs that are similar to the previously-rejected inputs will appropriately trigger gesture commands as desired by the user. In some embodiments, gestural data and/or parameters may be locally or remotely stored for further processing.
- Various embodiments provide a method to process user inputs received from a multi-directional input device. The method may be executed by an input device itself, by device driver associated with an input device, by a processor or other component of a computing system that performs functions in response to software or firmware instructions, by a server that receives user input data via a network, and/or by any other processing device or logic. The method suitably comprises receiving at least one of the user inputs from the multi-directional input device; determining if the at least one user input forms a gesture based upon a set of parameters; if the at least one user input does not form the gesture based upon the set of parameters, attempting to recognize the gesture based upon other factors that are different from the set of parameters; and if the gesture is recognized based upon the other factors, adapting the set of parameters based upon the at least one user input.
- Other embodiments suitably provide a computing system that comprises an input device having a multi-directional sensing region and a processor. The input device is configured to sense user inputs relative to a multi-directional sensing region and to provide output signals indicative of the sensed user inputs. The processor is configured to receive the output signals from the input device and to initially recognize gestures from at least some of the sensed user inputs indicated in the output signals based upon a set of parameters, and to recognize subsequent gestures based upon an adapted set of parameters that is adapted based upon previously-received user inputs.
- Still other embodiments provide a method executable by a computing system, server or other data processing logic to process user inputs obtained from at least one multi-directional input device. The method suitably comprises receiving the user inputs; identifying at least some of the user inputs that are not recognized as gestures based upon a set of parameters, but that are recognized as gestures based upon other factors that are different than the set of parameters; and adapting the set of parameters based upon the identified user inputs to create an adapted set of parameters.
- Various other embodiments, aspects and other features are described in more detail below.
- Exemplary embodiments will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and
-
FIG. 1 is a block diagram showing various components of an exemplary processing system that includes adaptive gesture recognition; and -
FIG. 2 is a flowchart of an exemplary process for adaptively recognizing gestures from user inputs. - The following detailed description of the invention is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any theory presented in the preceding background or the following detailed description.
- According to various embodiments, adaptive gesture recognition is applied within an application, a computer system, a hardware or software device driver and/or an input device to identify actual user behaviors and to adapt gestural recognition parameters based upon the actual behaviors of the user. If a particular user consistently makes shorter-than-expected movements for a particular gesture, for example, the parameters used to recognize that particular gesture can be adapted so that the user's actual behavior is recognized as producing the intended gesture. This can significantly reduce frustration by the user and greatly enhance user satisfaction with the adaptive product.
- Adaptive gesture recognition may be applied with respect to particular users by storing the adapted parameters for use during subsequent sessions with the same user. In some implementations, the adapted parameters are stored locally at the user's input device or computer system. Other implementations may additionally or alternately store the adapted parameters at a network server or other shared location so that the user's adapted parameters are accessible from different computing systems. Still other embodiments may use aggregate data compiled from any number of users to adjust default settings, or to otherwise adapt the parameters used to recognize gestures used by multiple users of the same system or application. Other features, enhancements and other variations are described in increasing detail below.
- Turning now to the drawing figures and with initial reference to
FIG. 1 , anexemplary system 100 could incorporate adaptive gestural recognition within aninput device 102, acomputing system 104, adevice controller 121 and/or asoftware application 130, as appropriate. Thesystem 100 illustrated inFIG. 1 shows amulti-dimensional input device 102 that providessignals 103 to adevice controller 121 or other feature ofcomputing system 104, which reports the receiveduser inputs 132 to agestural recognition module 131 or other feature of asoftware application 130.Gestural recognition module 131 suitably compares theuser inputs 132 togestural recognition parameters 133 to identify gestures in the user'smovements corresponding command 134 is provided to a media player, web browser, productivity application or otherdata processing module 135 as appropriate. The various features and functions ofgestural recognition module 131 could be equivalently incorporated within theinput device 102 itself, within theinput device controller 121, within theoperating system 128 ofcomputing system 104, or in any other hardware or software processing logic withinsystem 100, as desired. -
Input device 102 is any device or component capable of sensing user inputs within amulti-dimensional input space 110.Input device 102 may be a touch pad, touch screen, touch stick, directional pad, joystick, mouse, trackball or the like, to name just a few examples. Typically,input device 102 detectsuser motions sensing region 110, and providescorresponding output signals 103 to adevice controller 121 or the like. AlthoughFIG. 1 showsinput device controller 121 as part of thehardware 120 associated withcomputing system 104, inpractice device controller 121 may be implemented within a microcontroller or other processing circuitry that is physically and/or logically located withininput device 102.Signals 103 may indicate the absolute (“X, Y”), relative (“ΔX, ΔY”) and/or other position of one or more inputs applied by the user. Such inputs may respond to a finger or stylus applied to a touchpad or touch screen, for example, or to the user's manipulation of a mouse, joystick, trackball or the like. - Although not shown in
FIG. 1 ,input device 102 may also include a “select” button or similar feature that allows for selection of objects, as appropriate. Various gestures may incorporate actual or virtual “button pushes”, such as presses or holds of actual mechanical buttons provided as part ofinput device 102. Other embodiments may also consider presses and/or holds of “virtual” buttons within sensingregion 110. Any number of additional features, including presses, holds, clicks, multi-clicks, drags and/or the like may be similarly considered in any number of other embodiments. -
Computing system 104 is any data processing device capable of processing user inputs to perform desired tasks. Various types ofcomputing systems 104 may include, without limitation, any sort of portable or stationary personal computer or workstation, tablet computer system, media player, personal digital assistant, game playing system, mobile telephone, e-book reader, television, television receiver, audio receiver, consumer electronic device, appliance and/or the like. - Generally speaking,
computing system 104 suitably includesconventional hardware 120 that processes data as directed by one ormore software applications 130.FIG. 1 , for example, showscomputing system 104 as including aconventional processor 125 andmemory 126, as well as any number of input/output interfaces, such as anetwork interface 122, disk orother storage interface 123,input device controller 121, and any additional input/output interfaces 124 (e.g., a display driver) as desired. Typically,applications 130 interact with thehardware 120 via any sort ofconventional operating system 128. The exemplary features shown inFIG. 1 may be adapted or supplemented as desired to accommodate different hardware or software designs as may be appropriate for different types ofcomputing systems 104. -
Computing system 104 may, in some embodiments, communicate with aremote server 106 via anetwork 105.Server 106 may provide remote storage and retrieval of information, and/or any other data processing services as appropriate.Network 105 is any sort of local area, wide area or other network that supports data communications using any conventional protocols.Network 105 may represent a conventional wired or wireless LAN/WAN, the Internet, a telephone network, a corporate or other private network, and/or any other network or combination of networks as desired. - Gestures are recognized within
system 100 in any manner. In various embodiments, a gesture is recognized when the user's movements applied within thesensing region 110 ofinput device 102 follow a track or pattern that is defined by one ormore parameters 133, as appropriate. A horizontal swipe, for example, may be defined byparameters 133 specifying movement throughsensing region 110 that proceeds from a startingposition 111 and that remains within a confinedregion 112 for aparticular distance 114. Ifsignals 103 and/oruser inputs 132 indicate that the user's motion followspath 113 from startingposition 111 for adistance 114 in this example, then gesturalrecognition module 131 may appropriately recognize a “horizontal swipe” gesture. Any number of other gestures could be additionally or alternately considered, including any gestures correlating to movements in different directions (e.g., right-to-left, vertical movements, rotational movements and/or the like), movements of different velocities, movements applied at particular times or durations, taps, sequences of taps, tap and hold actions, tap and drag actions, tap-hold and drag actions, movements that begin or end at particular locations withinsensing region 110, and/or the like. - Gestures suitably correlate to
commands 134 provided to a web browser, media player or otherdata processing module 135. An exemplary media player could respond to the horizontal swipe, for example, by changing television channels, performing forward or backward navigation, or by otherwise adjusting anoutput 136 presented to the user. Web browsers, productivity applications or otherdata processing modules 135 may respond to the horizontal swipe gesture in any other manner, as desired. Although the example illustrated inFIG. 1 showsgestural recognition module 131 providing user commands 134 to a singledata processing module 135, alternate embodiments could provide simultaneous and/orsequential commands 134 to any number of data processing modules representing different applications, programs or other processes executing withinapplication space 130. - As noted above, however, the
parameters 133 that define one or more gestures may not be ideally suited for some users. Some users may expect gestures to be recognized with longer or shorter movements, for example, or users may unwittingly provide imprecise movements within sensing region no that are not initially recognized as gestures.FIG. 1 shows anexemplary user movement 115 that lies within the confinedregion 112 for a horizontal swipe gesture, but that does not extend for thefull distance 114 required to recognize the gesture based uponcurrent parameters 133.FIG. 1 also shows anexemplary user movement 117 that extends for the expecteddistance 114, but that lies outside of the confinedregion 112.Motion 117 may result, for example, if the user's hand position differs from the expected position that would ordinarily provide properly horizontal movement. If the user moved alongpaths parameters 133 that define theregion 112 associated with the gesture, so the gesture intended by the user would not be recognized. - By adapting
parameters 133 based uponactual user inputs 132, however, theregion 112 that defines one or more gestures can be modified to suit the actual behaviors of the user. If the user is observed to consistently move alongpath 115 without extending thefull distance 114 typically needed to identify the gesture, for example, one ormore parameters 133 could be adjusted so that movement for ashorter distance 116 triggered the gesture during subsequent operation. Similarly, consistent movement alongpath 117 could result in changes toparameters 133 that would placepath 117 within the definedregion 112 associated with the particular gesture. Other adaptations toparameters 133 may accommodate other user motions on sensingregion 110 and/or other types of user inputs (e.g., button presses, holds, drags, etc.) as desired. -
FIG. 2 shows anexemplary method 200 to process adaptive recognition of gestures indicated by user movements on amulti-directional input device 102. The particular functions and features shown inFIG. 2 may be executed, for example, by agestural recognition module 131 executing insoftware 130 or firmware oncomputing system 104. In such embodiments, software instructions stored inmemory 126 orstorage 123 can be executed byprocessor 125 to carry out the various functions shown. Equivalent embodiments may execute some or all ofmethod 200 within hardware, software and/or firmware logic associated withdevice controller 121, withinoperating system 128, withininput device 102 itself, at aremote server 106, and/or in any other location using any sort of data processing logic. Some embodiments may perform functions 201-206 at alocal computing system 104, for example, while performing functions 208-212 at a server that is accessible to thelocal computing system 104 vianetwork 105. The various functions and features shown inFIG. 2 may be supplemented or modified in any number of equivalent embodiments. -
Method 200 suitably includes receiving auser input 132 from the multi-directional input device 102 (function 202) and determining if a gesture is formed by theuser input 132 based upon a set of parameters 133 (function 204). If the gesture is not formed based upon the set ofparameters 133, the method may nevertheless attempt to recognize the gesture based upon other factors that are different from the set of parameters 133 (function 208). If the gesture is recognized based upon the other factors, one or more of theparameters 133 can be adapted (function 210) as desired. In some embodiments, the adapted parameters may be locally and/or remotely stored (function 212) for subsequent retrieval (function 201), or for any other purpose. - The
various parameters 133 that define gestural movements withinsensing region 110 may be initially defined in any manner (function 201). In various settings, theparameters 133 are initially defined based upon default values, based upon expectations or behavior of average users, and/or upon any other factors. Some embodiments may initially defineparameters 133 to be relatively stringent (e.g., to prevent false recognition of unintended gestures), or to be relatively permissive (e.g., to allow relatively imprecise movement by the user to trigger a gesture). In some embodiments, the permissivity ofgesture recognition parameters 133 may be selected and adjusted by the user or an administrator using conventional user interface features, as desired. Some embodiments may initially obtainparameters 133 from a local source, such as from default values configured inmodule 131, fromprior parameters 133 stored inmemory 126 orstorage 123, and/or the like. Other implementations may initially obtainparameters 133 from a remote service (e.g.,server 106 via network 105), as described more fully below. -
User inputs 132 are received in any manner (function 202).User inputs 132 are any signals or data that can be processed to ascertain the user's motion with respect tosensing region 110 ofinput device 102. In the implementation described above,user inputs 132 are received at agestural recognition module 131 fromdevice controller 121, which suitably constructs the user input signals 132 from the raw motion signals 103 received from theinput device 102 itself. Other embodiments may not providesignals 103 that are separately-identifiable fromuser inputs 132. Still other embodiments may incorporate gesture recognition withindevice controller 121 so thatgestural commands 134 issue directly fromcontroller 121. Other embodiments may combine or otherwise process the information contained withinsignals - If the
user inputs 132 indicate movement within theparameters 133 that define an existinggestural region 112, then a gesture can be readily identified (function 204), and auser command 134 can be issued, or otherwise processed (function 206). Iffunction 204 does not recognize a gesture based upon the then-current parameters 133, however, then additional processing may take place as desired. In some implementations, unrecognized gestures may be initially (and erroneously) processed as cursor movement, scrolling or other default behaviors if a gesture cannot be identified in any other manner. - Various embodiments further attempt to recognize gestures from the received
user inputs 132 using other factors (function 208). The other factors may include, for example, an approximation of theparameters 133, subsequent actions taken by the user, the setting in which the user's input was provided, and/or other factor as desired. Repetition and context may also be considered in various embodiments.Function 208 may be performed in real time as the user inputs are received in some embodiments. Equivalently, function 208 may be performed at a later time so that subsequent user inputs may also be considered, or for any other reason. As noted above, functions 208-212 may be processed at aserver 106 or other location that is remote from theuser input device 102. Such embodiments may support batch processing, or processing of inputs received from any number of users and/or devices on any temporal basis. - Gesture recognition in
function 208 may consider any appropriate information or other factors. To recognize an intended gesture based upon an approximation, for example, various embodiments could considergestures having parameters 133 that most closely match the actual movements indicated by theuser inputs 132.FIG. 1 , for example, shows twomovements movement 113 that the user's intentions can be inferred. Within the context of an application where a gesture input is expected, for example, the “closest” gesture can be more readily determined. Amedia player application 135, for example, may limit cursor movement or other non-gestural interface features in some settings and situations so that navigation or other gestural controls can be more readily isolated and recognized. - Other embodiments may recognize intended gestures based upon
subsequent user inputs 132. If a user tries unsuccessfully to trigger agesture command 134, he or she will often try again to complete the same action. If a particular sequence of user movements is not initially recognized, then, subsequent user movements that result in a recognized gesture can be compared to the earlier movements to determine whether the earlier movements represented an attempt to provide gestural input. - If the user makes multiple unsuccessful attempts to complete the same gesture, the data from the prior attempts may also be considered for both recognizing the gesture and modifying the
parameters 133 associated with the gesture, as appropriate. Although the unsuccessful gesture attempts may not be recognized in time to complete the intended action, they may nevertheless provide additional data that can improve future gesture recognition in some embodiments. Other embodiments may consider any number of alternate or additional features or gesture recognition techniques, as desired. - When a gesture is recognized based upon the factors other than the then-current parameters 133 (function 208), then the
parameters 133 may be adapted as desired (function 210) to more closely correspond with the user's intended actions. In the horizontal swipe gesture described with respect toFIG. 1 , for example, amovement 115 that extends for ashorter distance 116 than theparameter distance 114 could result in changing the parameter distance to a value that is more in line with user expectations. Similarly, a movement alongpath 117 could result in changing theparameters 133 that defineregion 112, as appropriate. - Various embodiments may not necessarily adapt
parameters 133 upon each recognized gesture, but may instead make adaptations based upon repetitive behaviors observed over time. If a user is consistently observed to makeshorter movements 115 instead of intendedmovements 113, for example, the distance parameter may be modified only after a threshold number of intended gestures have been recognized. - The amount of adaptation may be similarly determined in any manner. Various embodiments may adjust spatial or temporal parameters (e.g.,
distance 114, or the layout ofregion 112, time between clicks or button presses, etc.) based upon averages or other combinations of data obtained from multiple gesture attempts. Some embodiments may consider data from successful as well as unsuccessful gesture recognition. Various embodiments may apply any sort of adaptation algorithms or heuristics that could modify the occurrence, magnitude and/or frequency of parameter updates as desired. - The adapted
parameters 133 are appropriately stored for subsequent use or analysis (function 212). In some embodiments, the adaptedparameters 133 are stored locally at computing system 104 (e.g., inmemory 126, disk orother storage 123, or elsewhere) for retrieval and use during subsequent operations. In other embodiments, the adaptedparameters 133 are stored remotely (e.g., on server 106) so that the user may obtain his or her customized settings even while usingother computing systems 104, or to permit analysis of the gestures attempted by multiple users so that default or other settings ofparameters 133 can be improved. - To that end, some embodiments may additionally or alternately allow
server 106 to receive adjustedparameters 133 and/oruser inputs 132 from any number of users,input devices 102 and/orcomputing systems 104.Server 106 may therefore analyze empirical data obtained from multiple users to thereby generate improved default or other values forparameters 133. - As noted above, then, functions 208 and 210 may be performed at a remotely-located
server 106 based uponuser inputs 132 received fromcomputing system 104. While such analysis may not necessarily support real-time recognition of intended gestures, it would nevertheless allow the initial parameters obtained atfunction 201 to be improved over time. The adaptedparameters 133 could then be returned tocomputing system 104 as part offunction 201, or otherwise as appropriate. Theparameters 133 that are obtained infunction 201 could be determined solely based upon inputs received from a particular user or from a particular computing system to optimize gesture recognition for a particular user. Alternately, the obtainedparameters 133 could be based upon information obtained from multiple users operatingmultiple computing systems 104, as appropriate. Parameters may be adapted based upon the individual user behaviors and/or shared behaviors of multiple users, as desired. The resulting adaptedparameters 133 can then be shared fromserver 106 with any number of users,computing system 104 and/orinput devices 102 as desired. To that end, function 201 in some embodments could involve obtaining or updatinginitial parameters 133 fromserver 106 or elsewhere on any regular or irregular temporal basis, or as otherwise needed. - While the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing various embodiments of the invention, it should be appreciated that the particular embodiments described above are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. To the contrary, various changes may be made in the function and arrangement of elements described without departing from the scope of the invention.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/714,957 US20180011619A1 (en) | 2010-12-27 | 2017-09-25 | Systems and methods for adaptive gesture recognition |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/978,949 US9785335B2 (en) | 2010-12-27 | 2010-12-27 | Systems and methods for adaptive gesture recognition |
US15/714,957 US20180011619A1 (en) | 2010-12-27 | 2017-09-25 | Systems and methods for adaptive gesture recognition |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/978,949 Continuation US9785335B2 (en) | 2010-12-27 | 2010-12-27 | Systems and methods for adaptive gesture recognition |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180011619A1 true US20180011619A1 (en) | 2018-01-11 |
Family
ID=45464837
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/978,949 Active 2031-05-15 US9785335B2 (en) | 2010-12-27 | 2010-12-27 | Systems and methods for adaptive gesture recognition |
US15/714,957 Abandoned US20180011619A1 (en) | 2010-12-27 | 2017-09-25 | Systems and methods for adaptive gesture recognition |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/978,949 Active 2031-05-15 US9785335B2 (en) | 2010-12-27 | 2010-12-27 | Systems and methods for adaptive gesture recognition |
Country Status (5)
Country | Link |
---|---|
US (2) | US9785335B2 (en) |
EP (1) | EP2659346A1 (en) |
CA (1) | CA2822812C (en) |
MX (1) | MX2013007206A (en) |
WO (1) | WO2012091862A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11190674B2 (en) | 2018-09-07 | 2021-11-30 | John Robert Mortensen | Remote camera trigger |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10222974B2 (en) | 2011-05-03 | 2019-03-05 | Nokia Technologies Oy | Method and apparatus for providing quick access to device functionality |
US8751972B2 (en) * | 2011-09-20 | 2014-06-10 | Google Inc. | Collaborative gesture-based input language |
US9645733B2 (en) * | 2011-12-06 | 2017-05-09 | Google Inc. | Mechanism for switching between document viewing windows |
US8977967B2 (en) * | 2012-05-11 | 2015-03-10 | Microsoft Technology Licensing, Llc | Rules for navigating to next content in a browser |
US9696879B2 (en) | 2012-09-07 | 2017-07-04 | Google Inc. | Tab scrubbing using navigation gestures |
US10908929B2 (en) * | 2012-10-15 | 2021-02-02 | Famous Industries, Inc. | Human versus bot detection using gesture fingerprinting |
US10877780B2 (en) | 2012-10-15 | 2020-12-29 | Famous Industries, Inc. | Visibility detection using gesture fingerprinting |
US11386257B2 (en) | 2012-10-15 | 2022-07-12 | Amaze Software, Inc. | Efficient manipulation of surfaces in multi-dimensional space using energy agents |
US9501171B1 (en) * | 2012-10-15 | 2016-11-22 | Famous Industries, Inc. | Gesture fingerprinting |
FR2999316A1 (en) * | 2012-12-12 | 2014-06-13 | Sagemcom Broadband Sas | DEVICE AND METHOD FOR RECOGNIZING GESTURES FOR USER INTERFACE CONTROL |
JP5783385B2 (en) * | 2013-02-27 | 2015-09-24 | カシオ計算機株式会社 | Data processing apparatus and program |
US9699019B2 (en) | 2013-06-14 | 2017-07-04 | Microsoft Technology Licensing, Llc | Related content display associated with browsing |
US9542004B1 (en) * | 2013-09-24 | 2017-01-10 | Amazon Technologies, Inc. | Gesture-based flash |
US20150261659A1 (en) * | 2014-03-12 | 2015-09-17 | Bjoern BADER | Usability testing of applications by assessing gesture inputs |
US10613642B2 (en) * | 2014-03-12 | 2020-04-07 | Microsoft Technology Licensing, Llc | Gesture parameter tuning |
US20150277696A1 (en) * | 2014-03-27 | 2015-10-01 | International Business Machines Corporation | Content placement based on user input |
CN111417957B (en) * | 2018-01-03 | 2023-10-27 | 索尼半导体解决方案公司 | Gesture recognition using mobile device |
CN112104915B (en) * | 2020-09-14 | 2022-08-26 | 腾讯科技(深圳)有限公司 | Video data processing method and device and storage medium |
CN115113751A (en) * | 2021-03-18 | 2022-09-27 | 华为技术有限公司 | Method and device for adjusting numerical range of recognition parameter of touch gesture |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090231295A1 (en) * | 2008-03-14 | 2009-09-17 | France Telecom | System for classifying gestures |
US20120016641A1 (en) * | 2010-07-13 | 2012-01-19 | Giuseppe Raffa | Efficient gesture processing |
US20120056818A1 (en) * | 2010-09-03 | 2012-03-08 | Microsoft Corporation | Dynamic gesture parameters |
US20130120279A1 (en) * | 2009-11-20 | 2013-05-16 | Jakub Plichta | System and Method for Developing and Classifying Touch Gestures |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5471578A (en) * | 1993-12-30 | 1995-11-28 | Xerox Corporation | Apparatus and method for altering enclosure selections in a gesture based input system |
DE69426919T2 (en) * | 1993-12-30 | 2001-06-28 | Xerox Corp | Apparatus and method for performing many chaining command gestures in a gesture user interface system |
US5509114A (en) * | 1993-12-30 | 1996-04-16 | Xerox Corporation | Method and apparatus for correcting and/or aborting command gestures in a gesture based input system |
JPH086707A (en) * | 1993-12-30 | 1996-01-12 | Xerox Corp | Screen-directivity-display processing system |
US5880743A (en) * | 1995-01-24 | 1999-03-09 | Xerox Corporation | Apparatus and method for implementing visual animation illustrating results of interactive editing operations |
US7401299B2 (en) * | 2001-09-05 | 2008-07-15 | Autodesk, Inc. | Method and apparatus for providing a presumptive drafting solution |
JPH0991082A (en) | 1995-09-21 | 1997-04-04 | Canon Inc | Information processor and method therefor and storage medium |
US5966460A (en) | 1997-03-03 | 1999-10-12 | Xerox Corporation | On-line learning for neural net-based character recognition systems |
US6859909B1 (en) * | 2000-03-07 | 2005-02-22 | Microsoft Corporation | System and method for annotating web-based documents |
US7299424B2 (en) * | 2002-05-14 | 2007-11-20 | Microsoft Corporation | Lasso select |
US7532196B2 (en) * | 2003-10-30 | 2009-05-12 | Microsoft Corporation | Distributed sensing techniques for mobile devices |
US7454717B2 (en) * | 2004-10-20 | 2008-11-18 | Microsoft Corporation | Delimiters for selection-action pen gesture phrases |
US7636794B2 (en) * | 2005-10-31 | 2009-12-22 | Microsoft Corporation | Distributed sensing techniques for mobile devices |
WO2009072736A1 (en) | 2007-12-03 | 2009-06-11 | Electronics And Telecommunications Research Institute | User adaptive gesture recognition method and user adaptive gesture recognition system |
US8286080B2 (en) * | 2008-07-24 | 2012-10-09 | Cisco Technology, Inc. | User navigation via vectors dynamically mapped to distinct media attributes |
JP5100556B2 (en) * | 2008-07-30 | 2012-12-19 | キヤノン株式会社 | Information processing method and apparatus |
US8325978B2 (en) | 2008-10-30 | 2012-12-04 | Nokia Corporation | Method, apparatus and computer program product for providing adaptive gesture analysis |
US20100185949A1 (en) * | 2008-12-09 | 2010-07-22 | Denny Jaeger | Method for using gesture objects for computer control |
WO2010071630A1 (en) * | 2008-12-15 | 2010-06-24 | Hewlett-Packard Development Company, L.P. | Gesture based edit mode |
US8212788B2 (en) * | 2009-05-07 | 2012-07-03 | Microsoft Corporation | Touch input to modulate changeable parameter |
US20100315266A1 (en) | 2009-06-15 | 2010-12-16 | Microsoft Corporation | Predictive interfaces with usability constraints |
US9372614B2 (en) * | 2009-07-09 | 2016-06-21 | Qualcomm Incorporated | Automatic enlargement of viewing area with selectable objects |
US9311309B2 (en) * | 2009-08-05 | 2016-04-12 | Robert Bosch Gmbh | Entertainment media visualization and interaction method |
US8514188B2 (en) * | 2009-12-30 | 2013-08-20 | Microsoft Corporation | Hand posture mode constraints on touch input |
US8587422B2 (en) * | 2010-03-31 | 2013-11-19 | Tk Holdings, Inc. | Occupant sensing system |
DE102011006448A1 (en) * | 2010-03-31 | 2011-10-06 | Tk Holdings, Inc. | steering wheel sensors |
DE102011006649B4 (en) * | 2010-04-02 | 2018-05-03 | Tk Holdings Inc. | Steering wheel with hand sensors |
US20110291964A1 (en) * | 2010-06-01 | 2011-12-01 | Kno, Inc. | Apparatus and Method for Gesture Control of a Dual Panel Electronic Device |
US20110298720A1 (en) * | 2010-06-02 | 2011-12-08 | Rockwell Automation Technologies, Inc. | System and method for the operation of a touch screen |
DE112011101209T5 (en) * | 2010-09-24 | 2013-01-17 | Qnx Software Systems Ltd. | Alert Display on a portable electronic device |
DE112011101203T5 (en) * | 2010-09-24 | 2013-01-17 | Qnx Software Systems Ltd. | Portable electronic device and method for its control |
WO2012050251A1 (en) * | 2010-10-14 | 2012-04-19 | 엘지전자 주식회사 | Mobile terminal and method for controlling same |
US20120092286A1 (en) * | 2010-10-19 | 2012-04-19 | Microsoft Corporation | Synthetic Gesture Trace Generator |
US20120131513A1 (en) * | 2010-11-19 | 2012-05-24 | Microsoft Corporation | Gesture Recognition Training |
US8704792B1 (en) * | 2012-10-19 | 2014-04-22 | Google Inc. | Density-based filtering of gesture events associated with a user interface of a computing device |
-
2010
- 2010-12-27 US US12/978,949 patent/US9785335B2/en active Active
-
2011
- 2011-12-06 EP EP11806021.9A patent/EP2659346A1/en not_active Ceased
- 2011-12-06 WO PCT/US2011/063575 patent/WO2012091862A1/en active Application Filing
- 2011-12-06 CA CA2822812A patent/CA2822812C/en active Active
- 2011-12-06 MX MX2013007206A patent/MX2013007206A/en active IP Right Grant
-
2017
- 2017-09-25 US US15/714,957 patent/US20180011619A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090231295A1 (en) * | 2008-03-14 | 2009-09-17 | France Telecom | System for classifying gestures |
US20130120279A1 (en) * | 2009-11-20 | 2013-05-16 | Jakub Plichta | System and Method for Developing and Classifying Touch Gestures |
US20120016641A1 (en) * | 2010-07-13 | 2012-01-19 | Giuseppe Raffa | Efficient gesture processing |
US20120056818A1 (en) * | 2010-09-03 | 2012-03-08 | Microsoft Corporation | Dynamic gesture parameters |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11190674B2 (en) | 2018-09-07 | 2021-11-30 | John Robert Mortensen | Remote camera trigger |
Also Published As
Publication number | Publication date |
---|---|
US20120167017A1 (en) | 2012-06-28 |
MX2013007206A (en) | 2013-11-22 |
CA2822812A1 (en) | 2012-07-05 |
WO2012091862A1 (en) | 2012-07-05 |
EP2659346A1 (en) | 2013-11-06 |
US9785335B2 (en) | 2017-10-10 |
CA2822812C (en) | 2017-01-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180011619A1 (en) | Systems and methods for adaptive gesture recognition | |
US11314407B2 (en) | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object | |
US11221752B2 (en) | Character recognition on a computing device | |
US9298266B2 (en) | Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects | |
US7849421B2 (en) | Virtual mouse driving apparatus and method using two-handed gestures | |
EP2756369B1 (en) | Soft keyboard interface | |
US20170329487A1 (en) | Computer with graphical user interface for interaction | |
US20200218356A1 (en) | Systems and methods for providing dynamic haptic playback for an augmented or virtual reality environments | |
US20100194702A1 (en) | Signal processing apparatus, signal processing method and selection method of uer interface icon for multi-touch panel | |
JP2013539580A (en) | Method and apparatus for motion control on device | |
EP2356553A1 (en) | Methods and apparatuses for facilitating interaction with touch screen apparatuses | |
CN108920066B (en) | Touch screen sliding adjustment method and device and touch equipment | |
US20140298223A1 (en) | Systems and methods for drawing shapes and issuing gesture-based control commands on the same draw grid | |
CN102768595B (en) | A kind of method and device identifying touch control operation instruction on touch-screen | |
CN107797722A (en) | Touch screen icon selection method and device | |
CN107390917A (en) | A kind of display methods, device, mobile terminal and the storage medium of touch-control bar | |
KR101699026B1 (en) | System and method for providing user interface | |
CN104516566A (en) | Handwriting input method and device | |
CN108132721B (en) | Method for generating drag gesture, touch device and portable electronic equipment | |
US10133346B2 (en) | Gaze based prediction device and method | |
JP5449088B2 (en) | Information input device | |
US10318147B2 (en) | Method and system of gesture recognition in touch display device | |
US9778822B2 (en) | Touch input method and electronic apparatus thereof | |
KR20130065331A (en) | Apparatus and method for controlling mobile device based on gesture recognition, and program storing medium for excuting the method | |
WO2014179948A1 (en) | Method and apparatus for distinguishing swipe gesture from character input provided via touch-sensitive keyboard |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |