US20110043475A1 - Method and system of identifying a user of a handheld device - Google Patents
Method and system of identifying a user of a handheld device Download PDFInfo
- Publication number
- US20110043475A1 US20110043475A1 US12/988,745 US98874509A US2011043475A1 US 20110043475 A1 US20110043475 A1 US 20110043475A1 US 98874509 A US98874509 A US 98874509A US 2011043475 A1 US2011043475 A1 US 2011043475A1
- Authority
- US
- United States
- Prior art keywords
- user
- electronic device
- handheld electronic
- trajectory
- user identification
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42208—Display device provided on the remote control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42222—Additional components integrated in the remote control device, e.g. timer, speaker, sensors for detecting position, direction or movement of the remote control, microphone or battery charging device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42224—Touch pad or touch panel provided on the remote control
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
Definitions
- the present invention relates generally to method an system for identifying a user of a handheld device, e.g. remote control systems. Many systems would benefit from easy, non-intrusive user identifications. For reference only, many aspects of the invention and the background relating to the inventions are described in relation to a remote control system, suitable for control of consumer electronic products and home appliances, that includes a touch sensitive handheld remote control unit that detects holding and grabbing patterns of the user as well as other characteristics such as the trajectory at which the user raises the remote and first touches the remote to identify the user.
- Handheld remote control units typically featuring a large plurality of push buttons, are now quite commonplace on coffee tables throughout the world. With most consumer electronic products, it is customary for the manufacturer to furnish such a handheld remote control with each unit. Thus, most consumers own a collection of various different remote control units, each associated with a particular product or appliance.
- a touch-sensitive remote control unit that features a reduced number of push buttons and one or more touch-sensitive touchpads that may be manipulated by the user's fingers or thumb to interact with information on a display screen.
- the touch pads may be manipulated, for example, to move a selection indicator (such as a cursor or other graphical element) across a control region upon a display screen.
- the display screen will be separate from the handheld remote control unit, and thus the user manipulates the selection indicator by watching the display screen while manipulating the keypad with a finger or thumb.
- the touchpad or touchpads are disposed on the remote control unit so that they can be manipulated by the user's thumb while the user is holding the unit in one hand.
- the remote control has touch sensitive sensors on its outer casing sensitive to a user's touch.
- the methods range from the grab/hold patterns by which the user holds a remote, to the trajectory that the remote follows when grabbed by the user.
- the present invention relates to a system and method for identifying a user of a handheld device.
- the handheld electronic device comprises a housing and a sensor system disposed along a periphery of the housing.
- the sensor system is responsive to a plurality of simultaneous points of contact between a user's hand and the device to generate observation signals indicative of the plurality of contact points between the user's hand and the device.
- the handheld electronic device further includes a user identification database storing data corresponding attributes of a plurality of known users, wherein the attributes of the plurality of known users are used to identify a user.
- the device further comprises a user identification module configured to receive the observation signals from the sensor system and identify the user from the observation signals and the attributes of the plurality of users.
- the present relates to a handheld electronic device comprising a housing and a touchpad responsive to a finger movement of a user that generates a touchpad signal corresponding to the finger movement.
- the device further includes a touchpad processing module that receives the touchpad signal and generates finger movement data based on said touchpad signal.
- the handheld electronic device further includes a user identification database that stores user identification data corresponding to physical attributes of a plurality of known users, wherein physical attributes includes finger movement of a user drawing a predetermined object.
- the device is further comprised of a user identification module that receives finger movement data of the user and identifies the user based on the finger movement data and the user identification data, wherein the finger movement data is the user drawing the predefined shape.
- a handheld electronic device is comprised of a housing and a touch sensor system disposed along a periphery of the housing.
- the touch sensor system is responsive to a plurality of simultaneous points of contact between a user's hand and the device to generate observation signals indicative of the plurality of contact points between the user's hand and the device.
- the device also includes a touch sensor processing module configured to receive the observation signals from the touch sensor system and determine a user's holding pattern.
- the device is further comprised of an inertial sensor embedded in the housing which is responsive to movement of the device by the user's hand to generate inertial signals and a trajectory module configured to receive the inertial signals from the inertial sensor and determine a trajectory for the movement of the device.
- the device also includes a touchpad located along an external surface of the housing that is responsive to the user's finger movement along the external surface of the touchpad to generate touchpad signals and a touchpad processing module that receives the touchpad signals and determines user finger movement data.
- the device further includes a user identification database storing data corresponding to attributes of a plurality of known users, wherein the attributes of the plurality of the known users are used to identify a user, and wherein the attributes include holding patterns of the plurality of known users, trajectories corresponding to movement of the device by each of the plurality of known users, and user finger movement data of the plurality of known users.
- the device is further comprised of a user identification module configured to receive identification information of the user and identify the user based on the identification information and the attributes of the plurality of known users, wherein the identification information includes the user's holding pattern, the user's trajectory for movement of the device, and the user's finger movement data.
- a user identification module configured to receive identification information of the user and identify the user based on the identification information and the attributes of the plurality of known users, wherein the identification information includes the user's holding pattern, the user's trajectory for movement of the device, and the user's finger movement data.
- FIG. 1 illustrates an exemplary remote control system for an electronic product having a display screen and having a handheld remote control unit that includes at least one touchpad disposed for actuation by a user's thumb;
- FIGS. 2A and 2B are exemplary views of a touchpad surface, useful in understanding how a user's hand size can affect usability of the touchpad surface;
- FIG. 3 is a schematic representation of a remote control unit having plural touchpads and an array of capacitive sensors about the periphery of the remote control unit;
- FIG. 4 is a system level architecture of a user identification system
- FIG. 5 is a diagram of the architecture of the user identification module
- FIG. 6 is an exemplary view of the points of contact between the array of touch sensitive sensors on the device and a user's hand;
- FIG. 7 is a flow diagram of an exemplary method for identifying a user based on the way the user grabs the handheld device
- FIG. 8 is an exemplary view of two different trajectories corresponding to two different users
- FIG. 9 is a flow diagram of an exemplary method for identifying a user based on the trajectory corresponding to the movement of the remote control
- FIG. 10 is an exemplary view of a user touching the touchpad of the remote control and the corresponding thumb vector
- FIG. 11 is a flow diagram of an exemplary method for identifying a user based on the user's first touch of the touchpad
- FIG. 12 is an exemplary view of a user drawing a circle on the touchpad of the remote control
- FIG. 13 is a flow diagram of an exemplary method for identifying a user based on the user drawing a shape on the touchpad;
- FIG. 14 is an diagram illustrating the combination of various user identification methods to arrive to a user identification
- FIG. 15 is a flow diagram depicting various statistical learning and data mining techniques used in performing user identification
- FIG. 16 is a flow diagram depicting an exemplary method of performing unsupervised learning of users.
- FIG. 17 is an exemplary view of two clusters corresponding to two different users, and a user to be identified in relation to the two clusters.
- a system and method for user identification is herein disclosed.
- the system combines one or more user identification techniques to authenticate and/or identify a user.
- the techniques may be passive techniques such as identifying a user by the way which the user grabs the handheld device, the trajectory that the device follows when the user picks up the remote, the user's first touch of the device, and the user's heartbeat.
- the techniques may also be semi-passive, such as having the user draw a shape, e.g. a circle, on a touch-sensitive surface of the handheld device.
- the techniques are first explained as applied to a remote control 12 that may be used with a television, a set-top box, a computer, an entertainment center or other host device. It will be apparent that the techniques are also applied to all handheld devices where user identification would benefit the user. Such applications are described in greater detail below.
- the remote control system includes a handheld remote control 12 that sends control instructions, preferably wirelessly, to an electronic product 14 having a display screen 16 .
- the remote control 12 includes a complement of push buttons 18 and a pair of touchpads 20 .
- the remote control 12 unit is bilaterally symmetrical so that it will function in the same way regardless of which touchpad is proximate the user's thumb.
- the handheld remote control 12 has an orientation sensor (not shown) to detect in what orientation the unit is being held.
- any type of communication interface between the handheld remote control 12 unit and the electronic product can be utilized.
- a wireless transmitting device shown diagrammatically at 24 and a wireless receiving device, shown diagrammatically at 22 , are illustrated.
- wireless communication can be accomplished using infrared, ultrasonic and radio frequencies, and further utilizing a variety of different communication protocols, including infrared communication protocols, Bluetooth, WiFi, and the like.
- Communication can be unilateral (from remote control unit 12 to electronic product 14 ) or bilateral.
- a control region is defined on the screen, within which a user-controlled selection indicator may be visually displayed.
- a visual facsimile of the remote control 12 unit itself is displayed on a display screen 16 as at 26 .
- a user-controlled selection indicator in the form of a graphical depiction of the user's thumb 30 is displayed. Movement of the user's thumb upon touchpad 20 causes corresponding movement of the selection indicator 30 .
- Regions on the touchpad 20 are mapped one-to-one onto the control region of the screen.
- the typical computer track pad does not employ such one-to-one relationship, but rather it uses a relative mapping to mimic performance of a computer mouse which can be lifted and then repositioned.
- the system herein disclosed may be used, for example, to identify a mapping for the remote based on the user identification.
- the illustrated embodiment uses a one-to-one mapping between the touchpad surface and the control region, this mapping is altered to accommodate the hand size characteristics of the user.
- FIGS. 2A and 2B an exemplary pattern of numbers and letters have been illustrated on the touchpad, in the mapped positions where they would be most easily accessible to a person with a small hand ( FIG. 2A ) and a large hand ( FIG. 2B ). Compare these mapped locations with the corresponding locations on the control region 26 ( FIG. 1 ).
- the user identification may be used to configure other aspects of the host device controlled by the remote control. For example, if the host device is a set-top box for a television, the user identification may be used to restrict access to certain channels for certain users. Additionally, a list of pre-programmed favorite channels or settings may be loaded onto the set-top box. More examples are provided below.
- the remote control 12 is diagrammatically depicted with two touchpads 20 .
- a capacitive touch array is depicted at 40 . It is envisioned that other touch sensitive sensors may be used in combination with or instead of the capacitive sensors in a capacitive touch array.
- a sensor that provides a resistance relative to the contact points between the user and the device may be used.
- Such sensors are currently being developed and provide a higher dimensional data set, which is advantageous when identifying a user out of many users.
- These electrode matrix sensors have one sensor that transmits a signal, e.g. an electric current, and a plurality of receptors that receive the signal via the user's hand (or other body part).
- the transmitter and the plurality of receptors are placed along the exterior surface of the remote control 12 .
- the plurality of receptors are oriented spatially around the transmitter. The distance between each receptor and the transmitter is known, as is the current and voltage of the transmitted electrical signal.
- each transmission increases the dimensionality of the data by a factor of X, where X is the ratio of receptors to a transmitter, as each receptor will generate a resistance value.
- X is the ratio of receptors to a transmitter
- transfer functions may be performed on the transmitted signals resulting in the communications between the transmitters and the corresponding receptors to further increase the dimensionality.
- the remote may also include acoustic (not shown) and optical sensors (not shown), inertial sensors (not shown), a pulse oximiter (not shown) and thermal sensors (not shown).
- the sensors may receive signals from a user holding the remote control 12 with one hand or with two hands.
- the user may hold the remote control 12 in an operative position with one hand.
- the user may also hold the remote control 12 with both hands, much like a video game controller, for purposes of identification.
- FIG. 4 illustrates possible identification inputs and possible data used to identify a user.
- the remote control 12 or handheld device may identify a user based on a number of inputs.
- the remote control 12 uses data from sensors 52 - 60 that may be used for other functional applications to identify the users.
- the data is received and processed by the user identification module 50 .
- the user identification module will access a user identification database 64 containing data specific to each known user.
- the types of data may include one or more of the following: hold/grab patterns 64 received from touch sensitive sensors 52 ; trajectory data 66 received from inertial/motion sensors such as accelerometers and gyroscopes; heartbeat data 68 received from an acoustic sensor 58 or other types of sensors; face or torso data 70 received from an optical sensor 56 ; first touch data 72 received from a touch pad sensor 20 and arcuate data 74 received from the touchpad sensor 20 . It is understood that the lists of sensors and data types are not limiting, it is envisioned that other types of data may be received from the specified sensors and that other types of sensors may receive the listed data or inputs. Furthermore, it is envisioned that as little as one input type may be used to identify a user or any combination of inputs may be used to identify the user.
- FIG. 5 is a detailed depiction of an exemplary user identification module 50 .
- User identification module 50 may have a processing module for each type of input.
- user identification module 50 may include a touch processing module 80 for processing data from the touch sensors 52 ; a motion processing module 82 for processing data from the motion and inertial sensors 54 ; an optical processing module 84 for processing data from the optical sensors 56 ; an acoustic processing module 86 for processing data from the acoustic sensors 58 and a touchpad processing module 88 for processing data from the touchpad 60 . Greater detail of each type of data and its respective processing module are described below.
- remote control 12 may not have the processing power to handle such calculations. If this is the case, then remote control 12 may communicate the input data to the recognition module 50 residing on the host device.
- the various processing modules receive raw data from the respective sensor and process the data to a form that may be used by recognition module 90 , which may use one or more of a k-means clustering method, a support vector machines (SVM) method, a hidden Markov model method, and a linear regression method to identify a user based on the processed data.
- recognition module 90 may use one or more of a k-means clustering method, a support vector machines (SVM) method, a hidden Markov model method, and a linear regression method to identify a user based on the processed data.
- the various sensors will produce high dimensional data sets, which is beneficial for identification.
- User identification module 50 may also perform feature extraction on the input data set, so that the high dimensional data set can be more easily processed.
- Dimensionality reduction methods such as principle component analysis (PCA) and isomap may be implemented by recognition module.
- PCA principle component analysis
- isomap may be implemented by recognition module.
- the recognition module 90 uses the processed data and the datasets contained in user identification database 62 to classify the user to be identified via various statistical learning and/or clustering methods. The recognition module 90 will then determine the user whose feature data, e.g. hold/grab, trajectory, or first touch, most resembles the input data received from the various sensors. A confidence score may also be associated with the user identification. Furthermore, recognition module 90 may generate a list of the n-nearest matches in the user identification database 62 . Additionally, as is described later, recognition module 90 may generate a user identification.
- feature data e.g. hold/grab, trajectory, or first touch
- Recognition module 90 may take as parameters, a data set and data type. Based on the data type, recognition module 90 may select which learning or clustering technique to use and which data types to access in user identification database 62 .
- FIG. 6 illustrates an example of a user grabbing a remote.
- some of the individual elements of the touch array 102 and 104 are activated (those in close proximity to the touching portions of the user's hand).
- This holding pattern gives some measure of the user's identity.
- no user will hold the remote control 12 unit in exactly the same way each time he or she picks it up.
- each user's touch array observation data can be expected to vary from use to use and even from moment to moment.
- a presently preferred embodiment uses a model-based pattern classification system to convert the holding pattern observation data into user identification, hand size and holding position information. As can be seen the user's palm and fingers result in pressure against the capacitance sensors 102 and 104 .
- the capacitance sensors collectively transmit signals to the touch sensor processing module 80 , ( FIG. 5 ), in the form of raw data. These observation signals represent which sensors are currently contact points between the user's hand and the remote control 12 .
- the touch sensor processing module 80 may transform the raw data into a format usable by the recognition module 90 .
- the data may be structured in a vector or matrix whose elements represent the various sensors.
- the transformed data may be used to find a match in the user identification database 64 .
- touch sensor processing module 80 may extrapolate additional data from the raw data. For example, it is discernable which hand (left or right) is grabbing the remote. It is also discernable which sensors were activated by the palm and which sensors are activated by the fingers, due to the fact that the palm is continuous and the fingers have gaps between them. Based on which hand is grabbing the remote and the position of the fingers and the palm, the touch sensor processing module 80 may extrapolate an estimated hand size. Additional feature data may be extrapolated from the initial grab, such as a holding pattern. For example, some users will use three fingers to grab the remote on the side, while other user will use a four finger hold. Also, information relating to the pressure applied to each sensor may also be included. The collection of extrapolated feature data may be used to find a match in the user identification database 64 .
- step S 110 the raw data corresponding to the activated touch sensors is received by touch sensor processing module 80 .
- Touch sensor processing module 80 will determine whether the remote was grabbed by a left hand or right hand at step S 112 .
- step S 114 touch processing module 70 will extract features relevant to the grab/hold position, by separating the data into palm and finger data.
- step S 116 touch processing module 70 will determine the palm occlusion patterns. Information such as pressure and points of higher pressure may be extrapolated based on the signals received from the touch sensors. Furthermore, the amount of activated sensors may be used to extract the width of the palm occlusion patterns.
- touch processing module 70 may estimate a hand position for portions of the palm that are not in contact with the sensors, based on the palm positions that are known to touch processing module 70 . Learned models or known physiological models may be used to estimate the hand position. Based on steps S 116 and S 118 , a user hand size may be calculated at S 120 .
- the finger occlusion patterns are calculated. Similar to the palm occlusion patterns, pressure and pressure points may be determined from the received touch sensor data. From this data, touch processing module 80 can determine the amount of fingers used to grab the remote and the spacing between fingers. The finger occlusion patterns may be combined with the results of step S 118 and S 116 , which provide an estimate of the palm portion of the hand, to determine a hold pattern. This data, along with the hand size data, may be communicated to recognition module 90 and used to match a user in the user identification database at step S 126 . It is envisioned that many different methods of matching a user may be used. For example, a k-means clustering may be performed on the processed data and the user identification data.
- a confidence score may be attached to the identification, or an n-nearest match list of possible users. In the event a confidence score is used, the system may require a confidence score to exceed a predetermined threshold to identify a user. In the event an n-nearest match list is produced, other identification techniques may be used to pare down the list.
- the method of identifying a user may not initiate until the remote has reached a resting point.
- the user will grab the remote control 12 , pick up the remote control 12 , and then reach the hold position of the remote control 12 .
- the inertial sensors indicate that the remote control 12 has reached a steady position, e.g. acceleration or velocity are below a predetermined threshold, then the user identification process may commence.
- variations in the initial grab of the remote are entirely ignored, as the grab pattern may be equally dependent on the location of the remote control and the user, e.g. user will grab a remote differently if behind the user or lodged between two couch cushions.
- Hold/grab pattern matching may be used as a sole means of user identification, a primary means of user identification or a partial means of user identification.
- Preliminary research reveals that in a small user group (5 users or the size of a family), hold/grab patterns result in about an 87.5% accuracy in user identification.
- 87.5% may be a sufficient identification accuracy.
- hand/grab pattern matching may be used as one of a number of matching techniques.
- FIG. 8 depicts an example of a user trajectory that may be used to identify a user.
- two trajectories 136 A and 136 B corresponding to two users 134 A and 134 B are depicted.
- the remote control 12 is depicted in a resting state on a coffee table 138 .
- the remote control 12 has been grabbed by user 134 A and moved to position 132 A by following trajectory 136 A.
- the remote control 12 has been grabbed by user 134 B and moved to position 132 B by following trajectory 1368 .
- the two users may be differentiated based on the trajectories of remote control 12 .
- the remote control 12 will know when it is held by a user and when it is at rest based on the activation of the touch sensors.
- the remote control 12 may have one or more accelerometers and/or one or more gyroscopes. It should be appreciated any type of inertial sensors, such as the various types of gyroscopes and various types of accelerometers may be used.
- FIG. 9 illustrates an exemplary method of identifying a user using trajectory data.
- the motion processing module 82 receives the sensor inputs from the gyroscope and the accelerometer.
- the motion processing module 82 must determine a starting location.
- Motion processing module 82 may execute steps 146 and/or 142 and 144 to determine a starting location.
- the resting state and the hold position are the respective start point and end points to the trajectory.
- the remote control 12 may implement one or more techniques for estimating a starting location.
- the remote control 12 may keep track of where the remote was placed into the resting position.
- the inertial data from the sensors may be used to determine a resting location.
- the accelerometer and gyroscope should be continuously outputting accelerations and velocities to the motion sensor processing module 72 .
- the motion processing module 72 will use the most recent known location, e.g. the previous resting position, and dead reckoning to determine a location.
- the motion processing unit 82 may also receive timing data for purposes of calculating a position based on acceleration and velocity vectors.
- motion processing module 82 may store the new resting position as the last location. When a last known location is recorded, then motion processing module 82 may retrieve the last known location at step S 146 upon a user grabbing the remote control 12 .
- the trajectory processing module 82 may increase prediction accuracy if a starting location is known. It is appreciated that one underlying reason is that the starting location and the trajectory are dependent on one another. The dependency, however, is not necessarily the exact geographic location, but rather the relative location of the remote. For example, a user picking up the remote from the far right end will likely take a similar trajectory when picking the remote up off of the center of the coffee table. The trajectory will differ, however, when the user picks up the remote control 12 off of the couch. Thus, step S 148 , described above, does not require a pinpoint location. Rather a general location, or a cluster of locations may be used as the starting location. Trajectory processing module may use a k-means clustering algorithm of known locations and an estimated starting location to determine the general starting location.
- the remote control 12 may ask a user to verify a location periodically.
- there may be a location registration phase where the user preprograms the n-most likely locations of a remote.
- the user could enter a coffee table, a couch, a side table, the floor, the entertainment center, etc. In such a registration process, the user would have to define the locations with respect to each other.
- the motion processing module 82 will determine a location based on the motion itself.
- the motion processing module 82 receives the sensor data and determines a reference trajectory using dead reckoning techniques at step S 142 .
- the trajectory is a reference trajectory because it assumes a starting point of (0, 0, 0) and further is used as a reference to determine a starting location.
- the motion processing module 82 may use a k-means cluster algorithm, using the received trajectory as input, to determine the most likely starting location.
- the motion processing module may use the resting location (starting point), the hold position (the end point), and the sensor data, to determine a trajectory.
- the module processing module 82 will then attempt to find a match in the user identification database 62 for the trajectory, based on the starting point and the trajectory itself.
- the reason that both parameters may be of importance is that the categorization of a trajectory is dependent on the starting point. For example, two identical trajectories may be reported to motion sensor processing module 72 , despite one trajectory beginning on the coffee table and one trajectory beginning on the center of the floor. Without a starting location, it may be very difficult to differentiate the two trajectories.
- the motion processing module 72 may differentiate between the trajectories because one started from the floor, while the other started from the coffee table. Thus, it may be determined, for example, that a shorter user (e.g. a child) picked up the remote from the floor in a standing position, and that a taller (e.g. an adult) user picked up the remote from the coffee table. It is envisioned that learning methods such as support vector machines or k-means clustering may be used to determine a user identification based on the calculated trajectory and starting point. It should be noted that the starting point, e.g. couch or coffee table, may be used to pare down the set of trajectories that the input trajectory is compared with.
- the starting point e.g. couch or coffee table
- the motion processing module 82 determines that the user picked the remote control 12 up from the coffee table, i.e. the input trajectory originated from the coffee table, only the set of trajectories originating from the coffee table are used to generate a user identification.
- the starting point of the trajectory is ignored. Rather, a vector representing the relative motion of the remote control is used to identify the user. Thus, the trajectory is assumed to always begin at a (0,0,0) position.
- trajectory matching may be used as a sole means of user identification, a primary means of user identification or a partial means of user identification. Depending on the scale of the system and the application of the underlying system, trajectory matching may provide sufficient identification accuracy. However, for more sensitive login environments more accuracy may be needed and thus, trajectory matching may be used as one of a number of identification techniques.
- FIG. 10 illustrates a user's first touch of the touchpad 20 of the remote control 12 .
- a user when using the remote control 12 will do three things: 1) grab the remote; 2) pick up the remote; and 3) touch the touchpad 20 .
- a user may be identified from the extracted data without having to actively enter user identification information.
- a third way of identifying a user is based on a first touch of a user.
- a user Based on the hold pattern associated with a user, a user will have a fairly unique first touch position due to the fact that users generally have different bone and joint structures in the hand. Thus, a user may be further identified based on the hold position and the first touch of the touch pad.
- FIG. 11 is a flow diagram depicting an exemplary method of identifying a user based on a first touch of the remote control 12 .
- a user's hold position is detected and determined. The process is described in greater detail above.
- the user's first touch is determined.
- the user's first touch may be an (x,y) coordinate on the touchpad 20 .
- touchpad processing module 88 may extrapolate additional information relating to the user's thumb. For example, touchpad processing module 88 determines the angle at which the thumb holds/curves around the remote control 12 . Also, a thumb length may be determined.
- a vector corresponding to the user thumb 160 may be calculated at step S 154 .
- the thumb vector 160 may be a four dimensional vector having an x value, a y value, an x offset and a y offset, wherein one of the corners of the touchpad is used as the origin. This vector may be communicated to and used by recognition module 90 to find a match in user identification database 62 at step S 156 .
- first touch data may be used as a sole means of user identification, a primary means of user identification or a partial means of user identification. Depending on the scale of the system and the application of the underlying system, first touch data may provide sufficient identification accuracy. However, for more sensitive login environments more accuracy may be needed and thus, first touch data may be used as one of a number of identification techniques.
- FIG. 12 illustrates a user drawing a circle on the touchpad 20 of the remote control 12 .
- a semi-passive approach for identifying a user wherein the user traces a shape, preferably a circle, on the touchpad 20 .
- the user traces a circle 162 on the touchpad 20 .
- any shape may be used.
- the purpose of having the user trace a shape on the touchpad is to extract kinematics data from the user's motion. For example, when the user traces a counterclockwise circle, there are four strokes that will typically occur.
- the first is from 12 to 9, the second from 9 to 6, the third from 6 to 3, and the last from 3 to 12.
- a user may slide the thumb one position to the next or may slightly bend the thumb, which will result in different arcuate trajectories.
- the user may make small strokes or large strokes.
- the user may draw the circle clock-wise or counter-clockwise.
- timing data may also be extrapolated and used to identify the user. It should be apparent that the permutations of different stroke attributes are great.
- the amount of permutations corresponding to a user drawn circle provides for a high accuracy rate for identifying a user.
- FIG. 13 describes an exemplary method of identifying a user by having the user draw a shape on the touchpad 20 .
- touchpad processing module 88 receives the arcuate trajectory data corresponding to the drawn circle.
- the arcuate trajectory data may come in the form of triples (x, y, t), wherein every x,y coordinate is given a time stamp.
- the set of triples may undergo linear stretching.
- the arcuate trajectory data may be stretched to 130% the median length.
- the stretched arcuate trajectory data may undergo a principle component analysis (PCA) to reduce the dimensionality of the data set.
- PCA principle component analysis
- the reduced data sets may then be clustered using k means clustering, where k is selected as the number of users.
- a matching user may be identified by the cluster that the transformed arcuate trajectory data falls into.
- the timing data is initially removed and only the coordinate data, i.e. the (x,y) components of the data are used in steps S 172 -S 178 .
- the results of the k-means clustering may be rescored using the timing data at step S 180 . Once rescored, a user may be identified at step S 182 .
- shape drawing may be used as a sole means of user identification, a primary means of user identification or a partial means of user identification. In fact, shape drawing typically provides very high identification accuracy rates. Shape drawing, however, is not a passive approach and may, therefore, be implemented as a back up method when the system is unsure of the user's identity after using the passive identification techniques. Depending on the scale of the system and the application of the underlying system, shape drawing may be an advantageous means of protecting more sensitive login environments.
- an acoustic sensor 58 may be used to detect a user's heartbeat.
- the acoustic sensors 58 may be strategically placed along-side the outer covering of the remote control, whereby the entire remote control 12 acts as an acoustic antenna.
- the acoustic sensors detect the heartbeat and transmit the data to an acoustic processing module 86 .
- Acoustic processing module 86 may process the received data so that a frequency and amplitude of the heartbeat may be determined.
- the recognition module 90 may then use one or more of the statistical learning or data mining techniques described above to determine if there exists a matching user in the user identification database 62 based on the user's heartbeat characteristics.
- sensors may be used to monitor a user's heartbeat or related statistics.
- a pulse oximeter may be used to measure a patient's pulse or blood-oxygen levels.
- an ultra-sensitive accelerometer may be used to detect vibrations resulting from the user's pulse.
- an impulse response system may further be used.
- An impulse response system is essentially comprised of a speaker and a microphone. The microphone emits a high-frequency sound wave that reverberates through the user's hand. The sound wave may be deflected back to the microphone, where the sensor is able to discern the augmentation of the sound wave.
- the impulse-response sensors may also be used to measure a user's pulse.
- the optical sensor 56 may be located on the remote control or on the host device.
- the optical sensor 56 may be used to receive image data of the user.
- the image processing module 86 may perform face-recognition or torso recognition on the user for purposes of identifying the user.
- a thermal sensor placed on the outside covering of the remote control 12 .
- the thermal sensors may be used to determine a user's body temperature. Typically, an identification based solely on body temperature may not be reliable. Body temperature data, however, may be useful in increasing the dimensionality of the data sets so that greater separation results in the collection of user attribute data sets.
- each of the individual techniques may have a confidence score associated with an identification.
- an n-nearest match list may also be generated for each identification technique. Based on either the confidence scores or the n-nearest neighbors of multiple user identification efforts, a more accurate user identification may be realized.
- the grab/hold identification method resulted in an 87.5% accuracy rate
- the accelerometer-only based trajectory identification resulted in a 77.5% accuracy rate
- the gyroscope-only based trajectory identification resulted in a 65% accuracy rate.
- the circle-drawing based identification resulted in a 97.5% authentication accuracy.
- Combined identification module 194 may combine the individual user identifications to come to a more robust user identification. Each method may further produce an n-nearest list of matches, each entry in the list having its own confidence score. For each user, a weighted average of each of the confidence scores may be calculated by combined identification module 194 . The combined identification module 194 may determine a user identification 196 based on the user having the highest weighted average. It is envisioned that other methods of determining a user based on a combination of various identification methods may also be used.
- FIG. 15 depicts an exemplary method for processing the sensor data.
- various sensors will provide input data 200 a - 200 n .
- the data provided may be received from a variety of sensors, e.g. touch sensors, inertial sensors, touchpad, acoustic, etc., or it may be received from one sensor type that produces lots of data, e.g. many touch sensors or lots of inertial data. In either case, the data set will be large.
- the input data 200 a - 200 n may first undergo feature extraction 202 .
- Various techniques for dimensionality reduction may be used. For example, the data set may undergo a principle component analysis or a linear discriminate analysis.
- a feature vector 204 representing the data set but in a lower dimensionality is generated and communicated to a segmentation module 208 .
- a segmentation module 208 separates portions of the feature vector 204 into segments representing different states. For example, in the example of the remote control being raised in FIG. 8 , the remote control was first on a table top, at rest. Next, the remote control was grabbed, but remained on/or near the table. The remote control then quickly accelerates through the pick up phase. The remote control then reaches a hold position. At the hold position, the remote control will likely have velocity and acceleration but not at the magnitude observed during the initial pickup. Finally, the remote control will be placed back onto a stable surface such as the couch or the table top. Additionally, the user may drop the remote control or may move the remote control to transmit a command to the host device. It should be apparent that not all of these segments are relevant for purposes of user identification.
- the inertial sensors may be continuously transmitting data. Thus, it may be beneficial to further reduce the data that needs to be analyzed, by segmenting the data.
- the segmentation of data occurs by comparing the data against various segment models 210 .
- the segment models 210 may be in the form of hidden Markov models representing the various states. By comparing chunks of data against the segment models 210 , it can be determined with a reasonable probability the state of the data.
- the feature vector can then be classified according to at least one of the various segments 212 a - 212 n . Moreover, if only a certain portion of the feature vector is relevant, the segment selection module may 214 reduce the feature vector so that only the relevant segment is classified. It is appreciated, however, that the feature vector does not need to be reduced.
- the segment selection module 214 will select the state of the feature data, and may select the relevant segments 212 a - 212 n for classification.
- the segment selection module may be configured to only select trajectories of the remote when picked up and when at the rest position.
- the feature vector 204 or the selected segments of the feature vector 204 are communicated to a classifier 216 .
- the classifier 216 may use a clustering analysis to determine a user identification. In the cluster analysis, the selected segments are analyzed with user models 220 a - 220 n .
- the user models 220 a - 220 n represent the attributes of the various users.
- the segment selection module 214 can also communicate the selected segment to classifier 224 for purposes of classifying the feature vector with the most relevant data. For example, when segment selection module 214 determines that the feature vector 204 is primarily trajectory data corresponding to a remote control pickup, classifier, when accessing the user models, will only retrieve the segments of user models 220 a - 220 n , that correspond to trajectory data. Because the user models have been previously classified, the classifier only needs to determine the cluster, i.e. which user model 220 a - 220 n , that the feature vector or selected segment of the feature vector belongs to. As previously mentioned, classifier 224 may execute a clustering algorithm, such as k-means clustering, to determine the cluster that the feature vector most belongs to. The determined cluster will correspond with the user's identify. Thus, a user identification 222 may be made by the system.
- a clustering algorithm such as k-means clustering
- the various methods of identification all rely on at least one matching, learning or classification method, such as support vector machines, k-means clustering, hidden Markov models, etc.
- learning or classification method such as support vector machines, k-means clustering, hidden Markov models, etc.
- the data sets such as grab/hold data, trajectory data, first touch data, arcuate trajectory data, and heartbeat data, may be collected using unsupervised or supervised learning techniques.
- a first method of collecting the various data sets is by implementing a training session.
- Each user may register with the system, e.g. the host device.
- the registering user will be asked to repeatedly perform various tasks such as grabbing the remote control, picking up the remote control, or drawing a circle on the touch pad of the remote control.
- the collected training data are used to define a user's tendencies for purposes of identification.
- the input data, used for identification may be added to the training data upon each successful identification.
- the user can verify a correct user identification and correct an improper identification to increase the robustness of the system.
- a second method of collecting the various data sets is by implementing an unsupervised learning process that differentiates the users over the course of the remote control's usage. Take for example a family that owns a Digital Video Recorder (DVR). The father has large hands and grabs the remote control using three fingers. The wife has a small hands and grabs the remote with four fingers. The child has small hands and the remote with three fingers. Furthermore, the father records sports-related programming and reality television. The mother records sitcoms and police dramas. The child records cartoons and animal shows. Over the course of an initial period, the system will differentiate the three users based on the differences of hand size and hold patterns. Over the initial period, the identification system will also learn that the user with large hands and a three finger grab is associated with the sports and reality television programming.
- DVR Digital Video Recorder
- the system can then map the user preferences or profile to the extrapolated hold pattern data.
- a user may grab the remote control and have his or her preferences readily set based only on past usage and the initial picking up of the remote. Using this method, the user will never actually engage in training the system, but user identification will be realized over the course of time.
- FIG. 16 depicts a method used to identify a user and train the user identification database. It is noted that FIG. 16 contains most of the components found in FIG. 15 . The primary difference is that FIG. 16 contains a generic model database 224 .
- the background model database contains preprogrammed user templates, so that the system has background models to analyze alongside the user models. When a user first uses the system (i.e. the remote controls first use), there will be no user models 220 a - 220 n .
- the learning module may recognize this and automatically register the new user.
- the input data is processed as shown in FIG. 15 , so that all relevant segments and corresponding attributes are stored in the new user model.
- the system will receive the user input data, reduce the dimensionality, select the relevant segments of data, and run the selected segments against the user models and the background models.
- the user identification module will likely identify the user as the user.
- the identification will have a corresponding probability, indicating a confidence in the user identification. If the probability does not exceed a predetermined threshold, then the user identification module assumes that the user is a new user, and creates a new user model for the user, using the input data as the attributes of the new user model. If the corresponding probability, i.e. the confidence score, exceeds the threshold, then the user identification module identifies the user, and adds the input features into the user's user model.
- the confidence scores associated with the identification of the exemplary user will also increase.
- FIG. 16 is now described in greater detail. Components found in both FIGS. 15 and 16 have been numbered as such. Similar to FIG. 15 , input data 200 a - 200 n is received and undergoes feature extraction 202 . The result is a feature vector 204 , which is then segmented by the segmentation module 208 , using segment models 210 as models for determining the segmentation of the data. The feature vector 204 is then broken down into segments. The segment selection module 214 selects the relevant segments and communicates the relevant segments to the classifier 224 . The classifier 224 operates slightly differently than the classifier 216 of FIG. 15 .
- the classifier 224 receives the background models 226 a - 226 n in addition to the user models 220 a - 220 n .
- the classifier determines a user identification using a clustering algorithm.
- the user identification will have a probability associated with it. If the probability that the user identification exceeds a threshold, classifier 224 generates user identification 222 . If the probability does not exceed the threshold or if the identified user is a background variable, then the classifier passes the data to a model generation module 230 , which generates a new model based on the relevant attributes.
- the new model 232 is communicated to the user model database 218 .
- FIG. 17 depicts two hypothetical sets 230 and 232 of user identification data represented by black dots for a first user and circles for a second user.
- the six point star 234 represents a user identification attempt.
- the six point star 234 clearly falls within the first user's data set 230 .
- the system can predict that the user to be identified is the first user with a high probability based on the cluster in which the identification attempt is closest to. It is appreciated that the more biometric features that are used in an authentication event, the greater the dimensionality of the data sets used for identification. When data sets of higher dimensionality are used for identification, a greater amount of separation will be realized between the clusters of data.
- the sensors described above, as well as the identification methods described above will be used in various handheld devices such as cell phones, portable phones, mp3 players, personal DVD players, PDAs, and computer mice.
- the phone may determine that a first user or a plurality of users is using the phone. Based on this, specific settings such as a phonebook, saved text messages, saved emails, volume settings, screen settings, wall paper and saved files such as photos will become available to the user.
- specific settings such as a phonebook, saved text messages, saved emails, volume settings, screen settings, wall paper and saved files such as photos will become available to the user.
- the first user's music library may be accessible to the user after identification.
- schedules and contacts personal to the first user are made available to the user, only after the user grabs the PDA and is identified.
- the methods disclosed above may be used to identify and authenticate the user.
- the user may then be automatically logged onto her user profile. Further, the user may leave the computer and upon another user touching the mouse or mouse pad, the device will be able to determine that the user has changed. At this point, the second user may be locked out of the first user's profile until an explicit override instruction is provided by the first user.
- module may refer to, be part of, or include an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and/or memory (shared, dedicated, or group) that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality. It is should be understood that when describing a software or firmware program, the term module may refer to machine readable instructions residing on an electronic memory.
- ASIC Application Specific Integrated Circuit
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
- Details Of Television Systems (AREA)
- Selective Calling Equipment (AREA)
- Telephone Function (AREA)
- Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
Abstract
A system and method for identifying a user of a handheld device is herein disclosed. The device implementing the method and system may attempt to identify a user based on signals that are incidental to a user's handling of the device. The signals are generated by a variety of sensors dispersed along the periphery or within the housing. The sensors range may include touch sensors, inertial sensors, acoustic sensors, pulse oximiters, and a touchpad. Based on the sensors and corresponding signals, identification information is generated. The identification information is used to identify the user of the handheld device. The handheld device may implement various statistical learning and data mining techniques to increase the robustness of the system. The device may also authenticate the user based on the user drawing a circle, or other shape.
Description
- This application claims the benefit of U.S. Provisional Application No. 61/046,578, filed on Apr. 21, 2008, the entire disclosure of which is incorporated herein by reference.
- The present invention relates generally to method an system for identifying a user of a handheld device, e.g. remote control systems. Many systems would benefit from easy, non-intrusive user identifications. For reference only, many aspects of the invention and the background relating to the inventions are described in relation to a remote control system, suitable for control of consumer electronic products and home appliances, that includes a touch sensitive handheld remote control unit that detects holding and grabbing patterns of the user as well as other characteristics such as the trajectory at which the user raises the remote and first touches the remote to identify the user.
- Handheld remote control units, typically featuring a large plurality of push buttons, are now quite commonplace on coffee tables throughout the world. With most consumer electronic products, it is customary for the manufacturer to furnish such a handheld remote control with each unit. Thus, most consumers own a collection of various different remote control units, each associated with a particular product or appliance.
- In an effort to simplify matters, the Applicants' assignee has developed several different embodiments of a touch-sensitive remote control unit that features a reduced number of push buttons and one or more touch-sensitive touchpads that may be manipulated by the user's fingers or thumb to interact with information on a display screen. The touch pads may be manipulated, for example, to move a selection indicator (such as a cursor or other graphical element) across a control region upon a display screen. In some applications, the display screen will be separate from the handheld remote control unit, and thus the user manipulates the selection indicator by watching the display screen while manipulating the keypad with a finger or thumb. Preferably, the touchpad or touchpads are disposed on the remote control unit so that they can be manipulated by the user's thumb while the user is holding the unit in one hand. Furthermore, the remote control has touch sensitive sensors on its outer casing sensitive to a user's touch.
- As multiple users may use a single remote control and corresponding host device, there is a growing demand for a means of identifying which user is using the remote control. Methods such as logging in with a user name and password are time consuming and annoying to users. Thus, there is a need for a means to identify a user that requires minimal user interaction. Optimally, it would be beneficial to allow a user to be identified without the user having to enter or perform any identification tasks. Such a passive means of identification would result in virtually no user interaction on the part of the user.
- We have therefore developed a remote control system that implements various passive and semi-passive user identification methods. The methods range from the grab/hold patterns by which the user holds a remote, to the trajectory that the remote follows when grabbed by the user.
- This section provides a general summary of the disclosure, and is not a comprehensive disclosure of its full scope or all of its features.
- In one sense, the present invention relates to a system and method for identifying a user of a handheld device. The handheld electronic device comprises a housing and a sensor system disposed along a periphery of the housing. The sensor system is responsive to a plurality of simultaneous points of contact between a user's hand and the device to generate observation signals indicative of the plurality of contact points between the user's hand and the device. The handheld electronic device further includes a user identification database storing data corresponding attributes of a plurality of known users, wherein the attributes of the plurality of known users are used to identify a user. The device further comprises a user identification module configured to receive the observation signals from the sensor system and identify the user from the observation signals and the attributes of the plurality of users.
- In a second sense, the present relates to a handheld electronic device comprising a housing and a touchpad responsive to a finger movement of a user that generates a touchpad signal corresponding to the finger movement. The device further includes a touchpad processing module that receives the touchpad signal and generates finger movement data based on said touchpad signal. The handheld electronic device further includes a user identification database that stores user identification data corresponding to physical attributes of a plurality of known users, wherein physical attributes includes finger movement of a user drawing a predetermined object. The device is further comprised of a user identification module that receives finger movement data of the user and identifies the user based on the finger movement data and the user identification data, wherein the finger movement data is the user drawing the predefined shape.
- In a third sense, a handheld electronic device is comprised of a housing and a touch sensor system disposed along a periphery of the housing. The touch sensor system is responsive to a plurality of simultaneous points of contact between a user's hand and the device to generate observation signals indicative of the plurality of contact points between the user's hand and the device. The device also includes a touch sensor processing module configured to receive the observation signals from the touch sensor system and determine a user's holding pattern. The device is further comprised of an inertial sensor embedded in the housing which is responsive to movement of the device by the user's hand to generate inertial signals and a trajectory module configured to receive the inertial signals from the inertial sensor and determine a trajectory for the movement of the device. The device also includes a touchpad located along an external surface of the housing that is responsive to the user's finger movement along the external surface of the touchpad to generate touchpad signals and a touchpad processing module that receives the touchpad signals and determines user finger movement data. The device further includes a user identification database storing data corresponding to attributes of a plurality of known users, wherein the attributes of the plurality of the known users are used to identify a user, and wherein the attributes include holding patterns of the plurality of known users, trajectories corresponding to movement of the device by each of the plurality of known users, and user finger movement data of the plurality of known users. The device is further comprised of a user identification module configured to receive identification information of the user and identify the user based on the identification information and the attributes of the plurality of known users, wherein the identification information includes the user's holding pattern, the user's trajectory for movement of the device, and the user's finger movement data.
- Further areas of applicability will become apparent from the description provided herein. The description and specific examples in this summary are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
- The drawings described herein are for illustrative purposes only of selected embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure.
-
FIG. 1 illustrates an exemplary remote control system for an electronic product having a display screen and having a handheld remote control unit that includes at least one touchpad disposed for actuation by a user's thumb; -
FIGS. 2A and 2B are exemplary views of a touchpad surface, useful in understanding how a user's hand size can affect usability of the touchpad surface; -
FIG. 3 is a schematic representation of a remote control unit having plural touchpads and an array of capacitive sensors about the periphery of the remote control unit; -
FIG. 4 is a system level architecture of a user identification system; -
FIG. 5 is a diagram of the architecture of the user identification module; -
FIG. 6 is an exemplary view of the points of contact between the array of touch sensitive sensors on the device and a user's hand; -
FIG. 7 is a flow diagram of an exemplary method for identifying a user based on the way the user grabs the handheld device; -
FIG. 8 is an exemplary view of two different trajectories corresponding to two different users; -
FIG. 9 is a flow diagram of an exemplary method for identifying a user based on the trajectory corresponding to the movement of the remote control; -
FIG. 10 is an exemplary view of a user touching the touchpad of the remote control and the corresponding thumb vector; -
FIG. 11 is a flow diagram of an exemplary method for identifying a user based on the user's first touch of the touchpad; -
FIG. 12 is an exemplary view of a user drawing a circle on the touchpad of the remote control; -
FIG. 13 is a flow diagram of an exemplary method for identifying a user based on the user drawing a shape on the touchpad; -
FIG. 14 is an diagram illustrating the combination of various user identification methods to arrive to a user identification; -
FIG. 15 is a flow diagram depicting various statistical learning and data mining techniques used in performing user identification; -
FIG. 16 is a flow diagram depicting an exemplary method of performing unsupervised learning of users; and -
FIG. 17 is an exemplary view of two clusters corresponding to two different users, and a user to be identified in relation to the two clusters. - Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.
- Example embodiments will now be described more fully with reference to the accompanying drawings.
- A system and method for user identification is herein disclosed. The system combines one or more user identification techniques to authenticate and/or identify a user. The techniques may be passive techniques such as identifying a user by the way which the user grabs the handheld device, the trajectory that the device follows when the user picks up the remote, the user's first touch of the device, and the user's heartbeat. The techniques may also be semi-passive, such as having the user draw a shape, e.g. a circle, on a touch-sensitive surface of the handheld device. For simplicity, the techniques are first explained as applied to a
remote control 12 that may be used with a television, a set-top box, a computer, an entertainment center or other host device. It will be apparent that the techniques are also applied to all handheld devices where user identification would benefit the user. Such applications are described in greater detail below. - Referring to
FIG. 1 , a remote control system for an exemplary electronic product is illustrated generally at 10. The remote control system includes a handheldremote control 12 that sends control instructions, preferably wirelessly, to anelectronic product 14 having adisplay screen 16. Theremote control 12 includes a complement ofpush buttons 18 and a pair oftouchpads 20. Note that in the illustrated embodiment, theremote control 12 unit is bilaterally symmetrical so that it will function in the same way regardless of which touchpad is proximate the user's thumb. The handheldremote control 12 has an orientation sensor (not shown) to detect in what orientation the unit is being held. - Any type of communication interface between the handheld
remote control 12 unit and the electronic product can be utilized. For purposes of illustration, a wireless transmitting device, shown diagrammatically at 24 and a wireless receiving device, shown diagrammatically at 22, are illustrated. It will be appreciated that wireless communication can be accomplished using infrared, ultrasonic and radio frequencies, and further utilizing a variety of different communication protocols, including infrared communication protocols, Bluetooth, WiFi, and the like. Communication can be unilateral (fromremote control unit 12 to electronic product 14) or bilateral. - In the illustrated embodiment, a control region is defined on the screen, within which a user-controlled selection indicator may be visually displayed. In
FIG. 1 , a visual facsimile of theremote control 12 unit itself, is displayed on adisplay screen 16 as at 26. A user-controlled selection indicator, in the form of a graphical depiction of the user'sthumb 30 is displayed. Movement of the user's thumb upontouchpad 20 causes corresponding movement of theselection indicator 30. Although similar to movement of a computer screen cursor by track pad, there is this difference. Regions on thetouchpad 20 are mapped one-to-one onto the control region of the screen. The typical computer track pad does not employ such one-to-one relationship, but rather it uses a relative mapping to mimic performance of a computer mouse which can be lifted and then repositioned. - The system herein disclosed may be used, for example, to identify a mapping for the remote based on the user identification. Although the illustrated embodiment uses a one-to-one mapping between the touchpad surface and the control region, this mapping is altered to accommodate the hand size characteristics of the user. Referring to
FIGS. 2A and 2B , an exemplary pattern of numbers and letters have been illustrated on the touchpad, in the mapped positions where they would be most easily accessible to a person with a small hand (FIG. 2A ) and a large hand (FIG. 2B ). Compare these mapped locations with the corresponding locations on the control region 26 (FIG. 1 ). Although the image displayed on the screen (FIG. 1 ) would remain the same for all users, regardless of hand size, the portion of the touchpad that actually maps to the control region is adjusted. Thus, the user with a small hand does not have to reach as far to selectnumeral 1. Conversely, the user with a large hand will find it easier to select numeral 5 without simultaneously selecting an adjacent numeral, such asnumeral 4. In effect, only a portion of the touchpad is used when the hand is small (FIG. 2A ) and this portion is then scaled up to match the entire control region shown on the display screen. - The user identification may be used to configure other aspects of the host device controlled by the remote control. For example, if the host device is a set-top box for a television, the user identification may be used to restrict access to certain channels for certain users. Additionally, a list of pre-programmed favorite channels or settings may be loaded onto the set-top box. More examples are provided below.
- Referring to
FIG. 3 , theremote control 12 is diagrammatically depicted with twotouchpads 20. A capacitive touch array is depicted at 40. It is envisioned that other touch sensitive sensors may be used in combination with or instead of the capacitive sensors in a capacitive touch array. - For example, a sensor that provides a resistance relative to the contact points between the user and the device may be used. Such sensors are currently being developed and provide a higher dimensional data set, which is advantageous when identifying a user out of many users. These electrode matrix sensors have one sensor that transmits a signal, e.g. an electric current, and a plurality of receptors that receive the signal via the user's hand (or other body part). The transmitter and the plurality of receptors are placed along the exterior surface of the
remote control 12. The plurality of receptors are oriented spatially around the transmitter. The distance between each receptor and the transmitter is known, as is the current and voltage of the transmitted electrical signal. When the electrical signal is transmitted and subsequently received by the receptors, the resistance of the user's hand at the contact points may be determined. As can be appreciated each transmission increases the dimensionality of the data by a factor of X, where X is the ratio of receptors to a transmitter, as each receptor will generate a resistance value. These sensors are particularly helpful if datasets of higher dimensionality are preferred. Furthermore, transfer functions may be performed on the transmitted signals resulting in the communications between the transmitters and the corresponding receptors to further increase the dimensionality. - The remote may also include acoustic (not shown) and optical sensors (not shown), inertial sensors (not shown), a pulse oximiter (not shown) and thermal sensors (not shown).
- It is appreciated that the sensors may receive signals from a user holding the
remote control 12 with one hand or with two hands. For example, the user may hold theremote control 12 in an operative position with one hand. The user may also hold theremote control 12 with both hands, much like a video game controller, for purposes of identification. -
FIG. 4 illustrates possible identification inputs and possible data used to identify a user. As mentioned, theremote control 12 or handheld device may identify a user based on a number of inputs. Theremote control 12 uses data from sensors 52-60 that may be used for other functional applications to identify the users. The data is received and processed by theuser identification module 50. The user identification module will access auser identification database 64 containing data specific to each known user. The types of data may include one or more of the following: hold/grabpatterns 64 received from touch sensitive sensors 52;trajectory data 66 received from inertial/motion sensors such as accelerometers and gyroscopes;heartbeat data 68 received from anacoustic sensor 58 or other types of sensors; face ortorso data 70 received from anoptical sensor 56;first touch data 72 received from atouch pad sensor 20 and arcuate data 74 received from thetouchpad sensor 20. It is understood that the lists of sensors and data types are not limiting, it is envisioned that other types of data may be received from the specified sensors and that other types of sensors may receive the listed data or inputs. Furthermore, it is envisioned that as little as one input type may be used to identify a user or any combination of inputs may be used to identify the user. -
FIG. 5 is a detailed depiction of an exemplaryuser identification module 50.User identification module 50 may have a processing module for each type of input. For example,user identification module 50 may include atouch processing module 80 for processing data from the touch sensors 52; amotion processing module 82 for processing data from the motion andinertial sensors 54; anoptical processing module 84 for processing data from theoptical sensors 56; anacoustic processing module 86 for processing data from theacoustic sensors 58 and atouchpad processing module 88 for processing data from the touchpad 60. Greater detail of each type of data and its respective processing module are described below. - It is appreciated that
user identification module 50 may reside on theremote control 12 or the host device. Due to the fact that the system implements powerful learning techniques,remote control 12 may not have the processing power to handle such calculations. If this is the case, thenremote control 12 may communicate the input data to therecognition module 50 residing on the host device. - The various processing modules receive raw data from the respective sensor and process the data to a form that may be used by
recognition module 90, which may use one or more of a k-means clustering method, a support vector machines (SVM) method, a hidden Markov model method, and a linear regression method to identify a user based on the processed data. As can be appreciated, the various sensors will produce high dimensional data sets, which is beneficial for identification.User identification module 50 may also perform feature extraction on the input data set, so that the high dimensional data set can be more easily processed. Dimensionality reduction methods such as principle component analysis (PCA) and isomap may be implemented by recognition module. Therecognition module 90 uses the processed data and the datasets contained inuser identification database 62 to classify the user to be identified via various statistical learning and/or clustering methods. Therecognition module 90 will then determine the user whose feature data, e.g. hold/grab, trajectory, or first touch, most resembles the input data received from the various sensors. A confidence score may also be associated with the user identification. Furthermore,recognition module 90 may generate a list of the n-nearest matches in theuser identification database 62. Additionally, as is described later,recognition module 90 may generate a user identification. -
Recognition module 90 may take as parameters, a data set and data type. Based on the data type,recognition module 90 may select which learning or clustering technique to use and which data types to access inuser identification database 62. -
FIG. 6 illustrates an example of a user grabbing a remote. When grasped in the user's hand, some of the individual elements of thetouch array 102 and 104 are activated (those in close proximity to the touching portions of the user's hand). This holding pattern gives some measure of the user's identity. Of course, no user will hold theremote control 12 unit in exactly the same way each time he or she picks it up. Thus, each user's touch array observation data can be expected to vary from use to use and even from moment to moment. Thus, a presently preferred embodiment uses a model-based pattern classification system to convert the holding pattern observation data into user identification, hand size and holding position information. As can be seen the user's palm and fingers result in pressure against thecapacitance sensors 102 and 104. The capacitance sensors collectively transmit signals to the touchsensor processing module 80, (FIG. 5 ), in the form of raw data. These observation signals represent which sensors are currently contact points between the user's hand and theremote control 12. The touchsensor processing module 80 may transform the raw data into a format usable by therecognition module 90. For example, the data may be structured in a vector or matrix whose elements represent the various sensors. The transformed data may be used to find a match in theuser identification database 64. - Alternatively, touch
sensor processing module 80 may extrapolate additional data from the raw data. For example, it is discernable which hand (left or right) is grabbing the remote. It is also discernable which sensors were activated by the palm and which sensors are activated by the fingers, due to the fact that the palm is continuous and the fingers have gaps between them. Based on which hand is grabbing the remote and the position of the fingers and the palm, the touchsensor processing module 80 may extrapolate an estimated hand size. Additional feature data may be extrapolated from the initial grab, such as a holding pattern. For example, some users will use three fingers to grab the remote on the side, while other user will use a four finger hold. Also, information relating to the pressure applied to each sensor may also be included. The collection of extrapolated feature data may be used to find a match in theuser identification database 64. - Referring now to
FIG. 7 , an exemplary technique for identifying a user based on the grab/hold position is now described in greater detail. At step S110, the raw data corresponding to the activated touch sensors is received by touchsensor processing module 80. Touchsensor processing module 80 will determine whether the remote was grabbed by a left hand or right hand at step S112. At step S114,touch processing module 70 will extract features relevant to the grab/hold position, by separating the data into palm and finger data. At step S116,touch processing module 70 will determine the palm occlusion patterns. Information such as pressure and points of higher pressure may be extrapolated based on the signals received from the touch sensors. Furthermore, the amount of activated sensors may be used to extract the width of the palm occlusion patterns. At step S118,touch processing module 70 may estimate a hand position for portions of the palm that are not in contact with the sensors, based on the palm positions that are known to touchprocessing module 70. Learned models or known physiological models may be used to estimate the hand position. Based on steps S116 and S118, a user hand size may be calculated at S120. - At step S122, the finger occlusion patterns are calculated. Similar to the palm occlusion patterns, pressure and pressure points may be determined from the received touch sensor data. From this data,
touch processing module 80 can determine the amount of fingers used to grab the remote and the spacing between fingers. The finger occlusion patterns may be combined with the results of step S118 and S116, which provide an estimate of the palm portion of the hand, to determine a hold pattern. This data, along with the hand size data, may be communicated torecognition module 90 and used to match a user in the user identification database at step S126. It is envisioned that many different methods of matching a user may be used. For example, a k-means clustering may be performed on the processed data and the user identification data. Other data mining techniques and statistical learning techniques may also be used to determine a user identification. Furthermore, a confidence score may be attached to the identification, or an n-nearest match list of possible users. In the event a confidence score is used, the system may require a confidence score to exceed a predetermined threshold to identify a user. In the event an n-nearest match list is produced, other identification techniques may be used to pare down the list. - In an alternative embodiment, the method of identifying a user may not initiate until the remote has reached a resting point. Thus, the user will grab the
remote control 12, pick up theremote control 12, and then reach the hold position of theremote control 12. Once the inertial sensors indicate that theremote control 12 has reached a steady position, e.g. acceleration or velocity are below a predetermined threshold, then the user identification process may commence. In this embodiment, variations in the initial grab of the remote are entirely ignored, as the grab pattern may be equally dependent on the location of the remote control and the user, e.g. user will grab a remote differently if behind the user or lodged between two couch cushions. - Hold/grab pattern matching may be used as a sole means of user identification, a primary means of user identification or a partial means of user identification. Preliminary research reveals that in a small user group (5 users or the size of a family), hold/grab patterns result in about an 87.5% accuracy in user identification. Thus, depending on the scale and the application of the underlying system, 87.5% may be a sufficient identification accuracy. However, for more sensitive login environments more accuracy may be needed and thus, hand/grab pattern matching may be used as one of a number of matching techniques.
-
FIG. 8 depicts an example of a user trajectory that may be used to identify a user. For exemplary purposes, twotrajectories 136A and 136B corresponding to twousers position 130, theremote control 12 is depicted in a resting state on a coffee table 138. Atposition 132A, theremote control 12 has been grabbed byuser 134A and moved to position 132A by following trajectory 136A. Atposition 132B, theremote control 12 has been grabbed byuser 134B and moved to position 132B by following trajectory 1368. Thus, the two users may be differentiated based on the trajectories ofremote control 12. As is apparent from the disclosure, theremote control 12 will know when it is held by a user and when it is at rest based on the activation of the touch sensors. As described earlier, theremote control 12 may have one or more accelerometers and/or one or more gyroscopes. It should be appreciated any type of inertial sensors, such as the various types of gyroscopes and various types of accelerometers may be used. -
FIG. 9 illustrates an exemplary method of identifying a user using trajectory data. At step S140, themotion processing module 82 receives the sensor inputs from the gyroscope and the accelerometer. At step 148, themotion processing module 82 must determine a starting location.Motion processing module 82 may executesteps 146 and/or 142 and 144 to determine a starting location. The resting state and the hold position are the respective start point and end points to the trajectory. In order for a reliable trajectory match, it may be beneficial to determine the actual starting location. For example, the same user may follow a different trajectory if he is picking the remote up off the floor instead of picking the remote up off the coffee table. Thus, theremote control 12 may implement one or more techniques for estimating a starting location. First, theremote control 12 may keep track of where the remote was placed into the resting position. Thus, when the touch sensors are disengaged by a user, the inertial data from the sensors may be used to determine a resting location. To enable this type of determination, the accelerometer and gyroscope should be continuously outputting accelerations and velocities to the motionsensor processing module 72. Themotion processing module 72 will use the most recent known location, e.g. the previous resting position, and dead reckoning to determine a location. To enable dead reckoning, themotion processing unit 82 may also receive timing data for purposes of calculating a position based on acceleration and velocity vectors. Once the touch sensors are disengaged by a user,motion processing module 82 may store the new resting position as the last location. When a last known location is recorded, thenmotion processing module 82 may retrieve the last known location at step S146 upon a user grabbing theremote control 12. - As mentioned, the
trajectory processing module 82 may increase prediction accuracy if a starting location is known. It is appreciated that one underlying reason is that the starting location and the trajectory are dependent on one another. The dependency, however, is not necessarily the exact geographic location, but rather the relative location of the remote. For example, a user picking up the remote from the far right end will likely take a similar trajectory when picking the remote up off of the center of the coffee table. The trajectory will differ, however, when the user picks up theremote control 12 off of the couch. Thus, step S148, described above, does not require a pinpoint location. Rather a general location, or a cluster of locations may be used as the starting location. Trajectory processing module may use a k-means clustering algorithm of known locations and an estimated starting location to determine the general starting location. - It should be noted that in certain embodiments the
remote control 12 may ask a user to verify a location periodically. Furthermore, in some embodiments, there may be a location registration phase, where the user preprograms the n-most likely locations of a remote. In this embodiment, the user could enter a coffee table, a couch, a side table, the floor, the entertainment center, etc. In such a registration process, the user would have to define the locations with respect to each other. - In other embodiments, the
motion processing module 82 will determine a location based on the motion itself. In these embodiments, themotion processing module 82 receives the sensor data and determines a reference trajectory using dead reckoning techniques at step S142. It should be noted that the trajectory is a reference trajectory because it assumes a starting point of (0, 0, 0) and further is used as a reference to determine a starting location. At step S144, themotion processing module 82 may use a k-means cluster algorithm, using the received trajectory as input, to determine the most likely starting location. - Once a starting location is determined, the motion processing module may use the resting location (starting point), the hold position (the end point), and the sensor data, to determine a trajectory. The
module processing module 82 will then attempt to find a match in theuser identification database 62 for the trajectory, based on the starting point and the trajectory itself. The reason that both parameters may be of importance is that the categorization of a trajectory is dependent on the starting point. For example, two identical trajectories may be reported to motionsensor processing module 72, despite one trajectory beginning on the coffee table and one trajectory beginning on the center of the floor. Without a starting location, it may be very difficult to differentiate the two trajectories. However, with an estimated or known starting location, themotion processing module 72 may differentiate between the trajectories because one started from the floor, while the other started from the coffee table. Thus, it may be determined, for example, that a shorter user (e.g. a child) picked up the remote from the floor in a standing position, and that a taller (e.g. an adult) user picked up the remote from the coffee table. It is envisioned that learning methods such as support vector machines or k-means clustering may be used to determine a user identification based on the calculated trajectory and starting point. It should be noted that the starting point, e.g. couch or coffee table, may be used to pare down the set of trajectories that the input trajectory is compared with. For example, if themotion processing module 82 determines that the user picked theremote control 12 up from the coffee table, i.e. the input trajectory originated from the coffee table, only the set of trajectories originating from the coffee table are used to generate a user identification. - In an alternative embodiment, the starting point of the trajectory is ignored. Rather, a vector representing the relative motion of the remote control is used to identify the user. Thus, the trajectory is assumed to always begin at a (0,0,0) position.
- It should be noted that trajectory matching may be used as a sole means of user identification, a primary means of user identification or a partial means of user identification. Depending on the scale of the system and the application of the underlying system, trajectory matching may provide sufficient identification accuracy. However, for more sensitive login environments more accuracy may be needed and thus, trajectory matching may be used as one of a number of identification techniques.
- In embodiments of the
remote control 12 that include a touchpad, additional identification methods may be enabled.FIG. 10 illustrates a user's first touch of thetouchpad 20 of theremote control 12. As discussed above, a user when using theremote control 12 will do three things: 1) grab the remote; 2) pick up the remote; and 3) touch thetouchpad 20. By extracting data of these events, a user may be identified from the extracted data without having to actively enter user identification information. Thus far, identifying a user from grab/hold patterns and trajectories associated with picking up theremote control 12 have been described. A third way of identifying a user is based on a first touch of a user. Based on the hold pattern associated with a user, a user will have a fairly unique first touch position due to the fact that users generally have different bone and joint structures in the hand. Thus, a user may be further identified based on the hold position and the first touch of the touch pad. -
FIG. 11 is a flow diagram depicting an exemplary method of identifying a user based on a first touch of theremote control 12. At step S150, a user's hold position is detected and determined. The process is described in greater detail above. At step S152, the user's first touch is determined. The user's first touch may be an (x,y) coordinate on thetouchpad 20. Based on the hold position/pattern and the first touch point,touchpad processing module 88 may extrapolate additional information relating to the user's thumb. For example,touchpad processing module 88 determines the angle at which the thumb holds/curves around theremote control 12. Also, a thumb length may be determined. Based on the first touch data associated with the thumb, a vector corresponding to the user thumb 160 (FIG. 10 ) may be calculated at step S154. Thethumb vector 160 may be a four dimensional vector having an x value, a y value, an x offset and a y offset, wherein one of the corners of the touchpad is used as the origin. This vector may be communicated to and used byrecognition module 90 to find a match inuser identification database 62 at step S156. - It should be noted that first touch data may be used as a sole means of user identification, a primary means of user identification or a partial means of user identification. Depending on the scale of the system and the application of the underlying system, first touch data may provide sufficient identification accuracy. However, for more sensitive login environments more accuracy may be needed and thus, first touch data may be used as one of a number of identification techniques.
-
FIG. 12 illustrates a user drawing a circle on thetouchpad 20 of theremote control 12. Thus far, wholly passive approaches of identifying a user have been described. The following describes a semi-passive approach for identifying a user, wherein the user traces a shape, preferably a circle, on thetouchpad 20. As can be seen from the figure, the user traces acircle 162 on thetouchpad 20. It is envisioned, however, that any shape may be used. The purpose of having the user trace a shape on the touchpad is to extract kinematics data from the user's motion. For example, when the user traces a counterclockwise circle, there are four strokes that will typically occur. The first is from 12 to 9, the second from 9 to 6, the third from 6 to 3, and the last from 3 to 12. A user may slide the thumb one position to the next or may slightly bend the thumb, which will result in different arcuate trajectories. The user may make small strokes or large strokes. The user may draw the circle clock-wise or counter-clockwise. Furthermore, timing data may also be extrapolated and used to identify the user. It should be apparent that the permutations of different stroke attributes are great. The amount of permutations corresponding to a user drawn circle provides for a high accuracy rate for identifying a user. -
FIG. 13 describes an exemplary method of identifying a user by having the user draw a shape on thetouchpad 20. At step S170,touchpad processing module 88 receives the arcuate trajectory data corresponding to the drawn circle. The arcuate trajectory data may come in the form of triples (x, y, t), wherein every x,y coordinate is given a time stamp. At step S174, the set of triples may undergo linear stretching. For example, the arcuate trajectory data may be stretched to 130% the median length. At step S174, the stretched arcuate trajectory data may undergo a principle component analysis (PCA) to reduce the dimensionality of the data set. In a preferred embodiment, the principle components accounting for 98% of the variance are chosen. It is understood, however, that other variance thresholds may be chosen. At step S176, the reduced data sets may then be clustered using k means clustering, where k is selected as the number of users. At step S178 a matching user may be identified by the cluster that the transformed arcuate trajectory data falls into. - In some embodiments, the timing data is initially removed and only the coordinate data, i.e. the (x,y) components of the data are used in steps S172-S178. In these embodiments, the results of the k-means clustering may be rescored using the timing data at step S180. Once rescored, a user may be identified at step S182.
- It should be noted that shape drawing may be used as a sole means of user identification, a primary means of user identification or a partial means of user identification. In fact, shape drawing typically provides very high identification accuracy rates. Shape drawing, however, is not a passive approach and may, therefore, be implemented as a back up method when the system is unsure of the user's identity after using the passive identification techniques. Depending on the scale of the system and the application of the underlying system, shape drawing may be an advantageous means of protecting more sensitive login environments.
- It is envisioned that additional sensors may also be used to identify a user. For example, an
acoustic sensor 58 may be used to detect a user's heartbeat. Theacoustic sensors 58 may be strategically placed along-side the outer covering of the remote control, whereby the entireremote control 12 acts as an acoustic antenna. When a user holds the device tightly, the acoustic sensors detect the heartbeat and transmit the data to anacoustic processing module 86.Acoustic processing module 86 may process the received data so that a frequency and amplitude of the heartbeat may be determined. Therecognition module 90 may then use one or more of the statistical learning or data mining techniques described above to determine if there exists a matching user in theuser identification database 62 based on the user's heartbeat characteristics. It is envisioned that other types of sensors may be used to monitor a user's heartbeat or related statistics. For example, a pulse oximeter may be used to measure a patient's pulse or blood-oxygen levels. Additionally, an ultra-sensitive accelerometer may be used to detect vibrations resulting from the user's pulse. Finally, an impulse response system may further be used. An impulse response system is essentially comprised of a speaker and a microphone. The microphone emits a high-frequency sound wave that reverberates through the user's hand. The sound wave may be deflected back to the microphone, where the sensor is able to discern the augmentation of the sound wave. The impulse-response sensors may also be used to measure a user's pulse. - Another additional sensor is an
optical sensor 56. Theoptical sensor 56 may be located on the remote control or on the host device. Theoptical sensor 56 may be used to receive image data of the user. Theimage processing module 86 may perform face-recognition or torso recognition on the user for purposes of identifying the user. Yet another sensor is a thermal sensor placed on the outside covering of theremote control 12. The thermal sensors may be used to determine a user's body temperature. Typically, an identification based solely on body temperature may not be reliable. Body temperature data, however, may be useful in increasing the dimensionality of the data sets so that greater separation results in the collection of user attribute data sets. - Individual methods for user identification have been disclosed. All the methods are either passive or semi-passive, as they do not require the user to remember or enter a username, passcode, or other unique identifier. Rather, the techniques rely on a user's natural kinematic tendencies when performing subconscious tasks. In an alternative embodiment, a combination of two or more of the above-described techniques may be used to increase the accuracy of a user identification. As mentioned earlier, each of the individual techniques may have a confidence score associated with an identification. Additionally, an n-nearest match list may also be generated for each identification technique. Based on either the confidence scores or the n-nearest neighbors of multiple user identification efforts, a more accurate user identification may be realized. Taking a sample size of five users, the grab/hold identification method resulted in an 87.5% accuracy rate, the accelerometer-only based trajectory identification resulted in a 77.5% accuracy rate and the gyroscope-only based trajectory identification resulted in a 65% accuracy rate. Taking the three passive identification methods in combination, however, results in a 90% accuracy rate. It should be noted that the circle-drawing based identification resulted in a 97.5% authentication accuracy.
- As can be seen in
FIG. 14 , there are n various user identifications 190 a-190 n, each having a confidence score 192 a-192 n.Combined identification module 194 may combine the individual user identifications to come to a more robust user identification. Each method may further produce an n-nearest list of matches, each entry in the list having its own confidence score. For each user, a weighted average of each of the confidence scores may be calculated by combinedidentification module 194. The combinedidentification module 194 may determine auser identification 196 based on the user having the highest weighted average. It is envisioned that other methods of determining a user based on a combination of various identification methods may also be used. - General reference has been made to the processing of data.
FIG. 15 depicts an exemplary method for processing the sensor data. As mentioned earlier, various sensors will provide input data 200 a-200 n. The data provided may be received from a variety of sensors, e.g. touch sensors, inertial sensors, touchpad, acoustic, etc., or it may be received from one sensor type that produces lots of data, e.g. many touch sensors or lots of inertial data. In either case, the data set will be large. Thus the input data 200 a-200 n may first undergofeature extraction 202. Various techniques for dimensionality reduction may be used. For example, the data set may undergo a principle component analysis or a linear discriminate analysis. Afeature vector 204 representing the data set but in a lower dimensionality is generated and communicated to asegmentation module 208. - A
segmentation module 208 separates portions of thefeature vector 204 into segments representing different states. For example, in the example of the remote control being raised inFIG. 8 , the remote control was first on a table top, at rest. Next, the remote control was grabbed, but remained on/or near the table. The remote control then quickly accelerates through the pick up phase. The remote control then reaches a hold position. At the hold position, the remote control will likely have velocity and acceleration but not at the magnitude observed during the initial pickup. Finally, the remote control will be placed back onto a stable surface such as the couch or the table top. Additionally, the user may drop the remote control or may move the remote control to transmit a command to the host device. It should be apparent that not all of these segments are relevant for purposes of user identification. The inertial sensors (and all other sensors), however, may be continuously transmitting data. Thus, it may be beneficial to further reduce the data that needs to be analyzed, by segmenting the data. The segmentation of data occurs by comparing the data againstvarious segment models 210. Thesegment models 210 may be in the form of hidden Markov models representing the various states. By comparing chunks of data against thesegment models 210, it can be determined with a reasonable probability the state of the data. The feature vector can then be classified according to at least one of the various segments 212 a-212 n. Moreover, if only a certain portion of the feature vector is relevant, the segment selection module may 214 reduce the feature vector so that only the relevant segment is classified. It is appreciated, however, that the feature vector does not need to be reduced. - As mentioned, the
segment selection module 214 will select the state of the feature data, and may select the relevant segments 212 a-212 n for classification. Referring back to the previous example, the segment selection module may be configured to only select trajectories of the remote when picked up and when at the rest position. Thefeature vector 204 or the selected segments of thefeature vector 204 are communicated to a classifier 216. The classifier 216 may use a clustering analysis to determine a user identification. In the cluster analysis, the selected segments are analyzed with user models 220 a-220 n. The user models 220 a-220 n represent the attributes of the various users. Thesegment selection module 214 can also communicate the selected segment to classifier 224 for purposes of classifying the feature vector with the most relevant data. For example, whensegment selection module 214 determines that thefeature vector 204 is primarily trajectory data corresponding to a remote control pickup, classifier, when accessing the user models, will only retrieve the segments of user models 220 a-220 n, that correspond to trajectory data. Because the user models have been previously classified, the classifier only needs to determine the cluster, i.e. which user model 220 a-220 n, that the feature vector or selected segment of the feature vector belongs to. As previously mentioned, classifier 224 may execute a clustering algorithm, such as k-means clustering, to determine the cluster that the feature vector most belongs to. The determined cluster will correspond with the user's identify. Thus, auser identification 222 may be made by the system. - As is apparent from the disclosure, the various methods of identification all rely on at least one matching, learning or classification method, such as support vector machines, k-means clustering, hidden Markov models, etc. Thus, it has been assumed that the various data types in
user identification database 62 are actually present in said database. The data sets, such as grab/hold data, trajectory data, first touch data, arcuate trajectory data, and heartbeat data, may be collected using unsupervised or supervised learning techniques. - A first method of collecting the various data sets is by implementing a training session. Each user may register with the system, e.g. the host device. The registering user will be asked to repeatedly perform various tasks such as grabbing the remote control, picking up the remote control, or drawing a circle on the touch pad of the remote control. The collected training data are used to define a user's tendencies for purposes of identification. When the system is in an operational mode, the input data, used for identification, may be added to the training data upon each successful identification. Furthermore, the user can verify a correct user identification and correct an improper identification to increase the robustness of the system.
- A second method of collecting the various data sets is by implementing an unsupervised learning process that differentiates the users over the course of the remote control's usage. Take for example a family that owns a Digital Video Recorder (DVR). The father has large hands and grabs the remote control using three fingers. The wife has a small hands and grabs the remote with four fingers. The child has small hands and the remote with three fingers. Furthermore, the father records sports-related programming and reality television. The mother records sitcoms and police dramas. The child records cartoons and animal shows. Over the course of an initial period, the system will differentiate the three users based on the differences of hand size and hold patterns. Over the initial period, the identification system will also learn that the user with large hands and a three finger grab is associated with the sports and reality television programming. The system can then map the user preferences or profile to the extrapolated hold pattern data. Thus, a user may grab the remote control and have his or her preferences readily set based only on past usage and the initial picking up of the remote. Using this method, the user will never actually engage in training the system, but user identification will be realized over the course of time.
-
FIG. 16 depicts a method used to identify a user and train the user identification database. It is noted thatFIG. 16 contains most of the components found inFIG. 15 . The primary difference is thatFIG. 16 contains a generic model database 224. The background model database contains preprogrammed user templates, so that the system has background models to analyze alongside the user models. When a user first uses the system (i.e. the remote controls first use), there will be no user models 220 a-220 n. The learning module may recognize this and automatically register the new user. The input data, is processed as shown inFIG. 15 , so that all relevant segments and corresponding attributes are stored in the new user model. The second time the user picks up the remote, the system will receive the user input data, reduce the dimensionality, select the relevant segments of data, and run the selected segments against the user models and the background models. The user identification module will likely identify the user as the user. The identification will have a corresponding probability, indicating a confidence in the user identification. If the probability does not exceed a predetermined threshold, then the user identification module assumes that the user is a new user, and creates a new user model for the user, using the input data as the attributes of the new user model. If the corresponding probability, i.e. the confidence score, exceeds the threshold, then the user identification module identifies the user, and adds the input features into the user's user model. As can be appreciated, as the user picks up the remote and is successfully identified, the user model associated with that user will increase in richness. Thus, the confidence scores associated with the identification of the exemplary user will also increase. -
FIG. 16 is now described in greater detail. Components found in bothFIGS. 15 and 16 have been numbered as such. Similar toFIG. 15 , input data 200 a-200 n is received and undergoesfeature extraction 202. The result is afeature vector 204, which is then segmented by thesegmentation module 208, usingsegment models 210 as models for determining the segmentation of the data. Thefeature vector 204 is then broken down into segments. Thesegment selection module 214 selects the relevant segments and communicates the relevant segments to the classifier 224. The classifier 224 operates slightly differently than the classifier 216 ofFIG. 15 . The classifier 224 receives the background models 226 a-226 n in addition to the user models 220 a-220 n. The classifier then determines a user identification using a clustering algorithm. The user identification will have a probability associated with it. If the probability that the user identification exceeds a threshold, classifier 224 generatesuser identification 222. If the probability does not exceed the threshold or if the identified user is a background variable, then the classifier passes the data to amodel generation module 230, which generates a new model based on the relevant attributes. Thenew model 232 is communicated to theuser model database 218. - For example only,
FIG. 17 depicts twohypothetical sets point star 234 represents a user identification attempt. As can be seen, the sixpoint star 234 clearly falls within the first user'sdata set 230. Thus, the system can predict that the user to be identified is the first user with a high probability based on the cluster in which the identification attempt is closest to. It is appreciated that the more biometric features that are used in an authentication event, the greater the dimensionality of the data sets used for identification. When data sets of higher dimensionality are used for identification, a greater amount of separation will be realized between the clusters of data. - While reference has been made to a
remote control 12, it should be appreciated that the sensors described above, as well as the identification methods described above will be used in various handheld devices such as cell phones, portable phones, mp3 players, personal DVD players, PDAs, and computer mice. For example, with a cell phone or portable phone, using any of the above techniques, the phone may determine that a first user or a plurality of users is using the phone. Based on this, specific settings such as a phonebook, saved text messages, saved emails, volume settings, screen settings, wall paper and saved files such as photos will become available to the user. Similarly, in a device such as an MP3 player, the first user's music library may be accessible to the user after identification. In the PDA, schedules and contacts personal to the first user are made available to the user, only after the user grabs the PDA and is identified. - With a computer mouse or a laptop mouse pad, the methods disclosed above may be used to identify and authenticate the user. The user may then be automatically logged onto her user profile. Further, the user may leave the computer and upon another user touching the mouse or mouse pad, the device will be able to determine that the user has changed. At this point, the second user may be locked out of the first user's profile until an explicit override instruction is provided by the first user.
- It should be apparent that the disclosed methods and devices will allow the sharing of devices once thought to be personal devices without having to risk the privacy or intimacy typically associated with these devices.
- As used herein, the term module may refer to, be part of, or include an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and/or memory (shared, dedicated, or group) that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality. It is should be understood that when describing a software or firmware program, the term module may refer to machine readable instructions residing on an electronic memory.
- The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the invention, and all such modifications are intended to be included within the scope of the invention. The method steps, processes, and operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance. It is also to be understood that additional or alternative steps may be employed.
Claims (21)
1. A handheld electronic device, comprising:
a housing;
a touch sensor system disposed along a periphery of the housing and responsive to a plurality of simultaneous points of contact between a user's hand and the handheld electronic device to generate observation signals indicative of the plurality of points of contact between the user's hand and the handheld electronic device;
a touch sensor processing module configured to receive the observation signals from the touch sensor system and determine a user's holding pattern;
an inertial sensor embedded in the housing and responsive to movement of the handheld electronic device by the user's hand to generate inertial signals;
a trajectory module configured to determine a trajectory for the movement of the handheld electronic device, based on the inertial signals from the inertial sensor, a starting position, and an end position, the starting position being a location where the handheld electronic device in a resting position is grabbed by a user, the end position being a location of the handheld electronic device being held;
a touchpad located along an external surface of the housing that is responsive to the user's finger movement along the external surface of the touchpad to generate touchpad signals;
a touchpad processing module configured to receive the touchpad signals and determines user finger movement data;
a user identification database storing data corresponding to attributes of a plurality of known users, wherein the attributes of the plurality of the known users are used to identify a user, and wherein the attributes include holding patterns of the plurality of known users, trajectories for the movement of the handheld electronic device of the plurality of known users, and user finger movement data of the plurality of known users; and
a user identification module configured to receive identification information of the user and identify the user based on the identification information and the attributes of the plurality of known users by accessing said used identification database, wherein the identification information includes the user's holding pattern, the user's trajectory for movement of the handheld electronic device, and the user's finger movement data, wherein the user identification module is configured to identify the user by detecting a trajectory that matches the user's trajectory for movement of the handheld electronic device from the trajectories for the movement of the handheld electronic device of the plurality of known users.
2. The handheld electronic device of claim 1 wherein the touch sensor system is further defined as an array of capacitive sensors integrated into and spatially separated from each other along an exterior surface of the housing.
3. The handheld electronic device of claim 1 wherein the inertial sensor is an accelerometer.
4. The handheld electronic device of claim 1 wherein the inertial sensor is a gyroscope.
5. The handheld electronic device of claim 1 wherein the user identification module is configured to implement machine learning to identify a user.
6. The handheld electronic device of claim 5 wherein the user identification module uses a k-means clustering algorithm to determine a user identification.
7. The handheld electronic device of claim 1 wherein the user identification module is configured to determine a plurality of preliminary user identifications, wherein each of the preliminary user identifications is based on one of the attributes.
8. The handheld electronic device of claim 7 wherein each of the preliminary user identifications has a corresponding confidence score, wherein the confidence score indicates a probability that the preliminary user identification is correct.
9. The handheld electronic device of claim 1 wherein the user identification module is configured to determine a plurality of user identifications, wherein each of the preliminary identifications has a list of possible users and wherein each entry in the list of possible users has a confidence score indicating a probability that the possible user is actually the user.
10. The handheld electronic device of claim 1 wherein the trajectory module is configured to determine a starting location of the handheld electronic device, wherein the user identification module further bases user identification on the starting location of the handheld electronic device.
11. A handheld electronic device, comprising:
a housing;
a sensor system disposed along a periphery of the housing and responsive to a plurality of simultaneous points of contact between a user's hand and the handheld electronic device to generate observation signals indicative of the plurality of points of contact between the user's hand and the handheld electronic device;
a user identification database storing data corresponding attributes of a plurality of known users, wherein the attributes of the plurality of known users are used to identify a user;
a user identification module configured to receive the observation signals from the sensor system and identify the user from the observation signals and the attributes of the plurality of users by accessing said user identification database;
an inertial sensor embedded in the housing and responsive to movement of the handheld electronic device by the user's hand to generate inertial signals; and
a trajectory module configured to receive the inertial signals from the inertial sensor and determine a trajectory for the movement of the handheld electronic device, wherein the user identification module is configured to receive the trajectory from the trajectory module and identify the user based in part from the trajectory.
12-22. (canceled)
23. The handheld electronic device of claim 11 wherein the inertial sensor is an accelerometer.
24. The handheld electronic device of claim 11 wherein the inertial sensor is a gyroscope.
25. The handheld device of claim 11 wherein the trajectory module is configured to determine a starting location of the handheld electronic device and communicate said starting location of the user identification module.
26. The handheld electronic device of claim 25 wherein the user identification module further bases user identification on the starting location of the handheld electronic device.
27-28. (canceled)
29. The handheld electronic device of claim 25 wherein the user identification module is configured to receive finger movement data corresponding to a first point of contact between one of the user's digits and the touchpad and use the said finger movement data corresponding to the first point of contact to identify the user.
30-46. (canceled)
47. The handheld electronic device of claim 11 further comprising:
a motion processing module configured to determine the trajectory of the handheld electronic device by employing dead reckoning, the motion processing module supplying signals indicative of the trajectory as additional observation signals to said user identification module.
48. The handheld electronic device of claim 1 wherein a cluster of locations is used as the starting position.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/988,745 US20110043475A1 (en) | 2008-04-21 | 2009-04-21 | Method and system of identifying a user of a handheld device |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US4657808P | 2008-04-21 | 2008-04-21 | |
PCT/US2009/041227 WO2009131987A2 (en) | 2008-04-21 | 2009-04-21 | Method and system of identifying a user of a handheld device |
US12/988,745 US20110043475A1 (en) | 2008-04-21 | 2009-04-21 | Method and system of identifying a user of a handheld device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110043475A1 true US20110043475A1 (en) | 2011-02-24 |
Family
ID=41200729
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/235,862 Expired - Fee Related US8031175B2 (en) | 2007-10-24 | 2008-09-23 | Touch sensitive remote control system that detects hand size characteristics of user and adapts mapping to screen display |
US12/988,745 Abandoned US20110043475A1 (en) | 2008-04-21 | 2009-04-21 | Method and system of identifying a user of a handheld device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/235,862 Expired - Fee Related US8031175B2 (en) | 2007-10-24 | 2008-09-23 | Touch sensitive remote control system that detects hand size characteristics of user and adapts mapping to screen display |
Country Status (5)
Country | Link |
---|---|
US (2) | US8031175B2 (en) |
EP (1) | EP2255270A2 (en) |
JP (1) | JP2011523730A (en) |
CN (1) | CN102016765A (en) |
WO (1) | WO2009131987A2 (en) |
Cited By (81)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100296707A1 (en) * | 2009-05-25 | 2010-11-25 | Kabushiki Kaisha Toshiba | Method and apparatus for information processing |
US20110019105A1 (en) * | 2009-07-27 | 2011-01-27 | Echostar Technologies L.L.C. | Verification of symbols received through a touchpad of a remote control device in an electronic system to allow access to system functions |
US20110118027A1 (en) * | 2009-11-16 | 2011-05-19 | Broadcom Corporation | Altering video game operations based upon user id and-or grip position |
US20110118026A1 (en) * | 2009-11-16 | 2011-05-19 | Broadcom Corporation | Hand-held gaming device that identifies user based upon input from touch sensitive panel |
US20110157696A1 (en) * | 2009-12-31 | 2011-06-30 | Broadcom Corporation | Display with adaptable parallax barrier |
US20110164188A1 (en) * | 2009-12-31 | 2011-07-07 | Broadcom Corporation | Remote control with integrated position, viewer identification and optical and audio test |
US20110226864A1 (en) * | 2010-03-19 | 2011-09-22 | Samsung Electronics Co. Ltd. | Mobile device and method for emitting fragrance |
US20120062387A1 (en) * | 2010-09-10 | 2012-03-15 | Daniel Vik | Human interface device input filter based on motion |
US20120066212A1 (en) * | 2010-03-03 | 2012-03-15 | Waldeck Technology, Llc | Monitoring hashtags in micro-blog posts to provide one or more crowd-based features |
US20120242701A1 (en) * | 2011-03-25 | 2012-09-27 | Apple Inc. | Accessory dependent display orientation |
WO2012166979A2 (en) * | 2011-05-31 | 2012-12-06 | Cleankeys Inc. | System for detecting a user on a sensor-based surface |
US20130021241A1 (en) * | 2010-04-01 | 2013-01-24 | Funai Electric Co., Ltd. | Portable Information Display Terminal |
US20130100043A1 (en) * | 2011-10-24 | 2013-04-25 | General Electric Company | Method for determining valid touch screen inputs |
US20130147602A1 (en) * | 2011-12-12 | 2013-06-13 | Cisco Technology, Inc. | Determination of user based on electrical measurement |
US20130181902A1 (en) * | 2012-01-17 | 2013-07-18 | Microsoft Corporation | Skinnable touch device grip patterns |
US20130201155A1 (en) * | 2010-08-12 | 2013-08-08 | Genqing Wu | Finger identification on a touchscreen |
US20130243242A1 (en) * | 2012-03-16 | 2013-09-19 | Pixart Imaging Incorporation | User identification system and method for identifying user |
US20130293360A1 (en) * | 2012-05-02 | 2013-11-07 | Shuen-Fu Lo | All new Ui-E1-Stroke operation control devices |
US20130300668A1 (en) * | 2012-01-17 | 2013-11-14 | Microsoft Corporation | Grip-Based Device Adaptations |
US20140047465A1 (en) * | 2012-08-07 | 2014-02-13 | WebTuner, Corporation | Multi-media ad targeting and content recommendation with viewer identity detection system |
US20140184922A1 (en) * | 2012-12-28 | 2014-07-03 | Echostar Technologies L.L.C. | Determining remote control state and user via accelerometer |
US20140210728A1 (en) * | 2013-01-25 | 2014-07-31 | Verizon Patent And Licensing Inc. | Fingerprint driven profiling |
US8854531B2 (en) | 2009-12-31 | 2014-10-07 | Broadcom Corporation | Multiple remote controllers that each simultaneously controls a different visual presentation of a 2D/3D display |
US9069390B2 (en) | 2008-09-19 | 2015-06-30 | Typesoft Technologies, Inc. | Systems and methods for monitoring surface sanitation |
US9104260B2 (en) | 2012-04-10 | 2015-08-11 | Typesoft Technologies, Inc. | Systems and methods for detecting a press on a touch-sensitive surface |
US9110590B2 (en) | 2007-09-19 | 2015-08-18 | Typesoft Technologies, Inc. | Dynamically located onscreen keyboard |
US9146631B1 (en) * | 2013-02-11 | 2015-09-29 | Amazon Technologies, Inc. | Determining which hand is holding a device |
US9247286B2 (en) | 2009-12-31 | 2016-01-26 | Broadcom Corporation | Frame formatting supporting mixed two and three dimensional video data communication |
US9261964B2 (en) | 2005-12-30 | 2016-02-16 | Microsoft Technology Licensing, Llc | Unintentional touch rejection |
US20160085961A1 (en) * | 2014-09-19 | 2016-03-24 | Kabushiki Kaisha Toshiba | Authentication system, authentication device, and authentication method |
US20160162100A1 (en) * | 2014-12-03 | 2016-06-09 | Samsung Display Co., Ltd. | Display device and driving method for display device using the same |
US20160182950A1 (en) * | 2014-12-17 | 2016-06-23 | Lenovo (Singapore) Pte. Ltd. | Identification of a user for personalized media content presentation |
US9454270B2 (en) | 2008-09-19 | 2016-09-27 | Apple Inc. | Systems and methods for detecting a press on a touch-sensitive surface |
US9477337B2 (en) | 2014-03-14 | 2016-10-25 | Microsoft Technology Licensing, Llc | Conductive trace routing for display and bezel sensors |
US9489086B1 (en) | 2013-04-29 | 2016-11-08 | Apple Inc. | Finger hover detection for improved typing |
US9582122B2 (en) | 2012-11-12 | 2017-02-28 | Microsoft Technology Licensing, Llc | Touch-sensitive bezel techniques |
US9641393B2 (en) | 2009-02-02 | 2017-05-02 | Waldeck Technology, Llc | Forming crowds and providing access to crowd data in a mobile environment |
WO2017100458A1 (en) | 2015-12-11 | 2017-06-15 | Roku, Inc. | User identification based on the motion of a device |
US20170266329A1 (en) * | 2014-11-14 | 2017-09-21 | Echostar Technologies L.L.C. | Systems and methods for disinfecting a remote control using ultraviolet light |
US9804864B1 (en) * | 2011-10-07 | 2017-10-31 | BlueStack Systems, Inc. | Method of mapping inputs and system thereof |
US9927917B2 (en) | 2015-10-29 | 2018-03-27 | Microsoft Technology Licensing, Llc | Model-based touch event location adjustment |
EP3223119A4 (en) * | 2014-11-19 | 2018-07-25 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and device for adjusting object attribute information |
US20180217719A1 (en) * | 2017-02-01 | 2018-08-02 | Open Tv, Inc. | Menu modification based on controller manipulation data |
US10078377B2 (en) | 2016-06-09 | 2018-09-18 | Microsoft Technology Licensing, Llc | Six DOF mixed reality input by fusing inertial handheld controller with hand tracking |
US10108854B2 (en) | 2015-05-18 | 2018-10-23 | Sstatzz Oy | Method and system for automatic identification of player |
US10126942B2 (en) | 2007-09-19 | 2018-11-13 | Apple Inc. | Systems and methods for detecting a press on a touch-sensitive surface |
US10203873B2 (en) | 2007-09-19 | 2019-02-12 | Apple Inc. | Systems and methods for adaptively presenting a keyboard on a touch-sensitive display |
US10262324B2 (en) | 2010-11-29 | 2019-04-16 | Biocatch Ltd. | System, device, and method of differentiating among users based on user-specific page navigation sequence |
US10289302B1 (en) | 2013-09-09 | 2019-05-14 | Apple Inc. | Virtual keyboard animation |
US10298614B2 (en) * | 2010-11-29 | 2019-05-21 | Biocatch Ltd. | System, device, and method of generating and managing behavioral biometric cookies |
US10397262B2 (en) | 2017-07-20 | 2019-08-27 | Biocatch Ltd. | Device, system, and method of detecting overlay malware |
US10404729B2 (en) | 2010-11-29 | 2019-09-03 | Biocatch Ltd. | Device, method, and system of generating fraud-alerts for cyber-attacks |
US10474815B2 (en) | 2010-11-29 | 2019-11-12 | Biocatch Ltd. | System, device, and method of detecting malicious automatic script and code injection |
US10523680B2 (en) * | 2015-07-09 | 2019-12-31 | Biocatch Ltd. | System, device, and method for detecting a proxy server |
US10579784B2 (en) | 2016-11-02 | 2020-03-03 | Biocatch Ltd. | System, device, and method of secure utilization of fingerprints for user authentication |
US10586036B2 (en) | 2010-11-29 | 2020-03-10 | Biocatch Ltd. | System, device, and method of recovery and resetting of user authentication factor |
US10614201B2 (en) | 2014-08-07 | 2020-04-07 | Alibaba Group Holding Limited | Method and device for identity authentication |
US10621585B2 (en) | 2010-11-29 | 2020-04-14 | Biocatch Ltd. | Contextual mapping of web-pages, and generation of fraud-relatedness score-values |
US10685355B2 (en) * | 2016-12-04 | 2020-06-16 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering |
US10719765B2 (en) | 2015-06-25 | 2020-07-21 | Biocatch Ltd. | Conditional behavioral biometrics |
US10728761B2 (en) | 2010-11-29 | 2020-07-28 | Biocatch Ltd. | Method, device, and system of detecting a lie of a user who inputs data |
US10735080B2 (en) | 2016-08-10 | 2020-08-04 | Huawei Technologies Co., Ltd. | Transmission scheme indication method, and data transmission method, apparatus, and system |
US10747305B2 (en) | 2010-11-29 | 2020-08-18 | Biocatch Ltd. | Method, system, and device of authenticating identity of a user of an electronic device |
GB2581494A (en) * | 2019-02-19 | 2020-08-26 | Sony Interactive Entertainment Inc | Apparatus, system and method of authentication |
US10776476B2 (en) | 2010-11-29 | 2020-09-15 | Biocatch Ltd. | System, device, and method of visual login |
US10834590B2 (en) | 2010-11-29 | 2020-11-10 | Biocatch Ltd. | Method, device, and system of differentiating between a cyber-attacker and a legitimate user |
US10884497B2 (en) * | 2018-11-26 | 2021-01-05 | Center Of Human-Centered Interaction For Coexistence | Method and apparatus for motion capture interface using multiple fingers |
US10897482B2 (en) | 2010-11-29 | 2021-01-19 | Biocatch Ltd. | Method, device, and system of back-coloring, forward-coloring, and fraud detection |
US10917431B2 (en) | 2010-11-29 | 2021-02-09 | Biocatch Ltd. | System, method, and device of authenticating a user based on selfie image or selfie video |
US10949757B2 (en) | 2010-11-29 | 2021-03-16 | Biocatch Ltd. | System, device, and method of detecting user identity based on motor-control loop model |
US10949514B2 (en) | 2010-11-29 | 2021-03-16 | Biocatch Ltd. | Device, system, and method of differentiating among users based on detection of hardware components |
US10970394B2 (en) | 2017-11-21 | 2021-04-06 | Biocatch Ltd. | System, device, and method of detecting vishing attacks |
US11055395B2 (en) | 2016-07-08 | 2021-07-06 | Biocatch Ltd. | Step-up authentication |
US20210329030A1 (en) * | 2010-11-29 | 2021-10-21 | Biocatch Ltd. | Device, System, and Method of Detecting Vishing Attacks |
US11210674B2 (en) | 2010-11-29 | 2021-12-28 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering |
US11223619B2 (en) | 2010-11-29 | 2022-01-11 | Biocatch Ltd. | Device, system, and method of user authentication based on user-specific characteristics of task performance |
US11269977B2 (en) | 2010-11-29 | 2022-03-08 | Biocatch Ltd. | System, apparatus, and method of collecting and processing data in electronic devices |
US11281292B2 (en) * | 2016-10-28 | 2022-03-22 | Sony Interactive Entertainment Inc. | Information processing apparatus, control method, program, and storage media |
US11606353B2 (en) | 2021-07-22 | 2023-03-14 | Biocatch Ltd. | System, device, and method of generating and utilizing one-time passwords |
EP4293543A1 (en) * | 2022-06-17 | 2023-12-20 | NOS Inovação, S.A. | System for identifying a user of an electronic device |
US20240080339A1 (en) * | 2010-11-29 | 2024-03-07 | Biocatch Ltd. | Device, System, and Method of Detecting Vishing Attacks |
Families Citing this family (67)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8823645B2 (en) | 2010-12-28 | 2014-09-02 | Panasonic Corporation | Apparatus for remotely controlling another apparatus and having self-orientating capability |
KR20100039017A (en) * | 2008-10-07 | 2010-04-15 | 한국전자통신연구원 | Remote control apparatus using menu markup language |
US20110148762A1 (en) * | 2009-12-22 | 2011-06-23 | Universal Electronics Inc. | System and method for multi-mode command input |
JP5370144B2 (en) * | 2009-12-28 | 2013-12-18 | ソニー株式会社 | Operation direction determination device, remote operation system, operation direction determination method and program |
US8432368B2 (en) * | 2010-01-06 | 2013-04-30 | Qualcomm Incorporated | User interface methods and systems for providing force-sensitive input |
US9542097B2 (en) | 2010-01-13 | 2017-01-10 | Lenovo (Singapore) Pte. Ltd. | Virtual touchpad for a touch device |
US8803655B2 (en) | 2010-05-11 | 2014-08-12 | Universal Electronics Inc. | System and methods for enhanced remote control functionality |
US9357024B2 (en) * | 2010-08-05 | 2016-05-31 | Qualcomm Incorporated | Communication management utilizing destination device user presence probability |
DE102010041957A1 (en) * | 2010-10-04 | 2012-04-05 | Ident Technology Ag | Electrode configuration for touch detection and method for detecting a touch of a hand-held device |
US20120131490A1 (en) * | 2010-11-22 | 2012-05-24 | Shao-Chieh Lin | Touch-controlled device and method for displaying a virtual keyboard on the touch-controlled device thereof |
US8836640B2 (en) | 2010-12-30 | 2014-09-16 | Screenovate Technologies Ltd. | System and method for generating a representative computerized display of a user's interactions with a touchscreen based hand held device on a gazed-at screen |
JP5917805B2 (en) * | 2011-01-05 | 2016-05-18 | ソニー株式会社 | Information processing apparatus, information processing method, and computer program |
CN102724038B (en) * | 2011-03-30 | 2016-08-03 | 阿里巴巴集团控股有限公司 | Identity identifying method, information collecting device and ID authentication device |
JP5782857B2 (en) * | 2011-06-18 | 2015-09-24 | ブラザー工業株式会社 | Parameter input apparatus and image forming apparatus |
EP2724667B1 (en) * | 2011-06-24 | 2020-11-18 | Murata Manufacturing Co., Ltd. | Mobile apparatus with biosensors |
US9274642B2 (en) * | 2011-10-20 | 2016-03-01 | Microsoft Technology Licensing, Llc | Acceleration-based interaction for multi-pointer indirect input devices |
US9658715B2 (en) | 2011-10-20 | 2017-05-23 | Microsoft Technology Licensing, Llc | Display mapping modes for multi-pointer indirect input devices |
US8750852B2 (en) * | 2011-10-27 | 2014-06-10 | Qualcomm Incorporated | Controlling access to a mobile device |
US9213482B2 (en) * | 2011-11-11 | 2015-12-15 | Elan Microelectronics Corporation | Touch control device and method |
US9389679B2 (en) | 2011-11-30 | 2016-07-12 | Microsoft Technology Licensing, Llc | Application programming interface for a multi-pointer indirect touch input device |
JP5858228B2 (en) * | 2011-12-27 | 2016-02-10 | 株式会社国際電気通信基礎技術研究所 | Identification apparatus, identification method, and identification program |
KR101890624B1 (en) * | 2012-04-12 | 2018-08-22 | 엘지전자 주식회사 | Remote controller equipped with touch pad and method for controlling the same |
US20130335576A1 (en) * | 2012-06-19 | 2013-12-19 | Martin GOTSCHLICH | Dynamic adaptation of imaging parameters |
JP5923394B2 (en) * | 2012-06-20 | 2016-05-24 | 株式会社Nttドコモ | Recognition device, recognition method, and recognition system |
CN103546776A (en) * | 2012-07-13 | 2014-01-29 | 珠海扬智电子科技有限公司 | Method for providing multimedia service, remote control device, set top box and multimedia system |
US8964127B2 (en) * | 2012-07-27 | 2015-02-24 | TCL Research America Inc. | User-sensing remote control system and method |
JP5904049B2 (en) * | 2012-08-02 | 2016-04-13 | 日本電気株式会社 | Mobile phone, control method and program |
CN103917978A (en) * | 2012-09-24 | 2014-07-09 | 华为技术有限公司 | User login method, and terminal |
ITTO20120864A1 (en) * | 2012-10-04 | 2014-04-05 | St Microelectronics Srl | PROCEDURE AND SYSTEM FOR RECOGNITION OF FORMS OF THE TOUCH, SCREEN EQUIPMENT AND ITS RELATED PRODUCT |
JP6150141B2 (en) * | 2012-11-20 | 2017-06-21 | 日本電気株式会社 | Portable electronic device, its control method and program |
US9549323B2 (en) | 2012-12-03 | 2017-01-17 | Samsung Electronics Co., Ltd. | Method and mobile terminal for controlling screen lock |
CN102970606B (en) * | 2012-12-04 | 2017-11-17 | 深圳Tcl新技术有限公司 | The TV programme suggesting method and device of identity-based identification |
CN103870041B (en) * | 2012-12-14 | 2017-09-22 | 联想(北京)有限公司 | terminal device and its user identification method |
DE112013006349T5 (en) * | 2013-01-31 | 2015-09-17 | Hewlett Packard Development Company, L.P. | Touch screen with prevention of accidental input |
JP6117562B2 (en) * | 2013-02-13 | 2017-04-19 | ソニー株式会社 | Information processing apparatus, information processing method, and information processing system |
CN103529762B (en) * | 2013-02-22 | 2016-08-31 | Tcl集团股份有限公司 | A kind of intelligent home furnishing control method based on sensor technology and system |
CN103248933A (en) * | 2013-04-03 | 2013-08-14 | 深圳Tcl新技术有限公司 | Interactive method and remote control device based on user identity for interaction |
CN103246448A (en) * | 2013-04-03 | 2013-08-14 | 深圳Tcl新技术有限公司 | Interactive method and remote device for acquiring user identities to perform interaction |
US20140320420A1 (en) * | 2013-04-25 | 2014-10-30 | Sony Corporation | Method and apparatus for controlling a mobile device based on touch operations |
JP6232857B2 (en) * | 2013-09-02 | 2017-11-22 | 富士通株式会社 | Operation analysis device, operation analysis method, and operation analysis program |
CN103472917B (en) * | 2013-09-06 | 2016-07-06 | 浙江大学 | The unrelated motion recognition method of a kind of modes of emplacement with acceleration transducer and position |
US20150082890A1 (en) * | 2013-09-26 | 2015-03-26 | Intel Corporation | Biometric sensors for personal devices |
US9310933B2 (en) | 2014-02-26 | 2016-04-12 | Qualcomm Incorporated | Optimization for host based touch processing |
CN104020845B (en) * | 2014-03-27 | 2017-02-15 | 浙江大学 | Acceleration transducer placement-unrelated movement recognition method based on shapelet characteristic |
CN105025362B (en) * | 2014-04-29 | 2018-05-08 | 深圳Tcl新技术有限公司 | Identify the method and device of user identity |
US9798322B2 (en) * | 2014-06-19 | 2017-10-24 | Skydio, Inc. | Virtual camera interface and other user interaction paradigms for a flying digital assistant |
US9678506B2 (en) | 2014-06-19 | 2017-06-13 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
US12007763B2 (en) | 2014-06-19 | 2024-06-11 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
CN105630204A (en) * | 2014-10-29 | 2016-06-01 | 深圳富泰宏精密工业有限公司 | Mouse simulation system and method |
JP6404692B2 (en) * | 2014-12-01 | 2018-10-10 | 株式会社Nttドコモ | Personal authentication device and personal authentication method |
US9548865B2 (en) | 2014-12-01 | 2017-01-17 | International Business Machines Corporation | Token authentication for touch sensitive display devices |
US20160239138A1 (en) * | 2015-02-17 | 2016-08-18 | Htc Corporation | Mobile device, press detection method and computer-readable recording medium |
KR20160109304A (en) * | 2015-03-10 | 2016-09-21 | 삼성전자주식회사 | Remotely controller and method for controlling a screen of display apparatus |
CN106157036A (en) * | 2015-04-14 | 2016-11-23 | 广州杰赛科技股份有限公司 | A kind of authentication device based on heart beating feature |
JP6262177B2 (en) * | 2015-09-03 | 2018-01-17 | 株式会社東芝 | Wearable terminal, method and system |
CN105242851A (en) * | 2015-10-21 | 2016-01-13 | 维沃移动通信有限公司 | Method for switching operation modes of electronic device and electronic device |
KR101774391B1 (en) * | 2016-07-26 | 2017-09-05 | (주)다울디엔에스 | Motion effect encoding method using Gyro seonsor |
US10520943B2 (en) | 2016-08-12 | 2019-12-31 | Skydio, Inc. | Unmanned aerial image capture platform |
US20180157814A1 (en) * | 2016-12-01 | 2018-06-07 | Electronics And Telecommunications Research Institute | Personal authentication method and apparatus based on recognition of fingertip gesture and identification of fake pattern |
US11295458B2 (en) | 2016-12-01 | 2022-04-05 | Skydio, Inc. | Object tracking by an unmanned aerial vehicle using visual sensors |
CN106909237A (en) * | 2017-01-11 | 2017-06-30 | 深圳市赛亿科技开发有限公司 | A kind of mouse and mouse pad for monitoring human body mood |
JP6825397B2 (en) | 2017-02-07 | 2021-02-03 | 富士通株式会社 | Biometric device, biometric method and biometric program |
DE102017121456A1 (en) * | 2017-09-15 | 2019-03-21 | Fm Marketing Gmbh | Interactionless user recognition |
KR20200072493A (en) * | 2017-10-13 | 2020-06-22 | 코닌클리케 필립스 엔.브이. | Methods and systems for characterizing a user of a personal care device |
JP7053893B6 (en) | 2018-05-10 | 2022-06-02 | コーニンクレッカ フィリップス エヌ ヴェ | Methods and systems for adjusting the behavior of personal care devices based on the condition of the oral cavity |
DE102018215066A1 (en) * | 2018-09-05 | 2020-03-05 | Brendel Holding Gmbh & Co. Kg | Control panel for remote control, comprising an activation sensor with a variable effective sensor range |
US10948980B2 (en) * | 2019-05-10 | 2021-03-16 | Apple Inc. | Electronic device system with controllers |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5068619A (en) * | 1987-10-27 | 1991-11-26 | Nihon System Research Institute Inc. | Conductivity measuring device |
US20020028997A1 (en) * | 1998-12-07 | 2002-03-07 | Yoshitoshi Ito | Device for controlling equipment by using signals from a living body |
US6421453B1 (en) * | 1998-05-15 | 2002-07-16 | International Business Machines Corporation | Apparatus and methods for user recognition employing behavioral passwords |
US6456275B1 (en) * | 1998-09-14 | 2002-09-24 | Microsoft Corporation | Proximity sensor in a computer input device |
US6501515B1 (en) * | 1998-10-13 | 2002-12-31 | Sony Corporation | Remote control system |
US20030179246A1 (en) * | 2002-03-22 | 2003-09-25 | Koninklijke Philips Electronics N.V. | Low cost interactive program control system and method |
US20050212911A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Gesture identification of controlled devices |
US20060197753A1 (en) * | 2005-03-04 | 2006-09-07 | Hotelling Steven P | Multi-functional hand-held device |
US20060197750A1 (en) * | 2005-03-04 | 2006-09-07 | Apple Computer, Inc. | Hand held electronic device with multiple touch sensing devices |
US20070050138A1 (en) * | 2005-08-30 | 2007-03-01 | Honeywell International Inc. | Enhanced inertial system performance |
US7236156B2 (en) * | 2004-04-30 | 2007-06-26 | Hillcrest Laboratories, Inc. | Methods and devices for identifying users based on tremor |
US20080082339A1 (en) * | 2006-09-29 | 2008-04-03 | Nellcor Puritan Bennett Incorporated | System and method for secure voice identification in a medical device |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5459489A (en) * | 1991-12-05 | 1995-10-17 | Tv Interactive Data Corporation | Hand held electronic remote control device |
US5453758A (en) * | 1992-07-31 | 1995-09-26 | Sony Corporation | Input apparatus |
US5469194A (en) * | 1994-05-13 | 1995-11-21 | Apple Computer, Inc. | Apparatus and method for providing different input device orientations of a computer system |
GB9415627D0 (en) | 1994-08-01 | 1994-09-21 | Marshall James | Verification apparatus |
US5652630A (en) * | 1995-05-31 | 1997-07-29 | International Business Machines Corporation | Video receiver display, three axis remote control, and microcontroller for executing programs |
US6323846B1 (en) * | 1998-01-26 | 2001-11-27 | University Of Delaware | Method and apparatus for integrating manual input |
US6346891B1 (en) * | 1998-08-31 | 2002-02-12 | Microsoft Corporation | Remote control system with handling sensor in remote control device |
US6396523B1 (en) * | 1999-07-29 | 2002-05-28 | Interlink Electronics, Inc. | Home entertainment device remote control |
US6429543B1 (en) | 1999-10-01 | 2002-08-06 | Siemens Vdo Automotive Corporation | Innovative switch for remote control applications |
US6765557B1 (en) * | 2000-04-10 | 2004-07-20 | Interlink Electronics, Inc. | Remote control having touch pad to screen mapping |
US20040236699A1 (en) | 2001-07-10 | 2004-11-25 | American Express Travel Related Services Company, Inc. | Method and system for hand geometry recognition biometrics on a fob |
WO2003071410A2 (en) | 2002-02-15 | 2003-08-28 | Canesta, Inc. | Gesture recognition system using depth perceptive sensors |
TWM240050U (en) * | 2003-04-02 | 2004-08-01 | Elan Microelectronics Corp | Capacitor touch panel with integrated keyboard and handwriting function |
JP3930482B2 (en) * | 2004-01-19 | 2007-06-13 | ファナック株式会社 | 3D visual sensor |
US20050162402A1 (en) | 2004-01-27 | 2005-07-28 | Watanachote Susornpol J. | Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback |
US20050185788A1 (en) * | 2004-02-23 | 2005-08-25 | Daw Sean P. | Keypad adapted for use in dual orientations |
TW200539031A (en) * | 2004-05-20 | 2005-12-01 | Elan Microelectronics Corp | A capacitor type touch pad with integrated graphic input function |
US20060016868A1 (en) | 2004-07-01 | 2006-01-26 | American Express Travel Related Services Company, Inc. | Method and system for hand geometry recognition biometrics on a smartcard |
US20060077179A1 (en) | 2004-10-08 | 2006-04-13 | Inventec Corporation | Keyboard having automatic adjusting key intervals and a method thereof |
US20060227030A1 (en) * | 2005-03-31 | 2006-10-12 | Clifford Michelle A | Accelerometer based control system and method of controlling a device |
US7889175B2 (en) * | 2007-06-28 | 2011-02-15 | Panasonic Corporation | Touchpad-enabled remote controller and user interaction methods |
-
2008
- 2008-09-23 US US12/235,862 patent/US8031175B2/en not_active Expired - Fee Related
-
2009
- 2009-04-21 EP EP09735558A patent/EP2255270A2/en not_active Withdrawn
- 2009-04-21 JP JP2011506389A patent/JP2011523730A/en active Pending
- 2009-04-21 WO PCT/US2009/041227 patent/WO2009131987A2/en active Application Filing
- 2009-04-21 US US12/988,745 patent/US20110043475A1/en not_active Abandoned
- 2009-04-21 CN CN2009801139729A patent/CN102016765A/en active Pending
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5068619A (en) * | 1987-10-27 | 1991-11-26 | Nihon System Research Institute Inc. | Conductivity measuring device |
US6421453B1 (en) * | 1998-05-15 | 2002-07-16 | International Business Machines Corporation | Apparatus and methods for user recognition employing behavioral passwords |
US6456275B1 (en) * | 1998-09-14 | 2002-09-24 | Microsoft Corporation | Proximity sensor in a computer input device |
US6501515B1 (en) * | 1998-10-13 | 2002-12-31 | Sony Corporation | Remote control system |
US20020028997A1 (en) * | 1998-12-07 | 2002-03-07 | Yoshitoshi Ito | Device for controlling equipment by using signals from a living body |
US6831664B2 (en) * | 2002-03-22 | 2004-12-14 | Koninklijke Philips Electronics N.V. | Low cost interactive program control system and method |
US20030179246A1 (en) * | 2002-03-22 | 2003-09-25 | Koninklijke Philips Electronics N.V. | Low cost interactive program control system and method |
US20050212911A1 (en) * | 2004-03-23 | 2005-09-29 | Marvit David L | Gesture identification of controlled devices |
US7236156B2 (en) * | 2004-04-30 | 2007-06-26 | Hillcrest Laboratories, Inc. | Methods and devices for identifying users based on tremor |
US20060197753A1 (en) * | 2005-03-04 | 2006-09-07 | Hotelling Steven P | Multi-functional hand-held device |
US20060197750A1 (en) * | 2005-03-04 | 2006-09-07 | Apple Computer, Inc. | Hand held electronic device with multiple touch sensing devices |
US20070050138A1 (en) * | 2005-08-30 | 2007-03-01 | Honeywell International Inc. | Enhanced inertial system performance |
US20080082339A1 (en) * | 2006-09-29 | 2008-04-03 | Nellcor Puritan Bennett Incorporated | System and method for secure voice identification in a medical device |
Cited By (152)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9261964B2 (en) | 2005-12-30 | 2016-02-16 | Microsoft Technology Licensing, Llc | Unintentional touch rejection |
US9946370B2 (en) | 2005-12-30 | 2018-04-17 | Microsoft Technology Licensing, Llc | Unintentional touch rejection |
US9952718B2 (en) | 2005-12-30 | 2018-04-24 | Microsoft Technology Licensing, Llc | Unintentional touch rejection |
US9594457B2 (en) | 2005-12-30 | 2017-03-14 | Microsoft Technology Licensing, Llc | Unintentional touch rejection |
US10019080B2 (en) | 2005-12-30 | 2018-07-10 | Microsoft Technology Licensing, Llc | Unintentional touch rejection |
US10126942B2 (en) | 2007-09-19 | 2018-11-13 | Apple Inc. | Systems and methods for detecting a press on a touch-sensitive surface |
US10908815B2 (en) | 2007-09-19 | 2021-02-02 | Apple Inc. | Systems and methods for distinguishing between a gesture tracing out a word and a wiping motion on a touch-sensitive keyboard |
US10203873B2 (en) | 2007-09-19 | 2019-02-12 | Apple Inc. | Systems and methods for adaptively presenting a keyboard on a touch-sensitive display |
US9110590B2 (en) | 2007-09-19 | 2015-08-18 | Typesoft Technologies, Inc. | Dynamically located onscreen keyboard |
US9069390B2 (en) | 2008-09-19 | 2015-06-30 | Typesoft Technologies, Inc. | Systems and methods for monitoring surface sanitation |
US9454270B2 (en) | 2008-09-19 | 2016-09-27 | Apple Inc. | Systems and methods for detecting a press on a touch-sensitive surface |
US9641393B2 (en) | 2009-02-02 | 2017-05-02 | Waldeck Technology, Llc | Forming crowds and providing access to crowd data in a mobile environment |
US20100296707A1 (en) * | 2009-05-25 | 2010-11-25 | Kabushiki Kaisha Toshiba | Method and apparatus for information processing |
US8744142B2 (en) * | 2009-05-25 | 2014-06-03 | Kabushiki Kaisha Toshiba | Presenting information based on whether a viewer corresponding to information is stored is present in an image |
US20110019105A1 (en) * | 2009-07-27 | 2011-01-27 | Echostar Technologies L.L.C. | Verification of symbols received through a touchpad of a remote control device in an electronic system to allow access to system functions |
US20110115604A1 (en) * | 2009-11-16 | 2011-05-19 | Broadcom Corporation | Remote control for multimedia system having touch sensitive panel for user id |
US8614621B2 (en) * | 2009-11-16 | 2013-12-24 | Broadcom Corporation | Remote control for multimedia system having touch sensitive panel for user ID |
US8754746B2 (en) * | 2009-11-16 | 2014-06-17 | Broadcom Corporation | Hand-held gaming device that identifies user based upon input from touch sensitive panel |
US20110118028A1 (en) * | 2009-11-16 | 2011-05-19 | Broadcom Corporation | Hand-held gaming device with configurable touch sensitive panel(s) |
US20110118029A1 (en) * | 2009-11-16 | 2011-05-19 | Broadcom Corporation | Hand-held gaming device with touch sensitive panel(s) for gaming input |
US8449393B2 (en) * | 2009-11-16 | 2013-05-28 | Broadcom Corporation | Hand-held gaming device with configurable touch sensitive panel(s) |
US20110118026A1 (en) * | 2009-11-16 | 2011-05-19 | Broadcom Corporation | Hand-held gaming device that identifies user based upon input from touch sensitive panel |
US20110118027A1 (en) * | 2009-11-16 | 2011-05-19 | Broadcom Corporation | Altering video game operations based upon user id and-or grip position |
US20110164115A1 (en) * | 2009-12-31 | 2011-07-07 | Broadcom Corporation | Transcoder supporting selective delivery of 2d, stereoscopic 3d, and multi-view 3d content from source video |
US9979954B2 (en) | 2009-12-31 | 2018-05-22 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Eyewear with time shared viewing supporting delivery of differing content to multiple viewers |
US9204138B2 (en) | 2009-12-31 | 2015-12-01 | Broadcom Corporation | User controlled regional display of mixed two and three dimensional content |
US9247286B2 (en) | 2009-12-31 | 2016-01-26 | Broadcom Corporation | Frame formatting supporting mixed two and three dimensional video data communication |
US9654767B2 (en) | 2009-12-31 | 2017-05-16 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Programming architecture supporting mixed two and three dimensional displays |
US9124885B2 (en) | 2009-12-31 | 2015-09-01 | Broadcom Corporation | Operating system supporting mixed 2D, stereoscopic 3D and multi-view 3D displays |
US9143770B2 (en) | 2009-12-31 | 2015-09-22 | Broadcom Corporation | Application programming interface supporting mixed two and three dimensional displays |
US8687042B2 (en) | 2009-12-31 | 2014-04-01 | Broadcom Corporation | Set-top box circuitry supporting 2D and 3D content reductions to accommodate viewing environment constraints |
US9066092B2 (en) | 2009-12-31 | 2015-06-23 | Broadcom Corporation | Communication infrastructure including simultaneous video pathways for multi-viewer support |
US20110164188A1 (en) * | 2009-12-31 | 2011-07-07 | Broadcom Corporation | Remote control with integrated position, viewer identification and optical and audio test |
US8767050B2 (en) | 2009-12-31 | 2014-07-01 | Broadcom Corporation | Display supporting multiple simultaneous 3D views |
US20110157339A1 (en) * | 2009-12-31 | 2011-06-30 | Broadcom Corporation | Display supporting multiple simultaneous 3d views |
US20110157697A1 (en) * | 2009-12-31 | 2011-06-30 | Broadcom Corporation | Adaptable parallax barrier supporting mixed 2d and stereoscopic 3d display regions |
US8823782B2 (en) * | 2009-12-31 | 2014-09-02 | Broadcom Corporation | Remote control with integrated position, viewer identification and optical and audio test |
US20110157696A1 (en) * | 2009-12-31 | 2011-06-30 | Broadcom Corporation | Display with adaptable parallax barrier |
US8854531B2 (en) | 2009-12-31 | 2014-10-07 | Broadcom Corporation | Multiple remote controllers that each simultaneously controls a different visual presentation of a 2D/3D display |
US8922545B2 (en) | 2009-12-31 | 2014-12-30 | Broadcom Corporation | Three-dimensional display system with adaptation based on viewing reference of viewer(s) |
US8964013B2 (en) | 2009-12-31 | 2015-02-24 | Broadcom Corporation | Display with elastic light manipulator |
US8988506B2 (en) | 2009-12-31 | 2015-03-24 | Broadcom Corporation | Transcoder supporting selective delivery of 2D, stereoscopic 3D, and multi-view 3D content from source video |
US9013546B2 (en) | 2009-12-31 | 2015-04-21 | Broadcom Corporation | Adaptable media stream servicing two and three dimensional content |
US9019263B2 (en) | 2009-12-31 | 2015-04-28 | Broadcom Corporation | Coordinated driving of adaptable light manipulator, backlighting and pixel array in support of adaptable 2D and 3D displays |
US9049440B2 (en) | 2009-12-31 | 2015-06-02 | Broadcom Corporation | Independent viewer tailoring of same media source content via a common 2D-3D display |
US8566309B2 (en) * | 2010-03-03 | 2013-10-22 | Waldeck Technology, Llc | Monitoring hashtags in micro-blog posts to provide one or more crowd-based features |
US20120066212A1 (en) * | 2010-03-03 | 2012-03-15 | Waldeck Technology, Llc | Monitoring hashtags in micro-blog posts to provide one or more crowd-based features |
US20110226864A1 (en) * | 2010-03-19 | 2011-09-22 | Samsung Electronics Co. Ltd. | Mobile device and method for emitting fragrance |
US20130021241A1 (en) * | 2010-04-01 | 2013-01-24 | Funai Electric Co., Ltd. | Portable Information Display Terminal |
US20130201155A1 (en) * | 2010-08-12 | 2013-08-08 | Genqing Wu | Finger identification on a touchscreen |
US20120062387A1 (en) * | 2010-09-10 | 2012-03-15 | Daniel Vik | Human interface device input filter based on motion |
US11314849B2 (en) | 2010-11-29 | 2022-04-26 | Biocatch Ltd. | Method, device, and system of detecting a lie of a user who inputs data |
US10949757B2 (en) | 2010-11-29 | 2021-03-16 | Biocatch Ltd. | System, device, and method of detecting user identity based on motor-control loop model |
US10728761B2 (en) | 2010-11-29 | 2020-07-28 | Biocatch Ltd. | Method, device, and system of detecting a lie of a user who inputs data |
US10776476B2 (en) | 2010-11-29 | 2020-09-15 | Biocatch Ltd. | System, device, and method of visual login |
US10834590B2 (en) | 2010-11-29 | 2020-11-10 | Biocatch Ltd. | Method, device, and system of differentiating between a cyber-attacker and a legitimate user |
US10621585B2 (en) | 2010-11-29 | 2020-04-14 | Biocatch Ltd. | Contextual mapping of web-pages, and generation of fraud-relatedness score-values |
US12101354B2 (en) * | 2010-11-29 | 2024-09-24 | Biocatch Ltd. | Device, system, and method of detecting vishing attacks |
US10586036B2 (en) | 2010-11-29 | 2020-03-10 | Biocatch Ltd. | System, device, and method of recovery and resetting of user authentication factor |
US10897482B2 (en) | 2010-11-29 | 2021-01-19 | Biocatch Ltd. | Method, device, and system of back-coloring, forward-coloring, and fraud detection |
US20240080339A1 (en) * | 2010-11-29 | 2024-03-07 | Biocatch Ltd. | Device, System, and Method of Detecting Vishing Attacks |
US11838118B2 (en) * | 2010-11-29 | 2023-12-05 | Biocatch Ltd. | Device, system, and method of detecting vishing attacks |
US11269977B2 (en) | 2010-11-29 | 2022-03-08 | Biocatch Ltd. | System, apparatus, and method of collecting and processing data in electronic devices |
US10917431B2 (en) | 2010-11-29 | 2021-02-09 | Biocatch Ltd. | System, method, and device of authenticating a user based on selfie image or selfie video |
US11580553B2 (en) | 2010-11-29 | 2023-02-14 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering |
US10474815B2 (en) | 2010-11-29 | 2019-11-12 | Biocatch Ltd. | System, device, and method of detecting malicious automatic script and code injection |
US10747305B2 (en) | 2010-11-29 | 2020-08-18 | Biocatch Ltd. | Method, system, and device of authenticating identity of a user of an electronic device |
US10404729B2 (en) | 2010-11-29 | 2019-09-03 | Biocatch Ltd. | Device, method, and system of generating fraud-alerts for cyber-attacks |
US10949514B2 (en) | 2010-11-29 | 2021-03-16 | Biocatch Ltd. | Device, system, and method of differentiating among users based on detection of hardware components |
US10298614B2 (en) * | 2010-11-29 | 2019-05-21 | Biocatch Ltd. | System, device, and method of generating and managing behavioral biometric cookies |
US10262324B2 (en) | 2010-11-29 | 2019-04-16 | Biocatch Ltd. | System, device, and method of differentiating among users based on user-specific page navigation sequence |
US20210329030A1 (en) * | 2010-11-29 | 2021-10-21 | Biocatch Ltd. | Device, System, and Method of Detecting Vishing Attacks |
US11425563B2 (en) | 2010-11-29 | 2022-08-23 | Biocatch Ltd. | Method, device, and system of differentiating between a cyber-attacker and a legitimate user |
US11250435B2 (en) | 2010-11-29 | 2022-02-15 | Biocatch Ltd. | Contextual mapping of web-pages, and generation of fraud-relatedness score-values |
US11330012B2 (en) * | 2010-11-29 | 2022-05-10 | Biocatch Ltd. | System, method, and device of authenticating a user based on selfie image or selfie video |
US11210674B2 (en) | 2010-11-29 | 2021-12-28 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering |
US11223619B2 (en) | 2010-11-29 | 2022-01-11 | Biocatch Ltd. | Device, system, and method of user authentication based on user-specific characteristics of task performance |
US20120242701A1 (en) * | 2011-03-25 | 2012-09-27 | Apple Inc. | Accessory dependent display orientation |
WO2012166979A3 (en) * | 2011-05-31 | 2013-03-28 | Cleankeys Inc. | System for detecting a user on a sensor-based surface |
WO2012166979A2 (en) * | 2011-05-31 | 2012-12-06 | Cleankeys Inc. | System for detecting a user on a sensor-based surface |
US9804864B1 (en) * | 2011-10-07 | 2017-10-31 | BlueStack Systems, Inc. | Method of mapping inputs and system thereof |
US20130100043A1 (en) * | 2011-10-24 | 2013-04-25 | General Electric Company | Method for determining valid touch screen inputs |
US20130147602A1 (en) * | 2011-12-12 | 2013-06-13 | Cisco Technology, Inc. | Determination of user based on electrical measurement |
CN104054043A (en) * | 2012-01-17 | 2014-09-17 | 微软公司 | Skinnable touch device grip patterns |
US20130181902A1 (en) * | 2012-01-17 | 2013-07-18 | Microsoft Corporation | Skinnable touch device grip patterns |
US9519419B2 (en) * | 2012-01-17 | 2016-12-13 | Microsoft Technology Licensing, Llc | Skinnable touch device grip patterns |
US20130300668A1 (en) * | 2012-01-17 | 2013-11-14 | Microsoft Corporation | Grip-Based Device Adaptations |
US20130243242A1 (en) * | 2012-03-16 | 2013-09-19 | Pixart Imaging Incorporation | User identification system and method for identifying user |
US9280714B2 (en) * | 2012-03-16 | 2016-03-08 | PixArt Imaging Incorporation, R.O.C. | User identification system and method for identifying user |
US11126832B2 (en) * | 2012-03-16 | 2021-09-21 | PixArt Imaging Incorporation, R.O.C. | User identification system and method for identifying user |
US10832042B2 (en) * | 2012-03-16 | 2020-11-10 | Pixart Imaging Incorporation | User identification system and method for identifying user |
US20160140385A1 (en) * | 2012-03-16 | 2016-05-19 | Pixart Imaging Incorporation | User identification system and method for identifying user |
US20190303659A1 (en) * | 2012-03-16 | 2019-10-03 | Pixart Imaging Incorporation | User identification system and method for identifying user |
US9104260B2 (en) | 2012-04-10 | 2015-08-11 | Typesoft Technologies, Inc. | Systems and methods for detecting a press on a touch-sensitive surface |
US20130293360A1 (en) * | 2012-05-02 | 2013-11-07 | Shuen-Fu Lo | All new Ui-E1-Stroke operation control devices |
US20140047465A1 (en) * | 2012-08-07 | 2014-02-13 | WebTuner, Corporation | Multi-media ad targeting and content recommendation with viewer identity detection system |
US9582122B2 (en) | 2012-11-12 | 2017-02-28 | Microsoft Technology Licensing, Llc | Touch-sensitive bezel techniques |
US10656750B2 (en) | 2012-11-12 | 2020-05-19 | Microsoft Technology Licensing, Llc | Touch-sensitive bezel techniques |
US20160112668A1 (en) * | 2012-12-28 | 2016-04-21 | Echostar Technologies L.L.C. | Determining remote control state and user via accelerometer |
US20140184922A1 (en) * | 2012-12-28 | 2014-07-03 | Echostar Technologies L.L.C. | Determining remote control state and user via accelerometer |
US11477511B2 (en) * | 2012-12-28 | 2022-10-18 | DISH Technologies L.L.C. | Determining remote control state and user via accelerometer |
US9237292B2 (en) * | 2012-12-28 | 2016-01-12 | Echostar Technologies L.L.C. | Determining remote control state and user via accelerometer |
US20140210728A1 (en) * | 2013-01-25 | 2014-07-31 | Verizon Patent And Licensing Inc. | Fingerprint driven profiling |
US9471154B1 (en) * | 2013-02-11 | 2016-10-18 | Amazon Technologies, Inc. | Determining which hand is holding a device |
US9146631B1 (en) * | 2013-02-11 | 2015-09-29 | Amazon Technologies, Inc. | Determining which hand is holding a device |
US9489086B1 (en) | 2013-04-29 | 2016-11-08 | Apple Inc. | Finger hover detection for improved typing |
US10289302B1 (en) | 2013-09-09 | 2019-05-14 | Apple Inc. | Virtual keyboard animation |
US11314411B2 (en) | 2013-09-09 | 2022-04-26 | Apple Inc. | Virtual keyboard animation |
US9946383B2 (en) | 2014-03-14 | 2018-04-17 | Microsoft Technology Licensing, Llc | Conductive trace routing for display and bezel sensors |
US9477337B2 (en) | 2014-03-14 | 2016-10-25 | Microsoft Technology Licensing, Llc | Conductive trace routing for display and bezel sensors |
US10614201B2 (en) | 2014-08-07 | 2020-04-07 | Alibaba Group Holding Limited | Method and device for identity authentication |
US10795978B2 (en) | 2014-08-07 | 2020-10-06 | Alibaba Group Holding Limited | Method and device for identity authentication |
US20160085961A1 (en) * | 2014-09-19 | 2016-03-24 | Kabushiki Kaisha Toshiba | Authentication system, authentication device, and authentication method |
US9852281B2 (en) * | 2014-09-19 | 2017-12-26 | Kabushiki Kaisha Toshiba | Authentication system, authentication device, and authentication method |
US20170266329A1 (en) * | 2014-11-14 | 2017-09-21 | Echostar Technologies L.L.C. | Systems and methods for disinfecting a remote control using ultraviolet light |
US10507255B2 (en) * | 2014-11-14 | 2019-12-17 | DISH Technologies L.L.C. | Systems and methods for disinfecting a remote control using ultraviolet light |
US11271413B2 (en) | 2014-11-14 | 2022-03-08 | DISH Technologies L.L.C. | Systems and methods for disinfecting a remote control using ultraviolet light |
US12003126B2 (en) | 2014-11-14 | 2024-06-04 | DISH Technologies L.L.C. | Systems and methods for activating an ultraviolet light emitter |
EP3223119A4 (en) * | 2014-11-19 | 2018-07-25 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and device for adjusting object attribute information |
US10241611B2 (en) * | 2014-11-19 | 2019-03-26 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and device for adjusting object attribute information |
US10241608B2 (en) * | 2014-12-03 | 2019-03-26 | Samsung Display Co., Ltd. | Display device and driving method for display device using the same |
US20160162100A1 (en) * | 2014-12-03 | 2016-06-09 | Samsung Display Co., Ltd. | Display device and driving method for display device using the same |
US10452194B2 (en) * | 2014-12-03 | 2019-10-22 | Samsung Display Co., Ltd. | Display device and driving method for display device using the same |
US10969890B2 (en) * | 2014-12-03 | 2021-04-06 | Samsung Display Co., Ltd. | Display device and driving method for display device using the same |
US20160182950A1 (en) * | 2014-12-17 | 2016-06-23 | Lenovo (Singapore) Pte. Ltd. | Identification of a user for personalized media content presentation |
US10108854B2 (en) | 2015-05-18 | 2018-10-23 | Sstatzz Oy | Method and system for automatic identification of player |
US10719765B2 (en) | 2015-06-25 | 2020-07-21 | Biocatch Ltd. | Conditional behavioral biometrics |
US11238349B2 (en) | 2015-06-25 | 2022-02-01 | Biocatch Ltd. | Conditional behavioural biometrics |
US10523680B2 (en) * | 2015-07-09 | 2019-12-31 | Biocatch Ltd. | System, device, and method for detecting a proxy server |
US11323451B2 (en) | 2015-07-09 | 2022-05-03 | Biocatch Ltd. | System, device, and method for detection of proxy server |
US10834090B2 (en) * | 2015-07-09 | 2020-11-10 | Biocatch Ltd. | System, device, and method for detection of proxy server |
US9927917B2 (en) | 2015-10-29 | 2018-03-27 | Microsoft Technology Licensing, Llc | Model-based touch event location adjustment |
US10922400B2 (en) | 2015-12-11 | 2021-02-16 | Roku, Inc. | User identification based on the motion of a device |
WO2017100458A1 (en) | 2015-12-11 | 2017-06-15 | Roku, Inc. | User identification based on the motion of a device |
EP3387572A4 (en) * | 2015-12-11 | 2019-07-10 | Roku, Inc. | User identification based on the motion of a device |
US10078377B2 (en) | 2016-06-09 | 2018-09-18 | Microsoft Technology Licensing, Llc | Six DOF mixed reality input by fusing inertial handheld controller with hand tracking |
US11055395B2 (en) | 2016-07-08 | 2021-07-06 | Biocatch Ltd. | Step-up authentication |
US10735080B2 (en) | 2016-08-10 | 2020-08-04 | Huawei Technologies Co., Ltd. | Transmission scheme indication method, and data transmission method, apparatus, and system |
US11281292B2 (en) * | 2016-10-28 | 2022-03-22 | Sony Interactive Entertainment Inc. | Information processing apparatus, control method, program, and storage media |
US10579784B2 (en) | 2016-11-02 | 2020-03-03 | Biocatch Ltd. | System, device, and method of secure utilization of fingerprints for user authentication |
US10685355B2 (en) * | 2016-12-04 | 2020-06-16 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering |
US20180217719A1 (en) * | 2017-02-01 | 2018-08-02 | Open Tv, Inc. | Menu modification based on controller manipulation data |
US11042262B2 (en) * | 2017-02-01 | 2021-06-22 | Opentv, Inc. | Menu modification based on controller manipulation data |
US10397262B2 (en) | 2017-07-20 | 2019-08-27 | Biocatch Ltd. | Device, system, and method of detecting overlay malware |
US10970394B2 (en) | 2017-11-21 | 2021-04-06 | Biocatch Ltd. | System, device, and method of detecting vishing attacks |
US10884497B2 (en) * | 2018-11-26 | 2021-01-05 | Center Of Human-Centered Interaction For Coexistence | Method and apparatus for motion capture interface using multiple fingers |
US11325047B2 (en) | 2019-02-19 | 2022-05-10 | Sony Interactive Entertainment Inc. | Apparatus, system and method of authentication |
GB2581494B (en) * | 2019-02-19 | 2021-08-04 | Sony Interactive Entertainment Inc | Apparatus, system and method of authentication |
EP3698858A1 (en) * | 2019-02-19 | 2020-08-26 | Sony Interactive Entertainment Inc. | Apparatus, sytem and method of authentication |
GB2581494A (en) * | 2019-02-19 | 2020-08-26 | Sony Interactive Entertainment Inc | Apparatus, system and method of authentication |
US11606353B2 (en) | 2021-07-22 | 2023-03-14 | Biocatch Ltd. | System, device, and method of generating and utilizing one-time passwords |
EP4293543A1 (en) * | 2022-06-17 | 2023-12-20 | NOS Inovação, S.A. | System for identifying a user of an electronic device |
Also Published As
Publication number | Publication date |
---|---|
JP2011523730A (en) | 2011-08-18 |
WO2009131987A3 (en) | 2009-12-30 |
CN102016765A (en) | 2011-04-13 |
EP2255270A2 (en) | 2010-12-01 |
US20090262073A1 (en) | 2009-10-22 |
US8031175B2 (en) | 2011-10-04 |
WO2009131987A2 (en) | 2009-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20110043475A1 (en) | Method and system of identifying a user of a handheld device | |
Liu et al. | uWave: Accelerometer-based personalized gesture recognition and its applications | |
Wu et al. | Orientation independent activity/gesture recognition using wearable motion sensors | |
US9575570B2 (en) | 3D pointing devices and methods | |
CN107643828B (en) | Vehicle and method of controlling vehicle | |
US9122456B2 (en) | Enhanced detachable sensory-interface device for a wireless personal communication device and method | |
JP4481663B2 (en) | Motion recognition device, motion recognition method, device control device, and computer program | |
KR100580647B1 (en) | Motion-based input device being able to classify input modes and method therefor | |
KR102143574B1 (en) | Method and apparatus for online signature vefication using proximity touch | |
US9336374B2 (en) | Method, module, and computer program product for identifying user of mobile device | |
CN107004124B (en) | Method and apparatus for recognizing user using bio-signal | |
CN108196668B (en) | Portable gesture recognition system and method | |
US9760758B2 (en) | Determining which hand is being used to operate a device using a fingerprint sensor | |
CN105843500A (en) | Electronic device with fingerprint sensor operating in vector mode | |
Cenedese et al. | Home automation oriented gesture classification from inertial measurements | |
KR20180027502A (en) | How to use the capacitance to detect touch pressure | |
US20160357301A1 (en) | Method and system for performing an action based on number of hover events | |
CN107533371A (en) | Controlled using the user interface for influenceing gesture | |
Le et al. | The Internet-of-Things based hand gestures using wearable sensors for human machine interaction | |
Guna et al. | Intuitive gesture based user identification system | |
CN111684762B (en) | Terminal device management method and terminal device | |
CN111145891A (en) | Information processing method and device and electronic equipment | |
KR20140112316A (en) | control apparatus method of smart device using motion recognition | |
Yamagishi et al. | A system for controlling personal computers by hand gestures using a wireless sensor device | |
US20220138625A1 (en) | Information processing apparatus, information processing method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |