US20140085220A1 - Determining a dominant hand of a user of a computing device - Google Patents
Determining a dominant hand of a user of a computing device Download PDFInfo
- Publication number
- US20140085220A1 US20140085220A1 US13/658,632 US201213658632A US2014085220A1 US 20140085220 A1 US20140085220 A1 US 20140085220A1 US 201213658632 A US201213658632 A US 201213658632A US 2014085220 A1 US2014085220 A1 US 2014085220A1
- Authority
- US
- United States
- Prior art keywords
- user
- computing device
- dominant hand
- input
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- Computing devices provide users with the ability to interact with processes and data using input and output devices. For example, a user may provide a user input to a computing device using a presence-sensitive display that displays a graphical user interface (GUI). The user input may cause the computing device to modify the execution of a process and/or data. Such processes may provide a user with the ability to access the Internet, play games, and play videos and music, as well as providing other various types of functionality.
- GUI graphical user interface
- the computing device may be a mobile computing device, such as a mobile phone (e.g., a smartphone) or tablet computer that the user may hold in his or her hand.
- a user may hold a mobile computing device in the user's right hand, and may provide user input gestures at a presence-sensitive display of the mobile computing device using the left hand of the user.
- Advancements in computing devices have enabled such devices to provide users with richer user experiences that include increasingly complex graphical user interfaces.
- a method includes determining, by a computing device, a plurality of features. Each feature from the plurality of features may be usable to determine a dominant hand of a user of the computing device. The method also includes receiving, by the computing device, a plurality of input values, each input value from the plurality of input values corresponding to a respective feature from the plurality of features, and determining, using a probabilistic model and based at least in part on at least one input value from the plurality of input values corresponding to the respective feature from the plurality of features, a hand of the user as a dominant hand of the user. The method also includes generating, based at least in part on the determined dominant hand of the user, a graphical user interface for display at a the presence-sensitive display operatively coupled to the computing device.
- a computer-readable storage medium is encoded with instructions that, when executed, cause one or more processors of a computing device to perform operations including determining a plurality of features. Each feature from the plurality of features may be usable to determine a dominant hand of a user of the computing device.
- the computer-readable storage medium may be further encoded with instructions that, when executed cause the one or more processors to perform operations including receiving a plurality of input values, each input value from the plurality of input values corresponding to a respective feature from the plurality of features, determining, using a probabilistic model and based at least in part on at least one input value from the plurality of input values corresponding to the respective feature from the plurality of features, a hand of the user as a dominant hand of the user, and generating, based at least in part on the determined dominant hand of the user, a graphical user interface for display at a presence-sensitive display operatively coupled to the computing device.
- a computing device includes one or more processors, a presence-sensitive display that is operatively coupled to the computing device, and one or more sensors.
- the one or more processors may be configured to determine a plurality of features. Each feature from the plurality of features may be usable to determine a dominant hand of a user of the computing device.
- the one or more processors may be further configured to receive, from the one or more sensors, a plurality of input values, each input value from the plurality of input values corresponding to a respective feature from the plurality of features, and determine, using a probabilistic model and based at least in part on at least one input value from the plurality of input values corresponding to the respective feature from the plurality of features, a hand of the user as a dominant hand of the user.
- the one or more processors may be further configured to generate, based at least in part on the determined dominant hand of the user, a graphical user interface for display at the presence-sensitive display.
- FIG. 1A is a conceptual diagram illustrating an example computing device that may be used to determine a dominant hand of a user and generate a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure.
- FIG. 1B is a conceptual diagram illustrating an example of the computing device of FIG. 1A that may be used to determine a dominant hand of a user and generate a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure.
- FIG. 2 is a block diagram illustrating further details of one example of a computing device shown in FIGS. 1A and 1B , in accordance with one or more aspects of this disclosure.
- FIGS. 3A and 3B are conceptual diagrams illustrating an example computing device that may be used to determine a dominant hand of a user and generate a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure.
- FIG. 4 is a flow diagram illustrating example operations of a computing device to determine a dominant hand of a user and output a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure.
- FIG. 5 is a flow diagram illustrating example operations of a computing device to determine a dominant hand of a user and output a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure.
- a computing device may output a graphical user interface (GUI) at a presence-sensitive display.
- GUI graphical user interface
- the presence-sensitive display e.g., a touch-sensitive screen
- the presence-sensitive display may enable a user to interact with graphical elements of the GUI by detecting user inputs in the form of gestures performed at or near the presence-sensitive display. For instance, a user may provide a touch gesture to select a graphical button control of the GUI.
- Advancements in computing devices have enabled such devices to provide increasingly complex GUIs.
- presence-sensitive displays such as those associated with mobile computing devices, may provide relatively small interaction surfaces with which to display a GUI and receive user input gestures.
- the combination of increasingly complex GUIs and the limited space provided by many presence-sensitive displays may increase the difficulty for a user to provide user input gestures to interact with the computing device.
- users may typically be more accurate and quicker when providing such gestures using a dominant hand of the user than when using a non-dominant hand of the user.
- a computing device may determine a dominant hand of the user. For instance, the computing device may receive a plurality of input values (e.g., acceleration information from an accelerometer of the computing device, physical orientation information from a gyroscope of the computing device, visual information from an image sensor of the computing device, etc.), each input from the plurality of inputs corresponding to a respective feature from a plurality of features that are usable to determine the dominant hand of the user.
- a plurality of input values e.g., acceleration information from an accelerometer of the computing device, physical orientation information from a gyroscope of the computing device, visual information from an image sensor of the computing device, etc.
- Such features may include, but are not limited to, acceleration information of a computing device, physical orientation of the computing device, visual information associated with the computing device, one or more user inputs detected at a presence-sensitive and/or touch-sensitive display device operatively coupled to the computing device, and the like.
- the computing device may use a probabilistic model, such as a Bayesian network, to determine the dominant hand of the user based at least in part on the plurality of input values. For instance, the computing device may compare the received input values to corresponding baseline values determined with respect to known right-handed and/or left-handed users.
- a probabilistic model such as a Bayesian network
- the computing device may generate, based at least in part on the determined dominant hand of the user, a GUI for display in a dominant hand visual configuration.
- the computing device may determine that a left hand of a user is the dominant hand of the user.
- the computing device may generate a GUI in a dominant hand visual configuration that includes, in one example, graphical elements (e.g., one or more graphical button controls) positioned along a radius that follows a typical arc of a left thumb of a user holding a mobile computing device in the left hand of the user (e.g., a left-handed visual configuration).
- graphical elements e.g., one or more graphical button controls
- the computing device may determine, based at least in part on at least one input value from the plurality of received input values, that the user is currently holding the computing device with a non-dominant hand of the user.
- the computing device may generate a GUI in a non-dominant hand visual configuration. For instance, the computing device may determine that a left hand of the user is a dominant hand of the user, and that the user is currently holding the computing device in a right hand of the user (i.e., a non-dominant hand of the user in this example). In such an example, the computing device may generate a GUI in a non-dominant hand visual configuration.
- the non-dominant hand visual configuration includes graphical elements (e.g., one or more graphical button controls) positioned along a radius that follows a typical arc of a right thumb of a user holding a computing device in a right hand of the user (e.g., a right-handed visual configuration).
- graphical elements e.g., one or more graphical button controls
- the non-dominant hand visual configuration may be different than the dominant hand visual configuration with respect to one or more of a size, shape, location, number of graphical elements generated for display, or other properties of the visual configuration.
- a non-dominant hand visual configuration may include fewer, but larger graphical elements to compensate for a tendency of users to be less accurate when providing user input gestures with a non-dominant hand of the user.
- the computing device may promote improved usability by facilitating user selection of graphical elements with the non-dominant hand of the user.
- FIG. 1A is a conceptual diagram illustrating an example computing device that may be used to determine a dominant hand of a user and generate a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure.
- computing device 2 may include display 4 , one or more sensors 6 , handedness module 8 , and graphical user interface (GUI) module 10 .
- GUI graphical user interface
- Examples of computing device 2 may include, but are not limited to, portable or mobile devices such as mobile phones (including smartphones), tablet computers, smart television platform, personal digital assistants (PDAs), and the like.
- computing device 2 may be a mobile phone, such as a smartphone.
- Display 4 may be a liquid crystal display (LCD), e-ink, organic light emitting diode (OLED), or other display.
- Display 4 may present the content of computing device 2 to a user.
- display 4 may display the output of applications executed on one or more processors of computing device 2 , confirmation messages, indications, or other functions that may need to be presented to a user.
- display 4 may provide some or all of the functionality of a user interface of computing device 2 .
- display 4 may be a touch-sensitive and/or presence-sensitive display that can display a GUI and detect input from a user in the form of user input gestures (e.g., touch gestures, swipe gestures, pinch gestures, and the like) using capacitive or inductive detection at or near the presence-sensitive display.
- user input gestures e.g., touch gestures, swipe gestures, pinch gestures, and the like
- computing device 2 may include handedness module 8 and GUI module 10 .
- GUI module 10 may perform one or more functions to receive input, such as one or more user input gestures detected at display 4 .
- GUI module 10 may send such input to other components associated with computing device 2 , such as handedness module 8 or other application(s) executing on one or more processors of computing device 2 .
- GUI module 10 may also receive data from components associated with computing device 2 , such as handedness module 8 . Using the data, GUI module 10 may cause components associated with computing device 2 , such as display 4 , to provide output based on the data.
- GUI module 10 may receive data from handedness module 8 that causes GUI module 10 to display a GUI at display 4 to enable a user to interact with computing device 2 .
- GUI module 10 may generate a GUI for display at display 4 that includes one or more graphical elements, such as graphical elements 12 .
- Graphical elements 12 may include any one or more graphical elements to enable a user to provide user input gestures to interact with computing device 2 .
- graphical elements 12 may be graphical button controls, checkbox controls, slider controls, or other types of graphical control elements.
- graphical elements 12 may include one or more graphical button controls to enable a user to provide user input gestures to interact with an email application, at least portions of which execute on one or more processors of computing device 2 .
- graphical elements 12 may include a “compose” graphical button to enable a user to create a new email message, a “send” graphical button to enable a user to send an email message, an “archive” graphical button to enable a user to archive one or more email messages, and the like.
- Other examples of graphical elements 12 are possible, and the non-limiting example above is provided only for purposes of discussion.
- Graphical elements 12 may be the same or different types of graphical elements. For instance, in some examples, at least one of graphical elements 12 may be a graphical button control and at least one of graphical elements 12 may be a graphical checkbox control. In certain examples, each of graphical elements 12 may be the same type of graphical elements, such as when each of graphical elements 12 is a graphical button control.
- GUI module 10 may generate a GUI for display at display 4 in various visual configurations. For instance, GUI module 10 may generate a GUI for display in a right-handed visual configuration. In certain examples, GUI module 10 may generate a GUI for display in a left-handed visual configuration that is different from the right-handed visual configuration. In some examples, GUI module 10 may generate a GUI for display in a dominant hand visual configuration that is different from a non-dominant hand visual configuration.
- the dominant hand visual configuration may be either a right-handed visual configuration or a left-handed visual configuration.
- the non-dominant hand visual configuration may be either a right-handed visual configuration or a left-handed visual configuration.
- GUI module 10 may, in some examples, generate a GUI for display at a display device operatively coupled to computing device 2 (e.g., display 4 ) based at least in part on a determination by computing device 2 of a dominant hand of a user interacting with computing device 2 .
- GUI module 10 may receive data from handedness module 8 indicating a dominant hand of a user.
- GUI module 10 may generate the GUI for display at display 4 based at least in part on the data received from handedness module 8 indicating the dominant hand of the user.
- Handedness module 8 may determine a plurality of features, each of which is usable to determine a dominant hand of a user. Examples of such features include, but are not limited to, a physical orientation of computing device 2 , acceleration information of computing device 2 , indications of one or more user inputs detected at display 4 (e.g., a presence-sensitive and/or touch-sensitive display), visual information of an image sensor (e.g., a camera device) of computing device 2 , and the like.
- a physical orientation of computing device 2 e.g., acceleration information of computing device 2 , indications of one or more user inputs detected at display 4 (e.g., a presence-sensitive and/or touch-sensitive display), visual information of an image sensor (e.g., a camera device) of computing device 2 , and the like.
- Physical orientation information of computing device 2 may be usable to determine a dominant hand of a user.
- computing device 2 may be a mobile computing device such as a mobile phone or tablet computer.
- a user such as user 3
- user 3 holds computing device 2 in a right hand of user 3 .
- user 3 may hold computing device 2 against the side of his or her head while using computing device 2 for telephonic communications.
- a right-handed user i.e., a user whose right hand is dominant over a non-dominant left hand
- a left-handed user i.e., a user whose left hand is dominant over a non-dominant right hand
- physical orientation information of computing device 2 while computing device 2 is being used for telephonic communications may be usable to determine a dominant hand of a user.
- physical orientation information indicating that computing device 2 is held against a right side of a head of a user may indicate that a right hand of the user is a dominant hand of the user.
- Physical orientation information indicating that computing device 2 is held against a left side of a head of a user may indicate that a left hand of the user is a dominant hand of the user.
- user 3 may hold computing device 2 against right ear 14 (i.e., a right ear of user 3 ) and right cheek 16 (i.e., a right cheek of user 3 ) while using computing device 2 for telephonic communications.
- user 3 may hold computing device 2 against left ear 18 (i.e., a left ear of user 3 ) and left cheek 20 (i.e., a left cheek of user 3 ) while using computing device 2 for telephonic communications.
- left ear 18 i.e., a left ear of user 3
- left cheek 20 i.e., a left cheek of user 3
- a physical orientation of computing device 2 while the user is holding computing device 2 against the side of his or her head may typically differ depending upon whether computing device 2 is being held against right ear 14 and right cheek 16 or whether computing device 2 is being held against left ear 18 and left cheek 20 . That is, due in part to typical anatomical features of the human head, an angle of a physical orientation of computing device 2 with respect to the ground while computing device 2 is held against right ear 14 and right cheek 16 may be substantially opposite an angle of a physical orientation of computing device 2 with respect to the ground while computing device 2 is held against left ear 18 and left cheek 20 .
- physical orientation information of computing device 2 when computing device 2 detects one or more user input gestures at or near display 4 may be usable to determine a dominant hand of user 3 .
- a user may hold a mobile computing device in a dominant hand of the user while providing user input gestures with a thumb of the dominant hand of the user.
- a user may hold a mobile computing device in a non-dominant hand of the user while providing user input gestures with a dominant hand of the user (e.g., with a finger of the dominant hand, or other input unit, such as a pen, stylus, etc. held in the dominant hand of the user).
- a user While holding the mobile computing device in one hand, a user may typically hold the mobile computing device at a slight angle toward the opposite side of the user. For instance, a user holding a mobile computing device in a left hand of the user and providing user input gestures with a right hand of the user or a left thumb of the user may typically hold the mobile computing device such that a presence-sensitive display of the mobile computing device is angled toward the right side of the user. Similarly, a user holding a mobile computing device in a right hand of the user and providing user input gestures with a left hand of the user or a right thumb of the user may typically hold the mobile computing device such that a presence-sensitive display of the mobile computing device is angled toward the left side of the user. As such, physical orientation information of computing device 2 while computing device 2 detects one or more user input gestures (e.g., touch gestures, swipe gestures, pinch gestures, etc.) may be usable to determine a dominant hand of the user.
- user input gestures e.g., touch gesture
- visual information from an image sensor of computing device 2 may be usable to determine a dominant hand of user 3 .
- a right-handed user may typically hold a mobile computing device in a right hand of the user against the right side of his or her head while using the mobile computing device for telephonic communications.
- a left-handed user may typically hold a mobile computing device in a left hand of the user against the left side of his or her head while using the mobile computing device for telephonic communications.
- visual information indicating that computing device 2 is held against a right side of a user's head may indicate that a right hand of the user is a dominant hand of the user.
- Visual information indicating that computing device 2 is held against a left side of a user's head may indicate that a left hand of the user is a dominant hand of the user.
- Such visual information may represent an anatomical feature of the user's head.
- the anatomical feature may include at least a portion of the side of the user's head.
- the anatomical feature may include at least a portion of an ear of the user.
- the visual information may include at least a portion of right ear 14 or left ear 18 . Because at least the outer edge of right ear 14 curves in an opposite direction to that of left ear 18 , visual information representing a portion of right ear 14 or left ear 18 may be usable to determine whether computing device 2 is held against a right side or a left side of the head of user 3 .
- the visual information may be usable to determine a dominant hand of the user.
- Acceleration information of computing device 2 may be usable to determine a dominant hand of user 3 .
- a user may typically hold the mobile computing device in a dominant hand of the user against a dominant side of the user's head.
- an acceleration profile resulting from the motion of the mobile computing device as the user moves the mobile computing device to the side of the user's head may differ depending upon whether the user moves the mobile computing device to the right side of the user's head or whether the user moves the mobile computing device to the left side of the user's head.
- user 3 may move computing device 2 to the right side of the head of user 3 in motion 22 .
- the acceleration profile of computing device 2 defined by motion 22 may typically differ from an acceleration profile defined by a similar motion (not illustrated) in which user 3 moves computing device 2 to the left side of the head of user 3 .
- the user when moving computing device 2 from a user's pocket to the right side of the head of the user (e.g., to right ear 14 and right cheek 16 of user 3 ), the user may move computing device 2 along a path that arcs first toward the middle of the user's body then toward the right side of the user's body.
- the user may move computing device 2 along a path that arcs first toward the middle of the user's body then toward the left side of the user's body.
- acceleration information of computing device 2 may be usable to determine a dominant hand of the user.
- One or more user inputs detected at or near display 4 may be usable to determine a dominant hand of user 3 .
- handedness module 8 may determine that a user input detected at or near display 4 indicates that display 4 is in contact with a cheek of user 3 .
- handedness module 8 may compare an area of display 4 that detects the presence of an input unit to a threshold value.
- Handedness module 8 may determine that the detected user input indicates that display 4 is in contact with a cheek of user 3 when the area of display 4 that detects the presence of an input unit is greater than the threshold value.
- the threshold value may be a percentage of the total area of display 4 , such as twenty-five percent, thirty-five percent, fifty percent, or other percentages of the total area of display 4 . In certain examples, the threshold value may be user configurable.
- the user input detected at or near display 4 indicating that display 4 is in contact with a cheek of user 3 may be usable to determine a dominant hand of user 3 .
- user 3 may hold computing device 2 to right ear 14 and right cheek 16 when using computing device 2 for telephonic communications.
- display 4 may detect a user input indicating that right cheek 16 is in contact with display 4 .
- user 3 may hold computing device 2 to left ear 18 and left cheek 20 when using computing device 2 for telephonic communications.
- display 4 may detect a user input indicating that left cheek 20 is in contact with display 4 .
- the profile of an area of display 4 that is in contact with right cheek 16 may typically differ from a profile of an area of display 4 that is in contact with left cheek 20 .
- a profile of an area of display 4 that is in contact with right cheek 16 may include an upper-left region of display 4 but not a lower-right region of display 4 .
- the upper-left region and lower-right regions of display 4 may be considered upper-left and lower-right regions from the perspective of a user viewing display 4 . That is, when user 3 holds computing device 2 to right ear 14 and right cheek 16 (e.g., when using computing device 2 for telephonic communications), display 4 may detect a user input at an upper-left region of display 4 .
- display 4 may typically not detect a user input at a lower-right region of display 4 .
- display 4 may detect a user input at an upper-right region of display 4 , but may not detect a user input at a lower-left region of display 4 .
- Handedness module 8 may analyze the touch region of the received user input at display 4 , and may determine that user 3 may be holding computing device 2 to right ear 14 and right cheek 16 when an area of display 4 that is in contact with an input unit is greater than a threshold value (e.g., indicating a cheek-press user input) and when a region of display 4 that detects the user input includes upper-left region of display 4 but does not include a lower-right region of display 4 . Handedness module 8 may determine that such a detected user input at display 4 indicates that a right hand of the user may be a dominant hand of the user.
- a threshold value e.g., indicating a cheek-press user input
- handedness module 8 may determine that user 3 may be holding computing device 2 to left ear 18 and left cheek 20 when an area of display 4 that is in contact with an input unit is greater than a threshold value (e.g., indicating a cheek-press user input) and when a region of display 4 that detects the user input includes an upper-right region of display 4 but does not include a lower-left region of display 4 . Handedness module 8 may determine that such a detected user input indicates that a left hand of the user may be a dominant hand of the user.
- a threshold value e.g., indicating a cheek-press user input
- a frequency at which user inputs are detected at a portion of display 4 may be usable to determine a dominant hand of a user.
- a user may typically hold a mobile computing device in a dominant hand of a user and provide user input gestures (e.g., touch gestures, swipe gestures, etc.) with a thumb of the dominant hand of the user.
- user input gestures e.g., touch gestures, swipe gestures, etc.
- a frequency at which user input gestures are detected at a portion of the presence-sensitive display corresponding to the dominant hand of the user may be greater than a frequency at which user input gestures are detected at a portion of the presence-sensitive display corresponding to the non-dominant hand of the user. That is, because of the limited reach of the user's thumb, a user may provide a greater proportion of user input gestures at locations of the presence-sensitive display that are closest to the thumb of the hand holding the mobile computing device.
- user 3 may be a right-handed user. As such, user 3 may typically hold computing device 2 in a right hand and provide user input gestures with a thumb of the right hand. Because of the limited reach of the thumb, user 3 may provide user input gestures more frequently at a right portion of display 4 than at a left portion of display 4 . Similarly, in examples when user 3 is a left-handed user, user 3 may typically hold computing device 2 in a left hand and provide user input gestures with a thumb of the left hand. As such, user 3 may provide user input gestures more frequently at a left portion of display 4 than at a right portion of display 4 .
- Handedness module 8 may determine a frequency at which user input gestures are detected at portions of display 4 over time, such as over a time period of one hour, three hours, one day, or other time periods. Handedness module 8 may, in certain examples, determine a histogram (e.g., a “heat map”) representing the frequency at which user input gestures are detected with respect to portions of display 4 . Handedness module 8 may determine that a histogram indicating a greater frequency of received user input gestures at a right portion of display 4 than at a left portion of display 4 may indicate that a right hand of user 3 is a dominant hand of user 3 .
- a histogram e.g., a “heat map”
- handedness module 8 may determine that a histogram indicating a greater frequency of received user input gestures at a left portion of display 4 than at a right portion of display 4 may indicate that a left hand of user 3 is a dominant hand of user 3 .
- a location of display 4 at which user input gestures may be usable to determine a dominant hand of a user.
- GUI module 10 may cause display 4 to output one or more selectable objects, such as a graphical button, graphical slider control, and the like.
- a user providing a gesture (e.g., a touch gesture) at display 4 with a right hand of the user to select one of the selectable objects may typically provide the gesture at a location of display 4 that is slightly left of the selectable object from the perspective of a user viewing display 4 .
- a user providing such a gesture with a right hand may typically provide the gesture at a location of display 4 that is biased toward the left side of the selectable object.
- a user providing such a gesture with a left hand may typically provide the gesture at a location of display 4 that is slightly right of, or biased toward the right side of the selectable object.
- Handedness module 8 may determine a frequency at which user input gestures to select a selectable object displayed at display 4 are biased toward the left of the selectable object and biased toward the right of the selectable object. Handedness module 8 may determine that a higher frequency of user input gestures biased toward the left of selectable objects indicates that a right hand of the user is a dominant hand of the user. Similarly, handedness module 8 may determine that a higher frequency of user input gestures biased toward the right of selectable objects indicates that a left hand of the user is a dominant hand of the user.
- the speed and accuracy with which user input gestures are detected to select a plurality of selectable objects may be usable to determine a dominant hand of the user. For instance, a user may typically be quicker and more accurate when providing user input gestures with a dominant hand of the user than with a non-dominant hand of the user.
- handedness module 8 may cause GUI module 10 to output a GUI to determine the speed and accuracy with which a user provides user input gestures to select a plurality of selectable objects using a left hand of the user and using a right hand of a user.
- handedness module 8 may cause GUI module 10 to output a GUI that includes a series of selectable objects sequentially output at display 4 over a period of time.
- the series of selectable objects may be displayed in succession at varying (e.g., random) locations of display 4 , each of the selectable objects output for a threshold amount of time.
- the selectable objects may be displayed as “bubbles,” each of the bubbles output at display 4 for a threshold amount of time, such as one second, five hundred milliseconds, two hundred and fifty milliseconds, or other threshold amounts of time.
- the GUI may first request the user to select the objects with a right hand of the user, then to select the objects with a left hand of the user.
- Handedness module 8 may determine characteristics of received user input gestures to select the selectable objects with respect to each of the right and left hands of the user. For instance, handedness module may determine the number of objects successfully selected with each hand, an average time between when a selectable object is displayed and a user input gesture is received to select the object with respect to each hand of the user, or other characteristics.
- the determined characteristics of the detected user input gestures with respect to each of the left and right hands of the user may be usable to determine the dominant hand of the user. For instance, handedness module 8 may determine that a greater number of successfully selected objects with a right hand than with a left hand may indicate that a right hand of the user is a dominant hand of the user. Conversely, handedness module 8 may determine that a greater number of successfully selected objects with a left hand than with a right hand may indicate that a left hand of the user is a dominant hand of the user. Similarly, handedness module 8 may determine that a lower average time to select the selectable objects with a right hand than with a left hand may indicate that a right hand is a dominant hand of the user. Handedness module 8 may determine that a lower average time to select the selectable objects with a left hand than with a right hand may indicate that a left hand is a dominant hand of the user.
- computing device 2 may determine a plurality of features, each of which is usable to determine a dominant hand of the user.
- Computing device 2 may receive a plurality of input values, each input value from the plurality of input values corresponding to a respective feature from the plurality of features.
- computing device 2 may include one or more sensors 6 .
- sensors 6 include, but are not limited to, accelerometers, gyroscopes, magnetometers, audio input devices (e.g., a microphone), image sensors (e.g., an image sensor associated with a camera device of computing device 2 ), and proximity sensors.
- Computing device 2 may receive a plurality of inputs from one or more sensors 6 , such as acceleration information from one or more accelerometers of computing device 2 , physical orientation information from one or more gyroscopes of computing device 2 , visual information from one or more image sensors of computing device 2 , audio information from one or more audio input devices of computing device 2 , physical orientation information from one or more magnetometers of computing device 2 , and information from one or more proximity sensors of computing device 2 indicating physical proximity of computing device 2 to another object.
- sensors 6 such as acceleration information from one or more accelerometers of computing device 2 , physical orientation information from one or more gyroscopes of computing device 2 , visual information from one or more image sensors of computing device 2 , audio information from one or more audio input devices of computing device 2 , physical orientation information from one or more magnetometers of computing device 2 , and information from one or more proximity sensors of computing device 2 indicating physical proximity of computing device 2 to another object.
- Handedness module 8 may determine, using a probabilistic model and based at least in part on at least one input value from the plurality of input values corresponding to the respective feature from the plurality of features, a hand of the user as a dominant hand of the user.
- a probabilistic model include machine learning models such as Bayesian networks, artificial neural networks, support vector machines, as well as other probabilistic models.
- handedness module 8 may compare input values determined from one or more sensors 6 to corresponding baseline values determined with respect to known right-handed and/or left-handed users.
- input values corresponding to one or more features from the plurality of features may be determined with respect to known right-handed and/or left-handed users.
- the input values determined with respect to the known right-handed and/or left-handed users may be used to determine one or more baseline values, each baseline value corresponding to a respective feature from the plurality of features.
- the baseline values may serve as a basis for comparison against which handedness module 8 may compare received inputs from one or more sensors 6 using the probabilistic model.
- handedness module 8 may determine a feature vector including the plurality of features, each of which is usable to determine a dominant hand of the user. Handedness module 8 may compare an input vector including a plurality of inputs determined from one or more sensors 6 to the feature vector including the baseline values. Handedness module 8 may determine a dominant hand of the user based at least in part on the comparison.
- known right-handed and/or known left-handed users may be asked to use a computing device, such as computing device 2 , in various ways. For instance, such users may be asked to interact with a GUI output at a presence-sensitive display operatively coupled to the computing device (e.g., display 4 ), such as by providing user input gestures to enter textual strings, select various selectable objects (e.g., button controls), and the like. Similarly, such users may be asked to use the computing device for telephonic communications, such as by moving the computing device to either the left or right side of the user's head. For each of the known right-handed and left-handed users, input values corresponding to a plurality of features of the feature vector may be recorded.
- physical orientation information received from a gyroscope of the computing device may be recorded.
- acceleration information received from one or more accelerometers of the computing device may be recorded.
- visual information received from one or more image sensors of the computing device e.g., visual information including a visual representation of an anatomical feature of the user's head, such as an ear of the user
- information associated with one or more user inputs detected at a presence-sensitive and/or touch-sensitive display device operatively coupled to the computing device may be determined, such as a frequency at which user input gestures are received at one or more portions of the display, a portion of the display that detects a user input indicating that the display is in contact with a cheek of the user, and the like.
- a baseline feature vector (e.g., a stereotype) may be established using the recorded inputs.
- the baseline feature vector may be determined using an average of the recorded inputs associated with each feature, a weighted average of the recorded inputs, or other central tendency techniques to establish a baseline feature vector for at least one of a stereotypical right-handed and left-handed user.
- handedness module 8 may compare a plurality of received inputs corresponding to the plurality of features of the baseline feature vector to determine a dominant hand of the user. For instance, in certain examples, handedness module 8 may determine an angle in n-dimensional space between the n-dimensional baseline feature vector and n-dimensional input feature vector, where “n” represents the number of distinct features of each of the two feature vectors. Therefore, in some examples, rather than determine the dominant hand of the user based upon only one type of input (e.g., only one of physical orientation information, acceleration information, or visual information associated with the computing device), computing device 2 may determine the dominant hand of the user based at least in part on the plurality of input values corresponding to the plurality of features.
- computing device 2 may determine the dominant hand of the user based at least in part on the plurality of input values corresponding to the plurality of features.
- techniques of this disclosure may increase the certainty with which a computing device may determine the dominant hand of a user based upon input values determined from one or more sensors of the computing device. Moreover, because the baseline values may be adjusted based upon further observations with respect to the user or other known right-handed or left-handed users, the techniques may enable the computing device to further increase the certainty with which the computing device may determine the dominant hand of any particular user.
- GUI module 10 may generate, based at least in part on the determined dominant hand of the user, a GUI for display at a display device operatively coupled to computing device 2 (e.g., display 4 ).
- handedness module 8 may determine that a right hand is a dominant hand of user 3 .
- GUI module 10 may generate a right-handed GUI for display at display 4 .
- GUI module 10 may generate a GUI that includes one or more graphical elements 12 arranged in a right-handed visual configuration.
- the right-handed visual configuration may include a visual layout of graphical elements 12 such that graphical elements 12 are positioned at locations of display 4 along an arc that follows a radius reachable by a right thumb of user 3 . That is, as illustrated in FIG. 1A , GUI module 10 may generate the right-handed GUI such that graphical elements 12 are positioned at a right portion of display 4 along an arc that follows a radius of a typical motion of a right thumb of user 3 as user 3 moves his or her right thumb between a top portion of display 4 and a bottom portion of display 4 .
- handedness module 8 may determine that a left hand is a dominant hand of user 3 .
- GUI module 10 may generate a left-handed GUI for display at display 4 including a left-handed visual configuration that is different from a right-handed visual configuration.
- the left-handed visual configuration may include a visual layout of graphical elements 12 such that graphical elements 12 are positioned at locations of display 4 along an arc that follows a radius reachable by a left thumb of user 3 .
- techniques of this disclosure may promote usability of computing device 2 by facilitating user selection of graphical elements with a dominant hand of a user.
- FIG. 1B is a conceptual diagram illustrating an example of the computing device of FIG. 1A that may be used to determine a dominant hand of a user and generate a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure.
- user 3 may use computing device 2 for telephonic communications, such as by holding computing device 2 to right ear 14 and right cheek 16 or to left ear 18 and left cheek 20 .
- a physical orientation of computing device 2 may result in angle 24 of the physical orientation of computing device 2 with respect to the ground.
- a physical orientation of computing device 2 may result in angle 26 with respect to the ground.
- angle 24 may be different from angle 26 . That is, angle 24 may be substantially opposite angle 26 .
- Physical orientation information of computing device 2 such as information received from one or more gyroscopes or accelerometers of computing device 2 , may be usable to determine the dominant hand of user 3 . For instance, because user 3 may typically hold computing device 2 in a dominant hand to a dominant side of his or her head when using computing device 2 for telephonic communications, physical orientation information indicating a physical orientation of computing device 2 that is within a threshold value of angle 24 (e.g., a threshold value of one degree, five degrees, ten degrees, or other threshold values) may indicate that a right hand is a dominant hand of user 3 . Similarly, physical orientation information indicating a physical orientation of computing device 2 that is within a threshold value of angle 26 may indicate that a left hand is a dominant hand of user 3 .
- a threshold value of angle 24 e.g., a threshold value of one degree, five degrees, ten degrees, or other threshold values
- FIG. 2 is a block diagram illustrating further details of one example of a computing device shown in FIGS. 1A and 1B , in accordance with one or more aspects of this disclosure.
- FIG. 2 illustrates only one particular example of computing device 2 , and many other examples of computing device 2 may be used in other instances.
- computing device 2 includes display 4 , user interface 30 , one or more processors 32 , one or more communication units 34 , one or more sensors 6 , and one or more storage devices 38 . As illustrated, computing device 2 further includes handedness module 8 , GUI module 10 , and operating system 39 that are executable by computing device 2 . Each of components 4 , 6 , 30 , 32 , 34 , and 38 may be interconnected (physically, communicatively, and/or operatively) for inter-component communications. In some examples, communication channels 36 may include a system bus, network connection, inter-process communication data structure, or any other channel for communicating data. As one example in FIG.
- Handedness module 8 GUI module 10 and operating system 39 may also communicate information with one another as well as with other components of computing device 2 .
- One or more processors 32 are configured to implement functionality and/or process instructions for execution within computing device 2 .
- one or more processors 32 may be capable of processing instructions stored at one or more storage devices 38 .
- Examples of one or more processors 32 may include any one or more of a microprocessor, a controller, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or equivalent discrete or integrated logic circuitry.
- DSP digital signal processor
- ASIC application specific integrated circuit
- FPGA field-programmable gate array
- One or more storage devices 38 may be configured to store information within computing device 2 during operation.
- One or more storage devices 38 may be described as a computer-readable storage medium.
- one or more storage devices 38 may be a temporary memory, meaning that a primary purpose of one or more storage devices 38 is not long-term storage.
- One or more storage devices 38 may, in some examples, be described as a volatile memory, meaning that one or more storage devices 38 do not maintain stored contents when the computer is turned off. Examples of volatile memories include random access memories (RAM), dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories known in the art.
- RAM random access memories
- DRAM dynamic random access memories
- SRAM static random access memories
- one or more storage devices 38 may be used to store program instructions for execution by one or more processors 32 .
- One or more storage devices 38 may be used by software or applications running on computing device 2 (e.g., handedness module 8 and/or GUI module 10 ) to temporarily store information during program execution.
- One or more storage devices 38 also include one or more computer-readable storage media.
- One or more storage devices 38 may be configured to store larger amounts of information than volatile memory.
- One or more storage devices 38 may further be configured for long-term storage of information.
- one or more storage devices 38 include non-volatile storage elements. Examples of such non-volatile storage elements include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.
- User interface 30 may allow a user of computing device 2 to interact with computing device 2 .
- Examples of user interface 30 may include, but are not limited to, a keypad embedded on computing device 2 , a keyboard, a mouse, a roller ball, buttons, or other devices that allow a user to interact with computing device 2 .
- computing device 2 may not include user interface 30 , and the user may interact with computing device 2 with display 4 (e.g., by providing various user gestures).
- the user may interact with computing device 2 with user interface 30 or display 4 .
- Computing device 2 also includes one or more communication units 34 .
- Computing device 2 utilizes one or more communication units 34 to communicate with external devices via one or more networks, such as one or more wireless networks, one or more cellular networks, or other types of networks.
- One or more communication units 34 may be a network interface card, such as an Ethernet card, an optical transceiver, a radio frequency transceiver, or any other type of device that can send and receive information.
- Other examples of such network interfaces may include Bluetooth, 3G and WiFi radio computing devices as well as Universal Serial Bus (USB).
- computing device 2 utilizes one or more communication units 34 for telephonic communications with an external device.
- Computing device 2 may also include one or more sensors 6 .
- sensors may include, but are not limited to, accelerometers, gyroscopes, magnetometers, audio input devices (e.g., a microphone), image sensors (e.g., an image sensor associated with a camera device of computing device 2 ), and proximity sensors.
- Computing device 2 may receive a plurality of input values from one or more sensors 6 .
- computing device 2 may receive a acceleration information from one or more accelerometers, physical orientation information from one or more gyroscopes, physical orientation information from one or more magnetometers (e.g., physical orientation information with respect to the magnetic field of the earth), audio information from one or more audio input devices, visual information from one or more image sensors (e.g., visual information representing an anatomical feature of a user, such as an ear of the user), and proximity information from one or more proximity sensors (e.g., information indicating physical proximity of computing device 2 to another object).
- a acceleration information from one or more accelerometers physical orientation information from one or more gyroscopes
- physical orientation information from one or more magnetometers e.g., physical orientation information with respect to the magnetic field of the earth
- audio information from one or more audio input devices e.g., audio input devices
- visual information from one or more image sensors e.g., visual information representing an anatomical feature of a user, such as an ear of the user
- Computing device 2 may include operating system 39 .
- Operating system 39 controls the operation of components of computing device 2 .
- operating system 39 in one example, facilitates the communication of handedness module 8 and GUI module 10 with one or more processors 32 , display 4 , user interface 30 , one or more communication units 34 , and one or more sensors 6 , as described in FIGS. 1A and 1B .
- handedness module 8 may determine a plurality of features, each of which is usable to determine a dominant hand of a user. Handedness module 8 may receive a plurality of input values from one or more sensors 6 corresponding to the respective plurality of features. For example, handedness module 8 , executing on one or more processors 32 , may receive a plurality of input values from one or more sensors 6 using communication channels 36 . Handedness module 8 may determine a dominant hand of the user based at least in part on the plurality of input values corresponding to the plurality of features. In response, GUI module 10 may generate a GUI for display at a display device operatively coupled to computing device 2 (e.g., display 4 ) based at least in part on the determination of the dominant hand.
- computing device 2 e.g., display 4
- Handedness module 8 may use a probabilistic model, such as a Bayesian network, to determine the dominant hand of the user. For example, for one or more of the plurality of input values determined from information received from one or more sensors 6 , handedness module 8 may determine a difference between the respective input value and a respective baseline value.
- the respective baseline values may be determined, in some examples, using input values received during a ground-truth data collection phase. For instance, information from sensors (e.g., one or more sensors 6 ) may be collected while known right-handed and left-handed users perform various tasks using computing device 2 or other similar computing device.
- the baseline values may be used to determine a feature vector that represents stereotypical state information of computing device 2 (e.g., physical orientation information, acceleration information, etc.) during use by known right-handed and/or left-handed users.
- the baseline values may be modified based on information received from one or more sensors 6 .
- a baseline value corresponding to a feature representing a stereotypical cheek-press input e.g., a profile of an area of a presence-sensitive display that detects input indicating contact between the presence-sensitive display and a cheek of a user
- a baseline value corresponding to a feature representing a stereotypical cheek-press input may be modified based on user input information detected at display 4 .
- Handedness module 8 may apply a weighted value associated with the respective feature to the determined difference to generate a weighted difference value.
- each feature of the feature vector may be usable to determine a dominant hand of a user
- certain of the features may provide a stronger indication of a dominant hand of the user.
- visual information including a representation of at least a portion of an ear of the user may provide a stronger indication of a dominant hand of a user than acceleration information indicating a motion of computing device 2 to a particular side of a user's head.
- handedness module 8 may apply a weighted value (e.g., a coefficient) to a determined difference between visual information input values and a visual information baseline feature than to a determined difference between acceleration information input values and an acceleration information baseline feature.
- a weighted value e.g., a coefficient
- handedness module 8 may apply weighted values that range between zero and one.
- Handedness module 8 may aggregate the one or more weighted difference values to determine an aggregated weighted difference value. For example, handedness module 8 may determine a distance between an n-dimensional input feature vector and an n-dimensional baseline feature vector, where “n” represents the number of features in each of the input and baseline feature vectors. In certain examples, handedness module 8 may determine a representation of an angle between an n-dimensional input feature vector and an n-dimensional baseline feature vector, such as by determining the cosine of the angle between the two vectors.
- Handedness module 8 may determine the dominant hand of the user based at least in part on the aggregated weighted difference value. For instance, handedness module 8 may compare the aggregated weighted difference value to a threshold value, and may determine the dominant hand of the user based on the comparison. As one example, handedness module 8 may determine that an aggregated weighted difference value that is greater than or equal to a threshold value corresponds to a right-handed user and an aggregated weighted difference value that is less than the threshold value corresponds to a left-handed user.
- handedness module 8 may determine the plurality of features that are usable to determine a dominant hand of a user in response to one or more received inputs corresponding to a particular feature. For instance, handedness module 8 may receive an input value corresponding to a first feature. In response, handedness module 8 may determine one of an active state and an inactive state of a sensor associated with a second feature based at least in part on a criterion that specifies a relationship between the first feature and the second feature. Handedness module 8 may activate the sensor associated with the second feature in response to determining an active state of the sensor. Similarly, handedness module 8 may, in certain examples, deactivate the sensor associated with the second feature in response to determining an inactive state of the sensor.
- the input value corresponding to the first feature may include an indication of a user input detected at a presence-sensitive display operatively coupled to computing device 2 (e.g., display 4 ).
- the sensor associated with the second feature may include an image sensor of the mobile computing device (e.g., an image sensor associated with a camera device of computing device 2 ).
- Handedness module 8 may determine at least one of an active and inactive state of the image sensor based at least in part on a determination that a received indication of a user input detected at the presence-sensitive display is indicative of a contact between the presence-sensitive display and at least a portion of a head of a user (e.g., a cheek-press user input). Handedness module 8 may determine an active state of the image sensor when the received indication of the user input indicates a contact between the presence-sensitive display and at least the portion of the head of the user.
- Handedness module 8 may determine an inactive state of the image sensor when the received indication of the user input does not indicate a contact between the presence-sensitive display and at least a portion of the head of the user. As such, in examples where computing device 2 includes a battery to provide electrical power to components of computing device 2 , handedness module 8 may help to decrease power consumption of components of computing device 2 . That is, rather than require that each sensor of sensors 6 be active during use of computing device 2 , handedness module 6 may activate and deactivate at least one of sensors 6 based on received inputs corresponding to a particular feature.
- handedness module 8 may activate the camera device in response to receiving a user input at display 4 indicating that display 4 is in contact with at least a portion of the user's head. As such handedness module 8 may conserve battery power by activating the camera device in response to an input indicating that such visual information may likely be available.
- the input value corresponding to the first feature may include the indication of the user input detected at the presence-sensitive display (e.g., display 4 ), and the sensor associated with the second feature may include a gyroscope of computing device 2 .
- Handedness module 8 may determine the active state of the gyroscope based at least in part on a determination that the received indication of the user input at the presence-sensitive display is indicative of a contact between the presence-sensitive display and at least a portion of the head of the user.
- handedness module 8 may conserve battery power by activating the gyroscope in response to an input indicating that physical orientation information usable to determine the dominant hand of the user is likely available (e.g., a physical orientation of computing device 2 with respect to the ground when computing device 2 is used for telephonic communications).
- the input value corresponding to the first feature may include an audio input from an audio input device (e.g., a microphone) of computing device 2 .
- the sensor associated with the second feature may include an accelerometer of computing device 2 .
- Handedness module 8 may determine the active state of the accelerometer based at least in part on a determination that a received audio input is indicative of wind noise. For example, a received audio input that is indicative of wind noise may indicate movement of computing device 2 . As such, handedness module 8 may decrease power consumption of accelerometers of computing device 2 by activating the accelerometers in response to determining that the received audio input indicates wind noise, and hence, possible motion of computing device 2 .
- FIGS. 3A and 3B are conceptual diagrams illustrating an example computing device that may be used to determine a dominant hand of a user and generate a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure.
- GUI module 10 generates a GUI for display at display 4 in a right-handed dominant hand visual configuration including graphical elements 40 A, 40 B, 40 C, 40 D, and 40 E (collectively referred to herein as “graphical elements 40 ”) in visual layout 42 .
- graphical elements 40 graphical elements
- GUI module 10 generates a GUI for display at display 4 in a left-handed non-dominant hand visual configuration including graphical elements 44 A, 44 B, and 44 C (collectively referred to herein as “graphical elements 44 ”) in visual layout 46 .
- handedness module 8 may determine that a right hand of user 3 is a dominant hand of user 3 .
- GUI module 10 may generate a GUI including graphical elements 40 for display at display 4 based at least in part on the determination of the right hand as the dominant hand of the user.
- GUI module 10 may generate the GUI in a right-handed visual configuration.
- the right-handed visual configuration may include a visual layout of at least one graphical element (e.g., at least one of graphical elements 40 ).
- GUI module 10 may generate the GUI for display such that graphical elements 40 are positioned at locations of display 4 along an arc that follows a radius reachable by a right thumb of user 3 .
- GUI module 10 may generate the right-handed visual configuration such that graphical elements 12 are positioned at a right portion of display 4 along an arc that follows a radius of a typical motion of a right thumb of user 3 as user 3 moves his or her right thumb between a bottom portion of display 4 and a top portion of display 4 .
- handedness module 8 may determine that a left hand of user 3 is a dominant hand of user 3 .
- GUI module 10 may generate the GUI in a left-handed visual configuration.
- the left-handed visual configuration may include a visual layout of at least one graphical element (e.g., at least one of graphical elements 40 ).
- GUI module 10 may generate the left-handed GUI for display in a left-handed dominant hand visual configuration such that graphical elements 40 are positioned at locations of display 4 along an arc that follows a radius reachable by a left thumb of user 3 .
- the dominant hand visual configuration may be different from a non-dominant hand visual configuration.
- the dominant hand visual configuration may include one or more dominant hand layout properties that specify, for at least one graphical element, a visual layout of the at least one graphical element.
- the non-dominant hand visual configuration may include one or more non-dominant hand layout properties that specify, for at least one graphical element, a visual layout of the at least one graphical element.
- the dominant-hand visual layout may be different from the non-dominant hand visual layout.
- a right-handed visual configuration may include a right-handed visual layout of at least one of graphical elements 40 .
- the right-handed visual configuration may include one or more right-handed layout properties that specify the visual layout of graphical elements 40 .
- the right-handed visual layout e.g., a dominant hand visual layout in this example
- may be different from a left-handed visual layout e.g., a non-dominant hand visual layout in this example.
- the left-handed visual configuration may include one or more left-handed layout properties that specify a left-handed visual layout of graphical elements 40 that is different than the right-handed visual layout of graphical elements 40 .
- the left-handed layout properties may specify a left-handed visual layout of graphical elements 40 such that graphical elements 40 are positioned along an arc that follows a radius reachable by a left thumb of user 3 .
- layout properties may include, but are not limited to, a size of at least one graphical element (e.g., a size of at least one of graphical elements 40 ), a shape of the at least one graphical element, a display location of the at least one graphical element at a display device (e.g., display 4 ), and information indicating whether the at least one graphical element is displayed at the display device.
- Each of the dominant hand and non-dominant hand visual configuration may include such visual layout properties.
- one or more of the respective visual layout properties associated with each of the dominant hand visual configuration and the non-dominant hand visual configuration may be different, such that a dominant hand visual layout of the dominant hand visual configuration is different than the non-dominant hand visual layout of the non-dominant hand visual configuration.
- handedness module 8 may determine that a user (e.g., user 3 ) is currently holding computing device 2 with a non-dominant hand of the user. For instance, using techniques of this disclosure, handedness module 8 may determine the dominant hand of the user based at least in part on a received plurality of input values corresponding to a respective plurality of features usable to determine a dominant hand of a user. In addition, handedness module 8 may determine, based at least in part on the plurality of input values, that a user is currently holding computing device 2 with a non-dominant hand of the user.
- handedness module 8 may determine that a right hand of user 3 is a dominant hand of user 3 .
- handedness module 8 may determine that a plurality of input values corresponding to a respective plurality of features usable to determine the dominant hand of the user indicates that the user is currently holding computing device 2 with the non-dominant hand of the user. For instance, handedness module 8 may compare an input feature vector to a baseline feature vector determined with respect to known right-handed and/or left-handed users. Handedness module 8 may determine, in some examples, that user 3 is a right-handed user, and that the input feature vector correlates to a baseline feature vector associated with known left-handed users.
- handedness module 8 may determine that user 3 may be currently holding computing device with a non-dominant hand of user 3 (i.e., a left hand of user 3 in this example). Similarly, handedness module 8 may determine that user 3 is a left-handed user, and that the input feature vector correlates to a baseline feature vector associated with known right-handed user. In such examples, handedness module 8 may determine that user 3 may be currently holding computing device 2 with a non-dominant hand of user 3 (i.e., a right hand of user 3 in the current example). Responsive to determining that the user is currently holding computing device 2 with the non-dominant hand of the user, GUI module 10 may generate the GUI for display in a non-dominant hand visual configuration.
- GUI module 10 generates a GUI in a left-handed non-dominant hand visual configuration including graphical elements 44 in visual layout 46 .
- visual layout 46 i.e., a non-dominant hand visual layout in this example
- visual layout 42 i.e., a dominant hand visual layout in the example of FIG. 3A
- at least one of graphical elements 44 may correspond to at least one of graphical elements 40 .
- visual layout 46 may differ from visual layout 42 with respect to at least one of a shape, a size, and a display location of the at least one corresponding graphical elements.
- visual layout 46 may differ from visual layout 42 with respect to whether the at least one corresponding graphical element is displayed at display device 4 .
- graphical elements 44 may correspond to graphical elements 40 A, 40 B, and 40 C of FIG. 3A .
- graphical elements 40 A and 44 A may each be a “compose” graphical button to enable a user to create a new email message.
- graphical elements 40 B and 44 B may each be “send” graphical buttons to enable a user to send an email message
- graphical elements 40 C and 44 C may each be “archive” graphical buttons to enable a user to archive one or more email messages.
- visual layout 46 i.e., a non-dominant hand visual layout in this example
- visual layout 42 i.e., a dominant hand visual layout in this example
- visual layout 46 may be different from visual layout 42 in that certain graphical elements displayed in visual layout 46 may not be displayed in visual layout 46 (i.e., graphical elements 40 D and 40 E in this example).
- GUI module 10 may promote usability of computing device 2 by facilitating user selection of graphical elements with the non-dominant hand of the user. For example, to help compensate for a tendency of users to be less accurate when providing user input gestures with a non-dominant hand of the user, GUI module 10 may display fewer graphical elements in a non-dominant hand visual configuration than in a dominant hand visual configuration, each of the graphical elements of the non-dominant hand visual configuration being larger than the corresponding graphical elements of the dominant hand visual configuration.
- FIG. 4 is a flow diagram illustrating example operations of a computing device to determine a dominant hand of a user and output a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure.
- the example illustrated in FIG. 4 is only one example operation, and other implementations may include more or fewer aspects than those depicted in FIG. 4 .
- the example operations are described below within the context of computing device 2 .
- Handedness module 8 may determine a plurality of features ( 50 ). Each feature from the plurality of features may be usable to determine a dominant hand of a user of computing device 2 . Handedness module 8 may receive a plurality of input values, each input value from the plurality of input values corresponding to a respective feature from the plurality of features ( 52 ). For example, handedness module 8 may receive a plurality of input values from one or more sensors 6 , each input value corresponding to a respective feature from the plurality of features. Handedness module 8 may select an input value from the plurality of input values ( 54 ). Handedness module 8 may determine a difference between the respective input value and a respective baseline value ( 56 ). Handedness module 8 may apply a weighted value associated with the respective feature to the determined difference to generate a weighted difference value ( 58 ).
- Handedness module 8 may determine whether each input value of the plurality of input values has been evaluated ( 60 ). For example, handedness module 8 may determine, for each input value of the plurality of input values, whether a difference has been determined between the input value and a respective baseline value. When handedness module 8 determines that at least one of the input values of the plurality of input values has not been evaluated (“NO” branch of 60 ), handedness module 8 may select a next input value. When handedness module 8 determines that each input value of the plurality of input values has been evaluated (“YES” branch of 60 ), handedness module 8 may aggregate the weighted difference values to determine an aggregated weighted difference value ( 62 ).
- Handedness module 8 may determine whether the aggregated weighted difference value corresponds to a left-handed user ( 64 ). When handedness module 8 determines that the aggregated value corresponds to a left-handed user (“YES” branch of 64 ), GUI module 10 may output for display at display 4 a GUI in a left-handed visual configuration ( 66 ). In some examples, when handedness module 8 determines that the aggregated value does not correspond to a left-handed user (“NO” branch of 64 ), handedness module 8 may determine whether the aggregated value corresponds to a right-handed user ( 68 ).
- GUI module 10 may output for display at display 4 a GUI in a right-handed visual configuration. That is, rather than perform operation 68 to determine whether the aggregated value corresponds to a right-handed user, GUI module 10 may output for display at display 4 a GUI in a right-handed visual configuration as a default visual configuration, and may output a GUI in a left-handed visual configuration in response to handedness module 8 determining that the aggregated value corresponds to a left-handed user.
- GUI module 10 may output for display at display 4 a GUI in a right-handed visual configuration ( 70 ).
- handedness module 8 may determine a plurality of features, each of which may be usable to determine a dominant hand of a user.
- handedness module 8 may output for display at display 4 a GUI in a hand-neutral visual configuration. For instance, when handedness module 8 determines that the aggregated value does not correspond to a right-handed user (“NO” branch of 68 ), GUI module 10 may output for display at display 4 a GUI in a hand-neutral visual configuration. In certain examples, GUI module 10 may output for display at display 4 the GUI in the hand-neutral visual configuration and handedness module 8 may determine a plurality of features, each of which may be usable to determine a dominant hand of a user (e.g., operation 50 ).
- a hand-neutral visual configuration may include, for example, a visual configuration that favors neither a left hand nor a right hand of a user.
- a hand-neutral visual configuration of a GUI may include one or more graphical elements (e.g., one or more graphical button controls) output at locations of display 4 equidistant between a typical arc of a left thumb of a user holding a mobile computing device in the left hand of the user and a typical arc of a right thumb of a user holding the mobile computing device in the right hand of the user.
- one or more of a size and shape of at least one graphical element included in a hand-neutral visual configuration may be configured to favor neither a left hand nor a right hand of a user.
- one or more visual layout properties associated with each of a dominant hand visual configuration may define one or more of a size and shape of at least one graphical element included in the dominant hand visual configuration and the non-dominant hand visual configuration.
- the one or more visual layout properties may specify a particular size of a graphical element for display in the non-dominant hand visual configuration, and may specify a smaller size of the graphical element for display in the dominant hand visual configuration.
- GUI module 10 may output for display at display 4 a GUI in a hand-neutral visual configuration, such as by outputting one or more graphical elements with a size that is an average of the size of the one or more graphical elements specified by the visual layout properties associated with each of a dominant hand visual configuration and a non-dominant hand visual configuration.
- GUI module 10 may output for display at display 4 a GUI in a visual configuration specified by a user of computing device 2 .
- a user may specify one of a right-handed, left-handed, or hand-neutral visual configuration, such as by using user interface 30 (e.g., selecting a visual configuration preference).
- GUI module 10 may output for display at display 4 a GUI in a visual configuration corresponding to the user-selected visual configuration. That is, in such examples, GUI module 10 may output for display at display 4 a GUI in a visual configuration corresponding to the user-selected visual configuration regardless of determinations made by handedness module 8 based on the aggregated weighted difference values.
- FIG. 5 is a flow diagram illustrating example operations of a computing device to determine a dominant hand of a user and output a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure.
- the example illustrated in FIG. 5 is only one example operation, and other implementations may include more or fewer aspects than those depicted in FIG. 5 .
- the example operations are described below within the context of computing device 2 .
- Computing device 2 may determine a plurality of features, wherein each feature from the plurality of features is usable to determine a dominant hand of a user of computing device 2 ( 72 ).
- Computing device 2 may receive a plurality of input values, each input value from the plurality of input values corresponding to a respective feature from the plurality of features ( 74 ).
- Computing device 2 may determine, using a probabilistic model and based at least in part on at least one input value from the plurality of input values corresponding to the respective feature from the plurality of features, a hand of the user as a dominant hand of the user ( 76 ).
- Computing device 2 may generate, based at least in part on the determined dominant hand of the user, a graphical user interface for display at a display device operatively coupled to computing device 2 (e.g., a presence-sensitive display) ( 78 ).
- generating the graphical user interface based at least in part on the determined dominant hand includes generating for display the graphical user interface in a dominant hand visual configuration.
- the dominant hand visual configuration may be different from a non-dominant hand visual configuration.
- the dominant hand visual configuration includes a first visual layout of the at least one graphical element at the display device, the non-dominant hand visual configuration includes a second visual layout of the at least one graphical element at the display device, and the first visual layout is different from the second visual layout.
- the dominant hand visual configuration includes one or more dominant hand layout properties that specify, for the at least one graphical element, the first visual layout of the at least one graphical element
- the non-dominant hand visual configuration includes one or more non-dominant hand layout properties that specify, for the at least one graphical element, the second visual layout of the at least one graphical element.
- the one or more dominant hand layout properties include one or more of a size of the at least one graphical element, a shape of the at least one graphical element, a display location of the at least one graphical element at the display device, and information indicating whether the at least one graphical element is displayed at the display device.
- the one or more non-dominant hand layout properties include one or more of a size of the at least one graphical element, a shape of the at least one graphical element, a display location of the at least one graphical element at the display device, and information indicating whether the at least one graphical element is displayed at the display device.
- the example operations further include determining, based at least in part on the received plurality of input values corresponding to the respective plurality of features, that the user is currently holding computing device 2 with a non-dominant hand of the user; and responsive to determining that the user is currently holding the computing device 2 with the non-dominant hand of the user, generating for display the graphical user interface in the non-dominant hand visual configuration.
- at least one input value from the plurality of input values includes acceleration information from an accelerometer of computing device 2 .
- at least one input value from the plurality of input values includes physical orientation information from a gyroscope of computing device 2 .
- the display device includes a presence-sensitive display, and at least one input value from the plurality of input values includes an indication of a user input detected at the presence-sensitive display.
- At least one input value from the plurality of input values includes visual information from an image sensor of computing device 2 .
- the visual information includes a visual representation of an anatomical feature of a head of a user.
- the anatomical feature of the head of the user includes at least a portion of an ear of the user.
- determining the dominant hand of the user includes for one or more input values from the plurality of input values, determining a difference between the respective input value and a respective baseline value, and applying a weighted value associated with the respective feature to the determined difference to generate a weighted difference value.
- determining the dominant hand of the user may include aggregating the one or more weighted difference values to determine an aggregated weighted difference value, and determining the dominant hand of the user based at least in part on the aggregated weighted difference value.
- determining the plurality of features includes, in response to receiving an input value corresponding to a first feature, determining, based at least in part on a criterion that specifies a relationship between the first feature and a second feature, one of an active state and an inactive state of a sensor associated with the second feature.
- determining the plurality of features may further include activating the sensor associated with the second feature in response to determining the active state of the sensor, and deactivating the sensor associated with the second feature in response to determining the inactive state of the sensor.
- receiving the input value corresponding to the first feature includes receiving an audio input from an audio input device of the computing device 2
- the sensor associated with the second feature includes an accelerometer of the computing device 2
- determining the active state of the accelerometer includes determining the active state based at least in part on a determination that the received audio input is indicative of wind noise.
- the display device includes a presence-sensitive display
- receiving the input value corresponding to the first feature includes receiving an indication of a user input detected at the presence-sensitive display
- the sensor associated with the second feature includes an image sensor of computing device 2 .
- determining the active state of the image sensor includes determining the active state based at least in part on a determination that the received indication of the user input detected at the presence-sensitive display is indicative of a contact between the presence-sensitive display and at least a portion of a head of the user.
- the display device includes a presence-sensitive display
- receiving the input value corresponding to the first feature includes receiving an indication of a user input detected at the presence-sensitive display
- the sensor associated with the second feature comprises a gyroscope of the computing device 2 .
- determining the active state of the gyroscope includes determining the active state based at least in part on a determination that the received indication of the user input detected at the presence-sensitive display is indicative of a contact between the presence-sensitive display and at least a portion of a head of the user.
- the portion of the head of the user may include a cheek of the user.
- processors including one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components.
- DSPs digital signal processors
- ASICs application specific integrated circuits
- FPGAs field programmable gate arrays
- processors may generally refer to any of the foregoing logic circuitry, alone or in combination with other logic circuitry, or any other equivalent circuitry.
- a control unit including hardware may also perform one or more of the techniques of this disclosure.
- Such hardware, software, and firmware may be implemented within the same device or within separate devices to support the various techniques described in this disclosure.
- any of the described units, modules or components may be implemented together or separately as discrete but interoperable logic devices. Depiction of different features as modules or units is intended to highlight different functional aspects and does not necessarily imply that such modules or units must be realized by separate hardware, firmware, or software components. Rather, functionality associated with one or more modules or units may be performed by separate hardware, firmware, or software components, or integrated within common or separate hardware, firmware, or software components.
- the techniques described in this disclosure may also be embodied or encoded in an article of manufacture including a computer-readable storage medium encoded with instructions. Instructions embedded or encoded in an article of manufacture including a computer-readable storage medium encoded, may cause one or more programmable processors, or other processors, to implement one or more of the techniques described herein, such as when instructions included or encoded in the computer-readable storage medium are executed by the one or more processors.
- Computer readable storage media may include random access memory (RAM), read only memory (ROM), programmable read only memory (PROM), erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), flash memory, a hard disk, a compact disc ROM (CD-ROM), a floppy disk, a cassette, magnetic media, optical media, or other computer readable media.
- RAM random access memory
- ROM read only memory
- PROM programmable read only memory
- EPROM erasable programmable read only memory
- EEPROM electronically erasable programmable read only memory
- flash memory a hard disk, a compact disc ROM (CD-ROM), a floppy disk, a cassette, magnetic media, optical media, or other computer readable media.
- an article of manufacture may include one or more computer-readable storage media.
- a computer-readable storage medium may include a non-transitory medium.
- the term “non-transitory” may indicate that the storage medium is not embodied in a carrier wave or a propagated signal.
- a non-transitory storage medium may store data that can, over time, change (e.g., in RAM or cache).
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- Computing devices provide users with the ability to interact with processes and data using input and output devices. For example, a user may provide a user input to a computing device using a presence-sensitive display that displays a graphical user interface (GUI). The user input may cause the computing device to modify the execution of a process and/or data. Such processes may provide a user with the ability to access the Internet, play games, and play videos and music, as well as providing other various types of functionality.
- In certain examples, the computing device may be a mobile computing device, such as a mobile phone (e.g., a smartphone) or tablet computer that the user may hold in his or her hand. As an example, a user may hold a mobile computing device in the user's right hand, and may provide user input gestures at a presence-sensitive display of the mobile computing device using the left hand of the user. Advancements in computing devices have enabled such devices to provide users with richer user experiences that include increasingly complex graphical user interfaces.
- In one example, a method includes determining, by a computing device, a plurality of features. Each feature from the plurality of features may be usable to determine a dominant hand of a user of the computing device. The method also includes receiving, by the computing device, a plurality of input values, each input value from the plurality of input values corresponding to a respective feature from the plurality of features, and determining, using a probabilistic model and based at least in part on at least one input value from the plurality of input values corresponding to the respective feature from the plurality of features, a hand of the user as a dominant hand of the user. The method also includes generating, based at least in part on the determined dominant hand of the user, a graphical user interface for display at a the presence-sensitive display operatively coupled to the computing device.
- In one example, a computer-readable storage medium is encoded with instructions that, when executed, cause one or more processors of a computing device to perform operations including determining a plurality of features. Each feature from the plurality of features may be usable to determine a dominant hand of a user of the computing device. The computer-readable storage medium may be further encoded with instructions that, when executed cause the one or more processors to perform operations including receiving a plurality of input values, each input value from the plurality of input values corresponding to a respective feature from the plurality of features, determining, using a probabilistic model and based at least in part on at least one input value from the plurality of input values corresponding to the respective feature from the plurality of features, a hand of the user as a dominant hand of the user, and generating, based at least in part on the determined dominant hand of the user, a graphical user interface for display at a presence-sensitive display operatively coupled to the computing device.
- In one example, a computing device includes one or more processors, a presence-sensitive display that is operatively coupled to the computing device, and one or more sensors. The one or more processors may be configured to determine a plurality of features. Each feature from the plurality of features may be usable to determine a dominant hand of a user of the computing device. The one or more processors may be further configured to receive, from the one or more sensors, a plurality of input values, each input value from the plurality of input values corresponding to a respective feature from the plurality of features, and determine, using a probabilistic model and based at least in part on at least one input value from the plurality of input values corresponding to the respective feature from the plurality of features, a hand of the user as a dominant hand of the user. The one or more processors may be further configured to generate, based at least in part on the determined dominant hand of the user, a graphical user interface for display at the presence-sensitive display.
- The details of one or more examples of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims.
-
FIG. 1A is a conceptual diagram illustrating an example computing device that may be used to determine a dominant hand of a user and generate a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure. -
FIG. 1B is a conceptual diagram illustrating an example of the computing device ofFIG. 1A that may be used to determine a dominant hand of a user and generate a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure. -
FIG. 2 is a block diagram illustrating further details of one example of a computing device shown inFIGS. 1A and 1B , in accordance with one or more aspects of this disclosure. -
FIGS. 3A and 3B are conceptual diagrams illustrating an example computing device that may be used to determine a dominant hand of a user and generate a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure. -
FIG. 4 is a flow diagram illustrating example operations of a computing device to determine a dominant hand of a user and output a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure. -
FIG. 5 is a flow diagram illustrating example operations of a computing device to determine a dominant hand of a user and output a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure. - In general, this disclosure is directed to techniques for determining a dominant hand of a user of a computing device to improve user interactions with a presence-sensitive display operatively coupled to the computing device. A computing device may output a graphical user interface (GUI) at a presence-sensitive display. The presence-sensitive display (e.g., a touch-sensitive screen) may enable a user to interact with graphical elements of the GUI by detecting user inputs in the form of gestures performed at or near the presence-sensitive display. For instance, a user may provide a touch gesture to select a graphical button control of the GUI. Advancements in computing devices have enabled such devices to provide increasingly complex GUIs. However, some presence-sensitive displays, such as those associated with mobile computing devices, may provide relatively small interaction surfaces with which to display a GUI and receive user input gestures. The combination of increasingly complex GUIs and the limited space provided by many presence-sensitive displays may increase the difficulty for a user to provide user input gestures to interact with the computing device. Moreover, users may typically be more accurate and quicker when providing such gestures using a dominant hand of the user than when using a non-dominant hand of the user.
- Techniques of this disclosure may improve the ease with which a user can provide user input gestures (e.g., touch gestures) to interact with a GUI output at a presence-sensitive display of a computing device. According to various techniques of this disclosure, a computing device (e.g., a mobile computing device such as a mobile phone or tablet computer) may determine a dominant hand of the user. For instance, the computing device may receive a plurality of input values (e.g., acceleration information from an accelerometer of the computing device, physical orientation information from a gyroscope of the computing device, visual information from an image sensor of the computing device, etc.), each input from the plurality of inputs corresponding to a respective feature from a plurality of features that are usable to determine the dominant hand of the user. Such features may include, but are not limited to, acceleration information of a computing device, physical orientation of the computing device, visual information associated with the computing device, one or more user inputs detected at a presence-sensitive and/or touch-sensitive display device operatively coupled to the computing device, and the like.
- The computing device may use a probabilistic model, such as a Bayesian network, to determine the dominant hand of the user based at least in part on the plurality of input values. For instance, the computing device may compare the received input values to corresponding baseline values determined with respect to known right-handed and/or left-handed users.
- The computing device may generate, based at least in part on the determined dominant hand of the user, a GUI for display in a dominant hand visual configuration. As an example, the computing device may determine that a left hand of a user is the dominant hand of the user. In response, the computing device may generate a GUI in a dominant hand visual configuration that includes, in one example, graphical elements (e.g., one or more graphical button controls) positioned along a radius that follows a typical arc of a left thumb of a user holding a mobile computing device in the left hand of the user (e.g., a left-handed visual configuration). As such, the computing device may promote usability by facilitating user selection of graphical elements with the dominant thumb of the user.
- In some examples, the computing device may determine, based at least in part on at least one input value from the plurality of received input values, that the user is currently holding the computing device with a non-dominant hand of the user. In response, the computing device may generate a GUI in a non-dominant hand visual configuration. For instance, the computing device may determine that a left hand of the user is a dominant hand of the user, and that the user is currently holding the computing device in a right hand of the user (i.e., a non-dominant hand of the user in this example). In such an example, the computing device may generate a GUI in a non-dominant hand visual configuration. In some examples, the non-dominant hand visual configuration includes graphical elements (e.g., one or more graphical button controls) positioned along a radius that follows a typical arc of a right thumb of a user holding a computing device in a right hand of the user (e.g., a right-handed visual configuration).
- The non-dominant hand visual configuration may be different than the dominant hand visual configuration with respect to one or more of a size, shape, location, number of graphical elements generated for display, or other properties of the visual configuration. For instance, a non-dominant hand visual configuration may include fewer, but larger graphical elements to compensate for a tendency of users to be less accurate when providing user input gestures with a non-dominant hand of the user. As such, the computing device may promote improved usability by facilitating user selection of graphical elements with the non-dominant hand of the user.
-
FIG. 1A is a conceptual diagram illustrating an example computing device that may be used to determine a dominant hand of a user and generate a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure. As illustrated inFIG. 1 ,computing device 2 may includedisplay 4, one ormore sensors 6,handedness module 8, and graphical user interface (GUI)module 10. Examples ofcomputing device 2 may include, but are not limited to, portable or mobile devices such as mobile phones (including smartphones), tablet computers, smart television platform, personal digital assistants (PDAs), and the like. As shown in the example ofFIG. 1A ,computing device 2 may be a mobile phone, such as a smartphone. -
Display 4 may be a liquid crystal display (LCD), e-ink, organic light emitting diode (OLED), or other display.Display 4 may present the content ofcomputing device 2 to a user. For example,display 4 may display the output of applications executed on one or more processors ofcomputing device 2, confirmation messages, indications, or other functions that may need to be presented to a user. - In some examples,
display 4 may provide some or all of the functionality of a user interface ofcomputing device 2. For instance, as in the example ofFIG. 1A ,display 4 may be a touch-sensitive and/or presence-sensitive display that can display a GUI and detect input from a user in the form of user input gestures (e.g., touch gestures, swipe gestures, pinch gestures, and the like) using capacitive or inductive detection at or near the presence-sensitive display. - As illustrated in
FIG. 1A ,computing device 2 may includehandedness module 8 andGUI module 10.GUI module 10 may perform one or more functions to receive input, such as one or more user input gestures detected atdisplay 4.GUI module 10 may send such input to other components associated withcomputing device 2, such ashandedness module 8 or other application(s) executing on one or more processors ofcomputing device 2.GUI module 10 may also receive data from components associated withcomputing device 2, such ashandedness module 8. Using the data,GUI module 10 may cause components associated withcomputing device 2, such asdisplay 4, to provide output based on the data. For instance,GUI module 10 may receive data fromhandedness module 8 that causesGUI module 10 to display a GUI atdisplay 4 to enable a user to interact withcomputing device 2. - As shown in
FIG. 1A ,GUI module 10 may generate a GUI for display atdisplay 4 that includes one or more graphical elements, such asgraphical elements 12.Graphical elements 12 may include any one or more graphical elements to enable a user to provide user input gestures to interact withcomputing device 2. For instance,graphical elements 12 may be graphical button controls, checkbox controls, slider controls, or other types of graphical control elements. As one example,graphical elements 12 may include one or more graphical button controls to enable a user to provide user input gestures to interact with an email application, at least portions of which execute on one or more processors ofcomputing device 2. For instance, in such an example,graphical elements 12 may include a “compose” graphical button to enable a user to create a new email message, a “send” graphical button to enable a user to send an email message, an “archive” graphical button to enable a user to archive one or more email messages, and the like. Other examples ofgraphical elements 12 are possible, and the non-limiting example above is provided only for purposes of discussion. -
Graphical elements 12 may be the same or different types of graphical elements. For instance, in some examples, at least one ofgraphical elements 12 may be a graphical button control and at least one ofgraphical elements 12 may be a graphical checkbox control. In certain examples, each ofgraphical elements 12 may be the same type of graphical elements, such as when each ofgraphical elements 12 is a graphical button control. -
GUI module 10 may generate a GUI for display atdisplay 4 in various visual configurations. For instance,GUI module 10 may generate a GUI for display in a right-handed visual configuration. In certain examples,GUI module 10 may generate a GUI for display in a left-handed visual configuration that is different from the right-handed visual configuration. In some examples,GUI module 10 may generate a GUI for display in a dominant hand visual configuration that is different from a non-dominant hand visual configuration. The dominant hand visual configuration may be either a right-handed visual configuration or a left-handed visual configuration. Similarly, the non-dominant hand visual configuration may be either a right-handed visual configuration or a left-handed visual configuration.GUI module 10 may, in some examples, generate a GUI for display at a display device operatively coupled to computing device 2 (e.g., display 4) based at least in part on a determination by computingdevice 2 of a dominant hand of a user interacting withcomputing device 2. For example,GUI module 10 may receive data fromhandedness module 8 indicating a dominant hand of a user.GUI module 10 may generate the GUI for display atdisplay 4 based at least in part on the data received fromhandedness module 8 indicating the dominant hand of the user. -
Handedness module 8 may determine a plurality of features, each of which is usable to determine a dominant hand of a user. Examples of such features include, but are not limited to, a physical orientation ofcomputing device 2, acceleration information ofcomputing device 2, indications of one or more user inputs detected at display 4 (e.g., a presence-sensitive and/or touch-sensitive display), visual information of an image sensor (e.g., a camera device) ofcomputing device 2, and the like. - Physical orientation information of
computing device 2 may be usable to determine a dominant hand of a user. For instance,computing device 2 may be a mobile computing device such as a mobile phone or tablet computer. In such examples, a user, such asuser 3, may holdcomputing device 2 in the user's hand. In the illustrated example ofFIG. 1A ,user 3 holdscomputing device 2 in a right hand ofuser 3. In some examples, such as when computingdevice 2 includes a mobile phone,user 3 may holdcomputing device 2 against the side of his or her head while usingcomputing device 2 for telephonic communications. A right-handed user (i.e., a user whose right hand is dominant over a non-dominant left hand) may typically hold a mobile computing device in a right hand of the user against the right side of his or her head while using the mobile computing device for telephonic communications. Similarly, a left-handed user (i.e., a user whose left hand is dominant over a non-dominant right hand) may typically hold a mobile computing device in a left hand of the user against the left side of his or her head while using the mobile computing device for telephonic communications. As such, physical orientation information ofcomputing device 2 while computingdevice 2 is being used for telephonic communications may be usable to determine a dominant hand of a user. That is, physical orientation information indicating thatcomputing device 2 is held against a right side of a head of a user may indicate that a right hand of the user is a dominant hand of the user. Physical orientation information indicating thatcomputing device 2 is held against a left side of a head of a user may indicate that a left hand of the user is a dominant hand of the user. - In the example of
FIG. 1A ,user 3 may holdcomputing device 2 against right ear 14 (i.e., a right ear of user 3) and right cheek 16 (i.e., a right cheek of user 3) while usingcomputing device 2 for telephonic communications. Similarly,user 3 may holdcomputing device 2 against left ear 18 (i.e., a left ear of user 3) and left cheek 20 (i.e., a left cheek of user 3) while usingcomputing device 2 for telephonic communications. As discussed in further detail with respect to the illustrated example ofFIG. 1B , a physical orientation ofcomputing device 2 while the user is holdingcomputing device 2 against the side of his or her head may typically differ depending upon whethercomputing device 2 is being held againstright ear 14 andright cheek 16 or whethercomputing device 2 is being held againstleft ear 18 and leftcheek 20. That is, due in part to typical anatomical features of the human head, an angle of a physical orientation ofcomputing device 2 with respect to the ground while computingdevice 2 is held againstright ear 14 andright cheek 16 may be substantially opposite an angle of a physical orientation ofcomputing device 2 with respect to the ground while computingdevice 2 is held againstleft ear 18 and leftcheek 20. - In addition, physical orientation information of
computing device 2 when computingdevice 2 detects one or more user input gestures at or near display 4 (e.g., a presence-sensitive display) may be usable to determine a dominant hand ofuser 3. For instance, a user may hold a mobile computing device in a dominant hand of the user while providing user input gestures with a thumb of the dominant hand of the user. In some examples, a user may hold a mobile computing device in a non-dominant hand of the user while providing user input gestures with a dominant hand of the user (e.g., with a finger of the dominant hand, or other input unit, such as a pen, stylus, etc. held in the dominant hand of the user). - While holding the mobile computing device in one hand, a user may typically hold the mobile computing device at a slight angle toward the opposite side of the user. For instance, a user holding a mobile computing device in a left hand of the user and providing user input gestures with a right hand of the user or a left thumb of the user may typically hold the mobile computing device such that a presence-sensitive display of the mobile computing device is angled toward the right side of the user. Similarly, a user holding a mobile computing device in a right hand of the user and providing user input gestures with a left hand of the user or a right thumb of the user may typically hold the mobile computing device such that a presence-sensitive display of the mobile computing device is angled toward the left side of the user. As such, physical orientation information of
computing device 2 while computingdevice 2 detects one or more user input gestures (e.g., touch gestures, swipe gestures, pinch gestures, etc.) may be usable to determine a dominant hand of the user. - In certain examples, visual information from an image sensor of
computing device 2 may be usable to determine a dominant hand ofuser 3. As discussed above, a right-handed user may typically hold a mobile computing device in a right hand of the user against the right side of his or her head while using the mobile computing device for telephonic communications. Similarly, a left-handed user may typically hold a mobile computing device in a left hand of the user against the left side of his or her head while using the mobile computing device for telephonic communications. As such, visual information indicating thatcomputing device 2 is held against a right side of a user's head may indicate that a right hand of the user is a dominant hand of the user. Visual information indicating thatcomputing device 2 is held against a left side of a user's head may indicate that a left hand of the user is a dominant hand of the user. - Such visual information may represent an anatomical feature of the user's head. For instance, the anatomical feature may include at least a portion of the side of the user's head. In some examples, the anatomical feature may include at least a portion of an ear of the user. For instance, in the example of
FIG. 1A , the visual information may include at least a portion ofright ear 14 orleft ear 18. Because at least the outer edge ofright ear 14 curves in an opposite direction to that ofleft ear 18, visual information representing a portion ofright ear 14 orleft ear 18 may be usable to determine whethercomputing device 2 is held against a right side or a left side of the head ofuser 3. Hence, because a right-handed user may typically holdcomputing device 2 against a right side of the user's head, and a left-handed user may typically holdcomputing device 2 against a left side of the user's head, the visual information may be usable to determine a dominant hand of the user. - Acceleration information of
computing device 2 may be usable to determine a dominant hand ofuser 3. For instance, as discussed above, while using a mobile computing device for telephonic communications, a user may typically hold the mobile computing device in a dominant hand of the user against a dominant side of the user's head. As such, an acceleration profile resulting from the motion of the mobile computing device as the user moves the mobile computing device to the side of the user's head may differ depending upon whether the user moves the mobile computing device to the right side of the user's head or whether the user moves the mobile computing device to the left side of the user's head. - For example, as in the conceptual illustration of
FIG. 1A ,user 3, holdingcomputing device 2 in a right hand, may movecomputing device 2 to the right side of the head ofuser 3 inmotion 22. The acceleration profile ofcomputing device 2 defined bymotion 22 may typically differ from an acceleration profile defined by a similar motion (not illustrated) in whichuser 3moves computing device 2 to the left side of the head ofuser 3. For instance, as one example, when movingcomputing device 2 from a user's pocket to the right side of the head of the user (e.g., toright ear 14 andright cheek 16 of user 3), the user may movecomputing device 2 along a path that arcs first toward the middle of the user's body then toward the right side of the user's body. In contrast, when movingcomputing device 2 from a user's pocket to the left side of the head of the user (e.g., to left hear 18 and leftcheek 20 of user 3), the user may movecomputing device 2 along a path that arcs first toward the middle of the user's body then toward the left side of the user's body. As such, because the acceleration profile resulting from each path may differ, acceleration information ofcomputing device 2 may be usable to determine a dominant hand of the user. - One or more user inputs detected at or near display 4 (e.g., a presence-sensitive and/or touch-sensitive display operatively coupled to computing device 2) may be usable to determine a dominant hand of
user 3. For example,handedness module 8 may determine that a user input detected at or neardisplay 4 indicates thatdisplay 4 is in contact with a cheek ofuser 3. For instance,handedness module 8 may compare an area ofdisplay 4 that detects the presence of an input unit to a threshold value.Handedness module 8 may determine that the detected user input indicates thatdisplay 4 is in contact with a cheek ofuser 3 when the area ofdisplay 4 that detects the presence of an input unit is greater than the threshold value. The threshold value may be a percentage of the total area ofdisplay 4, such as twenty-five percent, thirty-five percent, fifty percent, or other percentages of the total area ofdisplay 4. In certain examples, the threshold value may be user configurable. - The user input detected at or near
display 4 indicating thatdisplay 4 is in contact with a cheek ofuser 3 may be usable to determine a dominant hand ofuser 3. For example,user 3 may holdcomputing device 2 toright ear 14 andright cheek 16 when usingcomputing device 2 for telephonic communications. In such an example,display 4 may detect a user input indicating thatright cheek 16 is in contact withdisplay 4. Similarly,user 3 may holdcomputing device 2 to leftear 18 and leftcheek 20 when usingcomputing device 2 for telephonic communications. In such an example,display 4 may detect a user input indicating thatleft cheek 20 is in contact withdisplay 4. - The profile of an area of
display 4 that is in contact withright cheek 16 may typically differ from a profile of an area ofdisplay 4 that is in contact withleft cheek 20. For example, a profile of an area ofdisplay 4 that is in contact withright cheek 16 may include an upper-left region ofdisplay 4 but not a lower-right region ofdisplay 4. The upper-left region and lower-right regions ofdisplay 4 may be considered upper-left and lower-right regions from the perspective of auser viewing display 4. That is, whenuser 3 holdscomputing device 2 toright ear 14 and right cheek 16 (e.g., when usingcomputing device 2 for telephonic communications),display 4 may detect a user input at an upper-left region ofdisplay 4. However, in such an example,display 4 may typically not detect a user input at a lower-right region ofdisplay 4. In contrast, in examples whereuser 3 holdscomputing device 2 to leftear 18 and leftcheek 20,display 4 may detect a user input at an upper-right region ofdisplay 4, but may not detect a user input at a lower-left region ofdisplay 4. -
Handedness module 8 may analyze the touch region of the received user input atdisplay 4, and may determine thatuser 3 may be holdingcomputing device 2 toright ear 14 andright cheek 16 when an area ofdisplay 4 that is in contact with an input unit is greater than a threshold value (e.g., indicating a cheek-press user input) and when a region ofdisplay 4 that detects the user input includes upper-left region ofdisplay 4 but does not include a lower-right region ofdisplay 4.Handedness module 8 may determine that such a detected user input atdisplay 4 indicates that a right hand of the user may be a dominant hand of the user. Similarly,handedness module 8 may determine thatuser 3 may be holdingcomputing device 2 to leftear 18 and leftcheek 20 when an area ofdisplay 4 that is in contact with an input unit is greater than a threshold value (e.g., indicating a cheek-press user input) and when a region ofdisplay 4 that detects the user input includes an upper-right region ofdisplay 4 but does not include a lower-left region ofdisplay 4.Handedness module 8 may determine that such a detected user input indicates that a left hand of the user may be a dominant hand of the user. - In certain examples, a frequency at which user inputs are detected at a portion of display 4 (e.g., a presence-sensitive and/or touch-sensitive display) may be usable to determine a dominant hand of a user. For example, a user may typically hold a mobile computing device in a dominant hand of a user and provide user input gestures (e.g., touch gestures, swipe gestures, etc.) with a thumb of the dominant hand of the user. As such, a frequency at which user input gestures are detected at a portion of the presence-sensitive display corresponding to the dominant hand of the user (e.g., a right portion of
display 4 corresponding to a dominant right hand of a user or a left portion ofdisplay 4 corresponding to a dominant left hand of a user) may be greater than a frequency at which user input gestures are detected at a portion of the presence-sensitive display corresponding to the non-dominant hand of the user. That is, because of the limited reach of the user's thumb, a user may provide a greater proportion of user input gestures at locations of the presence-sensitive display that are closest to the thumb of the hand holding the mobile computing device. - As an example,
user 3 may be a right-handed user. As such,user 3 may typically holdcomputing device 2 in a right hand and provide user input gestures with a thumb of the right hand. Because of the limited reach of the thumb,user 3 may provide user input gestures more frequently at a right portion ofdisplay 4 than at a left portion ofdisplay 4. Similarly, in examples whenuser 3 is a left-handed user,user 3 may typically holdcomputing device 2 in a left hand and provide user input gestures with a thumb of the left hand. As such,user 3 may provide user input gestures more frequently at a left portion ofdisplay 4 than at a right portion ofdisplay 4. -
Handedness module 8 may determine a frequency at which user input gestures are detected at portions ofdisplay 4 over time, such as over a time period of one hour, three hours, one day, or other time periods.Handedness module 8 may, in certain examples, determine a histogram (e.g., a “heat map”) representing the frequency at which user input gestures are detected with respect to portions ofdisplay 4.Handedness module 8 may determine that a histogram indicating a greater frequency of received user input gestures at a right portion ofdisplay 4 than at a left portion ofdisplay 4 may indicate that a right hand ofuser 3 is a dominant hand ofuser 3. Similarly,handedness module 8 may determine that a histogram indicating a greater frequency of received user input gestures at a left portion ofdisplay 4 than at a right portion ofdisplay 4 may indicate that a left hand ofuser 3 is a dominant hand ofuser 3. - In some examples, a location of
display 4 at which user input gestures (e.g., touch gestures) are detected as compared to a region of a selectable object displayed atdisplay 4 may be usable to determine a dominant hand of a user. For example,GUI module 10 may causedisplay 4 to output one or more selectable objects, such as a graphical button, graphical slider control, and the like. A user providing a gesture (e.g., a touch gesture) atdisplay 4 with a right hand of the user to select one of the selectable objects may typically provide the gesture at a location ofdisplay 4 that is slightly left of the selectable object from the perspective of auser viewing display 4. That is, when providing a gesture to select an object displayed atdisplay 4, a user providing such a gesture with a right hand may typically provide the gesture at a location ofdisplay 4 that is biased toward the left side of the selectable object. Similarly, a user providing such a gesture with a left hand may typically provide the gesture at a location ofdisplay 4 that is slightly right of, or biased toward the right side of the selectable object. -
Handedness module 8 may determine a frequency at which user input gestures to select a selectable object displayed atdisplay 4 are biased toward the left of the selectable object and biased toward the right of the selectable object.Handedness module 8 may determine that a higher frequency of user input gestures biased toward the left of selectable objects indicates that a right hand of the user is a dominant hand of the user. Similarly,handedness module 8 may determine that a higher frequency of user input gestures biased toward the right of selectable objects indicates that a left hand of the user is a dominant hand of the user. - In certain examples, the speed and accuracy with which user input gestures are detected to select a plurality of selectable objects may be usable to determine a dominant hand of the user. For instance, a user may typically be quicker and more accurate when providing user input gestures with a dominant hand of the user than with a non-dominant hand of the user. As one example,
handedness module 8 may causeGUI module 10 to output a GUI to determine the speed and accuracy with which a user provides user input gestures to select a plurality of selectable objects using a left hand of the user and using a right hand of a user. - For instance,
handedness module 8 may causeGUI module 10 to output a GUI that includes a series of selectable objects sequentially output atdisplay 4 over a period of time. The series of selectable objects may be displayed in succession at varying (e.g., random) locations ofdisplay 4, each of the selectable objects output for a threshold amount of time. For instance, the selectable objects may be displayed as “bubbles,” each of the bubbles output atdisplay 4 for a threshold amount of time, such as one second, five hundred milliseconds, two hundred and fifty milliseconds, or other threshold amounts of time. As an example, the GUI may first request the user to select the objects with a right hand of the user, then to select the objects with a left hand of the user.Handedness module 8 may determine characteristics of received user input gestures to select the selectable objects with respect to each of the right and left hands of the user. For instance, handedness module may determine the number of objects successfully selected with each hand, an average time between when a selectable object is displayed and a user input gesture is received to select the object with respect to each hand of the user, or other characteristics. - The determined characteristics of the detected user input gestures with respect to each of the left and right hands of the user may be usable to determine the dominant hand of the user. For instance,
handedness module 8 may determine that a greater number of successfully selected objects with a right hand than with a left hand may indicate that a right hand of the user is a dominant hand of the user. Conversely,handedness module 8 may determine that a greater number of successfully selected objects with a left hand than with a right hand may indicate that a left hand of the user is a dominant hand of the user. Similarly,handedness module 8 may determine that a lower average time to select the selectable objects with a right hand than with a left hand may indicate that a right hand is a dominant hand of the user.Handedness module 8 may determine that a lower average time to select the selectable objects with a left hand than with a right hand may indicate that a left hand is a dominant hand of the user. - As such,
computing device 2 may determine a plurality of features, each of which is usable to determine a dominant hand of the user.Computing device 2 may receive a plurality of input values, each input value from the plurality of input values corresponding to a respective feature from the plurality of features. For example, as illustrated inFIG. 1A ,computing device 2 may include one ormore sensors 6. Examples of one ormore sensors 6 include, but are not limited to, accelerometers, gyroscopes, magnetometers, audio input devices (e.g., a microphone), image sensors (e.g., an image sensor associated with a camera device of computing device 2), and proximity sensors.Computing device 2 may receive a plurality of inputs from one ormore sensors 6, such as acceleration information from one or more accelerometers ofcomputing device 2, physical orientation information from one or more gyroscopes ofcomputing device 2, visual information from one or more image sensors ofcomputing device 2, audio information from one or more audio input devices ofcomputing device 2, physical orientation information from one or more magnetometers ofcomputing device 2, and information from one or more proximity sensors ofcomputing device 2 indicating physical proximity ofcomputing device 2 to another object. -
Handedness module 8 may determine, using a probabilistic model and based at least in part on at least one input value from the plurality of input values corresponding to the respective feature from the plurality of features, a hand of the user as a dominant hand of the user. Non-limiting examples of such a probabilistic model include machine learning models such as Bayesian networks, artificial neural networks, support vector machines, as well as other probabilistic models. For example, using the probabilistic model,handedness module 8 may compare input values determined from one ormore sensors 6 to corresponding baseline values determined with respect to known right-handed and/or left-handed users. - For instance, during a ground-truth collection phase, input values corresponding to one or more features from the plurality of features (e.g., physical orientation information, acceleration information, visual information, user input information detected at a presence-sensitive and/or touch-sensitive display device operatively coupled to the computing device, etc.) may be determined with respect to known right-handed and/or left-handed users. The input values determined with respect to the known right-handed and/or left-handed users may be used to determine one or more baseline values, each baseline value corresponding to a respective feature from the plurality of features. The baseline values may serve as a basis for comparison against which
handedness module 8 may compare received inputs from one ormore sensors 6 using the probabilistic model. For instance,handedness module 8 may determine a feature vector including the plurality of features, each of which is usable to determine a dominant hand of the user.Handedness module 8 may compare an input vector including a plurality of inputs determined from one ormore sensors 6 to the feature vector including the baseline values.Handedness module 8 may determine a dominant hand of the user based at least in part on the comparison. - As an example, during the ground-truth collection phase, known right-handed and/or known left-handed users may be asked to use a computing device, such as
computing device 2, in various ways. For instance, such users may be asked to interact with a GUI output at a presence-sensitive display operatively coupled to the computing device (e.g., display 4), such as by providing user input gestures to enter textual strings, select various selectable objects (e.g., button controls), and the like. Similarly, such users may be asked to use the computing device for telephonic communications, such as by moving the computing device to either the left or right side of the user's head. For each of the known right-handed and left-handed users, input values corresponding to a plurality of features of the feature vector may be recorded. For instance, physical orientation information received from a gyroscope of the computing device, acceleration information received from one or more accelerometers of the computing device, and visual information received from one or more image sensors of the computing device (e.g., visual information including a visual representation of an anatomical feature of the user's head, such as an ear of the user) may be recorded. Similarly, information associated with one or more user inputs detected at a presence-sensitive and/or touch-sensitive display device operatively coupled to the computing device may be determined, such as a frequency at which user input gestures are received at one or more portions of the display, a portion of the display that detects a user input indicating that the display is in contact with a cheek of the user, and the like. - In certain examples, a baseline feature vector (e.g., a stereotype) may be established using the recorded inputs. For instance, the baseline feature vector may be determined using an average of the recorded inputs associated with each feature, a weighted average of the recorded inputs, or other central tendency techniques to establish a baseline feature vector for at least one of a stereotypical right-handed and left-handed user.
- Using the probabilistic model,
handedness module 8 may compare a plurality of received inputs corresponding to the plurality of features of the baseline feature vector to determine a dominant hand of the user. For instance, in certain examples,handedness module 8 may determine an angle in n-dimensional space between the n-dimensional baseline feature vector and n-dimensional input feature vector, where “n” represents the number of distinct features of each of the two feature vectors. Therefore, in some examples, rather than determine the dominant hand of the user based upon only one type of input (e.g., only one of physical orientation information, acceleration information, or visual information associated with the computing device),computing device 2 may determine the dominant hand of the user based at least in part on the plurality of input values corresponding to the plurality of features. As such, techniques of this disclosure may increase the certainty with which a computing device may determine the dominant hand of a user based upon input values determined from one or more sensors of the computing device. Moreover, because the baseline values may be adjusted based upon further observations with respect to the user or other known right-handed or left-handed users, the techniques may enable the computing device to further increase the certainty with which the computing device may determine the dominant hand of any particular user. -
GUI module 10 may generate, based at least in part on the determined dominant hand of the user, a GUI for display at a display device operatively coupled to computing device 2 (e.g., display 4). As one example,handedness module 8 may determine that a right hand is a dominant hand ofuser 3. In such an example,GUI module 10 may generate a right-handed GUI for display atdisplay 4. For instance, as illustrated inFIG. 1A ,GUI module 10 may generate a GUI that includes one or moregraphical elements 12 arranged in a right-handed visual configuration. In some examples, the right-handed visual configuration may include a visual layout ofgraphical elements 12 such thatgraphical elements 12 are positioned at locations ofdisplay 4 along an arc that follows a radius reachable by a right thumb ofuser 3. That is, as illustrated inFIG. 1A ,GUI module 10 may generate the right-handed GUI such thatgraphical elements 12 are positioned at a right portion ofdisplay 4 along an arc that follows a radius of a typical motion of a right thumb ofuser 3 asuser 3 moves his or her right thumb between a top portion ofdisplay 4 and a bottom portion ofdisplay 4. - As another example,
handedness module 8 may determine that a left hand is a dominant hand ofuser 3. In such an example,GUI module 10 may generate a left-handed GUI for display atdisplay 4 including a left-handed visual configuration that is different from a right-handed visual configuration. For instance, the left-handed visual configuration may include a visual layout ofgraphical elements 12 such thatgraphical elements 12 are positioned at locations ofdisplay 4 along an arc that follows a radius reachable by a left thumb ofuser 3. As such, techniques of this disclosure may promote usability ofcomputing device 2 by facilitating user selection of graphical elements with a dominant hand of a user. -
FIG. 1B is a conceptual diagram illustrating an example of the computing device ofFIG. 1A that may be used to determine a dominant hand of a user and generate a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure. As illustrated in the example ofFIG. 1B ,user 3 may usecomputing device 2 for telephonic communications, such as by holdingcomputing device 2 toright ear 14 andright cheek 16 or to leftear 18 and leftcheek 20. As shown inFIG. 1B , when computingdevice 2 is held againstright ear 14 andright cheek 16, a physical orientation ofcomputing device 2 may result inangle 24 of the physical orientation ofcomputing device 2 with respect to the ground. In contrast, when computingdevice 2 is held againstleft ear 18 and leftcheek 20, a physical orientation ofcomputing device 2 may result inangle 26 with respect to the ground. - As illustrated,
angle 24 may be different fromangle 26. That is,angle 24 may be substantiallyopposite angle 26. Physical orientation information ofcomputing device 2, such as information received from one or more gyroscopes or accelerometers ofcomputing device 2, may be usable to determine the dominant hand ofuser 3. For instance, becauseuser 3 may typically holdcomputing device 2 in a dominant hand to a dominant side of his or her head when usingcomputing device 2 for telephonic communications, physical orientation information indicating a physical orientation ofcomputing device 2 that is within a threshold value of angle 24 (e.g., a threshold value of one degree, five degrees, ten degrees, or other threshold values) may indicate that a right hand is a dominant hand ofuser 3. Similarly, physical orientation information indicating a physical orientation ofcomputing device 2 that is within a threshold value ofangle 26 may indicate that a left hand is a dominant hand ofuser 3. -
FIG. 2 is a block diagram illustrating further details of one example of a computing device shown inFIGS. 1A and 1B , in accordance with one or more aspects of this disclosure.FIG. 2 illustrates only one particular example ofcomputing device 2, and many other examples ofcomputing device 2 may be used in other instances. - As shown in the specific example of
FIG. 2 ,computing device 2 includesdisplay 4, user interface 30, one ormore processors 32, one ormore communication units 34, one ormore sensors 6, and one ormore storage devices 38. As illustrated,computing device 2 further includeshandedness module 8,GUI module 10, andoperating system 39 that are executable by computingdevice 2. Each ofcomponents communication channels 36 may include a system bus, network connection, inter-process communication data structure, or any other channel for communicating data. As one example inFIG. 2 ,components more communication channels 36.Handedness module 8,GUI module 10 andoperating system 39 may also communicate information with one another as well as with other components ofcomputing device 2. - One or
more processors 32, in one example, are configured to implement functionality and/or process instructions for execution withincomputing device 2. For example, one ormore processors 32 may be capable of processing instructions stored at one ormore storage devices 38. Examples of one ormore processors 32 may include any one or more of a microprocessor, a controller, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or equivalent discrete or integrated logic circuitry. - One or
more storage devices 38 may be configured to store information withincomputing device 2 during operation. One ormore storage devices 38, in some examples, may be described as a computer-readable storage medium. In some examples, one ormore storage devices 38 may be a temporary memory, meaning that a primary purpose of one ormore storage devices 38 is not long-term storage. One ormore storage devices 38 may, in some examples, be described as a volatile memory, meaning that one ormore storage devices 38 do not maintain stored contents when the computer is turned off. Examples of volatile memories include random access memories (RAM), dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories known in the art. In some examples, one ormore storage devices 38 may be used to store program instructions for execution by one ormore processors 32. One ormore storage devices 38, for example, may be used by software or applications running on computing device 2 (e.g.,handedness module 8 and/or GUI module 10) to temporarily store information during program execution. - One or
more storage devices 38, in some examples, also include one or more computer-readable storage media. One ormore storage devices 38 may be configured to store larger amounts of information than volatile memory. One ormore storage devices 38 may further be configured for long-term storage of information. In some examples, one ormore storage devices 38 include non-volatile storage elements. Examples of such non-volatile storage elements include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. - User interface 30 may allow a user of
computing device 2 to interact withcomputing device 2. Examples of user interface 30 may include, but are not limited to, a keypad embedded oncomputing device 2, a keyboard, a mouse, a roller ball, buttons, or other devices that allow a user to interact withcomputing device 2. In some examples,computing device 2 may not include user interface 30, and the user may interact withcomputing device 2 with display 4 (e.g., by providing various user gestures). In some examples, the user may interact withcomputing device 2 with user interface 30 ordisplay 4. -
Computing device 2, in some examples, also includes one ormore communication units 34.Computing device 2, in one example, utilizes one ormore communication units 34 to communicate with external devices via one or more networks, such as one or more wireless networks, one or more cellular networks, or other types of networks. One ormore communication units 34 may be a network interface card, such as an Ethernet card, an optical transceiver, a radio frequency transceiver, or any other type of device that can send and receive information. Other examples of such network interfaces may include Bluetooth, 3G and WiFi radio computing devices as well as Universal Serial Bus (USB). In some examples,computing device 2 utilizes one ormore communication units 34 for telephonic communications with an external device. -
Computing device 2 may also include one ormore sensors 6. Examples of one or more sensors may include, but are not limited to, accelerometers, gyroscopes, magnetometers, audio input devices (e.g., a microphone), image sensors (e.g., an image sensor associated with a camera device of computing device 2), and proximity sensors.Computing device 2 may receive a plurality of input values from one ormore sensors 6. As an example,computing device 2 may receive a acceleration information from one or more accelerometers, physical orientation information from one or more gyroscopes, physical orientation information from one or more magnetometers (e.g., physical orientation information with respect to the magnetic field of the earth), audio information from one or more audio input devices, visual information from one or more image sensors (e.g., visual information representing an anatomical feature of a user, such as an ear of the user), and proximity information from one or more proximity sensors (e.g., information indicating physical proximity ofcomputing device 2 to another object). -
Computing device 2 may includeoperating system 39.Operating system 39, in some examples, controls the operation of components ofcomputing device 2. For example,operating system 39, in one example, facilitates the communication ofhandedness module 8 andGUI module 10 with one ormore processors 32,display 4, user interface 30, one ormore communication units 34, and one ormore sensors 6, as described inFIGS. 1A and 1B . - In accordance with techniques of this disclosure,
handedness module 8 may determine a plurality of features, each of which is usable to determine a dominant hand of a user.Handedness module 8 may receive a plurality of input values from one ormore sensors 6 corresponding to the respective plurality of features. For example,handedness module 8, executing on one ormore processors 32, may receive a plurality of input values from one ormore sensors 6 usingcommunication channels 36.Handedness module 8 may determine a dominant hand of the user based at least in part on the plurality of input values corresponding to the plurality of features. In response,GUI module 10 may generate a GUI for display at a display device operatively coupled to computing device 2 (e.g., display 4) based at least in part on the determination of the dominant hand. -
Handedness module 8 may use a probabilistic model, such as a Bayesian network, to determine the dominant hand of the user. For example, for one or more of the plurality of input values determined from information received from one ormore sensors 6,handedness module 8 may determine a difference between the respective input value and a respective baseline value. The respective baseline values may be determined, in some examples, using input values received during a ground-truth data collection phase. For instance, information from sensors (e.g., one or more sensors 6) may be collected while known right-handed and left-handed users perform various tasks usingcomputing device 2 or other similar computing device. The baseline values may be used to determine a feature vector that represents stereotypical state information of computing device 2 (e.g., physical orientation information, acceleration information, etc.) during use by known right-handed and/or left-handed users. In certain examples, the baseline values may be modified based on information received from one ormore sensors 6. For instance, a baseline value corresponding to a feature representing a stereotypical cheek-press input (e.g., a profile of an area of a presence-sensitive display that detects input indicating contact between the presence-sensitive display and a cheek of a user) may be modified based on user input information detected atdisplay 4. -
Handedness module 8 may apply a weighted value associated with the respective feature to the determined difference to generate a weighted difference value. For example, while each feature of the feature vector may be usable to determine a dominant hand of a user, certain of the features may provide a stronger indication of a dominant hand of the user. As one example, visual information including a representation of at least a portion of an ear of the user may provide a stronger indication of a dominant hand of a user than acceleration information indicating a motion ofcomputing device 2 to a particular side of a user's head. In such an example,handedness module 8 may apply a weighted value (e.g., a coefficient) to a determined difference between visual information input values and a visual information baseline feature than to a determined difference between acceleration information input values and an acceleration information baseline feature. In some examples,handedness module 8 may apply weighted values that range between zero and one. -
Handedness module 8 may aggregate the one or more weighted difference values to determine an aggregated weighted difference value. For example,handedness module 8 may determine a distance between an n-dimensional input feature vector and an n-dimensional baseline feature vector, where “n” represents the number of features in each of the input and baseline feature vectors. In certain examples,handedness module 8 may determine a representation of an angle between an n-dimensional input feature vector and an n-dimensional baseline feature vector, such as by determining the cosine of the angle between the two vectors. -
Handedness module 8 may determine the dominant hand of the user based at least in part on the aggregated weighted difference value. For instance,handedness module 8 may compare the aggregated weighted difference value to a threshold value, and may determine the dominant hand of the user based on the comparison. As one example,handedness module 8 may determine that an aggregated weighted difference value that is greater than or equal to a threshold value corresponds to a right-handed user and an aggregated weighted difference value that is less than the threshold value corresponds to a left-handed user. - In certain examples,
handedness module 8 may determine the plurality of features that are usable to determine a dominant hand of a user in response to one or more received inputs corresponding to a particular feature. For instance,handedness module 8 may receive an input value corresponding to a first feature. In response,handedness module 8 may determine one of an active state and an inactive state of a sensor associated with a second feature based at least in part on a criterion that specifies a relationship between the first feature and the second feature.Handedness module 8 may activate the sensor associated with the second feature in response to determining an active state of the sensor. Similarly,handedness module 8 may, in certain examples, deactivate the sensor associated with the second feature in response to determining an inactive state of the sensor. - As an example, the input value corresponding to the first feature may include an indication of a user input detected at a presence-sensitive display operatively coupled to computing device 2 (e.g., display 4). The sensor associated with the second feature may include an image sensor of the mobile computing device (e.g., an image sensor associated with a camera device of computing device 2).
Handedness module 8 may determine at least one of an active and inactive state of the image sensor based at least in part on a determination that a received indication of a user input detected at the presence-sensitive display is indicative of a contact between the presence-sensitive display and at least a portion of a head of a user (e.g., a cheek-press user input).Handedness module 8 may determine an active state of the image sensor when the received indication of the user input indicates a contact between the presence-sensitive display and at least the portion of the head of the user. -
Handedness module 8 may determine an inactive state of the image sensor when the received indication of the user input does not indicate a contact between the presence-sensitive display and at least a portion of the head of the user. As such, in examples wherecomputing device 2 includes a battery to provide electrical power to components ofcomputing device 2,handedness module 8 may help to decrease power consumption of components ofcomputing device 2. That is, rather than require that each sensor ofsensors 6 be active during use ofcomputing device 2,handedness module 6 may activate and deactivate at least one ofsensors 6 based on received inputs corresponding to a particular feature. In the above example, rather than require that a camera device ofcomputing device 2 be active at all times to detect information associated with a portion of a user's head (e.g., at least a portion of an ear of the user),handedness module 8 may activate the camera device in response to receiving a user input atdisplay 4 indicating thatdisplay 4 is in contact with at least a portion of the user's head. Assuch handedness module 8 may conserve battery power by activating the camera device in response to an input indicating that such visual information may likely be available. - Similarly, the input value corresponding to the first feature may include the indication of the user input detected at the presence-sensitive display (e.g., display 4), and the sensor associated with the second feature may include a gyroscope of
computing device 2.Handedness module 8 may determine the active state of the gyroscope based at least in part on a determination that the received indication of the user input at the presence-sensitive display is indicative of a contact between the presence-sensitive display and at least a portion of the head of the user. As such,handedness module 8 may conserve battery power by activating the gyroscope in response to an input indicating that physical orientation information usable to determine the dominant hand of the user is likely available (e.g., a physical orientation ofcomputing device 2 with respect to the ground when computingdevice 2 is used for telephonic communications). - As another example, the input value corresponding to the first feature may include an audio input from an audio input device (e.g., a microphone) of
computing device 2. In such an example, the sensor associated with the second feature may include an accelerometer ofcomputing device 2.Handedness module 8 may determine the active state of the accelerometer based at least in part on a determination that a received audio input is indicative of wind noise. For example, a received audio input that is indicative of wind noise may indicate movement ofcomputing device 2. As such,handedness module 8 may decrease power consumption of accelerometers ofcomputing device 2 by activating the accelerometers in response to determining that the received audio input indicates wind noise, and hence, possible motion ofcomputing device 2. -
FIGS. 3A and 3B are conceptual diagrams illustrating an example computing device that may be used to determine a dominant hand of a user and generate a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure. In the example ofFIG. 3A ,GUI module 10 generates a GUI for display atdisplay 4 in a right-handed dominant hand visual configuration includinggraphical elements visual layout 42. In the example ofFIG. 3B ,GUI module 10 generates a GUI for display atdisplay 4 in a left-handed non-dominant hand visual configuration includinggraphical elements visual layout 46. - In the example of
FIG. 3A ,handedness module 8 may determine that a right hand ofuser 3 is a dominant hand ofuser 3.GUI module 10 may generate a GUI including graphical elements 40 for display atdisplay 4 based at least in part on the determination of the right hand as the dominant hand of the user. For example, as illustrated,GUI module 10 may generate the GUI in a right-handed visual configuration. The right-handed visual configuration may include a visual layout of at least one graphical element (e.g., at least one of graphical elements 40). For instance, as illustrated inFIG. 3A ,GUI module 10 may generate the GUI for display such that graphical elements 40 are positioned at locations ofdisplay 4 along an arc that follows a radius reachable by a right thumb ofuser 3. That is, as illustrated,GUI module 10 may generate the right-handed visual configuration such thatgraphical elements 12 are positioned at a right portion ofdisplay 4 along an arc that follows a radius of a typical motion of a right thumb ofuser 3 asuser 3 moves his or her right thumb between a bottom portion ofdisplay 4 and a top portion ofdisplay 4. - Similarly, in certain examples,
handedness module 8 may determine that a left hand ofuser 3 is a dominant hand ofuser 3. In such examples,GUI module 10 may generate the GUI in a left-handed visual configuration. The left-handed visual configuration may include a visual layout of at least one graphical element (e.g., at least one of graphical elements 40). For example, in response to determining that a left hand ofuser 3 is a dominant hand ofuser 3,GUI module 10 may generate the left-handed GUI for display in a left-handed dominant hand visual configuration such that graphical elements 40 are positioned at locations ofdisplay 4 along an arc that follows a radius reachable by a left thumb ofuser 3. - The dominant hand visual configuration may be different from a non-dominant hand visual configuration. For instance, the dominant hand visual configuration may include one or more dominant hand layout properties that specify, for at least one graphical element, a visual layout of the at least one graphical element. Similarly, the non-dominant hand visual configuration may include one or more non-dominant hand layout properties that specify, for at least one graphical element, a visual layout of the at least one graphical element. The dominant-hand visual layout may be different from the non-dominant hand visual layout.
- For example, as illustrated in
FIG. 3A , a right-handed visual configuration (e.g., a dominant hand visual configuration in this example) may include a right-handed visual layout of at least one of graphical elements 40. In this example, the right-handed visual configuration may include one or more right-handed layout properties that specify the visual layout of graphical elements 40. The right-handed visual layout (e.g., a dominant hand visual layout in this example) may be different from a left-handed visual layout (e.g., a non-dominant hand visual layout in this example). For instance, the left-handed visual configuration may include one or more left-handed layout properties that specify a left-handed visual layout of graphical elements 40 that is different than the right-handed visual layout of graphical elements 40. - As an example, the left-handed layout properties may specify a left-handed visual layout of graphical elements 40 such that graphical elements 40 are positioned along an arc that follows a radius reachable by a left thumb of
user 3. Examples of such layout properties may include, but are not limited to, a size of at least one graphical element (e.g., a size of at least one of graphical elements 40), a shape of the at least one graphical element, a display location of the at least one graphical element at a display device (e.g., display 4), and information indicating whether the at least one graphical element is displayed at the display device. Each of the dominant hand and non-dominant hand visual configuration may include such visual layout properties. In addition, one or more of the respective visual layout properties associated with each of the dominant hand visual configuration and the non-dominant hand visual configuration may be different, such that a dominant hand visual layout of the dominant hand visual configuration is different than the non-dominant hand visual layout of the non-dominant hand visual configuration. - In certain examples,
handedness module 8 may determine that a user (e.g., user 3) is currently holdingcomputing device 2 with a non-dominant hand of the user. For instance, using techniques of this disclosure,handedness module 8 may determine the dominant hand of the user based at least in part on a received plurality of input values corresponding to a respective plurality of features usable to determine a dominant hand of a user. In addition,handedness module 8 may determine, based at least in part on the plurality of input values, that a user is currently holdingcomputing device 2 with a non-dominant hand of the user. - As an example,
handedness module 8 may determine that a right hand ofuser 3 is a dominant hand ofuser 3. In addition,handedness module 8 may determine that a plurality of input values corresponding to a respective plurality of features usable to determine the dominant hand of the user indicates that the user is currently holdingcomputing device 2 with the non-dominant hand of the user. For instance,handedness module 8 may compare an input feature vector to a baseline feature vector determined with respect to known right-handed and/or left-handed users.Handedness module 8 may determine, in some examples, thatuser 3 is a right-handed user, and that the input feature vector correlates to a baseline feature vector associated with known left-handed users. In such examples,handedness module 8 may determine thatuser 3 may be currently holding computing device with a non-dominant hand of user 3 (i.e., a left hand ofuser 3 in this example). Similarly,handedness module 8 may determine thatuser 3 is a left-handed user, and that the input feature vector correlates to a baseline feature vector associated with known right-handed user. In such examples,handedness module 8 may determine thatuser 3 may be currently holdingcomputing device 2 with a non-dominant hand of user 3 (i.e., a right hand ofuser 3 in the current example). Responsive to determining that the user is currently holdingcomputing device 2 with the non-dominant hand of the user,GUI module 10 may generate the GUI for display in a non-dominant hand visual configuration. - As illustrated in
FIG. 3B ,GUI module 10 generates a GUI in a left-handed non-dominant hand visual configuration including graphical elements 44 invisual layout 46. As illustrated, visual layout 46 (i.e., a non-dominant hand visual layout in this example) may be different than visual layout 42 (i.e., a dominant hand visual layout in the example ofFIG. 3A ). For instance, at least one of graphical elements 44 may correspond to at least one of graphical elements 40. However,visual layout 46 may differ fromvisual layout 42 with respect to at least one of a shape, a size, and a display location of the at least one corresponding graphical elements. In addition,visual layout 46 may differ fromvisual layout 42 with respect to whether the at least one corresponding graphical element is displayed atdisplay device 4. - As an example, graphical elements 44 may correspond to
graphical elements FIG. 3A . For instance,graphical elements graphical elements graphical elements visual layout 46 may be different fromvisual layout 42 in that certain graphical elements displayed invisual layout 46 may not be displayed in visual layout 46 (i.e.,graphical elements - As such,
GUI module 10 may promote usability ofcomputing device 2 by facilitating user selection of graphical elements with the non-dominant hand of the user. For example, to help compensate for a tendency of users to be less accurate when providing user input gestures with a non-dominant hand of the user,GUI module 10 may display fewer graphical elements in a non-dominant hand visual configuration than in a dominant hand visual configuration, each of the graphical elements of the non-dominant hand visual configuration being larger than the corresponding graphical elements of the dominant hand visual configuration. -
FIG. 4 is a flow diagram illustrating example operations of a computing device to determine a dominant hand of a user and output a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure. The example illustrated inFIG. 4 is only one example operation, and other implementations may include more or fewer aspects than those depicted inFIG. 4 . For purposes of illustration only, the example operations are described below within the context ofcomputing device 2. -
Handedness module 8, executing on one ormore processors 32, may determine a plurality of features (50). Each feature from the plurality of features may be usable to determine a dominant hand of a user ofcomputing device 2.Handedness module 8 may receive a plurality of input values, each input value from the plurality of input values corresponding to a respective feature from the plurality of features (52). For example,handedness module 8 may receive a plurality of input values from one ormore sensors 6, each input value corresponding to a respective feature from the plurality of features.Handedness module 8 may select an input value from the plurality of input values (54).Handedness module 8 may determine a difference between the respective input value and a respective baseline value (56).Handedness module 8 may apply a weighted value associated with the respective feature to the determined difference to generate a weighted difference value (58). -
Handedness module 8 may determine whether each input value of the plurality of input values has been evaluated (60). For example,handedness module 8 may determine, for each input value of the plurality of input values, whether a difference has been determined between the input value and a respective baseline value. Whenhandedness module 8 determines that at least one of the input values of the plurality of input values has not been evaluated (“NO” branch of 60),handedness module 8 may select a next input value. Whenhandedness module 8 determines that each input value of the plurality of input values has been evaluated (“YES” branch of 60),handedness module 8 may aggregate the weighted difference values to determine an aggregated weighted difference value (62). -
Handedness module 8 may determine whether the aggregated weighted difference value corresponds to a left-handed user (64). Whenhandedness module 8 determines that the aggregated value corresponds to a left-handed user (“YES” branch of 64),GUI module 10 may output for display at display 4 a GUI in a left-handed visual configuration (66). In some examples, whenhandedness module 8 determines that the aggregated value does not correspond to a left-handed user (“NO” branch of 64),handedness module 8 may determine whether the aggregated value corresponds to a right-handed user (68). In certain examples, whenhandedness module 8 determines that the aggregated value does not correspond to a left-handed user,GUI module 10 may output for display at display 4 a GUI in a right-handed visual configuration. That is, rather than performoperation 68 to determine whether the aggregated value corresponds to a right-handed user,GUI module 10 may output for display at display 4 a GUI in a right-handed visual configuration as a default visual configuration, and may output a GUI in a left-handed visual configuration in response tohandedness module 8 determining that the aggregated value corresponds to a left-handed user. - In certain examples, when
handedness module 8 determines that the aggregated value corresponds to a right-handed user (“YES” branch of 68),GUI module 10 may output for display at display 4 a GUI in a right-handed visual configuration (70). Whenhandedness module 8 determines that the aggregated value does not correspond to a right-handed user (“NO” branch of 68),handedness module 8 may determine a plurality of features, each of which may be usable to determine a dominant hand of a user. - In some examples,
handedness module 8 may output for display at display 4 a GUI in a hand-neutral visual configuration. For instance, whenhandedness module 8 determines that the aggregated value does not correspond to a right-handed user (“NO” branch of 68),GUI module 10 may output for display at display 4 a GUI in a hand-neutral visual configuration. In certain examples,GUI module 10 may output for display atdisplay 4 the GUI in the hand-neutral visual configuration andhandedness module 8 may determine a plurality of features, each of which may be usable to determine a dominant hand of a user (e.g., operation 50). - A hand-neutral visual configuration may include, for example, a visual configuration that favors neither a left hand nor a right hand of a user. For instance, a hand-neutral visual configuration of a GUI may include one or more graphical elements (e.g., one or more graphical button controls) output at locations of
display 4 equidistant between a typical arc of a left thumb of a user holding a mobile computing device in the left hand of the user and a typical arc of a right thumb of a user holding the mobile computing device in the right hand of the user. In some examples, one or more of a size and shape of at least one graphical element included in a hand-neutral visual configuration may be configured to favor neither a left hand nor a right hand of a user. For instance, one or more visual layout properties associated with each of a dominant hand visual configuration (e.g., a left-handed visual configuration in one example) and a non-dominant hand visual configuration (e.g., a right-handed visual configuration in one example) may define one or more of a size and shape of at least one graphical element included in the dominant hand visual configuration and the non-dominant hand visual configuration. As an example, the one or more visual layout properties may specify a particular size of a graphical element for display in the non-dominant hand visual configuration, and may specify a smaller size of the graphical element for display in the dominant hand visual configuration. In certain examples,GUI module 10 may output for display at display 4 a GUI in a hand-neutral visual configuration, such as by outputting one or more graphical elements with a size that is an average of the size of the one or more graphical elements specified by the visual layout properties associated with each of a dominant hand visual configuration and a non-dominant hand visual configuration. - In some examples,
GUI module 10 may output for display at display 4 a GUI in a visual configuration specified by a user ofcomputing device 2. For instance, a user may specify one of a right-handed, left-handed, or hand-neutral visual configuration, such as by using user interface 30 (e.g., selecting a visual configuration preference). In such examples,GUI module 10 may output for display at display 4 a GUI in a visual configuration corresponding to the user-selected visual configuration. That is, in such examples,GUI module 10 may output for display at display 4 a GUI in a visual configuration corresponding to the user-selected visual configuration regardless of determinations made byhandedness module 8 based on the aggregated weighted difference values. -
FIG. 5 is a flow diagram illustrating example operations of a computing device to determine a dominant hand of a user and output a graphical user interface based at least in part on the determined dominant hand, in accordance with one or more aspects of this disclosure. The example illustrated inFIG. 5 is only one example operation, and other implementations may include more or fewer aspects than those depicted inFIG. 5 . For purposes of illustration only, the example operations are described below within the context ofcomputing device 2. -
Computing device 2 may determine a plurality of features, wherein each feature from the plurality of features is usable to determine a dominant hand of a user of computing device 2 (72).Computing device 2 may receive a plurality of input values, each input value from the plurality of input values corresponding to a respective feature from the plurality of features (74).Computing device 2 may determine, using a probabilistic model and based at least in part on at least one input value from the plurality of input values corresponding to the respective feature from the plurality of features, a hand of the user as a dominant hand of the user (76).Computing device 2 may generate, based at least in part on the determined dominant hand of the user, a graphical user interface for display at a display device operatively coupled to computing device 2 (e.g., a presence-sensitive display) (78). - In some examples, generating the graphical user interface based at least in part on the determined dominant hand includes generating for display the graphical user interface in a dominant hand visual configuration. The dominant hand visual configuration may be different from a non-dominant hand visual configuration. In certain examples, the dominant hand visual configuration includes a first visual layout of the at least one graphical element at the display device, the non-dominant hand visual configuration includes a second visual layout of the at least one graphical element at the display device, and the first visual layout is different from the second visual layout.
- In some examples, the dominant hand visual configuration includes one or more dominant hand layout properties that specify, for the at least one graphical element, the first visual layout of the at least one graphical element, and the non-dominant hand visual configuration includes one or more non-dominant hand layout properties that specify, for the at least one graphical element, the second visual layout of the at least one graphical element. In certain examples, the one or more dominant hand layout properties include one or more of a size of the at least one graphical element, a shape of the at least one graphical element, a display location of the at least one graphical element at the display device, and information indicating whether the at least one graphical element is displayed at the display device. In some examples, the one or more non-dominant hand layout properties include one or more of a size of the at least one graphical element, a shape of the at least one graphical element, a display location of the at least one graphical element at the display device, and information indicating whether the at least one graphical element is displayed at the display device.
- In certain examples, the example operations further include determining, based at least in part on the received plurality of input values corresponding to the respective plurality of features, that the user is currently holding
computing device 2 with a non-dominant hand of the user; and responsive to determining that the user is currently holding thecomputing device 2 with the non-dominant hand of the user, generating for display the graphical user interface in the non-dominant hand visual configuration. In some examples at least one input value from the plurality of input values includes acceleration information from an accelerometer ofcomputing device 2. In certain examples, at least one input value from the plurality of input values includes physical orientation information from a gyroscope ofcomputing device 2. In some examples, the display device includes a presence-sensitive display, and at least one input value from the plurality of input values includes an indication of a user input detected at the presence-sensitive display. - In certain examples, at least one input value from the plurality of input values includes visual information from an image sensor of
computing device 2. In some examples, the visual information includes a visual representation of an anatomical feature of a head of a user. In certain examples, the anatomical feature of the head of the user includes at least a portion of an ear of the user. - In some examples, determining the dominant hand of the user, using the probabilistic model, includes for one or more input values from the plurality of input values, determining a difference between the respective input value and a respective baseline value, and applying a weighted value associated with the respective feature to the determined difference to generate a weighted difference value. In such examples, determining the dominant hand of the user may include aggregating the one or more weighted difference values to determine an aggregated weighted difference value, and determining the dominant hand of the user based at least in part on the aggregated weighted difference value.
- In certain examples, determining the plurality of features includes, in response to receiving an input value corresponding to a first feature, determining, based at least in part on a criterion that specifies a relationship between the first feature and a second feature, one of an active state and an inactive state of a sensor associated with the second feature. In such examples, determining the plurality of features may further include activating the sensor associated with the second feature in response to determining the active state of the sensor, and deactivating the sensor associated with the second feature in response to determining the inactive state of the sensor.
- In some examples, receiving the input value corresponding to the first feature includes receiving an audio input from an audio input device of the
computing device 2, and the sensor associated with the second feature includes an accelerometer of thecomputing device 2. In such examples, determining the active state of the accelerometer includes determining the active state based at least in part on a determination that the received audio input is indicative of wind noise. In certain examples, the display device includes a presence-sensitive display, receiving the input value corresponding to the first feature includes receiving an indication of a user input detected at the presence-sensitive display, and the sensor associated with the second feature includes an image sensor ofcomputing device 2. In such examples, determining the active state of the image sensor includes determining the active state based at least in part on a determination that the received indication of the user input detected at the presence-sensitive display is indicative of a contact between the presence-sensitive display and at least a portion of a head of the user. - In some examples, the display device includes a presence-sensitive display, receiving the input value corresponding to the first feature includes receiving an indication of a user input detected at the presence-sensitive display, and the sensor associated with the second feature comprises a gyroscope of the
computing device 2. In such examples, determining the active state of the gyroscope includes determining the active state based at least in part on a determination that the received indication of the user input detected at the presence-sensitive display is indicative of a contact between the presence-sensitive display and at least a portion of a head of the user. In certain examples, the portion of the head of the user may include a cheek of the user. - The techniques described in this disclosure may be implemented, at least in part, in hardware, software, firmware, or any combination thereof. For example, various aspects of the described techniques may be implemented within one or more processors, including one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components. The term “processor” or “processing circuitry” may generally refer to any of the foregoing logic circuitry, alone or in combination with other logic circuitry, or any other equivalent circuitry. A control unit including hardware may also perform one or more of the techniques of this disclosure.
- Such hardware, software, and firmware may be implemented within the same device or within separate devices to support the various techniques described in this disclosure. In addition, any of the described units, modules or components may be implemented together or separately as discrete but interoperable logic devices. Depiction of different features as modules or units is intended to highlight different functional aspects and does not necessarily imply that such modules or units must be realized by separate hardware, firmware, or software components. Rather, functionality associated with one or more modules or units may be performed by separate hardware, firmware, or software components, or integrated within common or separate hardware, firmware, or software components.
- The techniques described in this disclosure may also be embodied or encoded in an article of manufacture including a computer-readable storage medium encoded with instructions. Instructions embedded or encoded in an article of manufacture including a computer-readable storage medium encoded, may cause one or more programmable processors, or other processors, to implement one or more of the techniques described herein, such as when instructions included or encoded in the computer-readable storage medium are executed by the one or more processors. Computer readable storage media may include random access memory (RAM), read only memory (ROM), programmable read only memory (PROM), erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), flash memory, a hard disk, a compact disc ROM (CD-ROM), a floppy disk, a cassette, magnetic media, optical media, or other computer readable media. In some examples, an article of manufacture may include one or more computer-readable storage media.
- In some examples, a computer-readable storage medium may include a non-transitory medium. The term “non-transitory” may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. In certain examples, a non-transitory storage medium may store data that can, over time, change (e.g., in RAM or cache).
- Various examples have been described. These and other examples are within the scope of the following claims.
Claims (20)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/658,632 US8665238B1 (en) | 2012-09-21 | 2012-10-23 | Determining a dominant hand of a user of a computing device |
PCT/US2013/060736 WO2014047361A2 (en) | 2012-09-21 | 2013-09-19 | Determining a dominant hand of a user of a computing device |
CN201380058341.8A CN104769526A (en) | 2012-09-21 | 2013-09-19 | Determining a dominant hand of a user of a computing device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261704334P | 2012-09-21 | 2012-09-21 | |
US13/658,632 US8665238B1 (en) | 2012-09-21 | 2012-10-23 | Determining a dominant hand of a user of a computing device |
Publications (2)
Publication Number | Publication Date |
---|---|
US8665238B1 US8665238B1 (en) | 2014-03-04 |
US20140085220A1 true US20140085220A1 (en) | 2014-03-27 |
Family
ID=50158778
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/658,632 Expired - Fee Related US8665238B1 (en) | 2012-09-21 | 2012-10-23 | Determining a dominant hand of a user of a computing device |
Country Status (3)
Country | Link |
---|---|
US (1) | US8665238B1 (en) |
CN (1) | CN104769526A (en) |
WO (1) | WO2014047361A2 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160018960A1 (en) * | 2014-07-16 | 2016-01-21 | Lenovo (Singapore) Pte. Ltd. | Handedness detection |
CN105335059A (en) * | 2014-07-31 | 2016-02-17 | 优视科技有限公司 | Identification method for user operation mode in handheld device and handheld device |
US20160054827A1 (en) * | 2014-08-21 | 2016-02-25 | Echostar Technologies L.L.C. | Determining handedness on multi-element capacitive devices |
US20160092055A1 (en) * | 2014-09-25 | 2016-03-31 | Alibaba Group Holding Limited | Method and apparatus for adaptively adjusting user interface |
USD775657S1 (en) * | 2015-04-30 | 2017-01-03 | Brillio LLC | Display screen with animated graphical user interface |
US20170017799A1 (en) * | 2014-03-31 | 2017-01-19 | Huawei Technologies Co., Ltd. | Method for Identifying User Operation Mode on Handheld Device and Handheld Device |
GB2541730A (en) * | 2015-08-28 | 2017-03-01 | Samsung Electronics Co Ltd | Displaying graphical user interface elements on a touch screen |
US20170357360A1 (en) * | 2016-06-13 | 2017-12-14 | Lenovo (Singapore) Pte. Ltd. | Microphone control via contact patch |
US10775845B2 (en) * | 2016-03-18 | 2020-09-15 | Intel Corporation | Technologies for context aware graphical user interfaces for mobile compute devices |
US11385784B2 (en) * | 2019-01-31 | 2022-07-12 | Citrix Systems, Inc. | Systems and methods for configuring the user interface of a mobile device |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2481191A (en) | 2010-02-25 | 2011-12-21 | Sita Information Networking Computing Ireland Ltd | Graphical development tool for software application development |
MY158867A (en) | 2010-12-21 | 2016-11-16 | Sita N V | Reservation system and method |
SG188579A1 (en) | 2011-08-03 | 2013-04-30 | Sita Inf Networking Computing Usa Inc | Item handling and tracking system and method therefor |
GB2499288A (en) | 2012-02-09 | 2013-08-14 | Sita Inf Networking Computing Usa Inc | Path determination |
CN103543842B (en) * | 2012-07-16 | 2017-05-24 | 联想(北京)有限公司 | Terminal device |
US20140137038A1 (en) * | 2012-11-10 | 2014-05-15 | Seungman KIM | Electronic apparatus and method of displaying a user input menu |
TW201426441A (en) * | 2012-12-26 | 2014-07-01 | Hon Hai Prec Ind Co Ltd | Display control system and display control method |
US20140184519A1 (en) * | 2012-12-28 | 2014-07-03 | Hayat Benchenaa | Adapting user interface based on handedness of use of mobile computing device |
JP5991538B2 (en) * | 2013-02-20 | 2016-09-14 | 富士ゼロックス株式会社 | Data processing apparatus, data processing system, and program |
US10320908B2 (en) | 2013-03-25 | 2019-06-11 | Sita Information Networking Computing Ireland Limited | In-flight computing device for aircraft cabin crew |
JP2014203288A (en) * | 2013-04-05 | 2014-10-27 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
TWI742471B (en) * | 2013-10-11 | 2021-10-11 | 日商半導體能源研究所股份有限公司 | A driving method of a portable data-processing device |
KR102297287B1 (en) | 2013-11-15 | 2021-09-03 | 가부시키가이샤 한도오따이 에네루기 켄큐쇼 | Data processor |
US9594893B2 (en) * | 2014-01-15 | 2017-03-14 | Lenovo (Singapore) Pte. Ltd. | Multi-touch local device authentication |
US9509822B2 (en) | 2014-02-17 | 2016-11-29 | Seungman KIM | Electronic apparatus and method of selectively applying security in mobile device |
GB2523441A (en) | 2014-02-19 | 2015-08-26 | Sita Information Networking Computing Ireland Ltd | Reservation system and method therefor |
US9971490B2 (en) * | 2014-02-26 | 2018-05-15 | Microsoft Technology Licensing, Llc | Device control |
US20170235366A1 (en) * | 2014-08-07 | 2017-08-17 | Beijing Zhigu Tech Co., Ltd. | Dominant limb identification method and device |
US10082936B1 (en) * | 2014-10-29 | 2018-09-25 | Amazon Technologies, Inc. | Handedness determinations for electronic devices |
US10001546B2 (en) | 2014-12-02 | 2018-06-19 | Sita Information Networking Computing Uk Limited | Apparatus for monitoring aircraft position |
EP3304767B1 (en) | 2015-06-01 | 2019-12-04 | Sita Information Networking Computing UK Limited | Method and system for monitoring aircraft status |
WO2017039125A1 (en) * | 2015-08-28 | 2017-03-09 | Samsung Electronics Co., Ltd. | Electronic device and operating method of the same |
CN107037965A (en) * | 2016-02-04 | 2017-08-11 | 北京搜狗科技发展有限公司 | A kind of information displaying method based on input, device and mobile terminal |
US11073980B2 (en) | 2016-09-29 | 2021-07-27 | Microsoft Technology Licensing, Llc | User interfaces for bi-manual control |
USD836625S1 (en) | 2016-12-08 | 2018-12-25 | Sita Information Networking Computing Canada, Inc. | Self-service kiosk |
US10795450B2 (en) | 2017-01-12 | 2020-10-06 | Microsoft Technology Licensing, Llc | Hover interaction using orientation sensing |
US10620013B2 (en) | 2017-03-09 | 2020-04-14 | Sita Information Networking Computing Usa, Inc. | Testing apparatus and method for testing a location-based application on a mobile device |
US10360785B2 (en) | 2017-04-07 | 2019-07-23 | Sita Information Networking Computing Usa, Inc. | Article tracking system and method |
USD881961S1 (en) | 2017-05-19 | 2020-04-21 | Sita Information Networking Computing Usa, Inc. | Robot |
DE102017111236A1 (en) * | 2017-05-23 | 2017-11-09 | Lothar Rupff | Method of determining handedness |
US11410088B2 (en) | 2017-11-03 | 2022-08-09 | Sita Ypenburg B.V. | Systems and methods for interactions between ticket holders and self service functions |
CN112416238A (en) * | 2020-11-30 | 2021-02-26 | 联想(北京)有限公司 | Information processing method, information processing device, electronic equipment and storage medium |
CN114356119A (en) * | 2021-11-16 | 2022-04-15 | 北京乐我无限科技有限责任公司 | Control method and device of application operation interface, electronic equipment and storage medium |
US11537239B1 (en) | 2022-01-14 | 2022-12-27 | Microsoft Technology Licensing, Llc | Diffusion-based handedness classification for touch-based input |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5402152A (en) * | 1993-12-30 | 1995-03-28 | Intel Corporation | Method and apparatus for tailoring scroll bar and cursor appearance to pen user hand orientation |
US6243074B1 (en) | 1997-08-29 | 2001-06-05 | Xerox Corporation | Handedness detection for a physical manipulatory grammar |
US7800592B2 (en) * | 2005-03-04 | 2010-09-21 | Apple Inc. | Hand held electronic device with multiple touch sensing devices |
GB2375278B (en) | 2001-05-04 | 2003-09-10 | Motorola Inc | Adapting data in a communication system |
US6888532B2 (en) | 2001-11-30 | 2005-05-03 | Palmone, Inc. | Automatic orientation-based user interface for an ambiguous handheld device |
US6948136B2 (en) * | 2002-09-30 | 2005-09-20 | International Business Machines Corporation | System and method for automatic control device personalization |
US7255503B2 (en) * | 2004-05-17 | 2007-08-14 | Charlene H. Grafton | Dual numerical keyboard based on dominance |
JP5045559B2 (en) * | 2008-06-02 | 2012-10-10 | 富士通モバイルコミュニケーションズ株式会社 | Mobile device |
EP3654141A1 (en) * | 2008-10-06 | 2020-05-20 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying graphical user interface depending on a user's contact pattern |
CN101729636A (en) * | 2008-10-16 | 2010-06-09 | 鸿富锦精密工业(深圳)有限公司 | Mobile terminal |
US9477396B2 (en) * | 2008-11-25 | 2016-10-25 | Samsung Electronics Co., Ltd. | Device and method for providing a user interface |
US20100153313A1 (en) * | 2008-12-15 | 2010-06-17 | Symbol Technologies, Inc. | Interface adaptation system |
US9600070B2 (en) * | 2008-12-22 | 2017-03-21 | Apple Inc. | User interface having changeable topography |
US20100310136A1 (en) * | 2009-06-09 | 2010-12-09 | Sony Ericsson Mobile Communications Ab | Distinguishing right-hand input and left-hand input based on finger recognition |
KR101612283B1 (en) * | 2009-09-10 | 2016-04-15 | 삼성전자주식회사 | Apparatus and method for determinating user input pattern in portable terminal |
US8963845B2 (en) * | 2010-05-05 | 2015-02-24 | Google Technology Holdings LLC | Mobile device with temperature sensing capability and method of operating same |
-
2012
- 2012-10-23 US US13/658,632 patent/US8665238B1/en not_active Expired - Fee Related
-
2013
- 2013-09-19 WO PCT/US2013/060736 patent/WO2014047361A2/en active Application Filing
- 2013-09-19 CN CN201380058341.8A patent/CN104769526A/en active Pending
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170017799A1 (en) * | 2014-03-31 | 2017-01-19 | Huawei Technologies Co., Ltd. | Method for Identifying User Operation Mode on Handheld Device and Handheld Device |
US10444951B2 (en) * | 2014-03-31 | 2019-10-15 | Huawei Technologies Co., Ltd. | Method and device for identifying a left-hand or a right-hand mode of operation on a user handheld device |
US20160018960A1 (en) * | 2014-07-16 | 2016-01-21 | Lenovo (Singapore) Pte. Ltd. | Handedness detection |
US9710137B2 (en) * | 2014-07-16 | 2017-07-18 | Lenovo (Singapore) Pte. Ltd. | Handedness detection |
CN105335059A (en) * | 2014-07-31 | 2016-02-17 | 优视科技有限公司 | Identification method for user operation mode in handheld device and handheld device |
US20160054827A1 (en) * | 2014-08-21 | 2016-02-25 | Echostar Technologies L.L.C. | Determining handedness on multi-element capacitive devices |
US10678381B2 (en) * | 2014-08-21 | 2020-06-09 | DISH Technologies L.L.C. | Determining handedness on multi-element capacitive devices |
WO2016049205A1 (en) * | 2014-09-25 | 2016-03-31 | Alibaba Group Holding Limited | Method and apparatus for adaptively adjusting user interface |
TWI670638B (en) * | 2014-09-25 | 2019-09-01 | 香港商阿里巴巴集團服務有限公司 | Method and device for adaptively adjusting user interface of smart terminal |
CN105511748A (en) * | 2014-09-25 | 2016-04-20 | 阿里巴巴集团控股有限公司 | Method and device for adaptively adjusting user interface of intelligent terminal |
US10572110B2 (en) * | 2014-09-25 | 2020-02-25 | Alibaba Group Holding Limited | Method and apparatus for adaptively adjusting user interface |
US20160092055A1 (en) * | 2014-09-25 | 2016-03-31 | Alibaba Group Holding Limited | Method and apparatus for adaptively adjusting user interface |
USD775657S1 (en) * | 2015-04-30 | 2017-01-03 | Brillio LLC | Display screen with animated graphical user interface |
GB2541730A (en) * | 2015-08-28 | 2017-03-01 | Samsung Electronics Co Ltd | Displaying graphical user interface elements on a touch screen |
GB2541730B (en) * | 2015-08-28 | 2020-05-13 | Samsung Electronics Co Ltd | Displaying graphical user interface elements on a touch screen |
US10775845B2 (en) * | 2016-03-18 | 2020-09-15 | Intel Corporation | Technologies for context aware graphical user interfaces for mobile compute devices |
US20170357360A1 (en) * | 2016-06-13 | 2017-12-14 | Lenovo (Singapore) Pte. Ltd. | Microphone control via contact patch |
US10540085B2 (en) * | 2016-06-13 | 2020-01-21 | Lenovo (Singapore) Pte. Ltd. | Microphone control via contact patch |
US11385784B2 (en) * | 2019-01-31 | 2022-07-12 | Citrix Systems, Inc. | Systems and methods for configuring the user interface of a mobile device |
Also Published As
Publication number | Publication date |
---|---|
WO2014047361A2 (en) | 2014-03-27 |
WO2014047361A3 (en) | 2014-05-08 |
US8665238B1 (en) | 2014-03-04 |
CN104769526A (en) | 2015-07-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8665238B1 (en) | Determining a dominant hand of a user of a computing device | |
US9990086B2 (en) | Controlling input and output on multiple sides of a computing device | |
CN111665983B (en) | Electronic device and display method thereof | |
US11816325B2 (en) | Application shortcuts for carplay | |
CN103870535B (en) | Information search method and device | |
KR102113674B1 (en) | Apparatus, method and computer readable recording medium for selecting objects displayed on an electronic device using a multi touch | |
US9122332B2 (en) | Automatic detection for touch through glove | |
AU2014307237B2 (en) | Method and apparatus for recognizing grip state in electronic device | |
US9804679B2 (en) | Touchless user interface navigation using gestures | |
US9645651B2 (en) | Presentation of a control interface on a touch-enabled device based on a motion or absence thereof | |
US20140331146A1 (en) | User interface apparatus and associated methods | |
US10198118B2 (en) | Operating method and electronic device for processing method thereof | |
WO2012019350A1 (en) | Finger identification on a touchscreen | |
TW201413538A (en) | Input device with hand posture control | |
KR20140131061A (en) | Method of operating touch screen and electronic device thereof | |
US8810529B2 (en) | Electronic device and method of controlling same | |
US10108320B2 (en) | Multiple stage shy user interface | |
US20130194194A1 (en) | Electronic device and method of controlling a touch-sensitive display | |
US10012490B1 (en) | Determining location or position of a portable electronic device | |
CA2767707C (en) | Electronic device and method of controlling same | |
WO2016104431A1 (en) | Portable electronic instrument, control method, and control program | |
US20140139464A1 (en) | Pointer control method and electronic device thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GOSSWEILER, RICHARD CARL, III;ARADHYE, HRISHIKESH;SIGNING DATES FROM 20121005 TO 20121012;REEL/FRAME:029605/0440 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551) Year of fee payment: 4 |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044101/0299 Effective date: 20170929 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20220304 |