US20110034248A1 - Apparatus for associating physical characteristics with commands - Google Patents

Apparatus for associating physical characteristics with commands Download PDF

Info

Publication number
US20110034248A1
US20110034248A1 US12/537,823 US53782309A US2011034248A1 US 20110034248 A1 US20110034248 A1 US 20110034248A1 US 53782309 A US53782309 A US 53782309A US 2011034248 A1 US2011034248 A1 US 2011034248A1
Authority
US
United States
Prior art keywords
computer
accessory
storage medium
readable storage
input function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/537,823
Inventor
Arnie Grever
Bruce Hawver
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SteelSeries ApS
SteelSeries HQ
Original Assignee
SteelSeries HQ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SteelSeries HQ filed Critical SteelSeries HQ
Priority to US12/537,823 priority Critical patent/US20110034248A1/en
Assigned to STEELSERIES HQ reassignment STEELSERIES HQ ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GREVER, ARNIE, HAWVER, BRUCE
Publication of US20110034248A1 publication Critical patent/US20110034248A1/en
Assigned to STEELSERIES APS reassignment STEELSERIES APS CORRECTIVE ASSIGNMENT TO CORRECT THE NAME OF ASSIGNEE PREVIOUSLY RECORDED ON REEL 023354 FRAME 0064. ASSIGNOR(S) HEREBY CONFIRMS THE CORRECT NAME OF THE ASSIGNEE IS STEELSERIES APS.. Assignors: GREVER, ARNIE, HAWVER, BRUCE
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/22Setup operations, e.g. calibration, key configuration or button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/73Authorising game programs or game devices, e.g. checking authenticity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1018Calibration; Key and button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/401Secure communication, e.g. using encryption or authentication

Definitions

  • the present disclosure relates generally to accessory management applications, and more specifically to an apparatus for associating physical characteristics with commands.
  • Garners can have at their disposal accessories such as a keyboard, a general purpose gaming pad, a mouse, a gaming console controller, a headset with a built-in microphone to communicate with other players, a joystick, a computer display, or other common gaming accessories.
  • MMO Massively Multiplayer On-line
  • a gamer can utilize biometric sensing devices to serve as another option to control and/or manage gaming and other software applications.
  • a gamer can frequently use a combination of these accessories during a game (e.g., biometric sensing devices, headset, a keyboard, and mouse) or even use one accessory to replace the function of another accessory. Efficient management and utilization of these accessories can frequently impact the gamer's experience during a game.
  • FIGS. 1-3 depict illustrative embodiments of a Graphical User Interface (GUI) generated by an Accessory Management Software (AMS) application according to the present disclosure
  • GUI Graphical User Interface
  • AMS Accessory Management Software
  • FIGS. 4-6 depict illustrative methods describing the operation of the AMS application
  • FIGS. 7-9 depict a biometric sensing device featuring various detectable finger configurations, which can be associated with various actions.
  • FIG. 10 depicts an illustrative diagrammatic representation of a machine in the form of a computer system within which a set of instructions, when executed, may cause the machine to perform any one or more of the methodologies disclosed herein.
  • One embodiment of the present disclosure entails a computer-readable storage medium having computer instructions to present in a graphical user interface a plurality of associable actions and a biometric sensing accessory, wherein a fingerprint is detectable by the biometric sensing accessory and the fingerprint is correlated to an input function associated with the biometric sensing accessory, associate one of the plurality of associable actions with the input function, detect a stimulation of the input function by monitoring the biometric sensing accessory, wherein the stimulation of the input function occurs based on a detection of the fingerprint, retrieve the action associated with the input function, and transmit the action to an operating system.
  • One embodiment of the present disclosure entails a biometric accessory having a controller to detect at least one of navigation information and biometric information associated with a user of the accessory, and transmit at least one of the navigation information and biometric information to a software application, wherein an input function of the accessory which is correlated with at least one of the navigation information and the biometric information is assigned to an action of a plurality of associable actions by the software application, wherein a stimulation of the input function is detected by the software application, wherein the action is retrieved by the software application based on the stimulation being detected, and wherein the retrieved associable action is transmitted by the software application to an operating system.
  • One embodiment of the present disclosure entails a computer-readable storage medium having computer instructions to receive from a software application operably coupled to a biometric sensing accessory an action associated with an input function of the biometric sensing accessory, wherein the input function is correlated to at least one of navigation information and biometric information detected by the biometric sensing accessory, wherein a stimulation of the input function is detected by the software application, and wherein the action is retrieved by the software application when the stimulation is detected, and perform the received action.
  • FIGS. 1-3 depict illustrative embodiments of a Graphical User Interface (GUI) generated by an Accessory Management Software (AMS) application according to the present disclosure.
  • the AMS application can operate in a computing device such as a desktop computer, a laptop computer, a server, a mainframe computer, or a gaming console.
  • a gaming console can represent a gaming device such as a Playstation 3TM, a WiiTM, or an Xbox360TM. Other present and next generation gaming consoles are contemplated.
  • the AMS application can also operate in other computing devices with less computing resources such as a cellular phone, a personal digital assistant, or a media player (such as an iPODTM). From these illustrations it would be apparent to an artisan with ordinary skill in the art that the AMS application can operate in any device with computing resources.
  • FIGS. 4-6 depict illustrative methods 400 - 600 describing the operation of the AMS application as shown in FIGS. 1-3 .
  • Method 400 can begin with step 402 in which the AMS application is invoked in a computing device. The invocation step can result from a user selection of the AMS application from a menu or iconic symbol presented on a desktop of the computing device by an operating system (OS) managing operations thereof.
  • OS operating system
  • the AMS application can detect by way of drivers in the OS a plurality of operationally distinct accessories communicatively coupled to the computing device. However, the accessories do not necessarily have to be operationally distinct, and can have similar features and/or operational capabilities.
  • the accessories can be coupled to the computing device by a tethered interface (e.g., USB cable), a wireless interface (e.g., Bluetooth or Wireless Fidelity—WiFi), or combinations thereof.
  • a tethered interface e.g., USB cable
  • a wireless interface e.g., Bluetooth or
  • an accessory can represent any type of device which can be communicatively coupled to the computing device and which can control aspects of the OS and/or a software application operating in the computing device.
  • An accessory can represent for example a keyboard, a biometric sensing device, a gaming pad, a mouse, a gaming console controller, a joystick, a microphone, or a headset with a microphone—just to mention a few.
  • the keyboard and gaming pad represent accessories of a similar category since their operational parameters are alike.
  • a mouse represents an accessory which can have disparate operational parameters from the keyboard or gaming pad.
  • the operational parameters of a keyboard generally consist of alphanumeric keys, control keys (e.g., Shift, Alt, Ctrl), and function keys
  • the operational parameters of a mouse consist of navigation data generated by a tracking device such as a laser sensor, buttons to invoke GUI selections, and settings thereof (e.g., counts or dots per inch, acceleration, scroll speed, jitter control, line straightening control, and so on).
  • a tracking device such as a laser sensor
  • buttons to invoke GUI selections e.g., counts or dots per inch, acceleration, scroll speed, jitter control, line straightening control, and so on.
  • a biometric sensing device or other device capable of recognizing and/or sensing the physical characteristics of or detecting actions performed by a person can have different operational parameters as well.
  • the operational parameters can include, but are not limited to including, navigation or input data generated by sensing a finger or fingers dragged across the surface of the pad and input or other data generated by sensing a finger pressing against the pad.
  • the navigation and input data can also be generated by sensing other actions performed with respect to the pad.
  • the pad for example, can include a touchscreen, which can detect a user's touch through the use of capacitive, resistive, surface acoustic wave, projected capacitance, infrared, strain gauge, optical, and/or acoustic pulse touchscreen technologies.
  • the biometric sensing device can be configured to detect each fingerprint of a user or other users and each fingerprint can be correlated with a particular input function of the biometric sensing device.
  • the fingerprints can serve as inputs.
  • combinations of fingers can be correlated with input functions of the biometric sensing device as well. For example, a combination of a user's thumb and index finger can be correlated to an input function.
  • the biometric sensing device can also be configured to detect and correlate eye movements, blood pressure, heart rates, body movements, and other physical characteristics of a person to various other input functions of the sensing device.
  • the sensing device can be configured to detect a user's act of touching the surface of the pad or touchscreen, the user's dragging of a finger or fingers on the surface of the screen, and other actions which are independent of the physical characteristics of the user and utilize these actions as inputs.
  • the biometric sensing device can exist as a single device, multiple devices, a pad integrated with a display, and in other configurations.
  • the sensing device can be further configured to have a keypad or a touchscreen keypad and can have touch zones, where each zone can be tailored to perform a particular function.
  • the joysticks, game controllers or any other input devices represent additional categories of accessories supported by the AMS.
  • the AMS application presents a GUI 101 such as depicted in FIG. 1 with operationally distinct accessories such as the keyboard 108 , mouse 110 . headset 114 , game controller 115 , and biometric sensing device (or other sensing device) 116 .
  • the GUI 101 can be displayed on the biometric sensing device 116 if the device has a display.
  • the GUI 101 presents the accessories 108 - 116 in a scrollable section 117 .
  • One or more accessories can be selected by a user with a common mouse pointer or by tapping or dragging a finger on the touch screen/pad of the sensing device 116 .
  • the keyboard 108 and the biometric sensing device 116 were selected with a pointer for customization.
  • the AMS application presents the keyboard 108 and biometric sensing device 116 in split windows 118 , 120 , respectively, to help the user during the customization process.
  • the AMS application can be programmed to detect a user-selection of a particular software application such as a game. This step can be the result of the user entering in a Quick Search field 160 the name of a gaming application (e.g., World of WarcraftTM).
  • the AMS application can retrieve in step 410 from a remote or local database gaming application actions which can be presented in a scrollable section 139 of the GUI represented as “Actions” 130 .
  • the actions can be tactical actions 132 , communication actions 134 , menu actions 136 , and movement actions 138 , or any other types of actions, which can be used to invoke and manage features of the gaming application.
  • the actions presented descriptively in section 130 of the GUI can represent a sequence of accessory input functions which a user can stimulate by button depressions, navigation, performing actions with the sensing device 116 , or speech.
  • depressing the left index finger on the biometric sensing device 116 can represent the tactical action “Reload”, while the simultaneous keyboard depressions “Ctrl A” can represent the tactical action “Melee Attack”.
  • the “Actions” 130 section of the GUI is presented descriptively rather than by a description of the input function(s) of a particular accessory.
  • Any one of the Actions 130 can be associated with one or more input functions of the accessories by way of a simple drag and drop action. For instance, a user can select a “Melee Attack” by placing a pointer 133 over an iconic symbol associated with this action by utilizing the mouse 108 or by dragging the pointer 133 using a pad/touchscreen of the biometric sensing device 116 . Upon doing so, the symbol can be highlighted to indicate to the user that the icon is selectable.
  • the user can select the icon by holding the left mouse button and drag the symbol (or by utilizing the touch screen of the sensing device 116 ) to any of the input functions (e.g., buttons) of the keyboard 108 , mouse 110 , or biometric sensing device 116 to make an association with an input function of one of these accessories.
  • the input functions e.g., buttons
  • the user can drag the Melee Attack symbol to a particular region of a touchscreen of the sensing device 116 thereby causing an association between the associated region and the gaming action of a Melee Attack.
  • the AMS application can detect the selection as a “trigger” to generate the key sequence “Ctrl A” which is understood by the gaming application as request for a Melee Attack.
  • the gaming application receives from the AMS application by way of an operating system the “Ctrl A” sequence as if it had been generated by a Qwerty keyboard.
  • the sensing device 116 and/or the AMS application can be configured to record the fingerprints, a combination of fingerprints, eye movements, body movements, heart rates, blood pressure, and other physical characteristics of the user and correlate the characteristics to input functions of the sensing device 116 .
  • the user can then associate any one of the actions 130 to an input function correlated with a particular physical characteristic.
  • the biometric sensing device 116 is illustratively shown in split window 120 of FIG. 1 with a view from underneath the surface of the sensing device 116 .
  • a user's hands are placed on the top surface of the sensing device 116 so that the user's right hand is illustratively shown on the left and the user's left hand is illustratively shown on the right.
  • Each fingerprint 116 a - e of the right hand and each fingerprint 116 f - j of the right hand can be detected by the sensing device 116 and can be transmitted to the AMS application.
  • the AMS application can then be utilized to associate actions 130 to each fingerprint 116 f - j or to a combination of the fingerprints 116 f - j.
  • the fingerprint corresponding to the left ring finger 116 i of the user can be associated with the “Night Vision” action under the Tactics 132 menu and the right thumb can be assigned to “Melee Attack.”
  • a night vision mode can be triggered during game play. By tapping the screen again with the ring finger 116 i, the night vision mode can be toggled off or even lead to another action.
  • each finger or combination of fingers can be associated with a particular action.
  • a user's eye movement to the left can be associated with the “Move Left” action and eye movement to the right can be associated with the “Move Right” action under the Movement 138 menu.
  • a profile can be a device profile or master profile invoked by selecting GUI button 156 or 158 , each of which can identify the association of actions with input functions of one or more accessories.
  • the AMS application can retrieve macro(s) and/or prior associations of actions with the accessories as defined by the profile. For example, if a certain set of fingers or finger combinations are associated with certain actions for a particular video game in a selected profile, the associations of actions stored in the profile can be retrieved in anticipation of playing the video game.
  • the actions and/or macros defined in the profile can also be presented in step 416 by the AMS application in the actions column 130 of the GUI 101 to modify or create new associations.
  • the AMS application can also respond to a user selection to create a macro.
  • a macro in the present context can represent a subset of actions that can be presented in the Actions column 130 . Any command which can be recorded by the AMS application can be used to define a macro.
  • a command can represent a sequence of input functions of an accessory, identification of a software application to be initiated by an operating system (OS), or any other recordable stimulus to initiate, control or manipulate software applications.
  • OS operating system
  • a macro can represent a user entering the identity of a software application (e.g., instant messaging tool) to be initiated by an OS.
  • a macro can also represent recordable speech delivered by a microphone singly or in combination with a headset for detection by another software application through speech recognition or for delivery of the recorded speech to other parties.
  • the macro can represent recordable taps, finger drags, or other actions performed on a touchscreen/pad of the sensing device 116
  • a macro can represent recordable navigation of an accessory such as a mouse or joystick, recordable selections of buttons on a keyboard, a mouse, or a mouse pad, and so on.
  • Macros can also be combinations of the above illustrations. Macros can be created from the GUI 101 by selecting a “Record Macro” button 148 . The macro can be given a name and category in user-defined fields 140 and 142 .
  • a macro can be generated by selection of input functions on an accessory (e.g., Ctrl A, speech, touch screen region, etc.) and/or by manual entry in field 144 (e.g., typing the name and location of a software application to be initiated by an OS).
  • an accessory e.g., Ctrl A, speech, touch screen region, etc.
  • manual entry in field 144 e.g., typing the name and location of a software application to be initiated by an OS.
  • the clone button 152 can be selected to replicate the macro sequence if desired. Fields 152 can also present timing characteristics of the stimulation sequence in the macro with the ability to customize such timing.
  • the AMS application can respond to drag and drop associations between actions and input functions of the keyboard 108 and the biometric sensing device 116 . If an association is detected, the AMS application can proceed to step 424 where it can determine if a profile has been identified in step 412 to record the association(s) detected. If a profile has been identified, the associations are recorded in said profile in step 426 . If a profile was not been identified in step 412 , the AMS application can create a profile in step 428 for recording the detected associations. In the same step, the user can name the newly created profile as desired. The newly created profile can also be associated with one or more software applications in step 430 for future reference.
  • the GUI 101 presented by the AMS application can have other functions.
  • the GUI 101 can provide options for layout of the accessory selected (button 122 ), how the keyboard is illuminated when associations between input functions and actions are made (button 134 ), and configuration options for the accessory (button 126 ).
  • Configuration options can include operational settings of the mouse 110 such as Dots Per Inch or Counts Per Inch, and so on.
  • the AMS application can adapt the GUI 101 to present more than one functional perspective. For instance, by selecting button 102 , the AMS application can adapt the GUI 101 to present a means to create macros and associate actions to accessory input functions as depicted in FIG. 1 .
  • Selecting button 104 can cause the AMS application to adapt the GUI 101 to present statistics in relation to the usage of accessories as depicted in FIGS. 2-3 .
  • Selecting button 106 can cause the AMS application to adapt the GUI 101 to present promotional offers and software updates.
  • FIG. 5 depicts a method 500 in which the AMS application can be programmed to recognize unknown accessories so that method 400 can be applied to them as well.
  • Method 500 can begin with step 502 in which the AMS application detects an unknown accessory such as a new keyboard from an unknown vendor by way of a communicative coupling to a computing device from which the AMS application operates.
  • the AMS application in this instance can receive an identity from the keyboard or the operating system which is not known the AMS application.
  • the AMS application in step 504 can present a depiction of an accessory of similar or same category in response to a user providing direction as to the type of accessory (by selecting for example a drop-down menu).
  • the AMS application can determine from the information received from the unknown accessory an accessory type.
  • the AMS application can receive instructions describing all or a portion of the input functions of the unknown accessory. These instructions can come from a user who defines each input function individually or responds to inquiries provided by the AMS application. The AMS application can for example make an assumption as to the keyboard layout and highlight each key with a proposed function which the user can verify or modify.
  • the AMS application can create an accessory identity in step 508 which can be defined by the user.
  • the AMS application can associate and record the accessory instructions with the identity for future recognition of the accessory.
  • the AMS application can present a depiction of the new accessory with its identity along with the other selectable accessories in section 117 .
  • Method 500 can provide a means for universal detection and identification of any accessory which can be used to control or manage software applications operating in a computing device.
  • FIG. 6 depicts a method 600 for illustrating the AMS application responding to input function stimuli (triggers) of accessories.
  • Method 600 can begin with step 602 in which the AMS application monitors the use of accessories, such as the biometric sensing device 116 .
  • This step can represent monitoring the stimulation of input functions of one or more accessories communicatively coupled to a computing device from which the AMS application operates.
  • the input functions can correspond to button depressions on a keyboard, gaming pad, or navigation device such as a mouse, or can correspond to fingerprints, eye movements, heart rate increases, or blood pressure changes detected by the sensing device 116 .
  • the input functions can also represent navigation instructions such as eye, finger, mouse, or joystick movements.
  • the input functions can further represent speech supplied by a microphone singly or in combination with a headset.
  • Other existing or future input functions of an accessory detectable by the AMS application are contemplated by the present disclosure.
  • the AMS application can monitor input functions by for example processing human interface device (HID) reports supplied by the accessories to the computing device.
  • HID human interface device
  • the AMS application can proceed to step 606 to determine if action(s) have been associated with the detected stimulation(s). If, for example, the stimulations detected correspond to keyboard 108 and/or taps/drags on the touchscreen of the sensing device 116 , the AMS application can determine if actions have been associated and recorded for such stimulations. If these stimulations “trigger” one or more actions, the AMS application can proceed to step 608 where it retrieves the stimulation definition of these actions for each accessory reporting a stimulation. In step 610 , the AMS application can substitute the detected stimulations with the stimulations defined by the action.
  • an action associated with this stimulus consists of a macro that combines finger dragging with a navigation of the sending device 116 (e.g., moving one's finger quickly in a backward motion for a given distance), and a request to invoke an instant messaging (IM) session with a particular individual using SkypeTM or some other common IM tool.
  • the AMS application would substitute “Ctrl A” for stimulations consisting of the finger drags, navigation and a request for an IM application.
  • the substitute stimulations would then be reported in step 612 to an operating system (OS).
  • OS operating system
  • the OS can determine whether to pass the substitute stimulations to an active software application in operation (e.g., a gaming application) and/or to invoke another software application.
  • the active software application can be operating from the same computer system from which the OS and the AMS application operate or can be operating at a remote system such as an on-line server or family of servers (e.g., World of Warcraft) awaiting stimulation data from the computer system.
  • the macro comprises both stimulation feedback for the active software application and a request to initiate an IM session.
  • the OS conveys in step 618 the sensing device's 116 stimulation signals to the active software application (e.g., gaming application), and in a near simultaneous fashion invokes the IM session in step 620 with a specific individual (or organization).
  • the active software application e.g., gaming application
  • the OS can transmit the actions to a gaming application which is currently executing and an in-game entity (such as a video game character) can perform the received actions.
  • FIGS. 7-9 are illustrations depicting the use of various fingerprint combinations, which can be associated with various actions.
  • FIG. 7 illustrates a user pressing his/her ring finger 702 a, middle finger 702 c, and ring finger 702 c onto the surface of a sensing device.
  • the combination of the three fingers 702 a - c can be assigned to the “Throw Special” action from FIG. 1 and when the user press the three fingers 702 a - c on the sensing device, the sensing device can transmit a signal to the AMS application that three fingers 702 a - c were pressed.
  • the AMS application can query a database and find that the three fingers 702 a - c were associated with the action “Throw Special.”
  • the AMS application can then transmit the action to a software application which can utilize the action.
  • the action may cause, in the case of the software application being a video game, an in-game entity, such as a video game character, to throw some special weapon or special object.
  • FIGS. 8 and 9 feature fingerprint combinations 802 a - b and 902 a - d, which can also be associated with actions as well.
  • the combination 802 a - b can be associated with the “Menu” action from FIG. 1 , and when depressing fingers 802 a - b on the sensing device, the AMS application can transmit the action to a software application, which can then activate the “Menu” action and open a menu.
  • Finger combination 902 a - d can be associated with multiple actions or a macro and the actions or macro can be similarly utilized by the software application.
  • the illustrations above cover a scenario in which the AMS application has detected an association of actions to accessory stimuli. If however the AMS application does not detect such an association, then the detected stimulus (or stimuli) supplied by one or more accessories is transmitted to the OS in step 614 . For example, it may be that a stimulation based on the depressions of “Ctrl A” has no particular association to an action. In this case, the AMS application passes this stimulation to the OS with no substitutes. In step 616 the OS can determine if this stimulation invokes a new software application in step 620 or is conveyed to the previously initiated software application.
  • the AMS application can also record in step 622 statistics relating to the detected accessory stimulations.
  • a portion of the AMS application can operate as a background process which performs statistical analysis on the stimulations detected.
  • the AMS application can provide an updated GUI which illustrates the usage of input functions of one or more accessories for which stimulations were detected in step 604 .
  • a keyboard accessory is shown. In this illustration, certain keys (references 204 , 206 208 , 210 ) on the keyboard are color-coded to illustrate the frequency of usage of these keys.
  • a color scale 203 defines the frequency of usage of the input functions of the keyboard.
  • the first end of the scale represents a single detected depression, while an opposite end of the scale (bright red) represents 500 detected depressions.
  • the AMS application maps by color in step 624 stimulations of the keyboard.
  • the key grouping 208 depict a color coding with the highest detectable usage, while the F7 key (reference 210 ) indicates the fewest depressions. Keys having zero depressions are not color coded to readily identify the color mapping of keys which were used at least once.
  • the AMS application provides additional functions in a playback panel of the GUI which can help a user understand how the color coded keys were used during an active software application such as a video game.
  • the AMS application can present the user with a playback control function 202 which the user can select to replay, pause, forward or rewind the usage of these keys.
  • usage playback is selected, the user can for instance see the color coded keys highlighted in real-time with a temporary white border to visualize how the keys were selected.
  • a time clock 204 provides the user the elapsed time of the playback sequence.
  • Button 212 allows the user to retrieve statistics from other sessions, while button 214 provides the user a means to save statistics from a given session.
  • the GUI of FIG. 2 could have been shown as a split screen with all accessories which generated one or more detected stimulations (e.g., keyboard, biometric sensing device, mouse, and microphone), each providing statistical symbolic results as described above for the keyboard.
  • the sensing device 116 can be shown and color coding can illustrate where the user tapped the touchscreen of the sensing device 116 with the highest frequency. Red regions could represent heavily tapped areas of the touch screen, while blue areas can represent rarely tapped areas.
  • the GUI can display which fingerprints are used the most, which combination of fingerprints are used the most, etc.
  • split screen embodiments are contemplated by the present disclosure for the GUI of FIG. 2 .
  • the AMS application can provide the user a means to visualize raw statistics in a table format such as shown in FIG. 3 by selecting button 212 .
  • the table format shows raw data in section 302 and possible suggestions in section 304 for improving user performance which can be generated by the AMS application in step 626 .
  • Section 302 can be presented in a table format with a column identifying the key being analyzed, its usage, and number of key presses. The user can ascertain from this table the most and least frequently used keys as well as other identifiable patterns.
  • the table can include statistics on how many times a particular finger or finger combination was used.
  • a user's first left finger, FP1 may have had a usage duration of 03:05:23 and may be been detected 295 times by the AMS application.
  • the table can include what heart rate was the most frequent, what eye direction is most frequently used, and other similar statistics for any type and number of detectable physical characteristics.
  • the AMS application can utilize an understanding of the layout of the accessory (in this case, the keyboard or sensing device) to determine from the statistics ways that the user can improve response time or ergonomic use. For example, the AMS application can determine from a layout analysis that the key combination ⁇ Alt .> can be reassigned to a macro based on the trigger ⁇ Ctrl F> which could provide the user a faster response time and free up the user's right hand for other tasks.
  • the AMS application can also provide alternative suggestions. For example, the AMS application can also suggest creating single button macros for each of the key combinations ⁇ Alt .> and ⁇ Ctrl A> which can be assigned to keys on the keyboard or left and right buttons of a mouse. The latter suggestion of assigning macros to the mouse can help the user free up his/her left hand.
  • the AMS application can utilize information about the sensing device 116 to recommend to the user better or optimal associations. For example, the AMS application may determine that a particular region of the sensing device's 116 touchscreen is over-utilized and that distributing finger touches or drags over a larger portion of the screen would be more advantageous. Additionally, the AMS application may determine that particular combinations of fingers associated with certain actions would be more advantageous than current associations. For example, as shown in FIG. 3 , the AMS application can determine that the user should replace the combination of finger two and finger three of the user's left hand with finger one of the user's right hand.
  • the AMS application may suggest such a change to increase response time, reduce the total number of combinations/fingerprints associated with the user's left hand, take advantage of the user's right hand, or for other reasons. Furthermore, the application can determine that the user should be using fingers in conjunction with eye movements or other body movements.
  • the AMS application can utilize present and next generation algorithms to determine how to improve response times and ergonomic usage of accessory devices.
  • the AMS application can for example have at its disposal an understanding of the layout of each accessory, the type of software being controlled by the accessory (e.g., World of Warcraft), type of operations commonly used to control the software (e.g., known actions as shown in the actions column 130 of FIG. 1 ), an understanding of the associations made by other users (e.g., gamers) to improve their performance when controlling the software, and so on.
  • the AMS application can also be adapted to communicate with the active software application by way of an Application Programming Interface (API) to receive additional usage statistics from the software which it can in turn use to improve the user's performance.
  • API Application Programming Interface
  • the AMS application can also utilize common statistical and behavior modeling techniques to predict the behavior of the user and responses from the software application to identify possible ways to improve the user's performance.
  • method 400 can be adapted to define more than one programmable layer for an accessory.
  • Such a feature can extend the functionality of an accessory into multi-layer paradigms of input functions.
  • the GUI of FIG. 1 can be adapted so that a user can specify more than one programmable layer for a specific accessory.
  • the user can also specify which layer to present in FIG. 1 while associating actions. If for instance layer 1 is shown, the GUI of FIG. 1 can present the actions associated in this layer by presenting descriptors superimposed on the input functions (e.g., buttons or keys or regions of a touchscreen).
  • the input functions e.g., buttons or keys or regions of a touchscreen.
  • the accessory can be shown in the GUI with a different set of associated actions.
  • the user can define a macro or identify a key sequence to switch between layers when the accessory is in use.
  • the trigger for switching between layers can be a toggle function (e.g., selecting the tab key on a Qwerty keyboard or tapping a certain region of a touchscreen of the sensing device) to switch between layers in a round robin fashion (layer 1 ⁇ layer 2 ⁇ layer 3 ⁇ to layer 1 ⁇ and so on).
  • the user can define a hold and release trigger to switch between layers.
  • the user moves to another layer while pressing a button (e.g., a “Shift” key) or portion of a touchscreen of the sensing device 116 and returns to the preceding layer upon its release.
  • the trigger to switch layers can be defined differently per layer.
  • the user can for example select the letter “A” in layer 1 to proceed to layer 2 , and select the letter “B” in layer 2 to return to layer 1 or proceed to yet another layer 3 .
  • layers and triggers can be defined to substantially expand the capability of single accessory. Additionally, triggers can be of any kind, tactile, speech, etc.
  • method 400 can be adapted so that a user can define super macros and/or super profiles.
  • a super macro can represent nested macros (combinations of macros).
  • Method 400 can be adapted so that the user can customize the timing for executing nested macros.
  • a super profile can represent nested profiles (combinations of profiles).
  • a super profile can for example comprise sub-profiles, each sub-profile defining associations of actions to input functions of a particular accessory.
  • method 400 can be adapted to establish audio profiles for headset accessories.
  • GUI 101 can be adapted to provide the user options to establish an sound output (equalizer) setting to optimize performance for a particular gaming application. For instance GUI 101 can present an equalizer so that the user can raise the volume of high frequencies to an enemy's footsteps from a longer distance in a gaming application.
  • the method 400 can be adapted to allow a user to authenticate into the AMS application and/or a software application accessible by the AMS application based on a detection of an authorized fingerprint or other physical characteristic detected by the sensing device 116 .
  • FIG. 10 depicts an exemplary diagrammatic representation of a machine in the form of a computer system 1000 within which a set of instructions, when executed, may cause the machine to perform any one or more of the methodologies discussed above.
  • the machine operates as a standalone device.
  • the machine may be connected (e.g., using a network) to other machines.
  • the machine may operate in the capacity of a server or a client user machine in server-client user network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine may comprise a server computer, a client user computer, a personal computer (PC), a tablet PC, a laptop computer, a desktop computer, a control system, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • a device of the present disclosure includes broadly any electronic device that provides voice, video or data communication.
  • the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • the computer system 1000 may include a processor 1002 (e.g., a central processing unit (CPU), a graphics processing unit (GPU, or both), a main memory 1004 and a static memory 1006 , which communicate with each other via a bus 1008 .
  • the computer system 1000 may further include a video display unit 1010 (e.g., a liquid crystal display (LCD), a flat panel, a solid state display, or a cathode ray tube (CRT)).
  • the computer system 1000 may include an input device 1012 (e.g., a keyboard), a cursor control device 1014 (e.g., a mouse), a disk drive unit 1016 , a signal generation device 1018 (e.g., a speaker or remote control) and a network interface device 1020 .
  • an input device 1012 e.g., a keyboard
  • a cursor control device 1014 e.g., a mouse
  • a disk drive unit 1016 e.g., a disk drive unit 1016
  • a signal generation device 1018 e.g., a speaker or remote control
  • the disk drive unit 1016 may include a machine-readable medium 1022 on which is stored one or more sets of instructions (e.g., software 1024 ) embodying any one or more of the methodologies or functions described herein, including those methods illustrated above.
  • the instructions 1024 may also reside, completely or at least partially, within the main memory 1004 , the static memory 1006 , and/or within the processor 1002 during execution thereof by the computer system 1000 .
  • the main memory 1004 and the processor 1002 also may constitute machine-readable media.
  • Dedicated hardware implementations including, but not limited to, application specific integrated circuits, programmable logic arrays and other hardware devices can likewise be constructed to implement the methods described herein.
  • Applications that may include the apparatus and systems of various embodiments broadly include a variety of electronic and computer systems. Some embodiments implement functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application-specific integrated circuit.
  • the example system is applicable to software, firmware, and hardware implementations.
  • the methods described herein are intended for operation as software programs running on a computer processor.
  • software implementations can include, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the methods described herein.
  • the present disclosure contemplates a machine readable medium containing instructions 1024 , or that which receives and executes instructions 1024 from a propagated signal so that a device connected to a network environment 1026 can send or receive voice, video or data, and to communicate over the network 1026 using the instructions 1024 .
  • the instructions 1024 may further be transmitted or received over a network 1026 via the network interface device 1020 .
  • machine-readable medium 1022 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • the term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
  • machine-readable medium shall accordingly be taken to include, but not be limited to: solid-state memories such as a memory card or other package that houses one or more read-only (non-volatile) memories, random access memories, or other re-writable (volatile) memories; magneto-optical or optical medium such as a disk or tape; and carrier wave signals such as a signal embodying computer instructions in a transmission medium; and/or a digital file attachment to e-mail or other self-contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium. Accordingly, the disclosure is considered to include any one or more of a machine-readable medium or a distribution medium, as listed herein and including art-recognized equivalents and successor media, in which the software implementations herein are stored.
  • inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed.
  • inventive concept merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Cardiology (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A system that incorporates teachings of the present disclosure may include, for example, a biometric accessory having a controller to detect at least one of navigation information and biometric information associated with a user of the accessory, and transmit at least one of the navigation information and biometric information to a software application, wherein an input function of the accessory which is correlated with at least one of the navigation information and the biometric information is assigned to an action of a plurality of associable actions by the software application, wherein a stimulation of the input function is detected by the software application, wherein the action is retrieved by the software application based on the stimulation being detected, and wherein the retrieved associable action is transmitted by the software application to an operating system. Additional embodiments are disclosed.

Description

    FIELD OF THE DISCLOSURE
  • The present disclosure relates generally to accessory management applications, and more specifically to an apparatus for associating physical characteristics with commands.
  • BACKGROUND
  • It is common today for gainers to utilize more than one gaming accessory while utilizing a gaming or other software application. This is especially true of garners who play Massively Multiplayer On-line (MMO) games in a team or individual configuration. Garners can have at their disposal accessories such as a keyboard, a general purpose gaming pad, a mouse, a gaming console controller, a headset with a built-in microphone to communicate with other players, a joystick, a computer display, or other common gaming accessories.
  • In addition, a gamer can utilize biometric sensing devices to serve as another option to control and/or manage gaming and other software applications. A gamer can frequently use a combination of these accessories during a game (e.g., biometric sensing devices, headset, a keyboard, and mouse) or even use one accessory to replace the function of another accessory. Efficient management and utilization of these accessories can frequently impact the gamer's experience during a game.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1-3 depict illustrative embodiments of a Graphical User Interface (GUI) generated by an Accessory Management Software (AMS) application according to the present disclosure;
  • FIGS. 4-6 depict illustrative methods describing the operation of the AMS application;
  • FIGS. 7-9 depict a biometric sensing device featuring various detectable finger configurations, which can be associated with various actions; and
  • FIG. 10 depicts an illustrative diagrammatic representation of a machine in the form of a computer system within which a set of instructions, when executed, may cause the machine to perform any one or more of the methodologies disclosed herein.
  • DETAILED DESCRIPTION
  • One embodiment of the present disclosure entails a computer-readable storage medium having computer instructions to present in a graphical user interface a plurality of associable actions and a biometric sensing accessory, wherein a fingerprint is detectable by the biometric sensing accessory and the fingerprint is correlated to an input function associated with the biometric sensing accessory, associate one of the plurality of associable actions with the input function, detect a stimulation of the input function by monitoring the biometric sensing accessory, wherein the stimulation of the input function occurs based on a detection of the fingerprint, retrieve the action associated with the input function, and transmit the action to an operating system.
  • One embodiment of the present disclosure entails a biometric accessory having a controller to detect at least one of navigation information and biometric information associated with a user of the accessory, and transmit at least one of the navigation information and biometric information to a software application, wherein an input function of the accessory which is correlated with at least one of the navigation information and the biometric information is assigned to an action of a plurality of associable actions by the software application, wherein a stimulation of the input function is detected by the software application, wherein the action is retrieved by the software application based on the stimulation being detected, and wherein the retrieved associable action is transmitted by the software application to an operating system.
  • One embodiment of the present disclosure entails a computer-readable storage medium having computer instructions to receive from a software application operably coupled to a biometric sensing accessory an action associated with an input function of the biometric sensing accessory, wherein the input function is correlated to at least one of navigation information and biometric information detected by the biometric sensing accessory, wherein a stimulation of the input function is detected by the software application, and wherein the action is retrieved by the software application when the stimulation is detected, and perform the received action.
  • FIGS. 1-3 depict illustrative embodiments of a Graphical User Interface (GUI) generated by an Accessory Management Software (AMS) application according to the present disclosure. The AMS application can operate in a computing device such as a desktop computer, a laptop computer, a server, a mainframe computer, or a gaming console. A gaming console can represent a gaming device such as a Playstation 3™, a Wii™, or an Xbox360™. Other present and next generation gaming consoles are contemplated. The AMS application can also operate in other computing devices with less computing resources such as a cellular phone, a personal digital assistant, or a media player (such as an iPOD™). From these illustrations it would be apparent to an artisan with ordinary skill in the art that the AMS application can operate in any device with computing resources.
  • FIGS. 4-6 depict illustrative methods 400-600 describing the operation of the AMS application as shown in FIGS. 1-3. Method 400 can begin with step 402 in which the AMS application is invoked in a computing device. The invocation step can result from a user selection of the AMS application from a menu or iconic symbol presented on a desktop of the computing device by an operating system (OS) managing operations thereof. In step 404, the AMS application can detect by way of drivers in the OS a plurality of operationally distinct accessories communicatively coupled to the computing device. However, the accessories do not necessarily have to be operationally distinct, and can have similar features and/or operational capabilities. The accessories can be coupled to the computing device by a tethered interface (e.g., USB cable), a wireless interface (e.g., Bluetooth or Wireless Fidelity—WiFi), or combinations thereof.
  • In the present context, an accessory can represent any type of device which can be communicatively coupled to the computing device and which can control aspects of the OS and/or a software application operating in the computing device. An accessory can represent for example a keyboard, a biometric sensing device, a gaming pad, a mouse, a gaming console controller, a joystick, a microphone, or a headset with a microphone—just to mention a few. The keyboard and gaming pad represent accessories of a similar category since their operational parameters are alike.
  • A mouse, on the other hand, represents an accessory which can have disparate operational parameters from the keyboard or gaming pad. For instance, the operational parameters of a keyboard generally consist of alphanumeric keys, control keys (e.g., Shift, Alt, Ctrl), and function keys while the operational parameters of a mouse consist of navigation data generated by a tracking device such as a laser sensor, buttons to invoke GUI selections, and settings thereof (e.g., counts or dots per inch, acceleration, scroll speed, jitter control, line straightening control, and so on). Such distinctions can be used to identify disparate categories of accessories.
  • Additionally, a biometric sensing device or other device capable of recognizing and/or sensing the physical characteristics of or detecting actions performed by a person can have different operational parameters as well. In the case of a biometric sensing device, such as a fingerprint detection pad, the operational parameters can include, but are not limited to including, navigation or input data generated by sensing a finger or fingers dragged across the surface of the pad and input or other data generated by sensing a finger pressing against the pad. The navigation and input data can also be generated by sensing other actions performed with respect to the pad. The pad, for example, can include a touchscreen, which can detect a user's touch through the use of capacitive, resistive, surface acoustic wave, projected capacitance, infrared, strain gauge, optical, and/or acoustic pulse touchscreen technologies.
  • Notably, the biometric sensing device can be configured to detect each fingerprint of a user or other users and each fingerprint can be correlated with a particular input function of the biometric sensing device. In other words, the fingerprints can serve as inputs. Additionally, combinations of fingers can be correlated with input functions of the biometric sensing device as well. For example, a combination of a user's thumb and index finger can be correlated to an input function. The biometric sensing device can also be configured to detect and correlate eye movements, blood pressure, heart rates, body movements, and other physical characteristics of a person to various other input functions of the sensing device.
  • Furthermore, the sensing device can be configured to detect a user's act of touching the surface of the pad or touchscreen, the user's dragging of a finger or fingers on the surface of the screen, and other actions which are independent of the physical characteristics of the user and utilize these actions as inputs. The biometric sensing device can exist as a single device, multiple devices, a pad integrated with a display, and in other configurations. The sensing device can be further configured to have a keypad or a touchscreen keypad and can have touch zones, where each zone can be tailored to perform a particular function. The joysticks, game controllers or any other input devices represent additional categories of accessories supported by the AMS.
  • In step 406, the AMS application presents a GUI 101 such as depicted in FIG. 1 with operationally distinct accessories such as the keyboard 108, mouse 110. headset 114, game controller 115, and biometric sensing device (or other sensing device) 116. In an embodiment, the GUI 101 can be displayed on the biometric sensing device 116 if the device has a display. The GUI 101 presents the accessories 108-116 in a scrollable section 117. One or more accessories can be selected by a user with a common mouse pointer or by tapping or dragging a finger on the touch screen/pad of the sensing device 116. In this illustration, the keyboard 108 and the biometric sensing device 116 were selected with a pointer for customization. Upon selecting the keyboard 108 and biometric sensing device 116 in section 117, the AMS application presents the keyboard 108 and biometric sensing device 116 in split windows 118, 120, respectively, to help the user during the customization process.
  • In step 408, the AMS application can be programmed to detect a user-selection of a particular software application such as a game. This step can be the result of the user entering in a Quick Search field 160 the name of a gaming application (e.g., World of Warcraft™). Upon identifying a gaming application, the AMS application can retrieve in step 410 from a remote or local database gaming application actions which can be presented in a scrollable section 139 of the GUI represented as “Actions” 130. The actions can be tactical actions 132, communication actions 134, menu actions 136, and movement actions 138, or any other types of actions, which can be used to invoke and manage features of the gaming application.
  • The actions presented descriptively in section 130 of the GUI can represent a sequence of accessory input functions which a user can stimulate by button depressions, navigation, performing actions with the sensing device 116, or speech. For example, depressing the left index finger on the biometric sensing device 116 can represent the tactical action “Reload”, while the simultaneous keyboard depressions “Ctrl A” can represent the tactical action “Melee Attack”. For ease of use, the “Actions” 130 section of the GUI is presented descriptively rather than by a description of the input function(s) of a particular accessory.
  • Any one of the Actions 130 can be associated with one or more input functions of the accessories by way of a simple drag and drop action. For instance, a user can select a “Melee Attack” by placing a pointer 133 over an iconic symbol associated with this action by utilizing the mouse 108 or by dragging the pointer 133 using a pad/touchscreen of the biometric sensing device 116. Upon doing so, the symbol can be highlighted to indicate to the user that the icon is selectable. At this point, the user can select the icon by holding the left mouse button and drag the symbol (or by utilizing the touch screen of the sensing device 116) to any of the input functions (e.g., buttons) of the keyboard 108, mouse 110, or biometric sensing device 116 to make an association with an input function of one of these accessories.
  • For example, the user can drag the Melee Attack symbol to a particular region of a touchscreen of the sensing device 116 thereby causing an association between the associated region and the gaming action of a Melee Attack. When the region of the sensing device 116 is tapped or otherwise touched during normal operation of a game, the AMS application can detect the selection as a “trigger” to generate the key sequence “Ctrl A” which is understood by the gaming application as request for a Melee Attack. The gaming application receives from the AMS application by way of an operating system the “Ctrl A” sequence as if it had been generated by a Qwerty keyboard.
  • Additionally, the sensing device 116 and/or the AMS application can be configured to record the fingerprints, a combination of fingerprints, eye movements, body movements, heart rates, blood pressure, and other physical characteristics of the user and correlate the characteristics to input functions of the sensing device 116. The user can then associate any one of the actions 130 to an input function correlated with a particular physical characteristic. As an example, the biometric sensing device 116 is illustratively shown in split window 120 of FIG. 1 with a view from underneath the surface of the sensing device 116. A user's hands are placed on the top surface of the sensing device 116 so that the user's right hand is illustratively shown on the left and the user's left hand is illustratively shown on the right. Each fingerprint 116 a-e of the right hand and each fingerprint 116 f-j of the right hand can be detected by the sensing device 116 and can be transmitted to the AMS application. The AMS application can then be utilized to associate actions 130 to each fingerprint 116 f-j or to a combination of the fingerprints 116 f-j.
  • For example, the fingerprint corresponding to the left ring finger 116 i of the user can be associated with the “Night Vision” action under the Tactics 132 menu and the right thumb can be assigned to “Melee Attack.” Upon tapping a touch screen of the sensing device 116 with the user's left ring finger 116 i during a game, a night vision mode can be triggered during game play. By tapping the screen again with the ring finger 116 i, the night vision mode can be toggled off or even lead to another action. Of course, each finger or combination of fingers can be associated with a particular action. Similarly, a user's eye movement to the left can be associated with the “Move Left” action and eye movement to the right can be associated with the “Move Right” action under the Movement 138 menu.
  • With this in mind, attention is directed to step 412 where the AMS application can respond to a user selection of a profile. A profile can be a device profile or master profile invoked by selecting GUI button 156 or 158, each of which can identify the association of actions with input functions of one or more accessories. If a profile selection is detected in step 412, the AMS application can retrieve macro(s) and/or prior associations of actions with the accessories as defined by the profile. For example, if a certain set of fingers or finger combinations are associated with certain actions for a particular video game in a selected profile, the associations of actions stored in the profile can be retrieved in anticipation of playing the video game. The actions and/or macros defined in the profile can also be presented in step 416 by the AMS application in the actions column 130 of the GUI 101 to modify or create new associations.
  • In step 418, the AMS application can also respond to a user selection to create a macro. A macro in the present context can represent a subset of actions that can be presented in the Actions column 130. Any command which can be recorded by the AMS application can be used to define a macro. A command can represent a sequence of input functions of an accessory, identification of a software application to be initiated by an operating system (OS), or any other recordable stimulus to initiate, control or manipulate software applications. For instance, a macro can represent a user entering the identity of a software application (e.g., instant messaging tool) to be initiated by an OS. A macro can also represent recordable speech delivered by a microphone singly or in combination with a headset for detection by another software application through speech recognition or for delivery of the recorded speech to other parties. In an embodiment, the macro can represent recordable taps, finger drags, or other actions performed on a touchscreen/pad of the sensing device 116 In yet another embodiment a macro can represent recordable navigation of an accessory such as a mouse or joystick, recordable selections of buttons on a keyboard, a mouse, or a mouse pad, and so on. Macros can also be combinations of the above illustrations. Macros can be created from the GUI 101 by selecting a “Record Macro” button 148. The macro can be given a name and category in user-defined fields 140 and 142.
  • Upon selecting the Record Macro button 148, a macro can be generated by selection of input functions on an accessory (e.g., Ctrl A, speech, touch screen region, etc.) and/or by manual entry in field 144 (e.g., typing the name and location of a software application to be initiated by an OS). Once the macro is created, it can be tested by selecting button 150 which can repeat the sequence specified in field 144. The clone button 152 can be selected to replicate the macro sequence if desired. Fields 152 can also present timing characteristics of the stimulation sequence in the macro with the ability to customize such timing. Once the macro has been fully defined, selection of button 154 records the macro in step 420. The recording step can be combined with a step for adding the macro to the associable items Actions column 130, thereby providing the user the means to associate the macro with input functions of the accessories.
  • In step 422, the AMS application can respond to drag and drop associations between actions and input functions of the keyboard 108 and the biometric sensing device 116. If an association is detected, the AMS application can proceed to step 424 where it can determine if a profile has been identified in step 412 to record the association(s) detected. If a profile has been identified, the associations are recorded in said profile in step 426. If a profile was not been identified in step 412, the AMS application can create a profile in step 428 for recording the detected associations. In the same step, the user can name the newly created profile as desired. The newly created profile can also be associated with one or more software applications in step 430 for future reference.
  • The GUI 101 presented by the AMS application can have other functions. For example, the GUI 101 can provide options for layout of the accessory selected (button 122), how the keyboard is illuminated when associations between input functions and actions are made (button 134), and configuration options for the accessory (button 126). Configuration options can include operational settings of the mouse 110 such as Dots Per Inch or Counts Per Inch, and so on. The AMS application can adapt the GUI 101 to present more than one functional perspective. For instance, by selecting button 102, the AMS application can adapt the GUI 101 to present a means to create macros and associate actions to accessory input functions as depicted in FIG. 1. Selecting button 104 can cause the AMS application to adapt the GUI 101 to present statistics in relation to the usage of accessories as depicted in FIGS. 2-3. Selecting button 106 can cause the AMS application to adapt the GUI 101 to present promotional offers and software updates.
  • It should be noted that the steps of method 400 in whole or in part can be repeated until a desirable pattern of associations of actions to input functions of the selected accessories has been accomplished. It would be apparent to an artisan with ordinary skill in the art that there can be numerous other approaches to accomplish similar results. These undisclosed approaches are contemplated by the present disclosure.
  • FIG. 5 depicts a method 500 in which the AMS application can be programmed to recognize unknown accessories so that method 400 can be applied to them as well. Method 500 can begin with step 502 in which the AMS application detects an unknown accessory such as a new keyboard from an unknown vendor by way of a communicative coupling to a computing device from which the AMS application operates. The AMS application in this instance can receive an identity from the keyboard or the operating system which is not known the AMS application. Upon detecting an unknown accessory, the AMS application in step 504 can present a depiction of an accessory of similar or same category in response to a user providing direction as to the type of accessory (by selecting for example a drop-down menu). Alternatively, or in combination with the user instructions, the AMS application can determine from the information received from the unknown accessory an accessory type.
  • In step 506 the AMS application can receive instructions describing all or a portion of the input functions of the unknown accessory. These instructions can come from a user who defines each input function individually or responds to inquiries provided by the AMS application. The AMS application can for example make an assumption as to the keyboard layout and highlight each key with a proposed function which the user can verify or modify. Once the AMS application has been provided instructions in step 506, the AMS application can create an accessory identity in step 508 which can be defined by the user. In steps 510 and 512, the AMS application can associate and record the accessory instructions with the identity for future recognition of the accessory. In step 514, the AMS application can present a depiction of the new accessory with its identity along with the other selectable accessories in section 117.
  • Method 500 can provide a means for universal detection and identification of any accessory which can be used to control or manage software applications operating in a computing device.
  • FIG. 6 depicts a method 600 for illustrating the AMS application responding to input function stimuli (triggers) of accessories. Method 600 can begin with step 602 in which the AMS application monitors the use of accessories, such as the biometric sensing device 116. This step can represent monitoring the stimulation of input functions of one or more accessories communicatively coupled to a computing device from which the AMS application operates. The input functions can correspond to button depressions on a keyboard, gaming pad, or navigation device such as a mouse, or can correspond to fingerprints, eye movements, heart rate increases, or blood pressure changes detected by the sensing device 116. The input functions can also represent navigation instructions such as eye, finger, mouse, or joystick movements. The input functions can further represent speech supplied by a microphone singly or in combination with a headset. Other existing or future input functions of an accessory detectable by the AMS application are contemplated by the present disclosure. The AMS application can monitor input functions by for example processing human interface device (HID) reports supplied by the accessories to the computing device.
  • Once one or more stimulations have been detected in step 604, the AMS application can proceed to step 606 to determine if action(s) have been associated with the detected stimulation(s). If, for example, the stimulations detected correspond to keyboard 108 and/or taps/drags on the touchscreen of the sensing device 116, the AMS application can determine if actions have been associated and recorded for such stimulations. If these stimulations “trigger” one or more actions, the AMS application can proceed to step 608 where it retrieves the stimulation definition of these actions for each accessory reporting a stimulation. In step 610, the AMS application can substitute the detected stimulations with the stimulations defined by the action.
  • To illustrate this substitution, suppose for example that the detected stimulation was “Ctrl A” simultaneously depressed on a keyboard. Suppose further that an action associated with this stimulus consists of a macro that combines finger dragging with a navigation of the sending device 116 (e.g., moving one's finger quickly in a backward motion for a given distance), and a request to invoke an instant messaging (IM) session with a particular individual using Skype™ or some other common IM tool. In step 610, the AMS application would substitute “Ctrl A” for stimulations consisting of the finger drags, navigation and a request for an IM application. The substitute stimulations would then be reported in step 612 to an operating system (OS).
  • In step 616, the OS can determine whether to pass the substitute stimulations to an active software application in operation (e.g., a gaming application) and/or to invoke another software application. The active software application can be operating from the same computer system from which the OS and the AMS application operate or can be operating at a remote system such as an on-line server or family of servers (e.g., World of Warcraft) awaiting stimulation data from the computer system. In this illustration, the macro comprises both stimulation feedback for the active software application and a request to initiate an IM session.
  • Accordingly, the OS conveys in step 618 the sensing device's 116 stimulation signals to the active software application (e.g., gaming application), and in a near simultaneous fashion invokes the IM session in step 620 with a specific individual (or organization). In another example, if a user's right thumb is assigned to the “Melee Attack” action and the user's left thumb is assigned to “Throw Frag,” the OS can transmit the actions to a gaming application which is currently executing and an in-game entity (such as a video game character) can perform the received actions.
  • In order to provide a further example, and now referring also to FIGS. 7-9, are illustrations depicting the use of various fingerprint combinations, which can be associated with various actions. FIG. 7 illustrates a user pressing his/her ring finger 702 a, middle finger 702 c, and ring finger 702 c onto the surface of a sensing device. The combination of the three fingers 702 a-c can be assigned to the “Throw Special” action from FIG. 1 and when the user press the three fingers 702 a-c on the sensing device, the sensing device can transmit a signal to the AMS application that three fingers 702 a-c were pressed. Once the AMS application receives the signal, the AMS application can query a database and find that the three fingers 702 a-c were associated with the action “Throw Special.” The AMS application can then transmit the action to a software application which can utilize the action. The action may cause, in the case of the software application being a video game, an in-game entity, such as a video game character, to throw some special weapon or special object.
  • Similarly, FIGS. 8 and 9 feature fingerprint combinations 802 a-b and 902 a-d, which can also be associated with actions as well. For example, the combination 802 a-b can be associated with the “Menu” action from FIG. 1, and when depressing fingers 802 a-b on the sensing device, the AMS application can transmit the action to a software application, which can then activate the “Menu” action and open a menu. Finger combination 902 a-d can be associated with multiple actions or a macro and the actions or macro can be similarly utilized by the software application.
  • Referring back to step 606, the illustrations above cover a scenario in which the AMS application has detected an association of actions to accessory stimuli. If however the AMS application does not detect such an association, then the detected stimulus (or stimuli) supplied by one or more accessories is transmitted to the OS in step 614. For example, it may be that a stimulation based on the depressions of “Ctrl A” has no particular association to an action. In this case, the AMS application passes this stimulation to the OS with no substitutes. In step 616 the OS can determine if this stimulation invokes a new software application in step 620 or is conveyed to the previously initiated software application.
  • Contemporaneous to the embodiments described above, the AMS application can also record in step 622 statistics relating to the detected accessory stimulations. A portion of the AMS application can operate as a background process which performs statistical analysis on the stimulations detected. By selecting button 104 in FIG. 1, the AMS application can provide an updated GUI which illustrates the usage of input functions of one or more accessories for which stimulations were detected in step 604. For ease of illustration, only a keyboard accessory is shown. In this illustration, certain keys ( references 204, 206 208, 210) on the keyboard are color-coded to illustrate the frequency of usage of these keys.
  • A color scale 203 defines the frequency of usage of the input functions of the keyboard. The first end of the scale (navy blue) represents a single detected depression, while an opposite end of the scale (bright red) represents 500 detected depressions. Based on this scale, the AMS application maps by color in step 624 stimulations of the keyboard. For example, the key grouping 208 depict a color coding with the highest detectable usage, while the F7 key (reference 210) indicates the fewest depressions. Keys having zero depressions are not color coded to readily identify the color mapping of keys which were used at least once.
  • The AMS application provides additional functions in a playback panel of the GUI which can help a user understand how the color coded keys were used during an active software application such as a video game. In this section of the GUI, the AMS application can present the user with a playback control function 202 which the user can select to replay, pause, forward or rewind the usage of these keys. When usage playback is selected, the user can for instance see the color coded keys highlighted in real-time with a temporary white border to visualize how the keys were selected. A time clock 204 provides the user the elapsed time of the playback sequence. Button 212 allows the user to retrieve statistics from other sessions, while button 214 provides the user a means to save statistics from a given session.
  • The GUI of FIG. 2 could have been shown as a split screen with all accessories which generated one or more detected stimulations (e.g., keyboard, biometric sensing device, mouse, and microphone), each providing statistical symbolic results as described above for the keyboard. For example, the sensing device 116 can be shown and color coding can illustrate where the user tapped the touchscreen of the sensing device 116 with the highest frequency. Red regions could represent heavily tapped areas of the touch screen, while blue areas can represent rarely tapped areas. Additionally, if fingerprints or other physical characteristics of the user are recorded, the GUI can display which fingerprints are used the most, which combination of fingerprints are used the most, etc. Although not shown, split screen embodiments are contemplated by the present disclosure for the GUI of FIG. 2.
  • In addition to a symbolic representation as shown in FIG. 2, the AMS application can provide the user a means to visualize raw statistics in a table format such as shown in FIG. 3 by selecting button 212. The table format shows raw data in section 302 and possible suggestions in section 304 for improving user performance which can be generated by the AMS application in step 626. Section 302 can be presented in a table format with a column identifying the key being analyzed, its usage, and number of key presses. The user can ascertain from this table the most and least frequently used keys as well as other identifiable patterns. Similarly, the table can include statistics on how many times a particular finger or finger combination was used. For example, a user's first left finger, FP1, may have had a usage duration of 03:05:23 and may be been detected 295 times by the AMS application. Additionally, the table can include what heart rate was the most frequent, what eye direction is most frequently used, and other similar statistics for any type and number of detectable physical characteristics.
  • The AMS application can utilize an understanding of the layout of the accessory (in this case, the keyboard or sensing device) to determine from the statistics ways that the user can improve response time or ergonomic use. For example, the AMS application can determine from a layout analysis that the key combination <Alt .> can be reassigned to a macro based on the trigger <Ctrl F> which could provide the user a faster response time and free up the user's right hand for other tasks. The AMS application can also provide alternative suggestions. For example, the AMS application can also suggest creating single button macros for each of the key combinations <Alt .> and <Ctrl A> which can be assigned to keys on the keyboard or left and right buttons of a mouse. The latter suggestion of assigning macros to the mouse can help the user free up his/her left hand.
  • Similarly, with regard to the sensing device 116, the AMS application can utilize information about the sensing device 116 to recommend to the user better or optimal associations. For example, the AMS application may determine that a particular region of the sensing device's 116 touchscreen is over-utilized and that distributing finger touches or drags over a larger portion of the screen would be more advantageous. Additionally, the AMS application may determine that particular combinations of fingers associated with certain actions would be more advantageous than current associations. For example, as shown in FIG. 3, the AMS application can determine that the user should replace the combination of finger two and finger three of the user's left hand with finger one of the user's right hand. The AMS application may suggest such a change to increase response time, reduce the total number of combinations/fingerprints associated with the user's left hand, take advantage of the user's right hand, or for other reasons. Furthermore, the application can determine that the user should be using fingers in conjunction with eye movements or other body movements.
  • The AMS application can utilize present and next generation algorithms to determine how to improve response times and ergonomic usage of accessory devices. The AMS application can for example have at its disposal an understanding of the layout of each accessory, the type of software being controlled by the accessory (e.g., World of Warcraft), type of operations commonly used to control the software (e.g., known actions as shown in the actions column 130 of FIG. 1), an understanding of the associations made by other users (e.g., gamers) to improve their performance when controlling the software, and so on. The AMS application can also be adapted to communicate with the active software application by way of an Application Programming Interface (API) to receive additional usage statistics from the software which it can in turn use to improve the user's performance. The AMS application can also utilize common statistical and behavior modeling techniques to predict the behavior of the user and responses from the software application to identify possible ways to improve the user's performance.
  • From these illustrations, it would be apparent to an artisan of ordinary skill in the art that innumerable algorithms can be developed to analyze accessory usage and thereby suggest improvements. These undisclosed embodiments are contemplated by the present disclosure.
  • From the foregoing descriptions, it would be evident to an artisan with ordinary skill in the art that the aforementioned embodiments can be modified, reduced, or enhanced without departing from the scope and spirit of the claims described below. For example, method 400 can be adapted to define more than one programmable layer for an accessory. Such a feature can extend the functionality of an accessory into multi-layer paradigms of input functions. The GUI of FIG. 1 can be adapted so that a user can specify more than one programmable layer for a specific accessory.
  • The user can also specify which layer to present in FIG. 1 while associating actions. If for instance layer 1 is shown, the GUI of FIG. 1 can present the actions associated in this layer by presenting descriptors superimposed on the input functions (e.g., buttons or keys or regions of a touchscreen). When the user switches to layer 2 (e.g., by selecting from a drop-down menu the layer of interest) the accessory can be shown in the GUI with a different set of associated actions. The user can define a macro or identify a key sequence to switch between layers when the accessory is in use.
  • The trigger for switching between layers can be a toggle function (e.g., selecting the tab key on a Qwerty keyboard or tapping a certain region of a touchscreen of the sensing device) to switch between layers in a round robin fashion (layer 1layer 2layer 3→to layer 1→and so on). Alternatively, the user can define a hold and release trigger to switch between layers. In this embodiment, the user moves to another layer while pressing a button (e.g., a “Shift” key) or portion of a touchscreen of the sensing device 116 and returns to the preceding layer upon its release. In yet another embodiment, the trigger to switch layers can be defined differently per layer. The user can for example select the letter “A” in layer 1 to proceed to layer 2, and select the letter “B” in layer 2 to return to layer 1 or proceed to yet another layer 3. There can be numerous combinations of layers and triggers which can be defined to substantially expand the capability of single accessory. Additionally, triggers can be of any kind, tactile, speech, etc.
  • In another embodiment, method 400 can be adapted so that a user can define super macros and/or super profiles. A super macro can represent nested macros (combinations of macros). Method 400 can be adapted so that the user can customize the timing for executing nested macros. Similarly, a super profile can represent nested profiles (combinations of profiles). A super profile can for example comprise sub-profiles, each sub-profile defining associations of actions to input functions of a particular accessory.
  • In yet another embodiment, method 400 can be adapted to establish audio profiles for headset accessories. When a user select a headset accessory such as 114, GUI 101 can be adapted to provide the user options to establish an sound output (equalizer) setting to optimize performance for a particular gaming application. For instance GUI 101 can present an equalizer so that the user can raise the volume of high frequencies to an enemy's footsteps from a longer distance in a gaming application.
  • In still another embodiment, the method 400 can be adapted to allow a user to authenticate into the AMS application and/or a software application accessible by the AMS application based on a detection of an authorized fingerprint or other physical characteristic detected by the sensing device 116.
  • The foregoing embodiments are a subset of possible embodiments contemplated by the present disclosure. Other suitable modifications can be applied to the present disclosure. Accordingly, the reader is directed to the claims for a fuller understanding of the breadth and scope of the present disclosure.
  • FIG. 10 depicts an exemplary diagrammatic representation of a machine in the form of a computer system 1000 within which a set of instructions, when executed, may cause the machine to perform any one or more of the methodologies discussed above. In some embodiments, the machine operates as a standalone device. In some embodiments, the machine may be connected (e.g., using a network) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client user machine in server-client user network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • The machine may comprise a server computer, a client user computer, a personal computer (PC), a tablet PC, a laptop computer, a desktop computer, a control system, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. It will be understood that a device of the present disclosure includes broadly any electronic device that provides voice, video or data communication. Further, while a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • The computer system 1000 may include a processor 1002 (e.g., a central processing unit (CPU), a graphics processing unit (GPU, or both), a main memory 1004 and a static memory 1006, which communicate with each other via a bus 1008. The computer system 1000 may further include a video display unit 1010 (e.g., a liquid crystal display (LCD), a flat panel, a solid state display, or a cathode ray tube (CRT)). The computer system 1000 may include an input device 1012 (e.g., a keyboard), a cursor control device 1014 (e.g., a mouse), a disk drive unit 1016, a signal generation device 1018 (e.g., a speaker or remote control) and a network interface device 1020.
  • The disk drive unit 1016 may include a machine-readable medium 1022 on which is stored one or more sets of instructions (e.g., software 1024) embodying any one or more of the methodologies or functions described herein, including those methods illustrated above. The instructions 1024 may also reside, completely or at least partially, within the main memory 1004, the static memory 1006, and/or within the processor 1002 during execution thereof by the computer system 1000. The main memory 1004 and the processor 1002 also may constitute machine-readable media.
  • Dedicated hardware implementations including, but not limited to, application specific integrated circuits, programmable logic arrays and other hardware devices can likewise be constructed to implement the methods described herein. Applications that may include the apparatus and systems of various embodiments broadly include a variety of electronic and computer systems. Some embodiments implement functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application-specific integrated circuit. Thus, the example system is applicable to software, firmware, and hardware implementations.
  • In accordance with various embodiments of the present disclosure, the methods described herein are intended for operation as software programs running on a computer processor. Furthermore, software implementations can include, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the methods described herein.
  • The present disclosure contemplates a machine readable medium containing instructions 1024, or that which receives and executes instructions 1024 from a propagated signal so that a device connected to a network environment 1026 can send or receive voice, video or data, and to communicate over the network 1026 using the instructions 1024. The instructions 1024 may further be transmitted or received over a network 1026 via the network interface device 1020.
  • While the machine-readable medium 1022 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
  • The term “machine-readable medium” shall accordingly be taken to include, but not be limited to: solid-state memories such as a memory card or other package that houses one or more read-only (non-volatile) memories, random access memories, or other re-writable (volatile) memories; magneto-optical or optical medium such as a disk or tape; and carrier wave signals such as a signal embodying computer instructions in a transmission medium; and/or a digital file attachment to e-mail or other self-contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium. Accordingly, the disclosure is considered to include any one or more of a machine-readable medium or a distribution medium, as listed herein and including art-recognized equivalents and successor media, in which the software implementations herein are stored.
  • Although the present specification describes components and functions implemented in the embodiments with reference to particular standards and protocols, the disclosure is not limited to such standards and protocols. Each of the standards for Internet and other packet switched network transmission (e.g., TCP/IP, UDP/IP, HTML, HTTP) represent examples of the state of the art. Such standards are periodically superseded by faster or more efficient equivalents having essentially the same functions. Accordingly, replacement standards and protocols having the same functions are considered equivalents.
  • The illustrations of embodiments described herein are intended to provide a general understanding of the structure of various embodiments, and they are not intended to serve as a complete description of all the elements and features of apparatus and systems that might make use of the structures described herein. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. Figures are also merely representational and may not be drawn to scale. Certain proportions thereof may be exaggerated, while others may be minimized Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
  • Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.
  • The Abstract of the Disclosure is provided to comply with 37 C.F.R. §1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.

Claims (32)

1. A computer-readable storage medium, comprising computer instructions to:
present in a graphical user interface a plurality of associable actions and a biometric sensing accessory, wherein a fingerprint is detectable by the biometric sensing accessory and the fingerprint is correlated to an input function associated with the biometric sensing accessory;
associate one of the plurality of associable actions with the input function;
detect a stimulation of the input function by monitoring the biometric sensing accessory, wherein the stimulation of the input function occurs based on a detection of the fingerprint;
retrieve the action associated with the input function; and
transmit the action to an operating system.
2. The computer-readable storage medium of claim 1, wherein the fingerprint is a plurality of fingerprints, and wherein each fingerprint or a combination of fingerprints of the plurality of fingerprints is correlated to a different input function.
3. The computer-readable storage medium of claim 2, comprising computer instructions to associate other associable actions of the plurality of associable actions with the different input functions.
4. The computer-readable storage medium of claim 1, wherein the biometric sensing accessory is further configured to detect at least one of an eye movement, a heart rate, a blood pressure, and a body movement.
5. The computer-readable storage medium of claim 4, comprising computer instructions to associate an associable action of the plurality of associable actions to an input function correlated with at least one of the detected eye movement, heart rate, blood pressure, and body movement.
6. The computer-readable storage medium of claim 5, comprising computer instructions to detect a stimulation of the input function correlated with at least one of the eye movement, heart rate, blood pressure, and body movement, wherein the stimulation of the input function occurs based on a detection of at least one of the eye movement, heart rate, blood pressure, and body movement.
7. The computer-readable storage medium of claim 1, comprising computer instructions to authenticate into a software application based on the detection of the fingerprint.
8. The computer-readable storage medium of claim 7, wherein the operating system provides a signal representative of the action to the software application.
9. The computer-readable storage medium of claim 7, wherein the software application is a gaming application which invokes a gaming feature based on the signal received from the operating system.
10. The computer-readable storage medium of claim 9, wherein at least a portion the plurality of associable actions are associated with one or more actions that control the gaming application.
11. The computer-readable storage medium of claim 1, wherein the operating system launches a software application based on the action.
12. The computer-readable storage medium of claim 1, comprising computer instructions to associate associable actions of the plurality of associable actions to input functions of a plurality of other accessories, wherein the plurality of other accessories comprise at least one of a keyboard, a gaming pad, a mouse, a gaming console controller, a joystick, a microphone, and a headset with a microphone.
13. The computer-readable storage medium of claim 1, comprising computer instructions to store the association of the action with the input function.
14. The computer-readable storage medium of claim 1, comprising computer instructions to store the input function correlated with the fingerprint and the associated action in a profile.
15. The computer-readable storage medium of claim 14, comprising computer instructions to associate the profile to at least one software application.
16. The computer-readable storage medium of claim 1, comprising computer instructions to associate a sequence of stimulations of the input function to a macro.
17. The computer-readable storage medium of claim 16, wherein the macro corresponds to at least one of the plurality of associable actions.
18. The computer-readable storage medium of claim 1, comprising computer instructions to calculate and present a statistical frequency of stimulation of the input function correlated with the fingerprint.
19. An biometric accessory, comprising a controller to:
detect at least one of navigation information and biometric information associated with a user of the accessory; and
transmit at least one of the navigation information and biometric information to a software application, wherein an input function of the accessory which is correlated with at least one of the navigation information and the biometric information is assigned to an action of a plurality of associable actions by the software application, wherein a stimulation of the input function is detected by the software application, wherein the action is retrieved by the software application based on the stimulation being detected, and wherein the retrieved associable action is transmitted by the software application to an operating system.
20. The accessory of claim 19, wherein the biometric information comprises at least one among a fingerprint, an eye movement, a blood pressure, a body movement, and a heart rate.
21. The accessory of claim 19, comprising an integrated display configured to present a graphical user interface which displays the plurality of associable actions and a plurality of accessories of distinct operational types, wherein the plurality of accessories are for interacting with a software application operable in a computer system.
22. The accessory of claim 19, wherein the biometric information is a plurality of physical characteristics, and wherein each physical characteristic or a combination of physical characteristics of the plurality of physical characteristics is correlated to a different input function of the accessory.
23. The accessory of claim 22, wherein the software application associates actions of the plurality of associable actions with the different input functions.
24. The accessory of claim 22, wherein the operating system authenticates into a gaming application based on the detection of the physical characteristic.
25. The accessory of claim 24, wherein the operating system transmits the assigned action or an aspect thereof to the gaming application.
26. The accessory of claim 24, wherein the stimulation of the input function correlated with the physical characteristic is utilized to manipulate in-game entities of the gaming application.
27. A computer-readable storage medium, comprising computer instructions to:
receive from a software application operably coupled to a biometric sensing accessory an action associated with an input function of the biometric sensing accessory, wherein the input function is correlated to at least one of navigation information and biometric information detected by the biometric sensing accessory, wherein a stimulation of the input function is detected by the software application, and wherein the action is retrieved by the software application when the stimulation is detected; and
perform the received action.
28. The computer-readable storage medium of claim 27, wherein the computer instructions represent a gaming application.
29. The computer-readable storage medium of claim 27, wherein the physical characteristic comprises at least one of a fingerprint, an eye movement, a body movement, a heart rate, and a blood pressure.
30. The computer-readable storage medium of claim 27, comprising authenticating a user into the gaming application based on the physical characteristic.
31. The computer-readable storage medium of claim 27, wherein the action is associated with one or more actions that control the gaming application.
32. The computer-readable storage medium of claim 27, wherein the software application is an operating system.
US12/537,823 2009-08-07 2009-08-07 Apparatus for associating physical characteristics with commands Abandoned US20110034248A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/537,823 US20110034248A1 (en) 2009-08-07 2009-08-07 Apparatus for associating physical characteristics with commands

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/537,823 US20110034248A1 (en) 2009-08-07 2009-08-07 Apparatus for associating physical characteristics with commands

Publications (1)

Publication Number Publication Date
US20110034248A1 true US20110034248A1 (en) 2011-02-10

Family

ID=43535233

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/537,823 Abandoned US20110034248A1 (en) 2009-08-07 2009-08-07 Apparatus for associating physical characteristics with commands

Country Status (1)

Country Link
US (1) US20110034248A1 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110183757A1 (en) * 2010-01-22 2011-07-28 Nintendo Co., Ltd. Storage medium having game program stored therein, game apparatus, control method, and game system
US20130274014A1 (en) * 2012-03-06 2013-10-17 Steelseries Aps Method and apparatus for presenting performances of gamers
US20130316828A1 (en) * 2012-05-23 2013-11-28 Shmuel Ur Method and apparatus for converting computer games between platforms using different modalities
US20150000429A1 (en) * 2012-02-13 2015-01-01 Thermo Fisher Scientific Oy Pipette With A Tracking System
US20150038231A1 (en) * 2013-03-15 2015-02-05 Steelseries Aps Gaming device with independent gesture-sensitive areas
CN105871749A (en) * 2015-11-16 2016-08-17 乐视致新电子科技(天津)有限公司 Network access control method and system based on router, and related device
US9474969B2 (en) * 2011-12-29 2016-10-25 Steelseries Aps Method and apparatus for determining performance of a gamer
US9547421B2 (en) 2009-07-08 2017-01-17 Steelseries Aps Apparatus and method for managing operations of accessories
US9604147B2 (en) 2013-03-15 2017-03-28 Steelseries Aps Method and apparatus for managing use of an accessory
US20170319954A1 (en) * 2016-05-03 2017-11-09 Disney Enterprises, Inc. System and method of configuring disparate physical objects to provide control signals for controlling a game
CN107526995A (en) * 2016-06-20 2017-12-29 比亚迪股份有限公司 Fingerprint recognition module, fingerprint identification method and mobile terminal
US20180200623A1 (en) * 2017-01-19 2018-07-19 Machine Zone, Inc. System and method for controlling game play using fingerprint recognition
US10173133B2 (en) 2013-03-15 2019-01-08 Steelseries Aps Gaming accessory with sensory feedback device
US20190105560A1 (en) * 2011-08-16 2019-04-11 Steelseries Aps Method and apparatus for adapting to gaming venue states
US10525338B2 (en) 2009-07-08 2020-01-07 Steelseries Aps Apparatus and method for managing operations of accessories in multi-dimensions
EP3721326A4 (en) * 2018-02-27 2020-12-16 Samsung Electronics Co., Ltd. Method of displaying graphic object differently according to body portion in contact with controller, and electronic device
US10918949B2 (en) 2019-07-01 2021-02-16 Disney Enterprises, Inc. Systems and methods to provide a sports-based interactive experience

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5252951A (en) * 1989-04-28 1993-10-12 International Business Machines Corporation Graphical user interface with gesture recognition in a multiapplication environment
US20100225595A1 (en) * 2009-03-03 2010-09-09 Microsoft Corporation Touch discrimination
US20100234094A1 (en) * 2007-11-09 2010-09-16 Wms Gaming Inc. Interaction with 3d space in a gaming system
US20110021269A1 (en) * 2009-07-27 2011-01-27 Steelseries Hq. Device for managing operations of accessories
US8358200B2 (en) * 2007-10-23 2013-01-22 Hewlett-Packard Development Company Method and system for controlling computer applications
US8525802B2 (en) * 2008-03-31 2013-09-03 Lg Electronics Inc. Portable terminal capable of sensing proximity touch and method for providing graphic user interface using the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5252951A (en) * 1989-04-28 1993-10-12 International Business Machines Corporation Graphical user interface with gesture recognition in a multiapplication environment
US8358200B2 (en) * 2007-10-23 2013-01-22 Hewlett-Packard Development Company Method and system for controlling computer applications
US20100234094A1 (en) * 2007-11-09 2010-09-16 Wms Gaming Inc. Interaction with 3d space in a gaming system
US8525802B2 (en) * 2008-03-31 2013-09-03 Lg Electronics Inc. Portable terminal capable of sensing proximity touch and method for providing graphic user interface using the same
US20100225595A1 (en) * 2009-03-03 2010-09-09 Microsoft Corporation Touch discrimination
US20110021269A1 (en) * 2009-07-27 2011-01-27 Steelseries Hq. Device for managing operations of accessories

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9547421B2 (en) 2009-07-08 2017-01-17 Steelseries Aps Apparatus and method for managing operations of accessories
US10318117B2 (en) 2009-07-08 2019-06-11 Steelseries Aps Apparatus and method for managing operations of accessories
US10525338B2 (en) 2009-07-08 2020-01-07 Steelseries Aps Apparatus and method for managing operations of accessories in multi-dimensions
US10891025B2 (en) 2009-07-08 2021-01-12 Steelseries Aps Apparatus and method for managing operations of accessories
US11154771B2 (en) 2009-07-08 2021-10-26 Steelseries Aps Apparatus and method for managing operations of accessories in multi-dimensions
US11416120B2 (en) 2009-07-08 2022-08-16 Steelseries Aps Apparatus and method for managing operations of accessories
US11709582B2 (en) 2009-07-08 2023-07-25 Steelseries Aps Apparatus and method for managing operations of accessories
US8449392B2 (en) * 2010-01-22 2013-05-28 Nintendo Co., Ltd. Storage medium having game program stored therein, game apparatus, control method, and game system using a heartbeat for performing a game process in a virtual game world
US20110183757A1 (en) * 2010-01-22 2011-07-28 Nintendo Co., Ltd. Storage medium having game program stored therein, game apparatus, control method, and game system
US10850189B2 (en) * 2011-08-16 2020-12-01 Steelseries Aps Method and apparatus for adapting to gaming venue states
US11806611B2 (en) * 2011-08-16 2023-11-07 Steelseries Aps Method and apparatus for adapting to gaming venue states
US20190105560A1 (en) * 2011-08-16 2019-04-11 Steelseries Aps Method and apparatus for adapting to gaming venue states
US11266905B2 (en) * 2011-08-16 2022-03-08 Steelseries Aps Method and apparatus for adapting to gaming venue states
US20220152480A1 (en) * 2011-08-16 2022-05-19 Steelseries Aps Method and apparatus for adapting to gaming venue states
US10653949B2 (en) 2011-12-29 2020-05-19 Steelseries Aps Method and apparatus for determining performance of a gamer
US10124248B2 (en) 2011-12-29 2018-11-13 Steelseries Aps Method and apparatus for determining performance of a gamer
US9914049B2 (en) 2011-12-29 2018-03-13 Steelseries Aps Method and apparatus for determining performance of a gamer
US9474969B2 (en) * 2011-12-29 2016-10-25 Steelseries Aps Method and apparatus for determining performance of a gamer
US9522395B2 (en) * 2012-02-13 2016-12-20 Thermo Fisher Scientific Oy Pipette with a tracking system
CN108031501A (en) * 2012-02-13 2018-05-15 恩姆菲舍尔科技公司 Electronics pipette
US20150000429A1 (en) * 2012-02-13 2015-01-01 Thermo Fisher Scientific Oy Pipette With A Tracking System
US20190046970A1 (en) * 2012-02-13 2019-02-14 Thermo Fisher Scientific Oy Pipette With A Tracking System
US10105698B2 (en) 2012-02-13 2018-10-23 Thermo Fischer Scientific Oy Pipette with a tracking system
US10814234B2 (en) 2012-03-06 2020-10-27 Steelseries Aps Method and apparatus for presenting performances of gamers
US10195533B2 (en) 2012-03-06 2019-02-05 Steelseries Aps Method and apparatus for presenting performances of gamers
US8870652B2 (en) * 2012-03-06 2014-10-28 Steelseries Aps Method and apparatus for presenting performances of gamers
US9446318B2 (en) 2012-03-06 2016-09-20 Steelseries Aps Method and apparatus for presenting performances of gamers
US20130274014A1 (en) * 2012-03-06 2013-10-17 Steelseries Aps Method and apparatus for presenting performances of gamers
US9302182B2 (en) * 2012-05-23 2016-04-05 Side-Kick Ltd Method and apparatus for converting computer games between platforms using different modalities
US20130316828A1 (en) * 2012-05-23 2013-11-28 Shmuel Ur Method and apparatus for converting computer games between platforms using different modalities
US9604147B2 (en) 2013-03-15 2017-03-28 Steelseries Aps Method and apparatus for managing use of an accessory
US10076706B2 (en) 2013-03-15 2018-09-18 Steelseries Aps Gaming device with independent gesture-sensitive areas
US10350494B2 (en) 2013-03-15 2019-07-16 Steelseries Aps Gaming device with independent gesture-sensitive areas
US10173133B2 (en) 2013-03-15 2019-01-08 Steelseries Aps Gaming accessory with sensory feedback device
US10661167B2 (en) 2013-03-15 2020-05-26 Steelseries Aps Method and apparatus for managing use of an accessory
US20150038231A1 (en) * 2013-03-15 2015-02-05 Steelseries Aps Gaming device with independent gesture-sensitive areas
US10130881B2 (en) 2013-03-15 2018-11-20 Steelseries Aps Method and apparatus for managing use of an accessory
US10500489B2 (en) 2013-03-15 2019-12-10 Steelseries Aps Gaming accessory with sensory feedback device
US11224802B2 (en) 2013-03-15 2022-01-18 Steelseries Aps Gaming accessory with sensory feedback device
US11701585B2 (en) 2013-03-15 2023-07-18 Steelseries Aps Gaming device with independent gesture-sensitive areas
US10898799B2 (en) 2013-03-15 2021-01-26 Steelseries Aps Gaming accessory with sensory feedback device
US11590418B2 (en) 2013-03-15 2023-02-28 Steelseries Aps Gaming accessory with sensory feedback device
US11135510B2 (en) 2013-03-15 2021-10-05 Steelseries Aps Gaming device with independent gesture-sensitive areas
US9687730B2 (en) * 2013-03-15 2017-06-27 Steelseries Aps Gaming device with independent gesture-sensitive areas
CN105871749A (en) * 2015-11-16 2016-08-17 乐视致新电子科技(天津)有限公司 Network access control method and system based on router, and related device
US20170319954A1 (en) * 2016-05-03 2017-11-09 Disney Enterprises, Inc. System and method of configuring disparate physical objects to provide control signals for controlling a game
US10702770B2 (en) * 2016-05-03 2020-07-07 Disney Enterprises, Inc. System and method of configuring disparate physical objects to provide control signals for controlling a game
CN107526995A (en) * 2016-06-20 2017-12-29 比亚迪股份有限公司 Fingerprint recognition module, fingerprint identification method and mobile terminal
US20180200623A1 (en) * 2017-01-19 2018-07-19 Machine Zone, Inc. System and method for controlling game play using fingerprint recognition
WO2018136330A1 (en) * 2017-01-19 2018-07-26 Mz Ip Holdings, Llc System and method for controlling game play using fingerprint recognition
EP3721326A4 (en) * 2018-02-27 2020-12-16 Samsung Electronics Co., Ltd. Method of displaying graphic object differently according to body portion in contact with controller, and electronic device
US10918949B2 (en) 2019-07-01 2021-02-16 Disney Enterprises, Inc. Systems and methods to provide a sports-based interactive experience

Similar Documents

Publication Publication Date Title
US11709582B2 (en) Apparatus and method for managing operations of accessories
US20110034248A1 (en) Apparatus for associating physical characteristics with commands
US10888779B2 (en) Accessory for presenting information associated with an application
US11596868B2 (en) Apparatus and method for enhancing sound produced by a gaming application
US10960311B2 (en) Method and apparatus for adapting applications over multiple devices
US20110244961A1 (en) Apparatus and method for managing operations of accessories in multi-dimensions
US11660537B2 (en) Apparatus and method for enhancing a condition in a gaming application

Legal Events

Date Code Title Description
AS Assignment

Owner name: STEELSERIES HQ, DENMARK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GREVER, ARNIE;HAWVER, BRUCE;SIGNING DATES FROM 20090911 TO 20091001;REEL/FRAME:023354/0064

AS Assignment

Owner name: STEELSERIES APS, DENMARK

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE NAME OF ASSIGNEE PREVIOUSLY RECORDED ON REEL 023354 FRAME 0064. ASSIGNOR(S) HEREBY CONFIRMS THE CORRECT NAME OF THE ASSIGNEE IS STEELSERIES APS.;ASSIGNORS:GREVER, ARNIE;HAWVER, BRUCE;SIGNING DATES FROM 20090911 TO 20091001;REEL/FRAME:028323/0631

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION