WO2014189868A1 - Attributing user action based on biometric identity - Google Patents

Attributing user action based on biometric identity Download PDF

Info

Publication number
WO2014189868A1
WO2014189868A1 PCT/US2014/038683 US2014038683W WO2014189868A1 WO 2014189868 A1 WO2014189868 A1 WO 2014189868A1 US 2014038683 W US2014038683 W US 2014038683W WO 2014189868 A1 WO2014189868 A1 WO 2014189868A1
Authority
WO
WIPO (PCT)
Prior art keywords
action
application
user
context
game
Prior art date
Application number
PCT/US2014/038683
Other languages
English (en)
French (fr)
Inventor
Robert Smith
Li-Chen Miller
Joseph Wyman
Jonathan Garcia
Pat Halvorsen
Jason Giles
Original Assignee
Microsoft Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corporation filed Critical Microsoft Corporation
Priority to CA2910347A priority Critical patent/CA2910347A1/en
Priority to RU2015149776A priority patent/RU2668984C2/ru
Priority to CN201480029405.6A priority patent/CN105431813B/zh
Priority to MX2015016069A priority patent/MX352436B/es
Priority to KR1020157035957A priority patent/KR20160010608A/ko
Priority to BR112015028449A priority patent/BR112015028449A2/pt
Priority to EP14734598.7A priority patent/EP3000026B1/en
Priority to JP2016514994A priority patent/JP2016524751A/ja
Priority to AU2014268811A priority patent/AU2014268811A1/en
Publication of WO2014189868A1 publication Critical patent/WO2014189868A1/en

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/3241Security aspects of a gaming system, e.g. detecting cheating, device integrity, surveillance
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/424Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/843Special adaptations for executing a specific game genre or game mode involving concurrently two or more players on the same game device, e.g. requiring the use of a plurality of controllers or of a specific view of game data for each player
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/3202Hardware aspects of a gaming system, e.g. components, construction, architecture thereof
    • G07F17/3204Player-machine interfaces
    • G07F17/3206Player sensing means, e.g. presence detection, biometrics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4751End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for defining user accounts, e.g. accounts for children
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4755End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for defining user preferences, e.g. favourite actors or genre
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2109Game systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Definitions

  • game systems attribute user actions based on a one-to-one mapping between a signed-in user and a controller. This may limit the ability of a gamer to interact with a game console simultaneously in a variety of ways, to switch controllers, to move about in a game space, and in other ways.
  • This conventional attribution approach arose because game consoles typically had a concept of a controller and a concept of a user and the only way to connect the user to the controller was through a sign-in procedure.
  • the console may have established and maintained a fixed, one-to-one relationship between a controller and a player.
  • Computer systems including game systems, may allow multiple users to access, sign in, or otherwise interact with the system at the same time.
  • a single game console may have dedicated controllers and may have the ability to allow additional transient controllers to interact with the game.
  • multiple players may simultaneously be able to sign into and interact with a multi-user game.
  • the game console may allow different players to have different games active at different times and even to come and go from a game.
  • the traditional, fixed, one-to-one relationship between a user and a controller may limit the game experience. This limiting experience may extend to other non-gaming systems.
  • a user context may describe, for example, user attributes and user state.
  • User attributes may include, for example, name, language preferences, login credentials for applications, login credentials for websites, saved content including documents or game saves, or other data.
  • User state may include, for example, location, or other data.
  • the context associated with the application may have controlled, at least in part, the operation of the application. For example, a first user context may have caused an application to produce an English language presentation for a male who is an intermediate level swordsman while a second user context may have caused an application to produce a French language presentation for a female who is an expert archer.
  • the context may have controlled in-game attributes (e.g., point of view, character) but may also have controlled attributes of other applications launched from the launch surface.
  • a launch surface refers to an interface with which a user may interact to launch an application.
  • a launch surface may be, for example, a desktop, a start menu, a dashboard, or other interactive item from which applications or processes can be initiated (e.g., launched).
  • Example apparatus and methods detect actions intended to control an application.
  • the application may have been launched from a shared launch surface.
  • the shared launch surface may be provided by a game console, a computer, an operating system, or other device or system.
  • Example apparatus and methods associate the action with a body located in a field of detection of a biometric sensor associated with the shared launch surface.
  • a game console may have an infrared (IR) system that produces a depth map for a game space, a visible light camera system that acquires images from the game space, a sound and voice recognition system for acquiring voice data, or haptic interfaces for capturing fine movements.
  • IR infrared
  • Example apparatus and methods determine, using data provided by the biometric sensor(s), a biometric identity for the body and then attribute (e.g., associate, couple, map) the action to a user as a function of the biometric identity.
  • attribute e.g., associate, couple, map
  • the context may include, for example, information that describes the user in general (e.g., user attributes) and dynamic information that describes the user at a particular point in time (e.g., user state).
  • Example apparatus may be configured with hardware including a processor and a biometric sensor(s).
  • the example apparatus may include a memory configured to store information concerning ownership of a shared launch surface provided by the apparatus or an operating system running on the apparatus.
  • the apparatus may include logics that are configured to attribute a user action as a function of a biometric identity determined by data provided by the biometric sensor.
  • the apparatus may track bodies located in a field of detection (e.g., field of view) of the biometric sensor.
  • the apparatus may also identify actions performed in the field of detection. An action can be mapped to a body, and a body can be mapped to a user through a biometric identity. Once the action and the biometric identification of a user are identified, the apparatus may selectively control an operation of the apparatus as a function of the biometric identification.
  • Figure 1 illustrates an example game environment with example mappings.
  • Figure 2 illustrates an example method associated with attributing a user action based on a biometric identity.
  • Figure 3 illustrates a portion of an example method associated with attributing a user action based on a biometric identity.
  • Figure 4 illustrates an example game environment and different user actions.
  • Figure 5 illustrates an example apparatus configured to facilitate attributing a user action based on a biometric identity.
  • Figure 6 illustrates an example apparatus configured to facilitate attributing a user action based on a biometric identity.
  • Figure 7 illustrates an example cloud operating environment.
  • Figure 8 is a system diagram depicting an exemplary mobile communication device configured to facilitate attributing a user action based on a biometric identity.
  • Example apparatus and methods decouple the fixed 1 :1 relationship between a launch surface, a context, a user, and a controller. Rather than fix the user/controller relationship and the controller/context relationship, example apparatus and methods detect actions, detect and identify bodies, flexibly and dynamically determine an action/user relationship and a user/context relationship, and then associate the action with the appropriate context. Since users may employ different controllers at different times during a game session, identifying a user currently associated with a controller provides new flexibility.
  • a controller may include, for example, a gamepad, a remote, a game guitar, a voice control apparatus, a gesture capture apparatus, haptic interfaces, natural user interfaces, or other apparatus.
  • users may interact (e.g., initiate operations) with an application, launch surface, or system using different actions (e.g., button press, virtual button press, user interface interaction, voice command, gesture) at different times, identifying an action, and then identifying a user associated with the action, and then identifying a context to associate with the action, also provides new flexibility.
  • the flexibility is extended even further by mapping the action to user to context at the time of the action.
  • Example apparatus and methods break the fixed one-to-one relationship by making biometric identifications of users.
  • a biometric identity is used to map the person associated with an action to a context.
  • a biometric identity may include a fingerprint, a retinal pattern, or other unique physical attribute.
  • a biometric identity may also include, for example, a user identified attribute (e.g., I am wearing the red shirt, I am wearing the white hat, I am the tallest person).
  • a biometric identity may also include, for example, a system identified attribute (e.g., the tallest player, the roundest player, the player with the deepest voice, the player with the darkest colored shirt/hat, the player with the lightest hat/hair).
  • a system identified attribute e.g., the tallest player, the roundest player, the player with the deepest voice, the player with the darkest colored shirt/hat, the player with the lightest hat/hair.
  • biometric identity is intended to be used in a broad manner. Rather than performing a single biometric identification for security purposes, on-going real time or near real time biometric identifications may be made for bodies visible in the field of view of a game system for action attribution reasons.
  • a gamer may move around during a game session. As the person moves around they might use different controllers or might initiate actions using different approaches.
  • Example apparatus and methods may track the position of the user in real-time or near real-time and then match an action location to the user location to facilitate attributing the action.
  • Checking stock prices may require launching a separate application in, for example, a small popup that might be displayed just to the requester and that would not interrupt the game, at least for other players.
  • the console may first detect the action and then identify the user associated with the action through biometrics.
  • Example apparatus and methods may then consult, create, update or otherwise manipulate a current mapping to associate a context with the user that made the action. Once the context associated with the user has been identified, the action can be taken in light of the proper context. This can be done without navigating to the launch surface or reconfiguring the launch surface. This provides a seamless experience that facilitates an improved game experience.
  • Example apparatus and methods use the identity of the biometrically identified user and, in some cases, the location of the action, to map a context to an operation (e.g., launch application, manipulate application, use application), without disrupting the game in progress. For example, while the game is in progress the gamer might receive a text that a new video by their favorite artist is available. The gamer may want to download, acquire, or view the new video without having to exit the game. Conventionally, the user that initiated the game owns the launch surface and any application launched during the session will be associated with the launch surface owner. To switch the launch surface ownership to another person typically requires suspending/ending the game, which is unacceptable to many gamers.
  • an operation e.g., launch application, manipulate application, use application
  • the gamer might not be able to simply pause the game, or even keep the game in progress, and launch a video acquisition application at the same time.
  • the gamer may make an application initiating action and acquire or view the video without ever leaving the game. Additionally, the video would be acquired by the user who initiated the action, not by the user who "owns" the launch surface.
  • billing updates, data plan updates, or other user specific results would be associated with the gamer who actually accessed the video, not with the gamer who initially started the game.
  • other non-gaming applications may also use the flexible, dynamic, biometric identity based action attribution described herein.
  • Example apparatus and methods recognize that a multi-user system may have multiple users that may want to launch or otherwise interact with applications.
  • Example apparatus and methods also recognize that users may have contexts that they want associated with an application or an action taken in an application.
  • example apparatus and methods may map users to contexts and contexts to actions with a flexible decoupled approach that attributes user actions based on biometric identities made at the time an action is taken.
  • example apparatus and methods may track bodies in the game space in real time or near real time, may biometrically identify a tracked body as a known user, and may associate a context with an action taken by that body based on the biometric identity.
  • a biometric identity can be determined in different ways. Biometric identification may include, for example, facial recognition, voice recognition, fingerprint recognition, gesture recognition, haptic recognition, or other approaches.
  • FIG. 1 illustrates an example game environment with example mappings.
  • a game console 120 may be able to run a number of applications (e.g., applicationl 130, application2 132, ... applicationN 138).
  • Console 120 may be able to support or interact with a number of controllers (e.g., controllerl 140, controlled 142, ... controllerO 148). Since console 120 is a multi-user system, console 120 may be able to support a number of users (e.g., userl 150, user2, 152, ... userP 158). Since the console 120 may support multiple users, console 120 may have access to multiple contexts (e.g., contextl 110, context2 112, ... contextM 118).
  • contexts e.g., contextl 110, context2 112, ... contextM 118.
  • a specific controller would have a fixed one-to-one relationship with a specific user.
  • the fixed one-to-one relationship would determine the context.
  • this fixed one- to-one relationship produced burdensome limitations for game play. For example, if userl 150 signed in with controllerl 140, then userl 150 could only use controllerl 140 for the duration of the game.
  • Example apparatus and methods break this fixed one-to-one relationship by tracking users during a game session and then mapping a user to an action at the time of the action.
  • userl 150 might start playing a game while seated using controllerl 140, then after a while userl 150 might put down controllerl 140, move from the comfortable chair to a standing position and start interacting with the game using gestures rather than the controller.
  • userl 150 might lie down and start controlling the game using voice commands. Near the end of the game, userl 150 might pick up controller 148 and control the game with button presses from controller 148. This scenario would be impossible in traditional games.
  • example apparatus and methods may provide this level of flexibility.
  • Mappings 160 show a four way mapping that relates a context to a controller and a user and an application. Different mappings may involve a greater or lesser number of attributes. In one embodiment, mappings 160 may be updated in real time or near real time to reflect the current reality in the game space.
  • Figure 2 illustrates an example method 200 associated with attributing a user action based on a biometric identity.
  • method 200 may be performed on a single device, may be performed partially or completely in the cloud, may be performed on distributed co-operating devices, or may be performed other ways.
  • method 200 may be performed on devices including, but not limited to, a game console, a computer, a laptop computer, a tablet computer, a phone, and a smart phone.
  • Method 200 includes, at 210, detecting an action intended to control an application associated with a shared launch surface.
  • the application may be a video game and the shared launch surface may be associated with a video game console. While a video game and a video game console are described, other multi-user systems that employ a shared launch surface may be employed.
  • the shared launch surface may be associated with a multi-user operating system running on laptop computer, a tablet computer, or a phone.
  • detecting the action may include detecting actions including, but not limited to a button press, a virtual button press, an interaction with a user interface, an interaction with a game controller, a manipulation of a game controller, a voice command, or a gesture.
  • the button press may occur on a game controller, on the console, or on another device configured to provide button press signals to the system or application.
  • a virtual button press may be performed using a voice command, a gesture, or other action where a non-physical button is "pressed".
  • An interaction with a user interface may include a physical interaction through, for example, a capacitive touch interface, a gesture interaction, or other interaction.
  • a manipulation of a game controller may include, for example, pressing buttons on the controller, moving arrows on the controller, turning knobs or dials on the controller, accelerating the controller in a direction, holding the controller in an orientation, or other actions.
  • Method 200 also includes, at 220, associating the action with a body located in a field of view of a biometric sensor associated with a shared launch surface.
  • the associating may be location based. For example, sounds from a location that is occupied by a body can be associated with that body. Similarly, controller actions from a location occupied by a body can be associated with the controller. When a gesture is made, the gesture may already be associated with a body that made the gesture.
  • Method 200 also includes, at 230, determining, using data provided by the biometric sensor, a biometric identity for the body.
  • determining the biometric identity involves performing actions including, but not limited to, facial recognition, gesture recognition, voice recognition, fingerprint recognition, haptic recognition, or retinal recognition.
  • the recognitions may be based on information received from a biometric sensor(s). For example, facial recognition may be performed from contours present in a depth map or visible features present in a visible light picture.
  • Method 200 also includes, at 240, attributing the action to a user as a function of the biometric identity. Attributing the action may include updating a mapping, updating a table that stores information relating actions to users, or otherwise associating an action and a user.
  • Method 200 also includes, at 250, identifying a context associated with the user.
  • the context may include an attribute and a state.
  • the attribute may describe data that is relatively unchanging for a user.
  • the attribute may be, for example, a user name, a screen name, a data plan identifier, a billing identifier, an account identifier, a parental control setting, a display preference, or a social media data.
  • Different contexts may include a greater or smaller number of attributes.
  • the state may describe data that changes more frequently for a user.
  • the state may be, for example, a location, a data plan balance, a billing balance, an account balance, an experience level, an access time, an engaged time, a location, or a connectivity level.
  • Different contexts may include a greater or smaller number of state variables.
  • Method 200 also includes, at 260, selectively controlling the application or the shared launch surface as a function of the action and the context.
  • the action may be intended to launch an application from the shared launch surface.
  • selectively controlling the application or the shared launch surface may include launching the application from the shared launch surface using the context.
  • the action may be intended to cause an operation in an application available through the shared launch surface.
  • selectively controlling the application or the shared launch surface includes associating the context with the application and then causing the operation to occur in the application.
  • selectively controlling the application or the shared launch surface may include launching a second application using the context while preserving a first context associated with a first application running on the shared launch surface.
  • different applications may be associated with different contexts.
  • the game context may be associated with the person who launched the game while a stock checking popup may be associated with the person who wanted to check their stocks and a video application could be associated with the gamer who got the text and wanted to view the video immediately.
  • selectively controlling the application or the shared launch surface may include selectively denying control of the application in response to the action based on the context or selectively denying launching an application based on the context.
  • the denial may be based, for example, on a parental control, on a data plan limitation, or other information about the user.
  • Figure 3 illustrates a portion of an example method associated with attributing a user action based on a biometric identity.
  • One part of attributing the user action may include identifying the action as a voice command, identifying the action as a gesture, or identifying a controller associated with the action. If the action is identified as a voice command at 270, then processing may include, at 272, identifying a location from which the voice command originated. The location may be determined by a directional microphone(s), by triangulating the sound, or by other location methods. Processing may include, at 274, identifying a body corresponding to the location. The body may have been tracked by biometric sensors, IR sensors, visible light camera systems, or other apparatus in, for example, a game system.
  • the biometric identity is not being used to authenticate a user for security purposes (although it could), but rather is being used to identify who is currently at a location from which a voice command originated.
  • processing may include, at 276, mapping the body corresponding to the location to the voice command.
  • Attributing the user action may include identifying the action as a gesture at 280. If the action is identified as a gesture, then there will likely already be a body associated with the location at which the gesture was made, and processing may include, at 284, identifying the body that made the gesture.
  • Attributing the user action may include, at 290, identifying a game controller on which the action was performed.
  • the game controller may be a wired controller, a wireless controller, a controller provided with the console, a controller provided by the gamer, or other controller.
  • processing may include, at 292, identifying a location at which the game controller was located. Once the location has been identified, processing may include identifying a body corresponding to the location. The body may be identified using biometric identification. The biometrics are not being used to authenticate a user for security purposes, but rather are used to identify who is currently using a controller. Once the identification is made, processing may include, at 286, mapping the body corresponding to the location to the action.
  • Figures 2 and 3 illustrates various actions occurring in serial, it is to be appreciated that various actions illustrated in Figures 2 and 3 could occur substantially in parallel.
  • a first process could detect actions
  • a second process could associate actions with bodies
  • a third process could produce a biometric identification of a body
  • a fourth process could selectively control an operation as a function the action and a context associated with the biometric identification. While four processes are described, it is to be appreciated that a greater or lesser number of processes could be employed and that lightweight processes, regular processes, threads, and other approaches could be employed.
  • a method may be implemented as computer executable instructions.
  • a computer-readable storage medium may store computer executable instructions that if executed by a machine (e.g., computer) cause the machine to perform methods described or claimed herein including method 200.
  • executable instructions associated with the above method are described as being stored on a computer-readable storage medium, it is to be appreciated that executable instructions associated with other example methods described or claimed herein may also be stored on a computer-readable storage medium.
  • the example methods described herein may be triggered in different ways. In one embodiment, a method may be triggered manually by a user. In another example, a method may be triggered automatically.
  • Computer-readable storage medium refers to a medium that stores instructions or data. “Computer-readable storage medium” does not refer to propagated signals, per se.
  • a computer-readable storage medium may take forms, including, but not limited to, non-volatile media, and volatile media.
  • Non-volatile media may include, for example, optical disks, magnetic disks, tapes, flash memory, ROM, and other media.
  • Volatile media may include, for example, semiconductor memories, dynamic memory (e.g., dynamic random access memory (DRAM), synchronous dynamic random access memory (SDRAM), double data rate synchronous dynamic random-access memory (DDR SDRAM), etc.), and other media.
  • DRAM dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • DDR SDRAM double data rate synchronous dynamic random-access memory
  • a computer-readable storage medium may include, but are not limited to, a floppy disk, a flexible disk, a hard disk, a magnetic tape, other magnetic medium, an application specific integrated circuit (ASIC), a compact disk (CD), other optical medium, a random access memory (RAM), a read only memory (ROM), a memory chip or card, a memory stick, and other media from which a computer, a processor or other electronic device can read.
  • ASIC application specific integrated circuit
  • CD compact disk
  • RAM random access memory
  • ROM read only memory
  • memory chip or card a memory stick, and other media from which a computer, a processor or other electronic device can read.
  • FIG. 4 illustrates an example game environment and different user actions.
  • a console 400 may detect actions including a controller action 410, a voice command 420, or a gesture 430.
  • Detecting a controller action 410 may include receiving an electrical signal through a wire that connects a controller to console 410.
  • Detecting the controller action 410 may also include receiving a data signal from a wireless controller through a wireless communication medium (e.g., radio frequency signal). Detecting the controller action 410 may not involve any biometric sensor.
  • a wireless communication medium e.g., radio frequency signal
  • Detecting a voice command 420 may include receiving sounds at a microphone(s).
  • the microphone(s) may be directional. Voices may be distinguishable between users. Therefore, detecting a voice command 420 may involve a sensor(s) that is also used for a biometric identity. Like the controller action 410, detecting the voice command 420 may be a passive or unidirectional operation where signals flow from the device or person to the console 400.
  • Detecting a gesture 430 may involve an active operation that includes sending signals out into a game space and receiving reflected signals. Detecting a gesture may include identifying where the gesture occurred, what the gesture was (e.g., flick, pinch, spread, wave), the speed at which the gesture was performed, and other attributes of a gesture.
  • Figure 5 illustrates an apparatus 500 that includes a processor 510, a memory 520, a set 530 of logics, a biometric sensor 599, and an interface 540 that connects the processor 510, the memory 520, the biometric sensor 599, and the set 530 of logics.
  • the memory 520 may be configured to store information concerning ownership of a shared launch surface.
  • the set 530 of logics may be configured to attribute a user action as a function of a biometric identity performed using data provided by the biometric sensor 599.
  • Apparatus 500 may be, for example, a game console, a device acting as a game console, a computer, a laptop computer, a tablet computer, a personal electronic device, a smart phone, system-on-a-chip (SoC), or other device that can access and process data.
  • a game console a device acting as a game console
  • a computer a laptop computer, a tablet computer, a personal electronic device, a smart phone, system-on-a-chip (SoC), or other device that can access and process data.
  • SoC system-on-a-chip
  • the apparatus 500 may be a general purpose computer that has been transformed into a special purpose computer through the inclusion of the set 530 of logics. Apparatus 500 may interact with other apparatus, processes, and services through, for example, a computer network.
  • the set 530 of logics may include a first logic 532 that is configured to track one or more bodies located in a field of detection of the biometric sensor.
  • the first logic 532 may be configured to track the one or more bodies in real time using data provided by the biometric sensor 599.
  • the third logic 536 may be configured to maintain, in real time, a mapping between a member of the one or more bodies and a biometric identity.
  • the set 530 of logics may also include a second logic 534 that is configured to identify an action performed by a member of the one or more bodies.
  • the second logic 534 may be configured to identify a controller action, a voice command, a gesture, or other operation initiating action.
  • the set 530 of logics may also include a third logic 536 that is configured to produce a biometric identification of the member using data provided by the biometric sensor 599.
  • the biometric sensor 599 may be, for example an infrared (IR) sensor, a vision system, a haptic system, a sound system, or a voice system.
  • the third logic 536 may be configured to acquire a set of user specific data associated with the biometric identification.
  • the user specific data may include attribute data and state data.
  • the set 530 of logics may also include a fourth logic 538 that is configured to selectively control an operation of the apparatus as a function of the biometric identification.
  • the operation may be intended to launch an application from the shared launch surface using the user specific data. For example, while a game is in progress, a gamer may want to view a recently released video. The gamer may start a video watching app in a small window on their display without disrupting other gamers. The video watching app might cost the gamer some money or might count against the gamer's data plan. Thus, launching the video watching app using the gamer's context may allow this direct billing.
  • the operation may be intended to be performed in an application launched from the shared launch surface. In this case, the operation may be controlled, at least in part, by the user specific data.
  • apparatus 500 may also include a communication circuit that is configured to communicate with an external source to facilitate accessing or processing action data, user data, biometric identity data, or other data associated with attributing a user action.
  • the set 530 of logics may interact with a presentation service 560 to facilitate displaying data using different presentations for different devices.
  • Figure 6 illustrates an apparatus 600 that is similar to apparatus 500 ( Figure 5).
  • apparatus 600 includes a processor 610, a memory 620, a set of logics 630 (e.g., 632, 634, 636, 638) that correspond to the set of logics 530 ( Figure 5), a biometric sensor 699, and an interface 640.
  • apparatus 600 includes an additional fifth logic 639.
  • the fifth logic 639 may be configured to control ownership of the shared launch surface. The ownership may be controlled as a function of the action and the set of user specific data. For example, a first action may not require surface ownership to be changed while a second action may require surface ownership to change.
  • FIG. 7 illustrates an example cloud operating environment 700.
  • a cloud operating environment 700 supports delivering computing, processing, storage, data management, applications, and other functionality as an abstract service rather than as a standalone product. Services may be provided by virtual servers that may be implemented as one or more processes on one or more computing devices. In some embodiments, processes may migrate between servers without disrupting the cloud service.
  • shared resources e.g., computing, storage
  • computers including servers, clients, and mobile devices over a network.
  • Different networks e.g., Ethernet, Wi-Fi, 802.x, cellular
  • Users interacting with the cloud may not need to know the particulars (e.g., location, name, server, database) of a device that is actually providing the service (e.g., computing, storage).
  • Users may access cloud services via, for example, a web browser, a thin client, a mobile application, or in other ways.
  • Figure 7 illustrates an example action attribution service 760 residing in the cloud.
  • the action attribution service 760 may rely on a server 702 or service 704 to perform processing and may rely on a data store 706 or database 708 to store data. While a single server 702, a single service 704, a single data store 706, and a single database 708 are illustrated, multiple instances of servers, services, data stores, and databases may reside in the cloud and may, therefore, be used by the action attribution service 760.
  • Figure 7 illustrates various devices accessing the action attribution service 760 in the cloud.
  • the devices include a computer 710, a tablet 720, a laptop computer 730, a personal digital assistant 740, and a mobile device (e.g., cellular phone, satellite phone, wearable computing device) 750.
  • the action attribution service 760 may store, access, or process action data, user data, biometric data, mapping data, or other data associated with connecting a user action to a context and controlling an operation (e.g., application launch, application operation) based on the action, context, and mapping.
  • an operation e.g., application launch, application operation
  • FIG. 8 is a system diagram depicting an exemplary mobile device 800 that includes a variety of optional hardware and software components, shown generally at 802.
  • Components 802 in the mobile device 800 can communicate with other components, although not all connections are shown for ease of illustration.
  • the mobile device 800 may be a variety of computing devices (e.g., cell phone, smartphone, handheld computer, Personal Digital Assistant (PDA), wearable computing device, etc.) and may allow wireless two-way communications with one or more mobile communications networks 804, such as a cellular or satellite networks.
  • PDA Personal Digital Assistant
  • mobile communications networks 804 such as a cellular or satellite networks.
  • Mobile device 800 can include a controller or processor 810 (e.g., signal processor, microprocessor, ASIC, or other control and processing logic circuitry) for performing tasks including signal coding, data processing, input/output processing, power control, or other functions.
  • An operating system 812 can control the allocation and usage of the components 802 and support application programs 814.
  • the application programs 814 can include gaming applications, mobile computing applications (e.g., email applications, calendars, contact managers, web browsers, messaging applications), or other computing applications.
  • mobile device 800 may function as a game console or game controller.
  • Mobile device 800 can include memory 820.
  • Memory 820 can include nonremovable memory 822 or removable memory 824.
  • the non-removable memory 822 can include random access memory (RAM), read only memory (ROM), flash memory, a hard disk, or other memory storage technologies.
  • the removable memory 824 can include flash memory or a Subscriber Identity Module (SIM) card, which is well known in GSM communication systems, or other memory storage technologies, such as "smart cards.”
  • SIM Subscriber Identity Module
  • the memory 820 can be used for storing data or code for running the operating system 812 and the applications 814.
  • Example data can include web pages, text, images, sound files, video data, or other data sets to be sent to or received from one or more network servers or other devices via one or more wired or wireless networks.
  • the memory 820 can be used to store a subscriber identifier, such as an International Mobile Subscriber Identity (IMSI), and an equipment identifier, such as an International Mobile Equipment Identifier (IMEI).
  • IMSI International Mobile Subscriber Identity
  • IMEI International Mobile Equipment Identifier
  • the identifiers can be transmitted to a network server to identify users or equipment.
  • the mobile device 800 can support one or more input devices 830 including, but not limited to, a touchscreen 832, a microphone 834, a camera 836, a physical keyboard 838, or trackball 840.
  • the mobile device 800 may also support output devices 850 including, but not limited to, a speaker 852 and a display 854.
  • Other possible output devices can include piezoelectric or other haptic output devices. Some devices can serve more than one input/output function.
  • touchscreen 832 and display 854 can be combined in a single input/output device.
  • the input devices 830 can include a Natural User Interface (NUI).
  • NUI Natural User Interface
  • NUI is an interface technology that enables a user to interact with a device in a "natural" manner, free from artificial constraints imposed by input devices such as mice, keyboards, remote controls, and others.
  • NUI methods include those relying on speech recognition, touch and stylus recognition, gesture recognition (both on screen and adjacent to the screen), air gestures, head and eye tracking, voice and speech, vision, touch, gestures, and machine intelligence.
  • Other examples of a NUI include motion gesture detection using accelerometers/gyroscopes, facial recognition, three dimensional (3D) displays, head, eye, and gaze tracking, immersive augmented reality and virtual reality systems, all of which provide a more natural interface, as well as technologies for sensing brain activity using electric field sensing electrodes (EEG and related methods).
  • EEG electric field sensing electrodes
  • the operating system 812 or applications 814 can comprise speech-recognition software as part of a voice user interface that allows a user to operate the device 800 via voice commands.
  • the device 800 can include input devices and software that allow for user interaction via a user's spatial gestures, such as detecting and interpreting gestures to provide input to a gaming application.
  • a wireless modem 860 can be coupled to an antenna 891.
  • radio frequency (RF) filters are used and the processor 810 need not select an antenna configuration for a selected frequency band.
  • the wireless modem 860 can support two- way communications between the processor 810 and external devices.
  • the modem 860 is shown generically and can include a cellular modem for communicating with the mobile communication network 804 and/or other radio-based modems (e.g., Bluetooth 864 or Wi- Fi 862).
  • the wireless modem 860 may be configured for communication with one or more cellular networks, such as a Global system for mobile communications (GSM) network for data and voice communications within a single cellular network, between cellular networks, or between the mobile device and a public switched telephone network (PSTN).
  • GSM Global system for mobile communications
  • PSTN public switched telephone network
  • NFC element 892 facilitates having near field communications .
  • the mobile device 800 may include at least one input/output port 880, a power supply 882, a satellite navigation system receiver 884, such as a Global Positioning System (GPS) receiver, an accelerometer 886, or a physical connector 890, which can be a Universal Serial Bus (USB) port, IEEE 1394 (Fire Wire) port, RS-232 port, or other port.
  • GPS Global Positioning System
  • the illustrated components 802 are not required or all-inclusive, as other components can be deleted or added.
  • Mobile device 800 may include an attribution logic 899 that is configured to provide a functionality for the mobile device 800.
  • attribution logic 899 may provide a client for interacting with a service (e.g., service 760, figure 7). Portions of the example methods described herein may be performed by attribution logic 899. Similarly, attribution logic 899 may implement portions of apparatus described herein.
  • references to "one embodiment”, “an embodiment”, “one example”, and “an example” indicate that the embodiment(s) or example(s) so described may include a particular feature, structure, characteristic, property, element, or limitation, but that not every embodiment or example necessarily includes that particular feature, structure, characteristic, property, element or limitation. Furthermore, repeated use of the phrase “in one embodiment” does not necessarily refer to the same embodiment, though it may.
  • Data store refers to a physical or logical entity that can store data.
  • a data store may be, for example, a database, a table, a file, a list, a queue, a heap, a memory, a register, and other physical repository.
  • a data store may reside in one logical or physical entity or may be distributed between two or more logical or physical entities.
  • Logic includes but is not limited to hardware, firmware, software in execution on a machine, or combinations of each to perform a function(s) or an action(s), or to cause a function or action from another logic, method, or system.
  • Logic may include a software controlled microprocessor, a discrete logic (e.g., ASIC), an analog circuit, a digital circuit, a programmed logic device, a memory device containing instructions, and other physical devices.
  • Logic may include one or more gates, combinations of gates, or other circuit components. Where multiple logical logics are described, it may be possible to incorporate the multiple logical logics into one physical logic.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Cardiology (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • Acoustics & Sound (AREA)
  • Dermatology (AREA)
  • Biomedical Technology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Collating Specific Patterns (AREA)
PCT/US2014/038683 2013-05-20 2014-05-20 Attributing user action based on biometric identity WO2014189868A1 (en)

Priority Applications (9)

Application Number Priority Date Filing Date Title
CA2910347A CA2910347A1 (en) 2013-05-20 2014-05-20 Attributing user action based on biometric identity
RU2015149776A RU2668984C2 (ru) 2013-05-20 2014-05-20 Атрибуция действия пользователя на основе биометрической идентификации
CN201480029405.6A CN105431813B (zh) 2013-05-20 2014-05-20 基于生物计量身份归属用户动作
MX2015016069A MX352436B (es) 2013-05-20 2014-05-20 Atribución de acción de usuario con base en identidad biométrica.
KR1020157035957A KR20160010608A (ko) 2013-05-20 2014-05-20 생체 인식 아이덴티티에 기초한 유저 액션의 애트리뷰션
BR112015028449A BR112015028449A2 (pt) 2013-05-20 2014-05-20 atribuir ação do usuário com base em identidade biométrica
EP14734598.7A EP3000026B1 (en) 2013-05-20 2014-05-20 Attributing user action based on biometric identity
JP2016514994A JP2016524751A (ja) 2013-05-20 2014-05-20 バイオメトリックアイデンティティに基づくユーザアクションの帰属先の判定
AU2014268811A AU2014268811A1 (en) 2013-05-20 2014-05-20 Attributing user action based on biometric identity

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/897,466 2013-05-20
US13/897,466 US9129478B2 (en) 2013-05-20 2013-05-20 Attributing user action based on biometric identity

Publications (1)

Publication Number Publication Date
WO2014189868A1 true WO2014189868A1 (en) 2014-11-27

Family

ID=51059548

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/038683 WO2014189868A1 (en) 2013-05-20 2014-05-20 Attributing user action based on biometric identity

Country Status (11)

Country Link
US (1) US9129478B2 (ru)
EP (1) EP3000026B1 (ru)
JP (1) JP2016524751A (ru)
KR (1) KR20160010608A (ru)
CN (1) CN105431813B (ru)
AU (1) AU2014268811A1 (ru)
BR (1) BR112015028449A2 (ru)
CA (1) CA2910347A1 (ru)
MX (1) MX352436B (ru)
RU (1) RU2668984C2 (ru)
WO (1) WO2014189868A1 (ru)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017054369A (ja) * 2015-09-10 2017-03-16 富士通株式会社 システム、情報処理装置、アラーム制御プログラム、およびアラーム制御方法

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9384013B2 (en) 2013-06-03 2016-07-05 Microsoft Technology Licensing, Llc Launch surface control
TWI584145B (zh) * 2013-12-06 2017-05-21 神盾股份有限公司 生物資料辨識設備、系統、方法及電腦可讀取式媒體
US9937415B1 (en) 2013-12-17 2018-04-10 Amazon Technologies, Inc. Virtual controller for touchscreen
US10967268B2 (en) * 2015-02-27 2021-04-06 Sony Interactive Entertainment Inc. Information processing apparatus
CN106302330B (zh) * 2015-05-21 2021-01-05 腾讯科技(深圳)有限公司 身份验证方法、装置和系统
CN105159569A (zh) * 2015-10-15 2015-12-16 广东欧珀移动通信有限公司 一种设置主题背景的方法及装置
US9901818B1 (en) 2016-02-19 2018-02-27 Aftershock Services, Inc. Systems and methods for regulating access to game content of an online game
US10576379B1 (en) 2016-02-19 2020-03-03 Electronic Arts Inc. Systems and methods for adjusting online game content and access for multiple platforms
US10134227B1 (en) * 2016-02-19 2018-11-20 Electronic Arts Inc. Systems and methods for making game content from a single online game accessible to users via multiple platforms
US9919218B1 (en) 2016-02-19 2018-03-20 Aftershock Services, Inc. Systems and methods for providing virtual reality content in an online game
US9864431B2 (en) 2016-05-11 2018-01-09 Microsoft Technology Licensing, Llc Changing an application state using neurological data
US10203751B2 (en) 2016-05-11 2019-02-12 Microsoft Technology Licensing, Llc Continuous motion controls operable using neurological data
US10192399B2 (en) * 2016-05-13 2019-01-29 Universal Entertainment Corporation Operation device and dealer-alternate device
US9811992B1 (en) 2016-06-06 2017-11-07 Microsoft Technology Licensing, Llc. Caregiver monitoring system
CN107038361B (zh) * 2016-10-13 2020-05-12 创新先进技术有限公司 基于虚拟现实场景的业务实现方法及装置
CN108536314A (zh) * 2017-03-06 2018-09-14 华为技术有限公司 用户身份识别方法及装置
CN107391983B (zh) * 2017-03-31 2020-10-16 创新先进技术有限公司 一种基于物联网的信息处理方法及装置
US10438584B2 (en) 2017-04-07 2019-10-08 Google Llc Multi-user virtual assistant for verbal device control
US10449440B2 (en) * 2017-06-30 2019-10-22 Electronic Arts Inc. Interactive voice-controlled companion application for a video game
CN108499110B (zh) * 2018-03-13 2019-11-01 深圳市达仁基因科技有限公司 游戏中的虚拟人物生成方法、装置、存储介质、计算机设备
RU2715300C1 (ru) * 2019-03-12 2020-02-26 Алексей Федорович Хорошев Способ создания данных соответствия объекта и информации о нем
JP6673513B1 (ja) * 2019-03-26 2020-03-25 株式会社セガゲームス ゲームシステム
US11786694B2 (en) 2019-05-24 2023-10-17 NeuroLight, Inc. Device, method, and app for facilitating sleep

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090089055A1 (en) * 2007-09-27 2009-04-02 Rami Caspi Method and apparatus for identification of conference call participants
WO2013048421A1 (en) * 2011-09-29 2013-04-04 Hewlett-Packard Development Company, L.P. Personalization data of an active application

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6993659B2 (en) 2002-04-23 2006-01-31 Info Data, Inc. Independent biometric identification system
US7867083B2 (en) * 2003-03-25 2011-01-11 Igt Methods and apparatus for limiting access to games using biometric data
US8190907B2 (en) * 2004-08-11 2012-05-29 Sony Computer Entertainment Inc. Process and apparatus for automatically identifying user of consumer electronics
JP4849829B2 (ja) * 2005-05-15 2012-01-11 株式会社ソニー・コンピュータエンタテインメント センタ装置
US8147332B2 (en) * 2005-10-21 2012-04-03 Broadcom Corporation Method of indicating the ordinal number of a player in a wireless gaming system
JP2007158654A (ja) * 2005-12-05 2007-06-21 Hitachi Ltd 放送受信装置
US20080032787A1 (en) * 2006-07-21 2008-02-07 Igt Customizable and personal game offerings for use with a gaming machine
JP5042651B2 (ja) * 2007-01-31 2012-10-03 株式会社バンダイナムコゲームス プログラム、情報記憶媒体及びゲーム装置
US8269822B2 (en) * 2007-04-03 2012-09-18 Sony Computer Entertainment America, LLC Display viewing system and methods for optimizing display view based on active tracking
CN101884188A (zh) 2007-07-12 2010-11-10 创新投资有限责任公司 身份鉴别和受保护访问系统、组件和方法
US8539357B2 (en) 2007-11-21 2013-09-17 Qualcomm Incorporated Media preferences
CN102016877B (zh) * 2008-02-27 2014-12-10 索尼计算机娱乐美国有限责任公司 用于捕获场景的深度数据并且应用计算机动作的方法
US8368753B2 (en) * 2008-03-17 2013-02-05 Sony Computer Entertainment America Llc Controller with an integrated depth camera
US8430750B2 (en) * 2008-05-22 2013-04-30 Broadcom Corporation Video gaming device with image identification
US8487938B2 (en) * 2009-01-30 2013-07-16 Microsoft Corporation Standard Gestures
US20110279368A1 (en) * 2010-05-12 2011-11-17 Microsoft Corporation Inferring user intent to engage a motion capture system
US8640021B2 (en) * 2010-11-12 2014-01-28 Microsoft Corporation Audience-based presentation and customization of content
US10150028B2 (en) * 2012-06-04 2018-12-11 Sony Interactive Entertainment Inc. Managing controller pairing in a multiplayer game

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090089055A1 (en) * 2007-09-27 2009-04-02 Rami Caspi Method and apparatus for identification of conference call participants
WO2013048421A1 (en) * 2011-09-29 2013-04-04 Hewlett-Packard Development Company, L.P. Personalization data of an active application

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BRYAN A. GARNER: "A Dictionary of Modern Legal Usage", vol. 624, 1995

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017054369A (ja) * 2015-09-10 2017-03-16 富士通株式会社 システム、情報処理装置、アラーム制御プログラム、およびアラーム制御方法

Also Published As

Publication number Publication date
JP2016524751A (ja) 2016-08-18
KR20160010608A (ko) 2016-01-27
EP3000026A1 (en) 2016-03-30
US20140342818A1 (en) 2014-11-20
CN105431813A (zh) 2016-03-23
RU2668984C2 (ru) 2018-10-05
AU2014268811A1 (en) 2015-11-12
CN105431813B (zh) 2019-05-10
BR112015028449A2 (pt) 2017-07-25
CA2910347A1 (en) 2014-11-27
MX2015016069A (es) 2016-03-21
RU2015149776A (ru) 2017-05-24
MX352436B (es) 2017-11-24
US9129478B2 (en) 2015-09-08
EP3000026B1 (en) 2019-10-02

Similar Documents

Publication Publication Date Title
EP3000026B1 (en) Attributing user action based on biometric identity
US9579567B2 (en) Managing controller pairings
US10203838B2 (en) Avatar personalization in a virtual environment
US9928462B2 (en) Apparatus and method for determining user's mental state
US20170052760A1 (en) Voice-triggered macros
US20120315983A1 (en) Account management of computer system
WO2013028279A1 (en) Use of association of an object detected in an image to obtain information to display to a user
CN103890696A (zh) 经认证的手势辨识
US20150234468A1 (en) Hover Interactions Across Interconnected Devices
US20150231491A1 (en) Advanced Game Mechanics On Hover-Sensitive Devices
EP3092553A1 (en) Hover-sensitive control of secondary display
US20200019242A1 (en) Digital personal expression via wearable device
CN108920202A (zh) 应用预加载管理方法、装置、存储介质及智能终端
WO2019184679A1 (zh) 游戏实现方法、装置、存储介质及电子设备
Guna et al. Intuitive gesture based user identification system
KR102082418B1 (ko) 전자 장치 및 그 제어 방법
US11562271B2 (en) Control method, terminal, and system using environmental feature data and biological feature data to display a current movement picture
CN112774185A (zh) 牌类虚拟场景中的虚拟牌控制方法、装置及设备
Guna et al. User identification approach based on simple gestures
TWI729323B (zh) 互動式遊戲系統
US9384013B2 (en) Launch surface control
JP6672073B2 (ja) 認証装置、認証方法、およびプログラム
CN104036133A (zh) 一种棋牌游戏系统
US20230128658A1 (en) Personalized vr controls and communications

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201480029405.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14734598

Country of ref document: EP

Kind code of ref document: A1

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
ENP Entry into the national phase

Ref document number: 2016514994

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2910347

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2014734598

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2014268811

Country of ref document: AU

Date of ref document: 20140520

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2015149776

Country of ref document: RU

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: MX/A/2015/016069

Country of ref document: MX

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112015028449

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 20157035957

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 112015028449

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20151112