JP5668011B2 - A system for tracking user actions in an environment - Google Patents

A system for tracking user actions in an environment Download PDF

Info

Publication number
JP5668011B2
JP5668011B2 JP2012080340A JP2012080340A JP5668011B2 JP 5668011 B2 JP5668011 B2 JP 5668011B2 JP 2012080340 A JP2012080340 A JP 2012080340A JP 2012080340 A JP2012080340 A JP 2012080340A JP 5668011 B2 JP5668011 B2 JP 5668011B2
Authority
JP
Japan
Prior art keywords
input channel
information
system
controller
channel information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2012080340A
Other languages
Japanese (ja)
Other versions
JP2012164330A (en
Inventor
エム ザレウスキー ゲイリー
エム ザレウスキー ゲイリー
エル マークス リチャード
エル マークス リチャード
マオ シャドン
マオ シャドン
Original Assignee
ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー
ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US11/381,721 priority Critical patent/US8947347B2/en
Priority to US11/381,721 priority
Priority to US11/381,725 priority
Priority to US11/381,725 priority patent/US7783061B2/en
Priority to US11/381,728 priority
Priority to US11/418,988 priority
Priority to US11/381,724 priority
Priority to US11/418,989 priority patent/US8139793B2/en
Priority to US11/429,133 priority
Priority to US11/381,727 priority
Priority to US11/429,414 priority
Priority to US11/418,989 priority
Priority to US11/418,988 priority patent/US8160269B2/en
Priority to US11/429,133 priority patent/US7760248B2/en
Priority to US11/381,729 priority patent/US7809145B2/en
Priority to US11/381,727 priority patent/US7697700B2/en
Priority to US11/429,414 priority patent/US7627139B2/en
Priority to US11/429,047 priority
Priority to US11/429,047 priority patent/US8233642B2/en
Priority to US11/381,724 priority patent/US8073157B2/en
Priority to US11/381,729 priority
Priority to US11/381,728 priority patent/US7545926B2/en
Priority to US29/259,348 priority
Priority to US11/382,033 priority patent/US8686939B2/en
Priority to US79803106P priority
Priority to US11/382,038 priority
Priority to US60/798,031 priority
Priority to US29/259,349 priority
Priority to US11/382,032 priority
Priority to US29/259,350 priority
Priority to US11/382,035 priority
Priority to US11/382,034 priority
Priority to US29259349 priority
Priority to US11/382,033 priority
Priority to US11/382,035 priority patent/US8797260B2/en
Priority to US29259348 priority
Priority to US11/382,036 priority
Priority to US11/382,031 priority patent/US7918733B2/en
Priority to US11/382,037 priority
Priority to US11/382,032 priority patent/US7850526B2/en
Priority to US11/382,036 priority patent/US9474968B2/en
Priority to US11/382,037 priority patent/US8313380B2/en
Priority to US11/382,038 priority patent/US7352358B2/en
Priority to US11/382,034 priority patent/US20060256081A1/en
Priority to US29/259,350 priority patent/USD621836S1/en
Priority to US11/382,031 priority
Priority to US11/382,040 priority
Priority to US11/382,043 priority patent/US20060264260A1/en
Priority to US11/382,039 priority patent/US9393487B2/en
Priority to US11/382,041 priority patent/US7352359B2/en
Priority to US11/382,039 priority
Priority to US11/382,040 priority patent/US7391409B2/en
Priority to US11/382,041 priority
Priority to US11/382,043 priority
Application filed by ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー, ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー filed Critical ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー
Publication of JP2012164330A publication Critical patent/JP2012164330A/en
Application granted granted Critical
Publication of JP5668011B2 publication Critical patent/JP5668011B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/02Accessories
    • A63F13/06Accessories using player-operated means for controlling the position of a specific area display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/424Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/0093Other optical systems; Other optical apparatus with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1081Input via voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light

Description

  The present invention relates to an interface between a human and a computer, and more particularly to processing of multi-channel input for tracking user operations of one or more controllers.

  Computer entertainment systems typically include handheld controllers, game controllers, or other controllers. A user or player uses a controller to send commands or other instructions to the entertainment system to control the video game or other simulation being played. For example, the controller may be provided with an operation unit such as a joystick operated by the user. The manipulated variable of the joystick is converted from an analog value to a digital value and transmitted to the main frame of the game machine. The controller may be provided with buttons that can be operated by the user.

  The present invention has been developed with respect to these and other background information elements.

1 is a diagram illustrating a video game system that operates in accordance with an embodiment of the present invention. FIG. It is a perspective view of the controller concerning an embodiment of the invention. It is the three-dimensional schematic diagram which illustrated the accelerometer which can be used in the controller which concerns on embodiment of this invention. 1 is a block diagram of a system for mixing various control inputs in accordance with an embodiment of the present invention. It is a block diagram of a part of the video game system of FIG. FIG. 5 is a flow diagram of a method for tracking a controller of a video game system according to an embodiment of the present invention. FIG. 3 is a flow diagram illustrating a method for utilizing position and / or orientation information during game play in a video game system according to an embodiment of the present invention. 1 is a block diagram illustrating a video game system according to an embodiment of the present invention. It is a block diagram of the implementation by the cell processor of the video game system which concerns on embodiment of this invention.

  The teachings of the present invention can be readily understood by considering the following detailed description in conjunction with the accompanying drawings, in which:

Priority claim This application is based on US patent application Ser. No. 11 / 381,729 (inventor: Shadon Mao, title of invention: “micro-microphone array”, agent case number: SCEA05062US00, filing date: May 2006. 4th), US patent application 11 / 381,728 (inventor: Shadon Mao, title of invention: “echo and noise cancellation”, agent case number: SCEA05064US00, filing date: May 4, 2006), US Patent application 11 / 381,725 (inventor: Shadon Mao, title of invention: “target speech detection method and apparatus”, agent case number: SCEA05072US00, filing date: May 4, 2006), US patent application 11 / 381,727 (Inventor: Shadon Mao, title of invention: “Electronics by far-field microphone on console” Device noise removal ”, agent case number: SCEA05073US00, filing date: May 4, 2006), US patent application 11 / 381,724 (inventor: Shadon Mao, title of invention:“ target speech detection and character ” Method and apparatus for tectarization ", agent case number: SCEA05079US00, filing date: May 4, 2006, US patent application 11 / 381,721 (inventor: Shadon Mao, title of invention:" computer interaction All rights are hereby incorporated by reference: “Selective sound source listening in conjunction with processing”, agent case number: SCEA04005JUMBOUS, filing date: May 4, 2006).

  This application is further related to US patent application 11 / 382,031 (inventor: Gary Zalewski et al., Title of invention: “multi-input game control mixer”, agent case number: SCEA06MXR1, filed May 6, 2006). US patent application 11 / 382,032 (inventor: Gary Zalewski et al., Title of invention: "system for tracking user actions in the environment", agent case number: SCEA06MXR2, filed May 6, 2006. ) And are incorporated herein by reference.

  This application is further related to co-pending US patent application 11 / 418,988 (inventor: Shadon Mao, title of invention: “Method and apparatus for adjusting listening area for obtaining sound”, agent Case number: SCEA-00300, filing date: May 4, 2006), the entire disclosure of which is incorporated herein by reference. This application further includes co-pending U.S. patent application 11 / 418,989 (inventor: Shadon Mao, title of invention: "Method and apparatus for acquiring audio signals based on visual images", agent case. No .: SCEA-00400, filing date: May 4, 2006), the entire disclosure of which is incorporated herein by reference. This application is further related to co-pending U.S. Patent Application 11 / 429,047 (inventor: Shadon Mao, title of invention: “Method and Apparatus for Acquiring Audio Signal Based on Signal Position”, Attorney (Case Number: SCEA-50050, filing date: May 4, 2006), the entire disclosure of which is incorporated herein by reference. This application is further related to co-pending US patent application 11 / 429,133 (inventor: Richard L. Marks, title: “selective sound source listening in conjunction with computer interaction processing”, agent case number: SCEA04005US01. -SONYP045, filing date: May 4, 2006), the entire disclosure of which is incorporated herein by reference. This application is further related to co-pending U.S. Patent Application 11 / 429,414 (inventor: Richard L. Marks, title of invention: "input device for computer image and sound intensity processing and interface with computer program") , Agent case number: SONYP052, filing date: May 4, 2006), the entire disclosure of which is incorporated herein by reference.

  This application is further related to co-pending US patent application 11 / 382,033 (invention name: “3D input control system, method and apparatus”, agent case number: SCEA06INRT1, filing date: May 6, 2006). ), The entire disclosure of which is incorporated herein by reference. This application is further related to co-pending US patent application 11 / 382,035 (Title of Invention: “Inertically Traceable Handheld Computer”, Agent Case Number: SCEA06INRT2, Filing Date: May 6, 2006) The entire disclosure of which is hereby incorporated by reference. This application is further related to co-pending U.S. Patent Application 11 / 382,036 (Title of Invention: “Method and Apparatus for Applying Gearing Effect to Visual Tracking”, Agent Case Number: SONYP058A, Filing Date: 2006 The disclosure of which is hereby incorporated by reference. This application is further related to co-pending US patent application 11 / 382,041 (Title of Invention: “Method and Apparatus for Applying Gearing Effect to Inertia Tracking”, Attorney Case Number: SONYP058B, Filing Date: 2006 The disclosure of which is hereby incorporated by reference. This application is further related to co-pending US patent application 11 / 382,038 (Title of Invention: “Method and Apparatus for Applying Gearing Effect to Acoustic Tracking”, Agent Case Number: SONYP058C, Filing Date: 2006 The disclosure of which is hereby incorporated by reference. This application further includes co-pending US patent application 11 / 382,040 (Title of Invention: “Method and Apparatus for Applying Gearing Effects to Multi-Channel Mixed Inputs”, Attorney Case Number: SONYP058D, Filing Date. : May 6, 2006), the entire disclosure of which is incorporated herein by reference. This application is further related to co-pending US patent application 11 / 382,034 (invention name: “mechanism for detecting and tracking user operation of game controller body”, agent case number: SCEA05082US00, filing date: 2006). The disclosure of which is hereby incorporated by reference. This application is further related to co-pending U.S. Patent Application 11 / 382,037 (Title of Invention: "Mechanism for Converting Handheld Computer Motion into Input to System", Agent Case Number: 86324, Filing Date: (6 May 2006), the entire disclosure of which is incorporated herein by reference. This application is further related to co-pending US patent application 11 / 382,043 (Title of Invention: “Detectable and Traceable Handheld Computer”, Agent Case Number: 86325, Filing Date: May 6, 2006) The entire disclosure of which is hereby incorporated by reference. This application is further related to co-pending U.S. Patent Application 11 / 382,039 (Title: “Method for Mapping Handheld Computer Movements to Game Commands”, Agent Case Number: 86326, Filing Date: 2006) May 6), the entire disclosure of which is incorporated herein by reference. This application is further related to co-pending U.S. patent application 29 / 259,349 (invention name: “controller with trademark infrared port”, agent case number: SCEA06007US00, filing date: May 6, 2006) The entire disclosure of which is hereby incorporated by reference. This application is further related to co-pending U.S. Patent Application 29 / 259,350 (Invention name: "Controller with tracking sensor (TM)", agent case number: SCEA06008US00, filing date: May 6, 2006) The entire disclosure of which is hereby incorporated by reference. This application further gives priority to co-pending US patent application 60 / 798,031 (Title of Invention: “Dynamic Object Interface”, Agent Case Number: SCEA06009US00, Filing Date: May 6, 2006). All the disclosures of which are hereby incorporated by reference. This application is further related to co-pending US patent application 29 / 259,348 (Title of Invention: "Tracked Controller Device (TM)", Agent Case Number: SCEA06010US00, Filing Date: May 6, 2006) The entire disclosure of which is hereby incorporated by reference.

-Cross-reference to related applications This application is based on U.S. Patent Application 10 / 207,677 (Title of Invention: "Man Machine Interface Using Deformable Device", Filing Date: July 27, 2002), US Patent Application 10 / 650,409 (Title of Invention: “Voice Input System”, Filing Date: August 27, 2003), US Patent Application 10 / 663,236 (Title of Invention: “Tracked Head Movements” Method and apparatus for adjusting the field of view of a displayed scene according to US patent application Ser. No. 10 / 759,782 (Title of Invention: “Method for optical input device” And device ", filing date: January 16, 2004), U.S. patent application 10 / 820,469 (invention name:" method and device for detecting and eliminating sound disturbance ", filing date: April 2004) 7th ), U.S. Patent Application No. 11 / 301,673 (Title of Invention: “Method for Realizing Pointing Interface via Camera Tracking Using Relative Position of Head and Hand”, Filing Date: December 12, 2005) , U.S. Patent Application 60 / 718,145 (Title of Invention: “Examples of Audio, Video, Simulation, and User Interface”, filing date: September 15, 2005), the entire disclosure of which is here Incorporated as a reference.

DESCRIPTION OF SPECIFIC EMBODIMENTS The following detailed description includes a number of specific details for purposes of illustration, but it will be apparent to those skilled in the art that many variations and permutations to the details below are within the scope of the invention. It is being recognized. Accordingly, the embodiments of the invention described below are described without losing the generality of the claimed invention and without adding limitations.

  Various embodiments of the methods, apparatus, mechanisms and systems described herein provide for detection, acquisition, and tracking of movement, movement, and / or operation by a user of the entire controller. The detected movement, movement, and / or manipulation of the entire controller by the user may be used as further commands to control various aspects of the game being played or other simulations.

  The detection and tracking of the operation by the user of the game controller main body can be realized by various methods. For example, inertial sensors such as accelerometers or gyroscopes, image captures such as digital cameras, etc. can be used with computer entertainment systems to detect movements of the handheld controller body and convert them into actions in the game . An example of tracking a controller with an inertial sensor is described, for example, in US patent application 11 / 382,033 (Title of Invention: “3D Input Control System, Method and Apparatus”, Agent Case Number: SCEA06INRT1). , Incorporated herein by reference. An example of tracking a controller using image capture is, for example, US Patent Application 11 / 382,034 (Title: “Mechanism for Detecting and Tracking User Operation of Game Controller Body”, Agent Case Number: SCEA05082US00) ) And incorporated herein by reference. Further, the controller and / or user may be tracked acoustically using a microphone array and appropriate signal processing. Examples of such acoustic tracking are described, for example, in US patent application Ser. No. 11 / 381,721, incorporated herein by reference.

  Acoustic sensors, inertial sensors, and image capture are used separately to detect many different types of controller movements, such as up / down, twist, left / right movement, pulling, swinging bars, sticking out, etc. Or may be used in any combination. These movements may be associated with various commands in order to translate into actions in the game. The detection and tracking of game controller operations by the user can, for example, engage with swords and light sabers, use sticks to trace the shape of items, fight in various sports competitions, on-screen battles and other encounters The present invention can be used to realize various types of games, simulations, and the like that enable the user to fight in battles. The game program may be configured to track the movement of the controller and recognize a pre-registered specific gesture from the tracked movement. The game state can be changed with the recognition of one or more of these gestures as a trigger.

  In an embodiment of the present invention, controller path information obtained from these different sources may be mixed prior to analysis for gesture recognition. Tracking data from different sources (eg, sound, inertia, and image capture) may be mixed in a way that improves the likelihood of gesture recognition.

  FIG. 1 illustrates a system 100 that operates in accordance with an embodiment of the present invention. As shown, the computer entertainment console 102 may be connected to a television or other video display device 104 for displaying images of a video game or other simulation. The game or other simulation may be stored on a DVD, CD, flash memory, USB memory, or other storage medium 106 inserted into the console 102. A user or player 108 operates a game controller 110 to control a video game or other simulation. In FIG. 2, the game controller 110 includes an inertial sensor 112 that generates a signal in response to the position, movement, direction, or change in direction of the game controller 110. In addition to the inertial sensor, the game controller 110 may include a conventional control input device such as a joystick 111, buttons 113, R1, and L1.

  In operation, the user 108 physically moves the controller 110. For example, the user 108 may move the controller 110 in any direction, for example, up, down, left, and right, twist, rotate, shake, pull, or poke. These movements of the controller 110 themselves may be detected and acquired by the camera 114 for tracking through analysis of signals from the inertial sensor 112 in the manner described below. Referring again to FIG. 1, the system 100 may optionally include a camera or other video acquisition device 114. The camera may be provided at a position where the controller 110 is within the field of view 116 thereof. Analysis of the image from the video acquisition device 114 may be used in connection with analysis of data from the inertial sensor 112. As shown in FIG. 2, the controller 110 may be selectively provided with light sources such as LEDs 202, 204, 206, 208 to facilitate tracking through video analysis. These may be mounted on the main body of the controller 110. Here, the term “main body” means a part of the game controller 110 that is held by a hand (or worn if it is a wearable game controller).

  Such video analysis aimed at tracking the controller 110 is described, for example, in US patent application Ser. No. 11 / 382,034 (Title for Invention: “Mechanism for Detecting and Tracking User Operation of Game Controller Body”. ", Agent case number: SCEA05082US00), incorporated herein by reference. Console 102 may include an acoustic transducer, such as microphone array 118. The controller 110 may include an acoustic signal generator 210 (eg, a speaker) to provide a sound source for facilitating acoustic tracking and acoustic signal processing of the controller 110 by the microphone array 118. That technique is described, for example, in US patent application Ser. No. 11 / 381,724, incorporated herein by reference.

  In general, signals from inertial sensors are used to generate position and orientation data for controller 110. These data are used to calculate a number of physical aspects of the movement of the controller 110, eg, any telemetry data of the controller 110 such as acceleration and velocity along any axis, tilt, gradient, yaw, rotation, etc. May be. Here, telemetry generally refers to telemetry and reporting of the information of interest to a system or system designer or operator.

  By detecting and tracking the movement of the controller 110, it can be determined whether a predefined movement of the controller 110 has been performed. That is, a specific movement pattern or gesture of the controller 110 can be defined in advance and used as an input command for a game or other simulation. For example, a gesture that pushes down the controller 110 can be defined as one command, a gesture that twists the controller 110 can be defined as another command, and a gesture that shakes the controller 110 can be defined as another command. In this way, using the method in which the user 108 physically moves the controller 110 as an input to control the game can provide a more exciting and entertaining experience for the user.

  The inertial sensor 112 may be an accelerometer, for example, but is not limited thereto. FIG. 3 shows an example of an accelerometer 300 in the form of a simple mass 302 that is elastically coupled to the frame 304 at four points, for example by springs 306, 308, 310, 312. The pitch axis and roll axis (indicated by X and Y, respectively) lie on a plane that intersects the frame. The yaw axis Z is a direction perpendicular to a plane including the pitch axis X and the roll axis Y. The frame 304 may be mounted on the controller 110 in any suitable manner. As frame 304 (and joystick controller 110) is accelerated and / or rotated, mass 302 is displaced relative to frame 304 and springs 306, 308, 310, 312 are subject to translational and / or rotational acceleration. Stretch depending on value and direction and / or pitch and / or roll and / or yaw angle. The amount of displacement of the mass 302 and / or the amount of expansion or contraction of the springs 306, 308, 310, 312 can be sensed by suitable sensors 314, 316, 318, 320, for example, and pitch and / or roll in a known or determinable manner. It is converted into a signal that depends on the acceleration.

  There are many different ways to track the position of the mass and / or the force applied to it. These methods include resistive strain gauge materials, optical sensors, magnetic sensors, Hall effect devices, piezoelectric devices, capacitive sensors, and the like. Embodiments of the present invention may include any number and type of sensors, and may include combinations of multiple types of sensors. Sensors 314, 316, 318, 320 may be gap electrodes disposed on mass 302. The capacitance between the mass and each electrode changes as the mass position changes relative to each electrode. Each electrode may be connected to a circuit that generates a signal related to the capacitance (and proximity) of the mass 302 to the electrode. Further, the springs 306, 308, 310, 312 may include resistive strain gauge sensors that generate signals related to spring expansion and contraction.

  In some embodiments, the frame 304 may be a gimbal that is mounted on the controller 110 in order for the accelerometer 300 to maintain a fixed orientation with respect to the pitch and / or roll and / or yaw axes. According to this, it is possible to map the X, Y, and Z axes of the controller directly to the corresponding axes in the real world without considering the inclination of the controller axis with respect to the coordinate axes in the real world.

  As described above, data from the inertial acquisition unit, the image acquisition unit, and the sound source may be analyzed to generate a path that tracks the position and / or orientation of the controller 110. As shown in the block diagram of FIG. 4, the system 400 according to the embodiment of the present invention may include an inertia analysis unit 402, an image analysis unit 404, and an acoustic analysis unit 406. Each of these analysis units receives a signal from the detection target environment 401. The analysis units 402, 404, and 406 may be realized by hardware, software (firmware), or a combination of two or more thereof. Each of the analysis units generates tracking information related to the position and / or direction of the target object. For illustration, the target object may be the controller 110 described above. The image analyzer 404 operates in connection with and with respect to the method described in US patent application 11 / 382,034 (Attorney Case Number: SCEA05082US00) and under the method. May be. Inertial analysis unit 402 is related to the method described in US patent application 11 / 382,033 (invention name: “three-dimensional input control system, method and apparatus”, agent case number: SCEI06INRT1), and It may operate to and with respect to forming regions under the method. The acoustic analyzer 406 may operate in connection with and with respect to the method described in US patent application Ser. No. 11 / 381,724.

  The analyzers 402, 404 and 406 may be considered to be associated with different channels of input of position and / or orientation information. The mixing unit 408 may accept a plurality of input channels, and these channels may include sample data that characterizes the detection target environment 401 mainly from the viewpoint of the channel. The position and / or direction information generated by the inertial analysis unit 402, the image analysis unit 404, and the acoustic analysis unit 406 may be combined and input to the mixing unit 408. The mixing unit 408 and the analysis units 402, 404, and 406 may receive an inquiry from the game software program 410, or may be configured to interrupt the game software in response to an event. Events may include gesture recognition events, gear changes, setting changes, noise level settings, sampling rate settings, mapping chain changes, and the like. Examples of these are described below. The mixing portion 408 may operate in connection with and with respect to the method described below and for forming regions under the method.

  As mentioned above, signals from different input channels, eg inertial sensors, video and / or acoustic sensors, are used to determine the movement and / or direction of the controller 110 during video game play according to the inventive method. , The inertial analysis unit 402, the image analysis unit 404, and the acoustic analysis unit 406 may be analyzed. Such a method may be implemented as a series of processors capable of executing program code instructions stored in a processor readable medium and executed on a digital processor. For example, as shown in FIG. 5A, the video game system 100 may include a console 102 that includes an inertial analysis unit 402, an image analysis unit 404, and an acoustic analysis unit 406, which are implemented by either hardware or software. For example, the analysis units 402, 404, and 406 may be realized as software instructions executed on an appropriate processor unit 502. For example, the processor unit 502 may be a digital processor such as a microprocessor of the type commonly used in video game consoles. Some of the instructions may be stored in the memory 506. In another example, the inertial analysis unit 402, the image analysis unit 404, and the acoustic analysis unit 406 may be realized by hardware such as an ASIC. The hardware of such an analysis unit may be arranged in the controller 110 or the console 102, or may be arranged apart from other places. In a hardware implementation, the analyzers 402, 404, 406 can be programmed in response to external signals from the processor 502, other remotely located signal sources connected via USB cable, wireless, or network. May be.

  Inertial analyzer 402 may include or execute instructions that analyze the signal generated by inertial sensor 112 and utilize information regarding the position and / or orientation of controller 110. Similarly, the image analysis unit 404 may execute a command for analyzing the image acquired by the image acquisition unit 114. Further, the acoustic analysis unit may execute a command for analyzing the image acquired by the microphone array 118. These signals and / or images may be received by the analyzers 402, 404, 406, as shown in block 512, as shown in the flow diagram 510 of FIG. 5B. As shown in block 514, the signals and / or images are analyzed 402, 404 to determine inertial tracking information 403, image tracking information 405, and acoustic tracking information 407 regarding the position and / or orientation of the controller 110. , 406. The tracking information 403, 405, 407 may be associated with one or more degrees of freedom. Six degrees of freedom are preferably tracked to represent the characteristics of the controller 110 or other tracked operation. The degrees of freedom may be related to controller tilt, yaw, roll, and position, velocity, or acceleration along the X, Y, and Z axes.

  As shown in block 516, the mixing unit 408 mixes the inertia information 403, the image information 405, and the acoustic information 407 to generate detailed position and / or direction information 409. For example, the mixing unit 408 may apply different weights to the inertia, image, and acoustic tracking information 403, 405, and 407 based on the game or environmental conditions, and take a weighted average. Further, the mixing unit 408 may include a mixing analysis unit 412 that analyzes the combined position / orientation information and results in unique “mixing” information that includes combinations of information generated by other analysis units. .

  In the embodiment of the present invention, the mixing unit 408 may assign distribution values to the tracking information 403, 405, and 407 from the analysis units 402, 404, and 406. As described above, a specific set of input control data may be averaged. However, in this embodiment, input control from a certain analysis unit is performed by weighting before the input control data is averaged. Data is more important in analysis than other input control data.

  The mixing unit 408 may assume many functions in connection with the present system. These functions include observation, correction, stabilization, differentiation, concatenation, routing, mixing, reporting, buffering, interrupting other processes, and analysis. These may be performed with respect to tracking information 403, 405, 407 received from one or more analysis units 402, 404, 406. Each of the analysis units 402, 404, and 406 may receive and deduct specific tracking information, but the mixing unit 408 optimizes the use of the received tracking information 403, 406, and 407 to obtain detailed tracking information 409. May be implemented to generate

  The analysis units 402, 404, 406, and the mixing unit 408 are preferably configured to provide tracking information in a similar output format. Tracking information parameters from any analyzer element 402, 404, 406 may be mapped to a single parameter in the analyzer. Alternatively, the mixing unit 408 generates tracking information for any of the analysis units 402, 404, 406 by processing one or more tracking information parameters from the one or more analysis units 402, 404, 406. Also good. The mixing unit may combine two or more tracking information elements of the same parameter type obtained from the analysis units 402, 404, and 406, or perform a function across multiple parameters of the tracking information generated by the analysis unit. Thus, a combined set of outputs may be generated with benefits generated from multiple channels of inputs.

  Detailed tracking information 409 may be utilized in the system 100 during video game play, as shown at block 518. In certain embodiments, position and / or orientation information may be used in connection with gestures made by the user 108 during game play. The mixing unit 408 may operate in conjunction with a gesture recognition unit 505 for associating at least one action in the game environment with one or more user actions from a user (eg, operation of a controller in space).

  As shown in the flow diagram 520 of FIG. 5C, the path of the controller 110 may be tracked using position and / or orientation information, as shown in block 522. In a non-limiting example, the path may include a series of points that indicate the position of the center of gravity of the controller with respect to a certain coordinate system. Each position point may be represented by one or more coordinates such as X, Y, and Z coordinates in an orthogonal coordinate system. In order to be able to monitor both the shape of the path and the progress of the controller along the path, time may be associated with each point on the path. Further, each point on the path may be associated with data indicating the direction of the controller, eg, one or more rotation angles with respect to the center of gravity of the controller. Furthermore, each point on the path may be related to the speed and acceleration of the center of gravity of the controller and the angular speed and acceleration about the center of gravity of the controller,

  As shown in block 524, the tracked path is compared with one or more stored paths corresponding to known and / or pre-recorded gestures 508 associated with the status of the video game being played. Also good. The recognition unit 505 may be configured to be able to recognize a user or process a sound-authenticated gesture or the like. For example, the user may be identified by the recognition unit 505 using a gesture and that the gesture can identify the user. Such a specific gesture may be recorded and included in a gesture 508 pre-recorded in the memory 506. In the recording process, sounds generated during gesture recording may be further stored. The detected environment is sampled and processed by the multi-channel analyzer. The processing unit may refer to the gesture model to determine, authenticate, and / or identify a user or object with high accuracy and performance based on voice or acoustic patterns.

  As shown in FIG. 5A, data 508 representing a gesture may be stored in memory 506. Non-limiting examples of gestures include, for example, throwing an object such as a ball, shaking an object such as a bat or golf club, moving a manual pump, opening or closing a door or window, rotating a steering wheel or other vehicle control means , Martial arts action such as punching, sanding action, waxing, waxing, painting the house, shaking, rattling, turning, throwing football, turning knob, 3D mouse action, scrolling action, known contour Motion, any recordable motion, back-and-forth motion along any vector, such as inflating the tire, but with any rotation in space, motion along the path, recorded within the noise level Have any exact stop and start times based on user actions that can be tracked and repeated Action, including action to put a key. Each of these gestures may be pre-recorded from the pass data and stored as a time based model. The comparison between the pass and the stored gesture may begin with a stable state assumption, and if the pass goes out of the stable state, the stored gesture may be compared with the pass through an erasure process. If there is no match at block 526, the analyzer may continue to track the controller's path at block 522. If there is a sufficient match between the pass (or part of it) and the stored gesture, the game state may be changed as shown at 528. Changing the game state includes, but is not limited to, interrupting, sending control signals, changing variables, and the like.

  Here is an example where this can happen. If the controller 110 determines that the path is out of the stable state, the analysis unit 402, 404, 406 or 412 tracks the operation of the controller 110. As long as the path of the controller 110 satisfies the path defined in the stored gesture model 508, these gestures may “hit”. If the controller 110 path deviates from any gesture model 508 (with a set noise tolerance), that gesture model is removed from the hit list. Each gesture reference model includes a time base in which gestures are recorded. The analysis unit 402, 404, 406, or 412 compares the controller pass data with the stored gesture 508 at the appropriate time index. The clock is reset by the occurrence of a steady state condition. When out of steady state (ie, when an action is tracked beyond the noise threshold), the hit list is populated with all possible gesture models. The clock is started and the controller movement is compared to the hit list. Again, the comparison is made over time. When any gesture in the hit list reaches the end of the gesture, it becomes a hit.

  In some embodiments, the mixing unit 408 and / or the individual analysis units 402, 404, 406, 412 may notify the game program when a particular event occurs. Examples of such events include:

-Zero acceleration point arrival interrupt (X, Y, and / or Z axis). In a certain game situation, the analysis unit may notify or interrupt a routine in the game program when the acceleration of the controller reaches an inflection point. For example, the user 108 may use the controller 110 to control a game avatar that represents a quarterback in a football simulation game. The analysis unit may track the controller representing the football via a path generated from the signal from the inertial sensor 112. A specific change in the acceleration of the controller 110 may signal a football release. In this regard, the analyzer can trigger another routine in a program, such as a physics simulation package, that simulates a football trajectory based on the controller's position at release and / or speed and / or direction. Also good.

-Interruption of a recognized new gesture.

  Further, the analysis unit may be set by one or more inputs. Examples of such inputs include:

-Noise level setting (X, Y or Z axis). The noise level may be a reference tolerance used when analyzing a user's hand trembling in a game.

・ Sampling rate setting. Here, the sampling rate refers to the frequency with which the analysis unit samples a signal from the inertial sensor. The sampling rate may be set to oversample or average the signal.

・ Gearing setting. Here, gearing generally refers to the ratio between the movement of the controller and the movement that occurs in the game. An example of such “gearing” in video game control is described in US patent application 11 / 382,040 (filed May 7, 2006, agent case number: SONYP058D), which is incorporated herein by reference. Incorporated.

・ Mapping chain settings. Here, the mapping chain refers to gesture model mapping. Gesture model mapping may be adapted to a specific input channel (eg, path data generated only from inertial sensor signals), or a mixed channel generated in the mixing section. Three input channels may be provided by two or more different analyzers similar to inertial analyzer 402. These include, among other things, the inertial analysis unit 402 described herein, eg, US patent application Ser. No. 11 / 382,034 (invention name: “mechanism for detecting and tracking user operation of game controller body”, agent case number. : SCEA05082US00, incorporated herein by reference), for example, an audio analyzer described in US patent application 11 / 381,721 (incorporated herein by reference). A mapping chain may be incorporated in the analysis unit. The mapping chain may be swapped out by the game during the game and set for the analysis unit and the mixing unit.

  Referring again to FIG. 5B, those skilled in the art will recognize that there are many ways to generate a signal from the inertial sensor 112 at block 512. Some examples have been described here. Referring to block 514, there are many ways to analyze the detection signal generated in block 512 to obtain tracking information related to the position and / or orientation of the controller 110. For example, the tracking information may include the following parameters individually or in any combination.

• Controller direction. The direction of the controller 110 may be expressed in radians, for example, with respect to pitch, roll, or yaw angle with respect to a reference direction. The rate of change in the direction of the controller (eg, angular velocity or angular acceleration) may be included in the position and / or direction information. For example, if inertial sensor 112 includes a gyroscope sensor, controller orientation information may be obtained directly in the form of one or more output values proportional to pitch, roll, or yaw angle.

The position of the controller (for example, the orthogonal coordinates X, Y, Z of the controller 110 in a certain reference frame).

• Controller X-axis speed.

• Y-axis speed of the controller.

・ Z-axis speed of the controller.

• X-axis acceleration of the controller.

• Y-axis acceleration of the controller.

• Z-axis acceleration of the controller.

  Regarding the position, velocity, and acceleration, the position and / or direction information may be expressed in a coordinate system other than the orthogonal coordinate system. For example, cylindrical coordinates or old coordinates may be used for position, velocity and acceleration. The acceleration information about the X, Y and Z axes may be obtained directly from a sensor in the form of an accelerometer as described herein, for example. The acceleration in the X, Y, and Z directions may be integrated over time from some initial time to determine the change in velocity in the X, Y, and Z directions. These speeds may be calculated by adding the change in speed to the known speeds in the X, Y, and Z directions at the initial time. The velocity in the X, Y and Z directions may be integrated over time to determine the displacement of the controller in the X, Y and Z directions. The positions in the X, Y, and Z directions may be determined by adding displacement to the known positions in the X, Y, and Z directions at the initial time.

-Stable state (Y / N). This special information indicates whether the controller is in a stable state. This information may be defined at any location and is subject to change. In a preferred embodiment, the steady state position may be a state where the controller is held in a higher or lower level direction, approximately at the same height as the user's waist.

  The time from the last stable state is generally data related to the time elapsed since the last stable state was detected. This time determination may be calculated at the actual time, as described above, or at the processor frequency or sampling period. The time data from the last steady state is important for resetting the controller tracking with respect to the initial position to ensure the accuracy of the character or object mapping in the game environment. This data is also important for determining possible actions or gestures that will be subsequently and exclusively executed in the gaming environment.

The gesture recognized last is generally the gesture recognized last by the gesture recognition unit 505 realized by hardware or software. The identification of the last recognized gesture is important with respect to the fact that the previous gesture may be related to a gesture that may subsequently be recognized or other actions performed in the gaming environment.

・ The last time the gesture was recognized.

  The above output may be sampled at an arbitrary timing by a game program or software.

  In the embodiment of the present invention, the mixing unit 408 may weight the tracking information 403, 405, and 407 from the analysis units 402, 404, and 406. As described above, certain combinations of input control data may be averaged. However, in this embodiment, input control data is weighted before averaging so that input control data from one analysis unit is more important in analysis than data from other.

  For example, the mixing unit 408 may request tracking information regarding acceleration and a stable state. The mixing unit 408 acquires the tracking information 403, 405, and 407 described above. The tracking information may include parameters related to acceleration and a stable state as described above, for example. Prior to averaging the data represented by this information, the mixing unit 408 may weight the tracking information sets 403, 405, and 407. For example, the acceleration parameters in the X and Y directions from the inertial analysis unit 402 may be weighted by 90%, but the acceleration data in the X and Y directions from the image analysis unit 406 has a weight of only 10%. May be attached. The tracking information 407 of the acoustic analysis unit includes an acceleration parameter, but the weight of 0%, that is, the value of data may be completely eliminated.

  Similarly, the tracking information in the Z direction from the inertia analysis unit 402 may be weighted by 10%, and the tracking information in the Z direction from the image analysis unit may be weighted by 90%. In this case, the tracking information 407 of the acoustic analysis unit may be given a weight of 0%. However, 100% weight may be given to the tracking information of the stable state from the acoustic analysis unit 406, and 0% weight may be given to the tracking information of the other analysis units.

  After appropriate weighting, the input control data calculates a weighted average input control data set that is subsequently analyzed by the gesture recognizer 505 and assigns that weight to be associated with a particular action in the gaming environment. May be averaged. The associated value may be defined in advance by the mixing unit 408 or a specific game title. As will be described later, the value may be a result of the mixing unit 408 identifying each quality of data acquired from various analysis units and dynamically adjusting the data. The adjustment may be a result of the creation of a historical knowledge base as to when specific data has a specific value in a specific environment and / or depending on the particularity of a given game title.

  The mixing unit 408 may be configured to operate dynamically during game play. For example, when the mixing unit 408 obtains various input control data, it indicates that the specific data continuously exceeds the acceptable range or data quality, or indicates a processing error of the related input device. It may be identified that incorrect data is reflected.

  Furthermore, certain conditions in the real world environment may change. For example, if the natural light in the game environment of the user's home increases from morning to noon, a problem occurs in the image data acquisition unit. Furthermore, if the neighborhood or family becomes noisy as the day passes, a problem occurs in the voice data acquisition unit. Similarly, when the user plays for several hours, if the user's reflexes become dull, problems arise in the interpretation of inertial data.

  In these cases, or in any other case where the quality of the particular format of the input control data is an issue, the mixing unit 408 may be configured as described above for a particular set of data obtained from a particular device. Weights may be assigned dynamically so that certain input control data is given higher or lower importance. Similarly, the game environment may switch the progress of a game when the need for a particular game changes to require reassignment of specific input control data values or needs.

  Similarly, the mixing unit 408 processes the specific data transmitted to the gesture recognition unit 505 inaccurately, slowly, or not at all based on processing errors or feedback data generated by the gesture recognition unit 505. You may recognize that. Corresponding to this feedback, or when a problem of these processes (for example, an error occurs when association is performed by the gesture recognition unit 505 even though the image analysis data is acceptable) Even if this is the case, the mixing unit 408 may adjust which input control data is to be obtained from which analysis unit. The mixing unit 408 may further request specific analysis and processing of the input control data by an appropriate analysis unit before the input control data is transmitted to the mixing unit 408. For example, the data may be reprocessed (eg, averaged) so that further accuracy is ensured and the data transmitted to the gesture recognition unit 505 is processed effectively and appropriately.

  In some embodiments, the mixing unit 408 may recognize that certain data is invalid, invalid, or out of a certain variable, and may request certain input control data or variables related to that data. Good. As a result, it is possible to replace erroneous data or to appropriately analyze and calculate specific data related to necessary variables.

  According to an embodiment of the present invention, a video game system and method of the type described above is implemented as shown in FIG. Video game system 600 may include a processor 601 and memory 602 (eg, RAM, DRAM, ROM, etc.). Furthermore, the video game system 600 may include a plurality of processors 601 when parallel processing is implemented. Memory 602 includes data and game program code 604 including portions configured as described above. In particular, the memory 602 may include inertial signal data 606 including path information stored in the controller described above. The memory 602 may further include stored gesture data 608, such as data indicating one or more gestures associated with the game program 604. Coded instructions executed on processor 602 may implement multi-input mixing unit 605 configured and functioning as described above.

  The system 600 may further include known support functions such as input / output (I / O) elements 611, a power supply (P / S) 612, a clock (CLK) 613, and a cache 614. The device 600 may include a mass storage device 615, such as a disk drive, CD-ROM drive, tape drive, etc., for storing programs and / or data. The controller may further include a display unit 616 and a user interface unit 618 to facilitate interaction between the controller 600 and the user. The display unit 616 may be in the form of a cathode ray tube (CRT) or flat display that displays text, numbers, display symbols, or images. User interface 618 may include a keyboard, mouse, joystick, light pen, or other device. In addition, the user interface 618 may include a microphone, video camera, or other signal conversion device to directly acquire the signal to be analyzed. As shown in FIG. 6, processor 601, memory 602, and other system 600 components may exchange signals (eg, code instructions and data) with each other via system bus 620.

  Microphone array 622 may be connected to system 600 via input / output function 611. The microphone array may include about 2 to 8, preferably about 4, microphones, and adjacent microphones may be separated by a distance of about 4 centimeters or less, preferably about 1 to 2 centimeters. The microphones in array 622 are preferably omnidirectional microphones. An optional image acquisition unit 623 (for example, a digital camera) may be connected to the apparatus 600 via the input / output function 611. One or more pointing actuators (P / A) 625 mechanically connected to the camera may exchange signals with the processor 601 via the input / output function 611.

  As used herein, the term input / output generally refers to any program, operation, or device that transfers data from or to system 600 and from or to a peripheral device. All data may be output from one device and considered to be input to another device. Peripheral devices include not only devices capable of both input and output, such as a writable CD-ROM, but also input devices such as a keyboard and a mouse, and output devices such as a printer. The term “peripheral device” is not only an internal device such as a CD-ROM drive, a CD-R drive, an internal modem, or other peripheral devices such as a flash memory reader / writer, a hard disk, but also a mouse, keyboard, printer, Also includes external devices such as a monitor, microphone, game controller, camera, external Zip drive, or scanner.

  In some embodiments of the present invention, the device 600 may be a video game unit that includes a controller 630 connected to the processor via an input / output function 611 by wire (eg, USB cable) or wirelessly. The controller 630 may have an analog joystick 631 and conventional buttons 633 that provide control signals commonly used during video game play. Such video games may be obtained by instructions from a program 604 stored in other processor readable media such as data readable by the processor and / or associated with memory 602, mass storage 615, etc. It may be realized.

  The joystick 631 is generally configured to show an operation along the X axis when the stick is moved left and right, and to show an operation along the Y axis when the stick is moved back and forth or up and down. In a joystick configured for three-dimensional motion, twisting the stick left (counterclockwise) or right (clockwise) may indicate motion along the Z axis. These three X, Y, and Z axes are often referred to as roll, pitch, and yaw, respectively, particularly in connection with airplanes.

  In addition to conventional features, the controller 630 may include one or more inertial sensors 632 that provide position and / or orientation information to the processor 601 via inertial signals. The direction information may include angle information such as tilt, roll, or yaw of the controller 630. For example, inertial sensor 632 may include any number and / or combination of accelerometers, gyroscopes, or tilt sensors. In a preferred embodiment, the inertial sensor 632 includes a tilt sensor for detecting the direction of the joystick controller relative to the tilt and roll axis, a first accelerometer for detecting acceleration along the yaw axis, and a yaw axis. A second accelerometer for detecting angular acceleration with respect to. The accelerometer may be implemented, for example, as a MEMS device that includes a mass mounted by one or more springs and a sensor for detecting mass displacement in one or more directions. A signal from a sensor that depends on the displacement of the mass may be used to determine the acceleration of the joystick controller 630. Such a technique may be realized by an instruction from the game program 604 stored in the memory 602 and executed by the processor 601.

  For example, an accelerometer suitable for the inertial sensor 632 may be a simple mass that is elastically coupled to the frame at three or four points, for example by a spring. The pitch and roll axes lie in a plane that intersects the frame mounted on the joystick controller 630. As the frame (and joystick controller 630) rotates about the pitch and roll axis, the mass is displaced under the influence of gravity and the springs expand and contract depending on the pitch and / or roll axis. Mass displacement is detected and converted into a signal that depends on the pitch and / or the amount of roll. Angular acceleration around the yaw axis or linear acceleration along the yaw axis is also detected and produces a characteristic pattern of spring expansion or contraction or mass movement that translates into a signal that depends on the amount of pitch and / or roll. Sometimes. Such an accelerometer can measure the tilt around the yaw axis, the roll angular acceleration, and the linear acceleration along the yaw axis by tracking the movement of the mass or the stretching force of the spring. There are many methods for tracking the position of the mass and / or the force exerted on it, such as resistive strain gauge materials, optical sensors, magnetic sensors, Hall effect devices, piezoelectric devices, capacitive sensors, and the like.

  In addition, joystick controller 630 may include one or more light sources 634, such as light emitting diodes (LEDs). The light source 634 may be used to distinguish the controller from other controllers. For example, this can be accomplished by blinking or sustaining the LED pattern code with one or more LEDs. For example, five LEDs may be provided in the controller 630 in a straight line or a two-dimensional pattern. Although it is preferred that the LEDs be arranged linearly, the LEDs are rectangular or arched to facilitate determination of the LED image plane when analyzing the image of the LED pattern acquired by the image acquisition unit 623. You may arrange in a pattern. Further, the LED pattern code may be used to determine the position of the joystick controller 630 during game play. For example, the LEDs help to identify controller tilt, yaw, and roll. This detection pattern is useful for improving the user's feeling of use in a game such as an aircraft flight game. The image acquisition unit 623 may acquire an image including the joystick controller 630 and the light source 634. By analyzing such an image, the position and / or orientation of the joystick controller can be determined. Such analysis may be implemented by program code instructions 604 stored in memory 602 and executed by processor 601. In order to facilitate the acquisition of the image of the light source 634 by the image acquisition unit 623, the light source 634 may be disposed on two or more different sides of the joystick controller 630, for example, front and back (shown in dashed lines). With such an arrangement, the image acquisition unit 623 can acquire an image of the light source 634 even if the direction of the joystick controller 630 is different depending on how the joystick controller 630 is gripped by the user. .

  Furthermore, the light source 634 may provide a telemetry signal to the processor 601 in a manner such as pulse code, amplitude modulation, or frequency modulation. Such a telemetry signal may indicate which joystick button was pressed and / or how hard the button was pressed. The telemetry signal may be encoded into the optical signal by pulse code, pulse width modulation, frequency modulation, luminous intensity (amplitude) modulation, and the like. The processor 601 may decode the telemetry signal from the optical signal and execute a game command in accordance with the decoded telemetry signal. The telemetry signal may be decoded from an analysis of the image of the joystick controller 630 acquired by the image acquisition unit 623. Alternatively, the apparatus 600 may include another optical sensor provided to receive a telemetry signal from the light source 634. The use of LEDs to determine intensity in conjunction with a computer program is described, for example, in US patent application 11 / 429,414 (inventor: Richard L. Marks et al., Title of invention: “Intensity in conjunction with computer program. And computer image / sound processing of input device ”, agent case number: SONYP052), which is incorporated herein by reference. Further, analysis of the image including the light source 634 may be used for both telemetry and determination of the position and / or orientation of the joystick controller 630. Such a technique may be realized by an instruction of a program 604 stored in the memory 602 and executed by the processor 601.

  The processor 601 detects the optical signal from the light source 634 detected by the image acquisition unit 623 and / or the sound detected by the microphone array 622 to estimate information about the position and / or orientation of the controller 630 and / or its user. The inertial signal from the inertial sensor 632 may be used in combination with the sound source position and feature information from the signal. For example, in conjunction with microphone array 622 to detect light source position and features to track voice movement while joystick controller movement is tracked independently (by inertial sensor 632 and / or light source 634). A “sonic radar” may be used. In the acoustic radar, a pre-calibrated listening area is selected at run time, and sound emitted from a sound source outside the pre-calibrated listening area is removed. The pre-calibrated listening area may include a listening area corresponding to the focal volume or field of view of the image acquisition unit 623. An example of an acoustic radar is US patent application 11 / 381,724 (inventor: Shadon Mao, title of invention: "Method and apparatus for detecting and characterizing speech of interest", filing date: May 4, 2006. ), Which is incorporated herein by reference. Any number of different combinations of different aspects of providing control signals to the processor 601 may be used in connection with embodiments of the present invention. Such a technique may be implemented by program code instructions 604 stored in memory 602 and executed by processor 601, selecting a pre-calibrated listening area at runtime and from a sound source outside the pre-calibrated listening area. One or more instructions may be included that instruct the one or more processors to remove the spoken voice. The pre-calibrated listening area may include a listening area corresponding to the focal volume or field of view of the image acquisition unit 623.

Program 604 generates discrete time domain input signals xm (t) from microphones M0-MM of microphone array 622, determines the listening area, and separates finite impulse responses from input signals xm (t). One or more instructions may be included to instruct one or more processors to use the listening area for semi-blind source separation to select filter coefficients. The program 604 may include instructions for applying one or more partial delays to a selected input signal xm (t) other than the input signal x0 (t) from the reference microphone M0. Each partial delay may be selected to optimize the signal for the noise ratio of the discrete time domain output signal y (t) from the microphone array. The partial delay may be selected so that the signal from the reference microphone M0 is first in time compared to signals from other microphones in the array. Program 604 may include instructions for introducing a partial time delay Δ into the output signal y (t) of the microphone array as follows.
y (t + Δ) = x (t + Δ) * b0 + x (t-1 + Δ) * b1 + x (t-2 + Δ) * b2 + ... + x (t-N + Δ) bN
However, while Δ is between 0 and ± 1, examples of such techniques are described in US patent application 11 / 381,729 (inventor: Shadon Mao, title of invention: “miniature microphone array”, filing date: 2006). May 4), incorporated herein by reference.

  The program 604 may include one or more instructions that, when executed, cause the system 600 to select a pre-calibrated listening area that includes a sound source. Such an instruction may cause the device to determine whether the sound source is within the initial region or on a particular side of the initial region. If the sound source is not within the default area, the instruction may select a different area on a particular side of the default area at execution time. Different regions may be characterized by the attenuation of the input signal closest to the optimum value. These instructions may calculate the attenuation of the input signal from the microphone array 622 and the attenuation to the optimum value at execution time. The instruction may cause the apparatus 600 to determine an attenuation value of the input signal in one or more regions at execution time and to select the region where the attenuation is closest to the optimal value. Examples of such techniques are described in US patent application Ser. No. 11 / 381,725 (inventor: Shadon Mao, title of invention: “object speech detection method and apparatus”, filing date: May 4, 2006). And incorporated herein by reference.

  The signal from inertial sensor 632 provides part of the tracking information input, and the signal generated by image acquisition unit 623 from tracking one or more light sources 634 may provide another part of the tracking information input. Good. Such a “mixed system” signal may be used, for example, in a football-type video game, when the quarterback moves the head to the left and fakes, then throws the ball to the right. Specifically, the game player holding the controller 630 may generate a sound while moving his head to the left and swinging the controller to the right to throw it as if it is football. A microphone array 622 associated with the “acoustic radar” program code can track the user's voice. The image acquisition unit 623 can track the movement of the user's head or track other commands that do not require the use of voice or a controller. Sensor 632 may track the movement of a joystick controller (representing football). The image acquisition unit 623 may further track the light source 634 on the controller 632. The user can release the “ball” when the acceleration of the joystick controller 630 reaches a certain amount and / or direction, or when a key command is generated by pressing a button on the controller 630.

  In some embodiments of the present invention, inertia signals from, for example, accelerometers or gyroscopes may be used to determine the position of controller 630. Specifically, the acceleration signal from the accelerometer may be integrated over time to determine the change in velocity, and the velocity may be integrated over time to determine the change in position. If the initial position and velocity values at a certain point in time are known, the absolute position can be determined using these values and the amount of change in velocity and position. Position determination using the inertial sensor can be performed at a higher speed than using the image acquisition unit 623 and the light source 634. However, the inertial sensor 632 accumulates errors over time, and the position of the joystick 631 calculated from the inertial signal. It is susceptible to a kind of error called “drift” that causes a discrepancy D between the joystick controller 630 and the actual position of the joystick controller 630 (shown in phantom). Embodiments of the present invention allow many ways to deal with such errors.

  For example, the drift can be manually canceled by resetting the initial position of the controller 630 to be equal to the current calculated position. The user can use one or more buttons on the controller 630 as a trigger for a command to reset the initial position. Alternatively, image-based drift may be performed by resetting the current position with reference to a position determined from an image acquired from the image acquisition unit 623. Such image-based drift compensation may be performed manually, such as when the user activates one or more buttons of the joystick controller 630. Alternatively, image-based drift compensation may be performed automatically, for example periodically or in response to game play. Such a technique may be realized by program code instructions 604 stored in the memory 602 and executed by the processor 601.

  In certain embodiments, it is desirable to correct erroneous data in the inertial sensor signal. For example, the signal from the inertial sensor 632 may be oversampled, and a moving average (sliding average) may be calculated from the oversampled signal to remove erroneous data from the inertial sensor signal. In certain situations, it may be desirable to oversample the signal, remove high and / or low values from the subset of data points, and calculate a moving average from the remaining data points. Furthermore, data oversampling and handling techniques may be used to adjust the signal from the inertial sensor to remove or reduce the importance of erroneous data. The choice of technology may depend on the type of signal, the operation to be performed on the signal, the type of game play, or a combination of two or more of these. These techniques may be realized by instructions of a program 604 stored in the memory 602 and executed by the processor 601.

  As described above, the processor 601 may analyze the inertial signal data 606 according to the data 606 and the program code instructions of the program 604 stored and acquired in the memory 602 and executed by the processor module 601. Good. Some of the code for program 604 may conform to any of a number of different programming languages, such as assembler, C ++, JAVA, or many other languages. The processor module 601 constitutes a general purpose computer. It becomes a specific purpose computer when executing a program such as program code 604. Here, the case where the program code 604 is realized as software executed on a general-purpose computer has been described. However, the task management method can also be realized using hardware such as an ASIC and other hardware circuits. That is understood by the contractor. Similarly, it is understood that some or all of the embodiments of the present invention can be realized by software, hardware, or a combination thereof.

  In certain embodiments, program code 604 may include a set of processor readable instructions that implement methods having features similar to method 510 of FIG. 5B and method 520 of FIG. 5C or a combination of two or more thereof. . The program code 604 generally instructs one or more processors to analyze the signal from the inertial sensor 632 to generate position and / or orientation information and to use that information during video game play. May also include instructions.

  The program code 604, when executed, causes the image acquisition unit 623 to monitor the field of view in front of the image acquisition unit 623, identify one or more light sources 634 within the field of view, detect changes in light emitted from the light sources 634, and change the changes. It may further include a processor readable instruction including one or more instructions that cause an input command to the processor 601 when detected. The use of LEDs in conjunction with an image acquisition device to trigger an action in a game controller is described in US patent application 10 / 759,782 (inventor: Richard L. Marks, filing date: January 16, 2004, Name: “Methods and Apparatus for Optical Input Devices”), incorporated herein by reference.

  Program code 604 includes one or more instructions that use signals from the inertial sensor at run time and signals generated from the image acquisition unit by tracking one or more light sources as input to the game system as described above. It may further include processor readable instructions. Program code 604 may further include processor-readable instructions that include one or more instructions that compensate for drift in inertial sensor 632 during execution.

  In the embodiment of the present invention, an example regarding the video game controller 630 has been described. However, the embodiment of the present invention including the system 600 can be applied to a body operated by a user, a modeled object, a knob, a structure, and the like. The inertia detection function and the inertial sensor signal transmission function may be used wirelessly or in another method.

  For example, embodiments of the present invention may be executed on a parallel processing system. Such parallel processing systems typically include two or more processor elements configured to execute portions of a program in parallel on separate processors. As a non-limiting example, FIG. 7 shows a type of cell processor 700 according to an embodiment of the present invention. The cell processor 700 may be used as the processor of FIG. 6 or may be used as the processor 502 of FIG. 5A. In the example illustrated in FIG. 7, the cell processor 700 includes a main memory 702, a power processor element (PPE) 704, and a plurality of synergistic processor elements (SPE) 706. In the example shown in FIG. 7, the cell processor 700 includes one PPE 704 and eight SPEs 706. In such a configuration, seven SPEs 706 may be used for parallel processing, and one may be reserved as a backup when any of the other seven fails. Alternatively, the cell processor may include a plurality of groups of PPE (PPE groups) and a plurality of groups of SPEs (SPE groups). In this case, hardware resources may be shared among units in the group. However, SPE and PPE must be considered software as independent elements. The embodiment of the present invention is not limited to being used by the configuration shown in FIG.

  Main memory 702 typically includes not only special purpose hardware registers or arrays used for functions such as system configuration, data transfer synchronization, memory mapped I / O, and I / O subsystems. Including general-purpose and non-volatile storage devices. In the embodiment of the present invention, the video game program 703 may be resident in the main memory 702. The memory 702 may include signal data 709. Video program 703 may include inertial, image, and acoustic analysis and mixing units, or some combination thereof, configured as described above in connection with FIGS. 4, 5A, 5B, or 5C. The program 703 may be executed on the PPE. The program 703 may be divided into a plurality of signal processing tasks that can be executed on the SPE and / or PPE.

  For example, the PPE 704 may be a 64-bit PPU (PowerPC Processor Unit) in which the caches L1 and L2 are combined. The PPE 704 is a general-purpose processing unit that can access system management resources such as a memory protection table. Hardware resources may be explicitly mapped into the real address space so that the PPE can reference them. Thus, the PPE can address any of these resources with an appropriate effective address value. The main function of the PPE 704 is task management and assignment for the SPE 706 of the cell processor 706.

  Although only one PPE is shown in FIG. 7, in a cell processor implementation such as a cell broadband engine architecture (CBEA), the cell processors 700 are grouped into one or more PPE groups. You may have more than one PPE. These PPE groups may share access to the main memory 702. Further, the cell processor 700 may include two or more SPE groups. SPE groups may also share access to main memory 702. Such a configuration is within the scope of the present invention.

  Each SPE 706 includes a SPU (synergistic processor unit) and its own local storage area LS. The local storage area LS may include one or more divided memory areas, each associated with a particular SPU. Each SPU may be configured to execute only instructions (including data load and data store instructions) from within the local storage area associated with it. In such a configuration, data transfer between the local storage area LS and other configurations of the system 700 is a memory flow controller (MFC) for transferring to or from the local storage area (in an individual SPE) May be executed by issuing a direct memory access (DMA) command from The SPU is a computational unit that is less complex than the PPE 704 in that it does not perform system management functions. SPUs generally have the ability to process multiple data simultaneously (SIMD) with a single instruction, typically processing data to perform assigned tasks, and any requested Initiate data transfer (assuming access to properties set by PPE). The purpose of the SPU is to enable applications that require higher computational unit density and can efficiently use the provided instruction set. The large number of SPEs in the system managed by PPE 704 allows for cost effective processing across a wide range of applications.

  Each SPE 706 may include a dedicated memory flow controller (MFC) that includes a memory management unit capable of holding and processing memory protection information and access permission information. The MFC provides the primary method for data transfer, protection, and synchronization between the main storage of the cell processor and the local storage of the SPE. The MFC command represents a transfer to be performed. A command for transferring data is also called an MFC direct memory access (DMA) command (MFCDMA command).

  Each MFC supports multiple DMA transfers simultaneously, and can hold and process multiple MFC commands. Each MFC / DMA data transfer command request may include both a local storage address (LSA) and an effective address (EA). The local storage address may directly address only the local storage area of the associated SPE. The effective address may have a more general application, for example, may refer to the main storage including all SPE local storage areas as long as it is aliased to the real address space.

  To facilitate communication between SPEs 706 and / or between SPEs 706 and PPEs 704, SPEs 706 and PPEs 704 may include signal notification registers related to signaling events. PPE 704 and SPE 706 may be connected by a star topology where PPE 704 functions as a router for sending messages to SPE 706. Alternatively, each SPE 706 and PPE 704 may have a one-way signal notification register referred to as a mailbox. Mailboxes may be used for operating system (OS) synchronization.

  Cell processor 700 may include an input / output (I / O) function 708 that allows cell processor 700 to interface with peripheral devices such as microphone array 712, image acquisition unit 713, and game controller 730. The game controller unit may include an inertial sensor 732 and a light source 734. Furthermore, the element interconnection bus 710 may connect the various components described above. Each SPE and PPE can access the bus 710 via the bus interface unit BIU. The cell processor 700 controls the flow of data between the I / O 708 and the bus 710, and the memory interface controller MIC that controls the flow of data between the bus 710 and the main memory 710, as typically found in the processor. The bus interface controller BIC may further include two controllers. The requirements for MIC, BIC, BIU and bus 710 can vary widely in different implementations, but their functionality and circuitry for implementation are well known to those skilled in the art.

  The cell processor 700 may further include an internal interrupt controller IIC. The IIC component manages the priority of interrupts transmitted to the PPE. IIC can handle interrupts from other components of the cell processor 700 without using the main system interrupt controller. The IIC may be considered a second level controller. The main system interrupt controller may handle interrupts from outside the cell processor.

  In embodiments of the present invention, certain calculations such as the partial delay described above may be performed in parallel using PPE 704 and / or one or more SPEs 706. Each partial delay calculation may be performed as a task divided into one or more so that a different SPE 706 can be executed.

  While the above is a complete description of the preferred embodiment of the invention, various alternatives, modifications and equivalents may be used. The scope of the invention should, therefore, be determined not with reference to the above description, but instead should be determined with reference to the appended claims along with their full scope of equivalents. All features described herein may be combined with all other features described herein, whether preferred or not. In the claims, what follows an indefinite article refers to the quantity of one or more items, unless expressly specified otherwise. The appended claims should not be construed to include means plus function limitations, unless explicitly limited by use of the phrase “means for”.

Claims (25)

  1. A system for tracking user actions within an environment,
    A user moveable object having at least one light source;
    Obtaining first input channel information indicating the position of the object from an image of the light source in an image obtainable from an image obtaining device, analyzing the first input channel information, and executing a program, A first analysis unit for generating first output information about an axis for registering a user operation of the object by the program;
    Executing the program by obtaining second input channel information that is a representation of the user operation of the object from a second input channel that is different from the first input channel, analyzing the second input channel, and executing the program A second analyzer for generating second output information relating to the axis for registering the user operation of the object by the program;
    Obtaining from the third input channel third input channel information that is a representation of the user operation within the environment, analyzing the third input channel, and executing the program during execution of the program. A third analysis unit for generating third output information about the axis for registering a user operation;
    Selecting information relating to the first aspect of the user operation from the first output information relating to the axis, selecting information relating to the second aspect of the user operation from the second output information relating to the axis; wherein from said third output information to select the information about the third aspect of the user operation relating to the first output information relating to the shaft which is selected, the second output information relating to the shaft that is selected, and the selected A programmable mixing unit that generates output information about the mixed axes to supply to the program based on third output information about the axes ;
    A system comprising:
  2.   The mixing unit is configured to generate output information related to the mixed axis based further on a distribution value of the first output information related to the axis and the second output information related to the axis. The system of claim 1, characterized in that:
  3.   The system according to claim 2, wherein the distribution value is defined by the mixing unit.
  4.   The system according to claim 2, wherein the distribution value is defined by the program.
  5.   The distribution value is a result of the mixing unit identifying the quality of specific data from the first and second analysis units and performing dynamic adjustment of the distribution value. Item 3. The system according to Item 2.
  6.   The system according to claim 2, wherein the distribution value is a result of building a historical knowledge base when specific data takes a specific value in a specific environment.
  7.   The system according to claim 2, wherein the distribution value is defined according to the particularity of the program.
  8.   The system according to claim 1, wherein the second input channel information includes information that can be obtained using a sensor of a different type from the image acquisition device.
  9.   The system of claim 1, wherein the second input channel information is available from an inertial sensor.
  10.   The system according to claim 1, wherein the second input channel information is available from a sound acquisition device.
  11.   The system of claim 10, wherein the sound acquisition device includes one or more microphones.
  12. The third input channel information includes information that can be obtained using a first type of sensor, and the second input channel information is a second type of sensor different from the first type of sensor. the system according to claim 1, characterized in that it comprises an available information with.
  13. The third input channel information The system of claim 1 2, characterized in that it comprises an information available by tracking at least part of the movement of the user's body.
  14.   The system of claim 1, wherein the second analysis unit includes an inertial analysis unit, and the second input channel information includes inertial information that can be obtained by an inertial sensor.
  15. The inertial sensor system of claim 1 4, characterized in that it comprises at least one of an accelerometer and a gyroscope.
  16.   The system according to claim 1, wherein the second input channel information includes information that can be obtained by converting audio in the environment.
  17.   The system according to claim 1, wherein the second input channel information includes information indicating a position of audio in the environment.
  18.   The system of claim 1, wherein the second input channel information includes information indicating a position of an acoustic signal generator within the environment.
  19.   The system of claim 1, wherein the second input channel information includes information indicating a position of an acoustic signal generator that can be moved by a user in the environment.
  20.   The system of claim 1, wherein at least one of the first input channel information and the second input channel information is a representation of an acceleration of an object movable by the user in the environment.
  21.   The system of claim 1, wherein at least one of the first input channel information and the second input channel information is a representation of a velocity of an object movable by the user in the environment.
  22.   The system according to claim 1, wherein at least one of the first input channel information and the second input channel information is a representation of a direction of an object movable by the user in the environment.
  23. The system of claim 2 2 wherein the user the direction of movable objects, characterized in that it comprises a pitch, yaw, and information indicating at least one of the roll.
  24.   2. The at least one of the first input channel information and the second input channel information is a representation of the position of the object in a coordinate space defined by at least two orthogonal axes. The system described in.
  25.   2. The at least one of the first input channel information and the second input channel information is a representation of the position of the object in a coordinate space defined by at least three orthogonal axes. The system described in.
JP2012080340A 2002-07-27 2012-03-30 A system for tracking user actions in an environment Active JP5668011B2 (en)

Priority Applications (54)

Application Number Priority Date Filing Date Title
US11/381,725 2006-05-04
US11/381,725 US7783061B2 (en) 2003-08-27 2006-05-04 Methods and apparatus for the targeted sound detection
US11/381,728 2006-05-04
US11/418,988 2006-05-04
US11/381,724 2006-05-04
US11/418,989 US8139793B2 (en) 2003-08-27 2006-05-04 Methods and apparatus for capturing audio signals based on a visual image
US11/429,133 2006-05-04
US11/381,727 2006-05-04
US11/381,728 US7545926B2 (en) 2006-05-04 2006-05-04 Echo and noise cancellation
US11/418,989 2006-05-04
US11/418,988 US8160269B2 (en) 2003-08-27 2006-05-04 Methods and apparatuses for adjusting a listening area for capturing sounds
US11/429,133 US7760248B2 (en) 2002-07-27 2006-05-04 Selective sound source listening in conjunction with computer interactive processing
US11/381,721 2006-05-04
US11/381,727 US7697700B2 (en) 2006-05-04 2006-05-04 Noise removal for electronic device with far field microphone on console
US11/429,414 US7627139B2 (en) 2002-07-27 2006-05-04 Computer image and audio processing of intensity and input devices for interfacing with a computer program
US11/429,047 2006-05-04
US11/429,047 US8233642B2 (en) 2003-08-27 2006-05-04 Methods and apparatuses for capturing an audio signal based on a location of the signal
US11/381,724 US8073157B2 (en) 2003-08-27 2006-05-04 Methods and apparatus for targeted sound detection and characterization
US11/381,729 2006-05-04
US11/381,721 US8947347B2 (en) 2003-08-27 2006-05-04 Controlling actions in a video game unit
US11/381,729 US7809145B2 (en) 2006-05-04 2006-05-04 Ultra small microphone array
US11/429,414 2006-05-04
US79803106P true 2006-05-06 2006-05-06
US11/382,038 2006-05-06
US60/798,031 2006-05-06
US29/259,349 2006-05-06
US11/382,032 2006-05-06
US29/259,350 2006-05-06
US29/259,350 USD621836S1 (en) 2006-05-06 2006-05-06 Controller face with tracking sensors
US11/382,034 2006-05-06
US29259349 2006-05-06
US11/382,033 2006-05-06
US11/382,035 US8797260B2 (en) 2002-07-27 2006-05-06 Inertially trackable hand-held controller
US29259348 2006-05-06
US11/382,036 2006-05-06
US11/382,031 US7918733B2 (en) 2002-07-27 2006-05-06 Multi-input game control mixer
US11/382,037 2006-05-06
US11/382,032 US7850526B2 (en) 2002-07-27 2006-05-06 System for tracking user manipulations within an environment
US11/382,036 US9474968B2 (en) 2002-07-27 2006-05-06 Method and system for applying gearing effects to visual tracking
US11/382,037 US8313380B2 (en) 2002-07-27 2006-05-06 Scheme for translating movements of a hand-held controller into inputs for a system
US11/382,038 US7352358B2 (en) 2002-07-27 2006-05-06 Method and system for applying gearing effects to acoustical tracking
US11/382,034 US20060256081A1 (en) 2002-07-27 2006-05-06 Scheme for detecting and tracking user manipulation of a game controller body
US11/382,033 US8686939B2 (en) 2002-07-27 2006-05-06 System, method, and apparatus for three-dimensional input control
US11/382,035 2006-05-06
US11/382,031 2006-05-06
US29/259,348 2006-05-06
US11/382,039 US9393487B2 (en) 2002-07-27 2006-05-07 Method for mapping movements of a hand-held controller to game commands
US11/382,041 US7352359B2 (en) 2002-07-27 2006-05-07 Method and system for applying gearing effects to inertial tracking
US11/382,039 2006-05-07
US11/382,040 US7391409B2 (en) 2002-07-27 2006-05-07 Method and system for applying gearing effects to multi-channel mixed input
US11/382,043 2006-05-07
US11/382,040 2006-05-07
US11/382,041 2006-05-07
US11/382,043 US20060264260A1 (en) 2002-07-27 2006-05-07 Detectable and trackable hand-held controller

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2009509931 Division 2007-04-19

Publications (2)

Publication Number Publication Date
JP2012164330A JP2012164330A (en) 2012-08-30
JP5668011B2 true JP5668011B2 (en) 2015-02-12

Family

ID=38668432

Family Applications (4)

Application Number Title Priority Date Filing Date
JP2009509931A Active JP5219997B2 (en) 2002-07-27 2007-04-19 Multi-input game control mixer
JP2012080329A Active JP5145470B2 (en) 2002-07-27 2012-03-30 System and method for analyzing game control input data
JP2012080340A Active JP5668011B2 (en) 2002-07-27 2012-03-30 A system for tracking user actions in an environment
JP2012257118A Active JP5638592B2 (en) 2002-07-27 2012-11-26 System and method for analyzing game control input data

Family Applications Before (2)

Application Number Title Priority Date Filing Date
JP2009509931A Active JP5219997B2 (en) 2002-07-27 2007-04-19 Multi-input game control mixer
JP2012080329A Active JP5145470B2 (en) 2002-07-27 2012-03-30 System and method for analyzing game control input data

Family Applications After (1)

Application Number Title Priority Date Filing Date
JP2012257118A Active JP5638592B2 (en) 2002-07-27 2012-11-26 System and method for analyzing game control input data

Country Status (2)

Country Link
JP (4) JP5219997B2 (en)
WO (1) WO2007130791A2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6496543B1 (en) 1996-10-29 2002-12-17 Qualcomm Incorporated Method and apparatus for providing high speed data communications in a cellular environment
US8225343B2 (en) 2008-01-11 2012-07-17 Sony Computer Entertainment America Llc Gesture cataloging and recognition
GB2458297B (en) * 2008-03-13 2012-12-12 Performance Designed Products Ltd Pointing device

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0682242A (en) * 1992-08-31 1994-03-22 Victor Co Of Japan Ltd Three-dimensional position/attitude detection method
JPH07284166A (en) * 1993-03-12 1995-10-27 Mitsubishi Electric Corp Remote controller
US6720949B1 (en) * 1997-08-22 2004-04-13 Timothy R. Pryor Man machine interfaces and applications
JPH11316646A (en) * 1998-05-01 1999-11-16 Nippon Telegr & Teleph Corp <Ntt> Virtual presence feeling method and system device
JPH11333139A (en) * 1998-05-26 1999-12-07 Fuji Electronics Co Ltd Moving image controlling device
JP2000259340A (en) * 1999-03-12 2000-09-22 Sony Corp Device and method for input, input system, and distribution medium
JP2001005600A (en) * 1999-06-24 2001-01-12 Nec Corp Handwritten input method and handwritten input device and recording medium recording program for allowing computer to realize handwritten input processing
US6426741B1 (en) * 1999-09-30 2002-07-30 Intel Corporation User input for a computer
JP3819416B2 (en) * 1999-10-04 2006-09-06 任天堂株式会社 Game system and game information storage medium used therefor
US6699123B2 (en) * 1999-10-14 2004-03-02 Sony Computer Entertainment Inc. Entertainment system, entertainment apparatus, recording medium, and program
JP2001246161A (en) * 1999-12-31 2001-09-11 Square Co Ltd Device and method for game using gesture recognizing technic and recording medium storing program to realize the method
JP4027031B2 (en) * 2000-11-16 2007-12-26 株式会社コナミデジタルエンタテインメント Play against type 3d video game device
US20020085097A1 (en) * 2000-12-22 2002-07-04 Colmenarez Antonio J. Computer vision-based wireless pointing system
JP2002306846A (en) * 2001-04-12 2002-10-22 Saibuaasu:Kk Controller for game machine
JP2002320772A (en) * 2001-04-25 2002-11-05 Pacific Century Cyberworks Japan Co Ltd Game device, its control method, recording medium, program and cellular phone
JP3841658B2 (en) * 2001-09-14 2006-11-01 株式会社タイトー Fighting game machine
JP4028708B2 (en) * 2001-10-19 2007-12-26 株式会社コナミデジタルエンタテインメント Game apparatus and a game system
JP3470119B2 (en) * 2002-02-14 2003-11-25 コナミ株式会社 Controller, controller of attitude telemetry device and the video game device
JP3602519B2 (en) * 2002-07-12 2004-12-15 コナミ株式会社 Video game apparatus, image processing method, and program
JP2004302993A (en) * 2003-03-31 2004-10-28 Sony Corp Information input system and input device
US20040212589A1 (en) * 2003-04-24 2004-10-28 Hall Deirdre M. System and method for fusing and displaying multiple degree of freedom positional input data from multiple input sources
US7233316B2 (en) * 2003-05-01 2007-06-19 Thomson Licensing Multimedia user interface
JP2005021563A (en) * 2003-07-01 2005-01-27 Namco Ltd Game device, program and information storage medium
JP2005021458A (en) * 2003-07-03 2005-01-27 Tamura Seisakusho Co Ltd Indicated position specifying device and method therefor, and image display system

Also Published As

Publication number Publication date
JP2012166036A (en) 2012-09-06
JP5145470B2 (en) 2013-02-20
JP5638592B2 (en) 2014-12-10
WO2007130791A3 (en) 2008-11-13
WO2007130791A2 (en) 2007-11-15
JP2012164330A (en) 2012-08-30
JP2013084281A (en) 2013-05-09
JP2009535172A (en) 2009-10-01
JP5219997B2 (en) 2013-06-26

Similar Documents

Publication Publication Date Title
US9405372B2 (en) Self-contained inertial navigation system for interactive control using movable controllers
US8009022B2 (en) Systems and methods for immersive interaction with virtual objects
US9433866B2 (en) Method and apparatus for adjustment of game parameters based on measurement of user performance
US8761434B2 (en) Tracking system calibration by reconciling inertial data with computed acceleration of a tracked object in the three-dimensional coordinate system
US8409004B2 (en) System and method for using accelerometer outputs to control an object rotating on a display
US9746921B2 (en) Signal generation and detector systems and methods for determining positions of fingers of a user
JP4471910B2 (en) Virtual position determining program
US10137374B2 (en) Method for an augmented reality character to maintain and exhibit awareness of an observer
EP1900406B1 (en) Game device and storage medium storing game program
US8947347B2 (en) Controlling actions in a video game unit
US8419539B2 (en) Game apparatus and recording medium recording game program for displaying a motion matching a player&#39;s intention when moving an input device
US9474968B2 (en) Method and system for applying gearing effects to visual tracking
US8672760B2 (en) Game apparatus and storage medium storing game program
CN101484933B (en) Method and apparatus based on the one or more visual, the inertia of the mixed data and the auditory effect to apply to the input of the transmission
EP2352149B1 (en) Selective sound source listening in conjunction with computer interactive processing
US20080174550A1 (en) Motion-Input Device For a Computing Terminal and Method of its Operation
JP5325771B2 (en) System, method, apparatus, and program for detecting timing of swing impact and / or strength of swing based on accelerometer data
US7826641B2 (en) Apparatus and method for determining an absolute pose of a manipulated object in a real three-dimensional environment with invariant features
US8192285B2 (en) Method and apparatus for simulating games involving a ball
US7783061B2 (en) Methods and apparatus for the targeted sound detection
US20110175809A1 (en) Tracking Groups Of Users In Motion Capture System
KR101231989B1 (en) Game controller and game system
CN1923325B (en) Game system
CN101367015B (en) Self contained inertial navigation system for interactive control using movable controller
CN1931410B (en) Information processing device and method executed by the information processing device

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130828

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130910

A601 Written request for extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A601

Effective date: 20131210

A602 Written permission of extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A602

Effective date: 20131213

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140110

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20140610

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20141003

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20141014

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20141209

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20141215

R150 Certificate of patent or registration of utility model

Ref document number: 5668011

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250