CN102989174A - Method for obtaining inputs used for controlling operation of game program - Google Patents

Method for obtaining inputs used for controlling operation of game program Download PDF

Info

Publication number
CN102989174A
CN102989174A CN2012104967128A CN201210496712A CN102989174A CN 102989174 A CN102989174 A CN 102989174A CN 2012104967128 A CN2012104967128 A CN 2012104967128A CN 201210496712 A CN201210496712 A CN 201210496712A CN 102989174 A CN102989174 A CN 102989174A
Authority
CN
China
Prior art keywords
controller
input message
value
information
additional input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012104967128A
Other languages
Chinese (zh)
Other versions
CN102989174B (en
Inventor
X.毛
R.L.马克斯
G.M.扎列夫斯基
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Sony Interactive Entertainment LLC
Original Assignee
Sony Computer Entertainment America LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/429,047 external-priority patent/US8233642B2/en
Priority claimed from US11/418,989 external-priority patent/US8139793B2/en
Priority claimed from US11/429,133 external-priority patent/US7760248B2/en
Priority claimed from US11/381,728 external-priority patent/US7545926B2/en
Priority claimed from PCT/US2006/017483 external-priority patent/WO2006121896A2/en
Priority claimed from US11/381,725 external-priority patent/US7783061B2/en
Priority claimed from US11/418,988 external-priority patent/US8160269B2/en
Priority claimed from US11/381,724 external-priority patent/US8073157B2/en
Priority claimed from US11/381,729 external-priority patent/US7809145B2/en
Priority claimed from US11/381,727 external-priority patent/US7697700B2/en
Priority claimed from US11/381,721 external-priority patent/US8947347B2/en
Priority claimed from US11/429,414 external-priority patent/US7627139B2/en
Priority claimed from US11/382,036 external-priority patent/US9474968B2/en
Priority claimed from US11/382,035 external-priority patent/US8797260B2/en
Priority claimed from US11/382,037 external-priority patent/US8313380B2/en
Priority claimed from US11/382,038 external-priority patent/US7352358B2/en
Priority claimed from US11/382,031 external-priority patent/US7918733B2/en
Priority claimed from US11/382,032 external-priority patent/US7850526B2/en
Priority claimed from US11/382,033 external-priority patent/US8686939B2/en
Priority claimed from US29/259,350 external-priority patent/USD621836S1/en
Priority claimed from US11/382,034 external-priority patent/US20060256081A1/en
Priority claimed from US11/382,039 external-priority patent/US9393487B2/en
Priority claimed from US11/382,043 external-priority patent/US20060264260A1/en
Priority claimed from US11/382,040 external-priority patent/US7391409B2/en
Priority claimed from US11/382,041 external-priority patent/US7352359B2/en
Priority claimed from US11/430,593 external-priority patent/US20070261077A1/en
Priority claimed from US11/382,256 external-priority patent/US7803050B2/en
Priority claimed from US11/382,252 external-priority patent/US10086282B2/en
Priority claimed from US11/382,259 external-priority patent/US20070015559A1/en
Priority claimed from US11/382,258 external-priority patent/US7782297B2/en
Priority claimed from US11/382,250 external-priority patent/US7854655B2/en
Priority claimed from US29/246,764 external-priority patent/USD629000S1/en
Priority claimed from US11/382,251 external-priority patent/US20060282873A1/en
Priority claimed from US29/246,743 external-priority patent/USD571367S1/en
Priority claimed from US11/430,594 external-priority patent/US20070260517A1/en
Priority claimed from US29/246,744 external-priority patent/USD630211S1/en
Priority claimed from US29/246,768 external-priority patent/USD571806S1/en
Priority claimed from US29/246,767 external-priority patent/USD572254S1/en
Application filed by Sony Computer Entertainment America LLC filed Critical Sony Computer Entertainment America LLC
Publication of CN102989174A publication Critical patent/CN102989174A/en
Application granted granted Critical
Publication of CN102989174B publication Critical patent/CN102989174B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R3/00Circuits for transducers, loudspeakers or microphones
    • H04R3/005Circuits for transducers, loudspeakers or microphones for combining the signals of two or more microphones

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Otolaryngology (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Position Input By Displaying (AREA)
  • Image Analysis (AREA)

Abstract

Disclosed is a method for obtaining inputs used for controlling operation of a game program. In an embodiment of the present invention, controller routing information from inertia, image capture and sound sources are mixed before analysis of gesture recognition.

Description

Obtain the input for the operation of control games
Priority request
The application requires the rights and interests of following patent: U.S. Patent application No.11/381729, authorize Xiao Dong Mao, and title is " microminiature microphone array ", (attorney docket SCEA05062US00), on May 4th, 2006 submitted to; Application number 11/381728 is authorized XiaoDong Mao, and title is " echo and noise are eliminated ", and (attorney docket SCEA05064US00), on May 4th, 2006 submitted to; U.S. Patent application No.11/381725 authorizes Xiao Dong Mao, and title is " method and apparatus that target sound detects ", and (attorney docket SCEA05072US00), on May 4th, 2006 submitted to; Application No. 11/381727 is authorized Xiao Dong Mao, and title is " noise remove that has the electronic installation of far field microphone on the console ", and (attorney docket SCEA05073US00), on May 4th, 2006 submitted to; U.S. Patent application No.11/381724 authorizes Xiao Dong Mao, and title is " method and apparatus that target sound detects and characterizes ", and (attorney docket SCEA05079US00), on May 4th, 2006 submitted to; U.S. Patent application No.11/381721 authorizes Xiao Dong Mao, and title is " in conjunction with the selective sound source listening of computer interactive processing ", and (attorney docket SCEA04005JUMBOUS), on May 4th, 2006 submitted to; By reference they all are incorporated into this paper.
The application requires the rights and interests of following patent: common pending application number 11/418988, authorize Xiao Dong Mao, title is " method and apparatus of adjusting the audit area that is used for seizure sound ", (attorney docket SCEA-00300), and on May 4th, 2006 submitted to; Common pending application number 11/418989 is authorized Xiao Dong Mao, and title is " being used for catching according to visual image the method and apparatus of audio signal ", and (attorney docket SCEA-00400), on May 4th, 2006 submitted to; Common pending application number 11/429047 is authorized Xiao Dong Mao, and title be " catching the method and apparatus of audio signal according to the position of signal ", (attorney docket SCEA-00500), submission on May 4th, 2006; Common pending application number 11/429133 is authorized the people such as Richard Marks, and title be " selective sound source listening of processing in conjunction with computer interactive ", (attorney docket SCEA04005US01-SONYP045), submission on May 4th, 2006; And common pending application number 11/429414, authorize the people such as Richard Marks, title is " processing with the intensity of computer program interface and computer picture and the audio frequency of input unit ", (attorney docket SONYP052), and on May 4th, 2006 submitted to; By reference the whole complete of them openly is attached to herein.
The application also requires the rights and interests of following patent: U.S. Patent application No.11/382031, title are " multi-input game control mixer ", (attorney docket SCEA06MXR1), and on May 6th, 2006 submitted to; U.S. Patent application No.11/382032, title are " system that is used for user's manipulation of tracking environmental ", (attorney docket SCEA06MXR2), and on May 6th, 2006 submitted to; U.S. Patent application No.11/382033, title are " system, the method and apparatus that are used for three-dimensional input control " ", (attorney docket SCEA06INRT1), on May 6th, 2006 submitted to; U.S. Patent application No.11/382035, title are " inertia can be followed the tracks of hand held controller ", (attorney docket SCEA06INRT2), and on May 6th, 2006 submitted to; U.S. Patent application No.11/382036, title are " being used for vision is followed the tracks of the method and system of using connected effect ", (attorney docket SONYP058A), and on May 6th, 2006 submitted to; U.S. Patent application No.11/382041, title are " being used for inertia is followed the tracks of the method and system of using connected effect ", (attorney docket SONYP058B), and on May 7th, 2006 submitted to; U.S. Patent application No.11/382038, title are " method and system that is used for using to acoustic tracking connected effect ", (attorney docket SONYP058C), and on May 6th, 2006 submitted to; U.S. Patent application No.11/382040, title are " being used for mixing the method and system that connected effect is used in input to multichannel ", (attorney docket SONYP058D), and on May 7th, 2006 submitted to; U.S. Patent application No.11/382034, title are " for detection of the scheme of handling with the user who follows the tracks of the game console main body ", (attorney docket 86321SCEA05082US00), and on May 6th, 2006 submitted to; U.S. Patent application No.11/382037, title is " being used for the moving rotating of hand held controller is changed into the scheme of the input of system ", (attorney docket 86324), on May 6th, 2006 submitted to; U.S. Patent application No.11/382043, title is " can detect and can follow the tracks of hand held controller ", (attorney docket 86325), on May 7th, 2006 submitted to; U.S. Patent application No.11/382039, title is " being used for the movement of hand held controller is mapped to the method for game order ", (attorney docket 86326), on May 7th, 2006 submitted to; U.S. design patent application No.29/259349, title is " controller with infrared port ", (attorney docket SCEA06007US00), on May 6th, 2006 submitted to; U.S. design patent application No.29/259350, title is " controller with tracking transducer ", (attorney docket SCEA06008US00), on May 6th, 2006 submitted to; U.S. Patent application No.60/798031, title are " dynamic object interface ", (attorney docket SCEA06009US00), and on May 6th, 2006 submitted to; And U.S. design patent application No.29/259348, title is " tracked control device ", (attorney docket SCEA06010US00), and on May 6th, 2006 submitted to; U.S. Patent application No.11/382250, title are " obtaining the input for the operation of control games ", (attorney docket SCEA06018US00), and on May 8th, 2006 submitted to; By reference they all intactly are attached to herein.
The application also requires the rights and interests of following patent: common unsettled Application No. 11/430594, authorize Garz Zalewski and Riley R.Russel, title is " user's audio visual environment is selected the system and method for advertisement ", (attorney docket SCEA05059US00), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled Application No. 11/430593, authorize Garz Zalewski and Riley R.Russel, title is " selecting advertisement with audio visual environment on gaming platform ", (attorney docket SCEAUS3.0-011), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled Application No. 11/382259, authorize the people such as Garz Zalewski, title is " being used for definite method and apparatus that does not have with respect to the User Activity of system ", (attorney docket 86327), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled Application No. 11/382258, authorize the people such as Garz Zalewski, title is " for the method and apparatus of determining with respect to the User Activity grade of system ", (attorney docket 86328), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled Application No. 11/382251, authorize the people such as Garz Zalewski, title is " but have for the detecting element of following the tracks of hand held controller ", (attorney docket 86329), and on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled Application No. 11/382252, title is " tracking means that is used for the information of acquisition control games operation ", (attorney docket SCEA06INRT3), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled Application No. 11/382256, title is " having the tracking means for the acoustic emitter of the information that obtains the operation of control games ", (attorney docket SCEA06ACRA2), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled U.S. design patent application number 29/246744, title are " PlayStation 3 videogame console/PS3 is positive ", (attorney docket SCEACTR-D3), and on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled U.S. design patent application number 29/246743, and title is " PlayStation 3 videogame console/PS3 ", (attorney docket SCEACTRL-D2), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled U.S. design patent application number 29/246767, and title is " PlayStation 3 videogame console/PS3 ", (attorney docket SONYP059A), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled U.S. design patent application number 29/246768, and title is " PlayStation 3 videogame console/PS3 ", (attorney docket SONYP059B), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled U.S. design patent application number 29/246763, title is " the ergonomics game control apparatus with LED and optical port ", (attorney docket PA3760US), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled U.S. design patent application number 29/246759, and title is " the game control apparatus with LED and optical port ", (attorney docket PA3761US), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled U.S. design patent application number 29/246765, title are " design of optics game console interface ", (attorney docket PA3762US), and on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled U.S. design patent application number 29/246766, and title is " the dual-handle game control device with LED and optical port ", (attorney docket PA3763US), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled U.S. design patent application number 29/246764, and title is " the game interface device with LED and optical port ", (attorney docket PA3764US), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The application also requires the rights and interests of following patent: common unsettled U.S. design patent application number 29/246762, title is " the ergonomics game interface device with LED and optical port ", (attorney docket PA3765US), on May 8th, 2006 submitted to; By reference the complete of it openly is attached to herein.
The cross reference of related application
The title that the application relates on September 15th, 2005 and submits to is the U.S. Provisional Patent Application No.60/718145 of " audio frequency, video, simulation and user interface example ", by reference it is incorporated into this paper.
The application relates to following patent: U.S. Patent application No.10/207677, title are " using the man-machine interface of deformable device ", and on July 27th, 2002 submitted to; U.S. Patent application No.10/650409, title are " audio input system ", and on August 27th, 2003 submitted to; U.S. Patent application No.10/663236, title is " being used for adjusting according to tracked head movement the method and apparatus of shown picture view ", on September 15th, 2003 submitted to; U.S. Patent application No.10/759782, title are " method and apparatus that is used for the light input unit ", and on January 16th, 2004 submitted to; U.S. Patent application 10/820469, title are " method and apparatus that detects and remove the audio frequency disturbance ", and on April 7th, 2004 submitted to; And U.S. Patent application No.11/301673, title is " following the tracks of the method that realizes indicating interface with relative head and hand position via camera ", on December 12nd, 2005 submitted to; U.S. Patent application No.11/165473, title are " delay matching of audio-frequency/video frequency system ", and on June 22nd, 2005 submitted to; By reference they all are incorporated into this.
The application also relates to following patent: common unsettled U.S. Patent application No.11/400997, and on April 10th, 2006 submitted to, and title is " being used for the system and method from phonetic acquisition user profile ", (attorney docket SCEA05040US00); By reference the complete of it openly is attached to herein.
Technical field
In general, the present invention relates to man-machine interface, specifically, relate to and process the multichannel input of handling for the user who follows the tracks of one or more controllers.
Background technology
Computer entertainment system generally includes hand held controller, game console or other controller.User or player use controller to send order or other instruction to entertainment systems, so that video-game or other simulation that control is being played.For example, controller can be equipped with by the executor of user's operation, such as control stick.Control stick converted to digital value by manipulated variable from the analogue value, this digital value is sent to game host.Controller also can be equipped with can be by the button of user's operation.
Developed the present invention for these and other background information factors just.
Description of drawings
By with reference to following detailed description, can should be readily appreciated that theory of the present invention by reference to the accompanying drawings, accompanying drawing comprises:
Fig. 1 is the pictorial diagram that the video game system that operates according to one embodiment of present invention is shown;
Fig. 2 is the perspective view of the controller made according to one embodiment of present invention;
Fig. 3 is the schematic three dimensional views that illustrates according to one embodiment of present invention, can be used for the accelerometer of controller;
Fig. 4 is according to one embodiment of the invention, is used for mixing the block diagram of the system of various control inputs;
Fig. 5 A is the block diagram of a part of the video game system of Fig. 1;
Fig. 5 B is according to one embodiment of present invention, be used for follows the tracks of the flow chart of method of the controller of video game system;
Fig. 5 C is the flow chart that illustrates according to one embodiment of present invention, is used for utilizing the method for position and/or orientation information during the game on the video game system is carried out;
Fig. 6 is the block diagram that video game system according to an embodiment of the invention is shown; And
Fig. 7 is the block diagram that the Cell processor of video game system according to an embodiment of the invention realizes.
Specific embodiment is described
Although for convenience of explanation, below describe in detail and comprise many details,, person of skill in the art will appreciate that, many variations and the change of following details is within the scope of the present invention.Therefore, propose the example embodiment of the following description of the present invention, and do not lose the generality of the present invention that requires rights and interests and the present invention who requires rights and interests is not applied restriction.
The various embodiment of method as herein described, equipment, scheme and system provide the user detection, seizure and tracking to movement, motion and/or the manipulation of whole controller main body itself.The user to whole controller main body detect mobile, motion and/or handle and can be used as additional command and be used for controlling the game carried out or the various aspects of other simulation.
The detection and tracking user can realize by different modes the step of the manipulation of game console main body.For example, inertial sensors such as accelerometer or gyroscope, can be used with computer entertainment system such as image capture unit such as digital cameras, in order to detect the motion of hand held controller main body, and convert them in the game action.For example have the example of the controller of inertial sensor at title for having described tracking in the U.S. Patent application 11/382033 (attorney docket SCEA06INRT1) of " system of three-dimensional input control, method and apparatus ", by reference it is attached to herein.For example in the U.S. Patent application 11/382034 (attorney docket SCEA05082US00) of title for " scheme of handling for detection of the user with tracking game console main body ", described the example of coming tracking control unit with picture catching, by reference it has been attached to herein.In addition, also can use microphone array and suitable signal to process with acoustically tracking control unit and/or user.In U.S. Patent application 11/381721, described the example of this acoustic tracking, by reference it has been attached to herein.
Phonoreception survey, inertia sensing and picture catching can be individually or with the many dissimilar motion of any combination for detection of controller, and movement, move left and right, jerk are moved, bar type moves such as moving up and down, reversing, underriding campaign etc.This type games can be corresponding to various command, so that motion is converted into the action in the game.The detection and tracking user can be used to realize many dissimilar game, simulation etc. to the manipulation of game console main body; this allows the user for example to participate in daggers and swords or the fight of light sword; the shape of tracking items uses the rod; participate in many dissimilar competitive sports, the fight on the participation screen or other antagonism etc.Games can be configured to the motion of tracking control unit, and identify some pre-recorded posture from tracked motion.One or more identification in these postures can trigger the variation of game state.
In an embodiment of the present invention, can before being used for the analysis of gesture recognition, mix the controller routing information that obtains from these separate sources.The mode of possibility that can be by improving the identification posture is mixed the tracking data from separate sources (for example sound, inertia and picture catching).
With reference to Fig. 1, the system 100 that operates according to one embodiment of present invention is shown.As shown in the figure, computer entertainment console 102 can be coupled with TV or other video display 104, so that the image of therein display video game or other simulation.Game or other simulation can be stored on DVD, CD, flash memory, USB storage or other storage medium 106 that inserts console 102.User or player's 108 direct game controllers 110 are controlled video-game or other simulation.See that in Fig. 2 game console 110 comprises inertial sensor 112, its response game console 110 position, motion, orientation or orientation variation and produce signal.Except inertial sensor, game console 110 also can comprise conventional control inputs device, such as control stick 111, button 113, R1, L1 etc.
In operation, user 108 is with physics mode mobile controller 110.For example, controller 110 can be moved towards any direction by user 108, such as upper and lower, to a side, to opposite side, reverse, roll, rock, jerk, underriding etc.These of controller 110 itself move can be by camera 112 by following the tracks of, detect and catch in mode described below from the signal of inertial sensor 112 via analyzing.
Refer again to Fig. 1, system 100 can comprise camera or other video image trap setting 114 alternatively, and it can be located so that controller 110 is within the visual field 116 of camera.Can be combined with the analysis from the data of inertial sensor 112 from the analysis of the image of image capture device 114.As shown in Figure 2, controller 110 can be equipped with alternatively such as light sources such as light emitting diodes (LED) 202,204,206,208, follows the tracks of by video analysis with help.They can be installed on the main body of controller 110.Term as used herein " main body " is used for describing the part that game console 110 will grasp (being to wear in the time of can wearing game console at it perhaps).
For example in authorizing inventor Gary M.Zalewski, the Application No. 11/382034 (attorney docket SCEA05082US00) of title for " scheme of handling for detection of the user with tracking game console main body ", describe for the analysis of tracking control unit 110 to this class video image, by reference it is attached to herein.Console 102 can comprise sonic transducer, and for example microphone array 118.Controller 110 also can comprise acoustical signal maker 210 (for example loudspeaker), thereby provide the acoustic tracking of the controller 110 of sound source to help to have microphone array 118 and suitable acoustical signal to process, described in U.S. Patent application 11/381724, by reference it is attached to herein.
In general, the position and the orientation data that are used for formation controller 110 from the signal of inertial sensor 112.This data can be used to many physics aspect of the movement of computing controller 110, and for example it is along the acceleration of any axle and speed, its any telemetry station of inclination, pitching, driftage, rolling and controller 110.This paper employed " remote measurement " generally refers to the concerned information of remote measurement and to system or to the designer of system or operator's report.
The ability of the movement of detection and tracking controller 110 makes it possible to determine whether any predetermined movement of implementation controller 110.That is to say that some Move Mode of controller 110 or posture can be pre-defined and with playing games or the input command of other simulation.For example, the downward underriding posture of controller 110 may be defined as an order, and the posture of reversing of controller 110 may be defined as another order, and the posture of rocking of controller 110 may be defined as another order, and the rest may be inferred.Like this, user 108 controls another input of playing in the mode of physics mode mobile controller 110 with acting on, and it provides the happier experience of stimulation for the user.
As example rather than restriction, inertial sensor 112 can be accelerometer.Fig. 3 illustrate take four points for example by spring 306,308,310,312 with the simple quality of framework 304 an Elastic Couplings example of the accelerometer 300 of 302 form certainly.Pitch axis and roll axis (being represented by X and Y respectively) are arranged in the plane with frame intersection.Yaw axis Z is orientated with to comprise pitch axis X vertical with the plane of roll axis Y.Framework 304 can be installed to controller 110 by any appropriate ways.When framework 304 (and game console 110) accelerates and/or rotate, quality is determined 302 can be with respect to framework 304 displacements, and spring 306,308,310,312 can extend or compress in the following manner, and this mode depends on the translation of pitching and/or rolling and/or driftage and/or quantity and direction and/or the angle of Spin-up.The displacement of mass 302 and/or spring 306,308,310,312 compression or elongation can adopt for example suitable sensor 314,316,318,320 to come sensing, and are converted into signal known or that predetermined way is relevant with the acceleration amount of pitching and/or rolling.
Exist many different modes to come the position that tracking quality determines and/or be applied to power on it, comprising strain ga(u)ge material, photon sensor, Magnetic Sensor, Hall effect device, piezo-electric device, capacitance sensor etc.Embodiments of the invention can comprise the combination of sensor or the sensor type of any quantity and type.By example rather than restriction, sensor 314,316,318,320 can be arranged on the gap close induction type electrode on the mass 302.Electric capacity between mass and each electrode changes with the position of mass with respect to each electrode.Each electrode can be connected to circuit, and this circuit produces the signal relevant with respect to the electric capacity (therefore with the degree of approach of mass with respect to electrode) of electrode with mass 302.In addition, spring 306,308,310,
312 can comprise the resistance-strain flowmeter sensor, and they produce the signal relevant with elongation with the compression of spring.
In certain embodiments, framework 304 can be installed to controller 110 with gimbal, so that accelerometer 300 is with respect to pitching and/or rolling and/or yaw axis maintenance fixed orientation.Like this, controller shaft X, YZ can map directly to the respective shaft in the real space, and need not to consider that controller shaft is with respect to the inclination of real space reference axis.
As mentioned above, can analyze the data from inertia, picture catching and sound source, with the position of generation tracking control unit 110 and/or the path of orientation.Shown in the block diagram of Fig. 4, system 400 according to an embodiment of the invention can comprise inertia analyzer 402, image dissector 404 and acoustic analysis device 406.In these analyzers each receives the signal from sensitive context 401.Analyzer 402,404,406 can make up to realize by hardware, software (or firmware) or two or more certain in them.The trace information that each generation in the analyzer is relevant with the position of concerned object and/or orientation.As example, concerned object can be above-mentioned controller 110.Image dissector 404 can carry out work, form and it operates relatively according to its in conjunction with the method described in the U.S. Patent application 11/382034 (attorney docket SCEA05082US00).Inertia analyzer 402 can carry out work, form and it operates relatively according to its for the method described in the U.S. Patent application 11/382033 (attorney docket SCEA06INRT1) of " system of three-dimensional input control, method and apparatus " in conjunction with title.Acoustic analysis device 406 can carry out work, form and it operates relatively according to its in conjunction with the method described in the U.S. Patent application 11/381,724.
Analyzer 402,404 and 406 can be counted as related from the different passages of the input of position and/or orientation information.Blender 408 can be accepted a plurality of input channels, and this class passage can comprise the sample data that characterizes sensitive context 401, usually from the angle of passage.The input that the position that inertia analyzer 402, image dissector 404 and acoustic analysis device 406 generate and/or orientation information can be coupled to blender 408.Blender 408 and analyzer 402,404,406 can be inquired about by Games Software program 410, and can be configured to response events and interrupt Games Software.Event can comprise gesture recognition event, interlock variation, configuration variation, noise grade is set, sampling rate is set, changes mapping chain etc., and the below discusses its example.Blender 408 can carry out work, form and it operates relatively according to its in conjunction with method as herein described.
As mentioned above, can be analyzed by inertia analyzer 402, image dissector 404 and acoustic analysis device 406 respectively from the signal such as the different input channels of inertial sensor, video image and/or acoustic sensor etc., in order to during carrying out video-game, determine motion and/or the orientation of controller 110 according to the inventive method.This method can be embodied as a series of (a series of) the processor executable program code instruction of storing in the processor readable medium and move at digital processing unit.For example, shown in Fig. 5 A, video game system 100 can comprise the console 102 with inertia analyzer 402, image dissector 404 and acoustic analysis device 406 of realizing by hardware or software.As example, analyzer 402,404,406 can be embodied as the software instruction that runs on the suitable processor unit 502.As example, processor unit 502 can be digital processing unit, for example the microprocessor of common type in the video game console.The part of instruction can be stored in the memory 506.Alternatively, inertia analyzer 402, image dissector 404 and acoustic analysis device 406 can be realized by hardware, for example special IC (ASIC).This analyzer hardware can be arranged on controller 110 or the console 102, perhaps can long-rangely be arranged on other position.In hardware was realized, analyzer 402,404,406 can be to respond for example to come self processor 502 or the external signal in the source of other certain long-range setting of for example connecting by USB cable, wireless connections or by network and programmable.
The position of the signal that inertia analyzer 402 can comprise or Realization analysis inertial sensor 112 generates and utilization and controller 110 and/or be orientated the instruction of relevant information.Similarly, but the instruction of the image that image dissector 404 Realization analysis image capture unit 114 catch.In addition, but the instruction of the image that acoustic analysis device Realization analysis microphone array 118 catches.Shown in the flow chart 510 of Fig. 5 B, these signals and/or image can be received by analyzer 402,404,406, shown in frame 512.Signal and/or image can be analyzed by analyzer 402,404,406, to determine with the position of controller 110 and/or to be orientated relevant inertia trace information 403, image trace information 405 and acoustics trace information 407, shown in frame 514.Trace information 403,405,407 can be relevant with one or more frees degree.Tracking six degrees of freedom preferably is with the manipulation of characterization control device 110 or other tracked object.This type free degree can be relevant with controller inclination, driftage, rolling and position, speed or the acceleration along x, y and z axes.
Shown in frame 516, blender 408 mixes inertia information 403, image information 405 and acoustic information 407, to generate accurate position and/or orientation information (orientationinformation) 409.As example, blender 408 can come inertia, image and acoustics trace information 403,405,407 are used different weights according to game or environmental condition, and gets weighted average.In addition, blender 408 can comprise its blender analyzer 412, and analyzer 412 is analyzed the position/orientation information of combination, and generates its gained " blender " information of the combination of the information that comprises other parser generation.
In one embodiment of the invention, blender 408 can be given Distribution Value from analyzer 402,404,406 trace information 403,405,407.As mentioned above, can some set of input control data be averaging.But, in the present embodiment, before being averaging, the input control data give certain value to it, thus, recently have larger analysis importance from the input control data of other analyzer from the input control data of some analyzer.
Blender 408 can be born several functions in the context of native system, comprise observation, correction, stable, derivation, combination, Route Selection, mixing, report, buffering, interrupt other process and analysis.This can be with respect to carrying out from analyzer 402,404, one or more trace informations that receive 403,405,407 of 406.Although analyzer 402,404, each some trace information that can receive and/or derive of 406, blender 408 can be embodied as the trace information 403,405 that optimization receives, 407 use, and generate accurate trace information 409.
Analyzer 402,404,406 preferably is configured as trace information with blender 408 provides similar output format.Can be mapped to single parameter in the analyzer from any analyzer element 402,404,406 trace information parameter.Alternatively, by processing from analyzer 402,404, one or more one or more trace information parameters of 406, blender 408 can form analyzer 402,404, any trace information of 406.Blender is capable of being combined takes from two or more elements of the trace information of analyzer 402,404,406 identical parameters type, and/or carry out functions for a plurality of parameters of the trace information of parser generation, have the synthetic set of the output of the beneficial effect that generates from a plurality of passages of input with establishment.
Accurate trace information 409 can carry out using during the video-game in employing system 100, shown in frame 518.In certain embodiments, can come use location and/or orientation information with respect to the posture that user 108 makes during game is carried out.In certain embodiments, blender 408 can operate in conjunction with gesture recognizers 505, so that at least one action in the game environment is related with the one or more user actions (for example manipulation of the controller in the space) from the user.
Shown in the flow chart 520 of Fig. 5 C, but use location and/or orientation information are come the path of tracking control unit 110, shown in frame 522.As example rather than restriction, this path can comprise that center that the quality that represents controller determines is with respect to the set of the point of the position of certain coordinate system.Each location point can represent by the X in one or more coordinates, for example Cartesian coordinates, Y and Z coordinate.Time can be related with the each point on the path, so that can monitor the shape in path and controller along the progress in path.In addition, the each point in the set can be related the orientation of expression controller, for example controller data around one or more angles of the central rotation of its mass.In addition, the each point on the path can be related the speed at center of mass of controller and acceleration and the controller value around the speed of the angle rotation at the center of its mass and angular acceleration.
Shown in frame 524, tracked path and path corresponding to one or more storages of known and/or pre-recorded posture 508 can be compared the context dependent of these known and/or pre-recorded postures 508 and the video-game of carrying out.Identifier 505 can be configured to identify the user or the process audio frequency is differentiated posture etc.For example, the user can be identified by posture by identifier 505, and posture can be that the user is specific.This given pose can be recorded and be included among the pre-recorded posture 508 that memory 506 stores.Recording process can be stored in the audio frequency that generates during the record of posture alternatively.Sensitive context is sampled in the multichannel analyzer and processes.But processor reference pose model is with according to voice or audiograph, determine and differentiate and/or identify user or object with high accuracy and performance.
Shown in Fig. 5 A, the data 508 of expression posture can be stored in the memory 506.The example of posture includes but not limited to: object-throwing, for example ball; Swing object, for example bat or golf club; The suction hand pump; Open or close the door or window; Steering wheel rotation or the control of other vehicle; Wushu movement, for example boxing; The sand papering action; Waxing and paraffin removal; The paint house; Shake hands; Send sound of laughing sound; Rolling; Throw rugby; The swing handle motion; The 3D mouse moves; It is mobile to roll; The movement of major profile; But any record move; Along moving around of any vector, that is, to tyre inflating, but in the space, carry out with certain arbitrary orientation; Movement along the path; Has the movement that accurately stops with the time started; That can record in noise floor, batten, follows the tracks of and repeat handles based on user any time; Etc..In these postures each is can be from path data pre-recorded and as time-based model storage.The comparison of the posture of path and storage can be from the supposition stable state, if the path deviation stable state, then the path can compare by the posture of elimination process and storage.At frame 526, if do not mate, then at frame 522, analyzer can continue the path of tracking control unit 110.If there is fully coupling between the posture of path (perhaps its part) and storage, then the state of game can change, shown in 528.The change of game state can include but not limited to interrupt, transmit control signal, change variable etc.
Here be the possible example that this thing happens.When definite controller 110 has left stable state, the movement of analyzer 402,404,406 or 412 tracking control units 110.As long as the path of controller 110 meets defined path in the posture model 508 of storage, then those postures are possible " hitting ".If the path of controller 110 (in noise tolerance is set) departs from any posture model 508, then from hit tabulation, delete that posture model.Each posture reference model comprises the time base of record posture.Analyzer 402,404,406 or 412 compares in the posture 508 of reasonable time index with controller path data and storage.The appearance replacement clock of limit.When departing from stable state (, outside noise threshold, follow the tracks of when moving), hit list is loaded all possible posture model.Start clock, and movement and the hit list of controller compared.Walkthrough (walk through) time more equally.Finishing if any posture in the hit list arrives posture, then is hit at first time.
In certain embodiments, blender 408 and/or each analyzer 402,404,406,412 can notify games about the time of some event occurs.The example of this class event comprises following:
The zero acceleration point (X and/or Y and/or Z axis) that interruption reaches in some game situation, when the acceleration of controller when flex point changes, the routine in the games can be notified or interrupt to analyzer.For example, user 108 can come with controller 110 the game scapegoat of the quarter back in the control representation rugby simulation.Analyzer can come tracking control unit (expression rugby) via the path that generates according to the signal from inertial sensor 112.But the specific change signaling of the acceleration of controller 110 service.At this moment, but another routine in the analyzer trigger (for example physical analogy bag) is simulated the track of rugby according to the position of penalty mark place controller and/or speed and/or orientation.
Interrupt the new posture of identification
In addition, analyzer can dispose by one or more inputs.The example of this class input includes but not limited to:
Employed with reference to tolerance when noise grade (X, Y or Z axis) noise grade being set being the shake of analyzing user's hand in the game.
Sampling rate is set.This paper employed " sampling rate " can refer to that analyzer is for the frequency of taking a sample from the signal of inertial sensor.Sampling rate can be arranged to signal cross the sampling or be averaging.
Interlock (gearing) is set.This paper employed " interlock " refer generally to controller move with play in the ratio of the movement that occurs.The example of this " interlock " in the context of control video-game is found in the Application No. 11/382040 (attorney docket No.:SONYP058D) of submitting on May 7th, 2006, by reference it is attached to herein.
The mapping chain is set.This paper employed " mapping chain " refers to the figure of posture model.The hybrid channel that can make the posture illustraton of model be suitable for specific input channel (path data that for example only generates from the inertial sensor signal) or mixer unit, form.
Can be by serving three input channels from similar two or more the different analyzers of inertia analyzer 402.Specifically, they can comprise: inertia analyzer 402 as described herein, for example be the video analyzer described in " for detection of the scheme of handling with the user who follows the tracks of the game console main body " (attorney docket SCEA05082US00) at the U.S. Patent application 11/382034 of authorizing inventor Gary M.Zalewski, title, it is incorporated herein by reference, and the acoustic analysis device described in the U.S. Patent application 11/381721 incorporated herein by reference for example.Analyzer can dispose with the mapping chain.The mapping chain can be swapped out by game during game is carried out, and for example can be set to analyzer or blender.
With reference to the frame 512 of Fig. 5 B, those skilled in the art can know, exists many modes to generate signal from inertial sensor 112 again.This paper describes several examples wherein.With reference to frame 514, exist many modes to come in the analysis block 512 sensor signal that generates to obtain with the position of controller 110 and/or to be orientated relevant trace information.As example rather than restriction, trace information can include but not limited to individually or the information relevant with following parameters in any combination:
The controller orientation.The orientation of controller 110 can according to respect to certain with reference to the orientation pitching (pitch), the rolling (roll) or the driftage (yaw) angle, for example represent with radian.The rate of change (for example angular speed or angular acceleration) of controller orientation also can be included in position and/or the orientation information.For example, comprise at inertial sensor 112 in the situation of gyrosensor, directly the controller orientation information of the form of the proportional one or more output valves of angle of acquisition and pitching, rolling or driftage.
Location of controls (for example Cartesian coordinate X, Y, the Z of certain referential middle controller 110)
Controller X-axis speed
Controller Y-axis speed
Controller Z axis speed
Controller X-axis acceleration
Controller Y-axis acceleration
Controller Z axis acceleration
Be noted that with respect to position, speed and acceleration position and/or orientation information can represent according to the coordinate system different from the flute card.For example, cylinder or spherical coordinate can be used for position, speed and acceleration.Acceleration information with respect to X, Y and Z axis can directly obtain from the accelerometer type sensor, and is as described herein.X, Y and Z acceleration can carry out integration for the time from certain initial time, to determine the variation of X, Y and Z speed.Can by with X, the Y of velocity variations and initial time and the given value addition of Z speed, calculate these speed.X, Y and Z speed can be carried out integration for the time, to determine X, Y and the Z displacement of controller.Can by known X, Y and the addition of Z position with displacement and initial time, determine X, Y and Z position.
This customizing messages of stable state Y/N-represents whether controller is in stable state, and it may be defined as any position, also can be through changing.In a preferred embodiment, stable position can be controller remains on the height that roughly flushes with user's waist with about horizontal alignment position.
" from the time of last stable state " generally refer to from detecting for the last time stable state (as mentioned above) since through how long section relevant data.As previously described, the time determine can be in real time, calculate by processor cycle or sampling period.With the personage that guarantees to shine upon in the game environment or the degree of accuracy of object, " from the time of last stable state " can be important for the tracking of the controller of resetting with respect to initial point.For actions available/posture of determining may to move subsequently in the game environment (foreclose or be included in), these data also can be important.
" the last posture of identification " generally refers to the last posture by gesture recognizers 505 (can realize by hardware or software) identification.For previous posture can with subsequently discernible may posture or game environment in the relevant fact of other certain action of occuring, the sign of the last posture of identification can be important.
The time of the last posture of identification
Can take a sample to above-mentioned output by games or software at any time.
In one embodiment of the invention, blender 408 can be given Distribution Value from analyzer 402,404,406 trace information 403,405,407.As mentioned above, can some set of input control data be averaging.But, in the present embodiment, before being averaging, the input control data give certain value to it, thus, recently have larger analysis importance from the input control data of other analyzer from the input control data of some analyzer.
For example, the trace information that blender 408 can be relevant with acceleration and stable state.Then, blender 408 will receive trace information 403,405,407, as mentioned above.Trace information can comprise the parameter relevant with acceleration and stable state, for example the above.Before the data that represent this information were averaging, blender 408 can be given Distribution Value trace information data set 403,405,407.For example, can be weighted for x and y acceleration parameter from inertia analyzer 402 with 90% value.But, can only be weighted for x and y acceleration parameter from image dissector 406 with 10%.Acoustic analysis device trace information 407 can be weighted with 0% when relating to acceleration parameter, that is, and and this data void value.
Similarly, can be weighted with 10% from the Z axis trace information parameter of inertia analyzer 402, and image dissector Z axis trace information can be weighted with 90%.Acoustic analysis device trace information 407 can be weighted with 0% value equally, but can be weighted with 100% from the steady track information of acoustic analysis device 406, and wherein all the other analyzer trace informations can be weighted with 0%.
After giving suitable distribution of weights, can come the input control data are averaging in conjunction with that weight, to draw weighted average input control data set, this data set is analyzed by gesture recognizers 505 subsequently, and related with the specific action in the game environment.Related value can be by blender 408 or pre-defined by the particular game title.These values also can be that blender 408 identifications are from the particular data quality of each analyzer thereby the result who carries out the following dynamic adjustment of further discussing.Adjusting also can be the result of the historical knowledge base when being structured in particular data and having particular value and/or respond the characteristic of given game title in specific environment.
Blender 408 can be configured to dynamic operation during game is carried out.For example, when blender 408 receives various input control data, it can recognize certain data all the time outside acceptable data area or quality or reflection can indicate the corrupt data of the processing mistake of related input device.
In addition, some condition of real world environment can change.For example, the natural daylight in user's the family game environment may forward in the morning constantly to be increased the lower period of the day from 11 a.m. to 1 p.m, thus the problem that causes view data to catch.In addition, neighbours or household may become more noisy with the passing of time in one day, thereby go wrong when causing audio data capture.Equally, if the user has carried out a few hours game, then their respond becomes not too sharp, thereby causes the problem of the explanation of inertial data.
In these cases, perhaps the quality in the input control data of particular form becomes in any other situation of problem, the specific collection from the data of specific device can dynamically be given distribution of weights (weight) again by blender 408, so that give specific input control data more or less importance, as mentioned above.Similarly, the game process that game environment can change with the needs of particular game changes, thereby needs again assignment or need specific input control data.
Similarly, blender 408 can be processed improperly, be processed lentamente or do not processed fully according to processing wrong or can being recognized by the feedback data that gesture recognizers 505 generates certain data that is delivered to gesture recognizers 505.Respond this feedback or (for example recognize these difficult treatment, when image analysis data is within tolerance interval, when carrying out association by gesture recognizers 505, produce wrong), if blender 408 capable of regulatings are sought which input control data and the time in situation about having from which analyzer.Before the input control data are delivered to blender 408, blender 408 also can need suitable analyzer to some analysis and the processing of input control data, it is deal with data (for example data being averaging) again, so that consist of about effectively and suitably process another layer assurance of the data that pass to gesture recognizers 505.
In certain embodiments, blender 408 can recognize certain data damage, invalid or exceed outside the particular variables, and specific input control data or variable that can be relevant with those data, so that it can replace incorrect data, perhaps with respect to necessary variable certain data of analysis and calculation suitably.
According to embodiments of the invention, the video game system of the above-mentioned type and method can realize according to mode shown in Figure 6.Video game system 600 can comprise processor 601 and memory 602 (such as RAM, DRAM, ROM etc.).In addition, if realize parallel processing, then video game system 600 can have a plurality of processors 601.Memory 602 comprises data and games code 604, and it can comprise the part that is configured as mentioned above.Specifically, memory 602 can comprise inertial signal data 606, and these inertial signal data 606 can comprise aforesaid storage control routing information.Memory 602 also can comprise the gesture data 608 of having stored, the data of one or more postures that for example expression is relevant with games 604.The coded command that runs on processor 602 can realize many input mixers 605, and it can be configured and work according to the above.
System 600 also can comprise well-known support function 610, for example I/O (I/O) element 611, power supply (P/S) 612, clock (CLK) 613 and high-speed cache 614.Equipment 600 can comprise the mass storage device 615 of storage program and/or data, such as disc driver, CD-ROM drive, tape drive etc. alternatively.Controller can also comprise display unit 616 and user interface section 618 alternatively, so that mutual between controller 600 and the user.Display unit 616 can take to show the cathode-ray tube (CRT) of text, numeral, graphical symbol or image or the form of flat screens.User interface 618 can comprise keyboard, mouse, control stick, light pen or other device.In addition, user interface 618 can comprise microphone, video camera or other chromacoder, so that the direct seizure of signal to be analyzed to be provided.The processor 601 of system 600, memory 602 and other assembly can exchange signal (for example code command and data) mutually via system bus 620, as shown in Figure 6.
Microphone array 622 can be by I/0 function 611 and system's 600 couplings.Microphone array can comprise about 2 to about 8 microphones, about 4 microphones preferably, wherein adjacent microphone separate less than about 4 centimetres, be preferably the distance between about 1 centimetre and about 2 centimetres.Preferably, the microphone in the array 622 is omni-directional microphone.Optional image capture unit 623 (for example video camera) can be by I/O function 611 and equipment 600 couplings.With the one or more sensing executing agency 625 of camera mechanical couplings can be via I/O function 611 and processor 601 exchange signals.
The general pointing system 600 of term as used herein " I/0 " and to the periphery device transmit data or transmit any program, operation or device from the data of system 600 and peripheral unit.Each data transmission can be regarded as from the output of a device and to the input of another device.Peripheral unit comprises a input unit such as keyboard and mouse etc., such as an output device of printer etc. and such as the devices such as write CD-ROM that will serve as input and output device.Term " peripheral unit " comprising: such as the external device (ED) of mouse, keyboard, printer, monitor, microphone, game console, camera, outside Zip drive or scanner etc. and such as the interior arrangement of CD-ROM drive, CD-R driver or internal modems etc. or such as other peripheral hardwares of flash memory reader/writer, hard disk drive etc.
In certain embodiments of the present invention, equipment 600 can be the video-game unit, and it can comprise via the controller 630 of I/0 function 611 with processor wired (for example USB cable) or wireless coupling.Controller 630 can have analog joystick control 631 and conventional button 633, and they provide carries out control signal commonly used during the video-game.This class video-game can be embodied as from being stored in memory 602 or such as processor readable data and/or the instruction of the program 604 in other related with mass storage device 615 etc. processor readable medium.In certain embodiments, blender 605 can receive the input from analog joystick control 631 and button 633.
Control stick control 631 generally can be configured to to the left or to the right mobile control lever transmitted signal notice along the movement of X-axis, control lever (make progress) forward or backward (downwards) movement then signaling along the movement of Y-axis.Be used for three-dimensional mobile control stick in configuration, left (counterclockwise) or to the right (clockwise) but reverse the control stick signaling along the movement of Z axis.These three axle-X, Y and Z-are called respectively rolling, pitching and driftage usually, particularly with respect to aircraft.
Game console 630 can comprise can operate with processor 602, game console 630 at least one or they both carry out the communication interface of digital communication.Communication interface can comprise universal asynchronous receiver transmitter (" UART ").UART can operate to receive for the operation of control tracking means or for the control signal that transmits the signal that communicates with another device from tracking means.Alternatively, communication interface comprises USB (" USB ") controller.The USB controller can operate to receive for the operation of control tracking means or for the control signal that transmits the signal that communicates with another device from tracking means.
In addition, controller 630 can comprise one or more inertial sensors 632, and it can provide position and/or orientation information to processor 601 via inertial signal.Orientation information can comprise angle information, for example the inclination of controller 630, rolling or driftage.As example, inertial sensor 632 can comprise any amount of accelerometer, gyroscope or inclination sensor or their any combination.In a preferred embodiment, inertial sensor 632 comprises: inclination sensor is suitable for sensing game console 630 with respect to the orientation of inclination and roll axis; The first accelerometer is suitable for sensing along the acceleration of yaw axis; And second accelerometer, be suitable for sensing with respect to the angular acceleration of yaw axis.Accelerometer can be embodied as for example MEMS device, comprises the mass by one or more spring fittings, wherein has for the sensor of sensing mass with respect to the displacement of one or more directions.Can be used to determine the acceleration of game console 630 from the signal of the displacement of depending on mass of sensor.This class technology can be by from being stored in the memory 602 and being realized by the instruction of the games 604 of processor 601 operations.
As example, the accelerometer that is suitable as inertial sensor 632 can be for example by spring, determine with the simple quality of framework Elastic Coupling on three or four points.Pitching and roll axis are arranged in the plane with the frame intersection that is installed to game material controlling device 630.When framework (with game console 630) during around the rotation of pitching and roll axis, quality certainly will be at the bottom offset that affects of gravity, and spring will extend or compress in the mode of the angle that depends on pitching and/or rolling.The displacement of mass can be sensed and be converted the signal that depends on pitching and/or rolling amount to.Also can produce the compression of spring and/or the motion characteristics figure of elongation or mass around the angular acceleration of yaw axis or along the linear acceleration of yaw axis, they can be sensed and convert the signal of the amount that depends on angle or linear acceleration to.The movement that this accelerometer means can be determined by tracking quality or the compression and expansion power of spring are measured inclination around yaw axis, roll angle acceleration and along the linear acceleration of yaw axis.Exist many different modes to come the position of tracking quality piece and/or be applied to power on it, comprising strain ga(u)ge material, photon sensor, Magnetic Sensor, Hall effect device, piezo-electric device, capacitance sensor etc.
In addition, game console 630 can comprise one or more light sources 634, for example light emitting diode (LED).Light source 634 can be used to distinguish a controller and another controller.For example, one or more LED can be by making the flicker of LED schema code or keeping realizing this aspect.As example, 5 LED can be arranged on the game console 630 with linearity or two-dimensional model.Although the linear array of LED is preferred,, LED can alternatively be arranged to rectangular pattern or arch pattern, so that determine the plane of delineation of led array when the image of analysis image capture unit 623 resulting LED patterns.In addition, the LED schema code also is used in the location of determining game console 630 during game is carried out.For example, LED can help inclination, driftage and the rolling of identification controller.This detecting pattern can help in game, provide better user/sensation as in the aircraft flight game etc.Image capture unit 623 can catch the image that comprises game console 630 and light source 634.The analysis of this class image can be determined position and/or the orientation of game console.This analysis can be by being stored in the memory 602 and being realized by the code instructions 604 of processor 601 operations.For the ease of catch the image of light source 634 by image capture unit 623, light source 634 can be arranged on two or more different sides of game console 630, for example is arranged on (shown in shade) on the front and back.This layout allows image capture unit 623 to keep the mode of game console 630 to obtain the image of light source 634 for the different orientation of game console 630 according to the user.
In addition, light source 634 can provide telemetered signal to processor 601 by for example pulse code, amplitude modulation(PAM) or frequency modulation(PFM) form.This class telemetered signal can indicate the dynamics of pressing which control stick button and/or pressing this class button.For example by pulse code, pulsewidth modulation, frequency modulation(PFM) or light intensity (amplitude) modulation, telemetered signal can be encoded into optical signal.Processor 601 can be decoded to the telemetered signal from optical signal, and the telemetered signal of response decoding and running game order.Can from the graphical analysis of image capture unit 623 resulting game consoles 630, decode to telemetered signal.Alternatively, equipment 600 can comprise and is exclusively used in the independent optical pickocff that receives from the telemetered signal of light source 634.Use LED such as the Application No. of authorizing the people such as Richard L.Marks 11/429414 of submitting on May 4th, 2006, title for having described in conjunction with definite intensive quantity with the computer program interface in " processing with the intensity of computer program interface and computer picture and the audio frequency of input unit " (attorney docket No.SONYP052), by reference it intactly is attached to herein.In addition, the analysis that comprises the image of light source 634 can be used for position and/or the orientation of remote measurement and definite game console 630.This class technology can be by being stored in the memory 602 and being realized by the instruction of the program 604 of processor 601 operations.
The sound source position of the acoustical signal that processor 601 can detect with the optical signalling of the light source 634 that detects from image capture unit 623 and/or from microphone array 622 and characterization information are combined with the inertial signal from inertial sensor 632, in order to derive about controller 630 and/or its user's position and/or the information of orientation.For example, " acoustic radar " sound source position and sign can be used for following the tracks of mobile voice with microphone array 622 combinations, and the simultaneously motion (by inertial sensor 632 and/or light source 634) of game console is followed the tracks of separately.In acoustic radar, the time select precalibrated audit area in operation, and the sound that sends of the source of the precalibrated audit area of filtering outside.Precalibrated audit area can comprise the audit area corresponding with a large amount of focuses of image capture unit 623 or the visual field.The title of authorizing Xiadong Mao of submitting on May 4th, 2006 is the example that the Application No. 11/381724 of " being used for the method and apparatus that target sound detects and characterizes " is described acoustic radar in detail, by reference it is attached to herein.Provide any amount of various combination of the different mode of control signal to be combined with embodiments of the invention to processor 601.This class technology can be by being stored in the memory 602 and being realized by the code instructions 604 of processor 601 operations, and can comprise one or more instructions alternatively, the sound that the one or more processors of these commands direct select precalibrated audit area and filtering to send from the source of precalibrated audit area outside when operation.Precalibrated audit area can comprise the audit area corresponding with a large amount of focuses of image capture unit 623 or the visual field.
Program 604 can comprise one or more instructions alternatively, and the one or more processors of these commands direct are from the microphone M of microphone array 622 0... M MMiddle generation discrete time-domain input signal x m(t), determine to monitor sector (sector), and in separate in the half-blindness source, select the finite impulse response filter coefficient with the monitoring sector, in order to branch away from input signal x m(t) different sound sources.Program 604 also can comprise and one or more fractional delays are applied to and come self-reference microphone M 0Input signal x 0(t) different selected input signal x m(t) instruction.Each fractional delay may be selected to the signal to noise ratio of optimizing from the discrete time-domain output signal y (t) of microphone array.Fractional delay may be selected to so that come self-reference microphone M 0Signal be first with respect to the signal from other microphone of array in time.Program 604 also can comprise introduces the instruction of the output signal y (t) of microphone array with the mark Δ of delaying time, so that: y (t+ Δ)=x (t+ Δ) * b 0+ x (t-1+ Δ) * b 1+ x (t-2+ Δ) * b 2+ ...+x (t-N+ Δ) * b N, wherein Δ 0 and ± 1 between.The title of authorizing Xiadong Mao of submitting on May 4th, 2006 is the example of describing this class technology in the Application No. 11/381729 of " microminiature microphone array " in detail, and is complete open in conjunction with it by reference.
Program 604 can comprise one or more instructions, and these instructions make system 600 select to comprise the precalibrated monitoring sector of sound source when operation.This class instruction can make equipment determine whether sound source is arranged in initial sector or is positioned at the specific side of initial sector.If sound source is not in default sector, then instruction can be selected the different sectors of a specific side of default sector when operation.The feature of this difference sector can be the decay near the input signal of optimum value.These instructions can be calculated from the decay of the input signal of microphone array 622 and to the decay of optimum value in when operation.Instruction can make equipment 600 determine the pad value of the input signal of one or more sectors in when operation, and selects decay near the sector of optimum value.The title of authorizing Xiadong Mao of for example submitting on May 4th, 2006 is the example that the U.S. Patent application 11/381725 of " being used for the method and apparatus that target sound detects " has been described this technology, by reference it openly is attached to herein.
The input of part trace information can be provided from the signal of inertial sensor 632, and the input of another part trace information can be provided from image capture unit 623 by the signal of following the tracks of one or more light sources 634 and generating.As example rather than restriction, this class " mixed mode " signal can be used in the rugby type video game, and wherein the head of quarter back left met and discussed and delivered to the right after the feinting deception.Specifically, but the game player labour contractor who holds controller 630 turns to the left side, and seems that rugby equally is flapped toward and right-hand sounds when throwing action carrying out controller.The microphone array 622 of being combined with " acoustic radar " program code can be followed the tracks of user's voice.Image capture unit 623 can be followed the tracks of the motion of user's head or be followed the tracks of other order that does not need sound or use controller.Sensor 632 can be followed the tracks of the motion of game console (expression rugby).But image capture unit 623 is the light source 634 on the tracking control unit 630 also.Can be when a certain amount of and/or direction of the acceleration that reaches game console 630, perhaps when the key commands that triggers by the button of pressing on the controller 630, the user can unclamp " ball ".
In certain embodiments of the present invention, for example can be used to determine the position of controller 630 from accelerometer or gyrostatic inertial signal.Specifically, can be with respect to time integral once from the acceleration signal of accelerometer, determining the variation of speed, and speed can carry out integration with respect to the time, to determine the variation of position.If the value of the initial position of certain time and speed is known, then can determine the absolute position with the variation of these values and speed and position.Although can make the location positioning that uses inertial sensor faster than using image capture unit 623 and light source 634, but, inertial sensor 632 may be through being called one type the mistake of " drift ", and the mistake of wherein accumulating in time can cause the inconsistent of the position (with shadow representation) of the control stick 631 that calculates from inertial signal and the physical location between the game console 630.Embodiments of the invention allow various ways to process this class mistake.
For example, can reset to by the initial position with controller 630 position that equals current calculating, manually offset drift.The user can be with one or more orders that trigger the replacement initial position of the button on the controller 630.Alternatively, can by current location being reset to according to from the determined position as a reference of image that image capture unit 623 obtains, realize the image-based drift.For example when the user triggers button on the game console 630 one or more, can manually realize this image-based drift compensation.Alternatively, for example carry out with the regular time interval or response game and automatically realize the image-based drift compensation.This class technology can be by being stored in the memory 602 and being realized by the code instructions 604 of processor 601 operations.
In certain embodiments, may wish to compensate parasitic data in the inertial sensor signal.For example, can carry out sampling for the signal from inertial sensor 632, and calculate sliding average from crossing sampled signal, in order to from the inertial sensor signal, remove parasitic data.In some cases, may wish signal was carried out sampling, and from certain subset of data point, get rid of height and/or low value, and calculate sliding average from remainder data point.In addition, other data sampling and manipulation technology can be used for adjusting the signal from inertial sensor, in order to remove or reduce the importance of parasitic data.The calculating that the selection of technology can be depending on the character of signal, carry out signal, the character that game is carried out or their two or more certain combinations.This class technology can be by being stored in the memory 602 and being realized by the instruction of the program 604 of processor 601 operations.
Processor 601 can respond by memory 602 storage and retrieval and be carried out the analysis of aforesaid inertial signal data 606 by the code instructions of the data 606 of processor module 601 operations and program 604.The code section of program 604 can meet any of multiple different programming languages, for example compilation, C++, JAVA many other Languages perhaps.Processor module 601 forms all-purpose computers, and it becomes special-purpose computer when for example moving program code 604 supervisor.Although program code 604 is described as realizing and run on all-purpose computer by software in this article, but, those skilled in the art can know that the method for task management alternatively can be used such as the hardware of special IC (ASIC) or other hardware circuit etc. and realize.Therefore, should be appreciated that embodiments of the invention can realize by software, hardware or their both combinations in whole or in part.
In one embodiment, program code 604 wherein also can comprise the processor readable instruction sets, and this instruction set realizes having the method for the feature the same with the method 520 of the method 510 of Fig. 5 B and Fig. 5 C or their two or more certain combinations.Program code 604 generally can comprise one or more instructions, and the one or more processor analyses of these commands direct in order to generate position and/or orientation information, and utilize this information from the signal of inertial sensor 632 during carrying out video-game.
Program code 604 can comprise processor executable alternatively, comprising one or more instructions, they make the visual field of image capture unit 623 monitoring image capture unit 623 fronts when operation, one or more in the light source 634 of identification in the visual field detect the variation of the light that sends from light source 634; And response detects and changes and trigger input command to processor 601.The title of authorizing Richard L.Marks of for example submitting on January 16th, 2004 has been described for the Application No. 10/759782 of " method and apparatus that is used for optical input device " and has been combined with LED with image capture device and triggers action in the game console, by reference it intactly is attached to herein.
Program code 604 can comprise processor executable alternatively, comprising one or more instructions, they when operation use from the signal of inertial sensor and from image capture unit by the signal following the tracks of one or more light sources and generate as the input to games system, as mentioned above.Program code 604 can comprise processor executable alternatively, comprising one or more instructions of the drift in the compensation inertial sensor 632 when moving.
In addition, program code 604 can comprise processor executable alternatively, handles the interlock of game environment and one or more instructions of mapping comprising adjust controller when moving.This feature allows the user to change the manipulation of game console 630 to " interlock " of game state.For example, 45 of game console 630 degree rotations can be spent rotating gangs with 45 of game object.But interlock in this 1: 1 is rotated (perhaps tilting or driftage or " manipulation ") than can be changed into so that the X degree of controller rotation (perhaps tilting or driftage or " manipulation ") converts the Y of game object to.Interlock can be 1: 1 ratio, 1: 2 ratio, 1: X ratio or X: Y ratio, the wherein desirable arbitrary value of X and Y.In addition, input channel also can be revised in time or immediately to the mapping of game control.Modification can comprise the threshold value of change posture locus model, location revision, scale, posture etc.This mapping can be through programming, at random, overlapping, staggered etc., in order to the manipulation of dynamic range is provided for the user.The modification of mapping, interlock or ratio can by games 604 according to game carry out, game state, by being arranged on the user's modifier button (keypad etc.) on the game console 630 or responding widely input channel adjustment.Input channel can include but not limited to audio frequency that audio user, controller generate, tracking audio frequency, controller buttons state, video camera output that controller generates, comprise the controller telemetry of accelerometer data, inclination, driftage, rolling, position, acceleration and from any other data that can follow the tracks of the user or handle for the user of object of sensor.
In certain embodiments, games 604 can change respectively mapping or interlock from a kind of scheme or ratio to another kind of scheme in time by relevant mode of predetermined time.Interlock and mapping change can be by applicable in various ways in game environment.In one example, when the personage was healthy, the video-game personage can control according to a kind of interlock scheme, and when personage's health worsened, therefore system's capable of regulating control order forced the user to aggravate the movement of controller to illustrate order to the personage.So that when regaining personage's control under new mapping, the video-game personage who gets lost that becomes can force the mapping that changes input channel when for example requiring the user to adjust input.The modification input channel also can change during game is carried out to the mapping scheme of the conversion of game order.This conversion can respond the modification order of sending under one or more elements of game state or response input channel and be undertaken by variety of way.Interlock and mapping also can be configured to affect configuration and/or the processing of one or more elements of input channel.
In addition, the acoustic emitter 636 such as loudspeaker, buzzer, bugle, bagpipe etc. can be installed to Joystick controller 630.In certain embodiments, acoustic emitter can be installed in separable mode " main body (body) " of Joystick controller 630.In program code 604 location with characterize among " acoustic radar " embodiment that adopts the sound that microphone array 622 detects, acoustic emitter 636 can provide the audio signal that can be detected and be used for following the tracks of by program code 604 by microphone array 622 position of game console 630.Acoustic emitter 636 also can be used for additional " input channel " offered processor 601 from game console 630.Can be regularly send audio signal from acoustic emitter 636 with pulse, in order to the beacon that makes the acoustic radar tracing positional is provided.Audio signal (with pulse transmission or alternate manner) can be that can listen or hyperacoustic.The user that acoustic radar can be followed the tracks of game console 630 handles, and wherein this manipulation tracking can comprise the information relevant with orientation (for example pitching, rolling or yaw angle) with the position of game console 630.Pulse can trigger with the suitable work period, and this is that those skilled in the art can use.Pulse can be initiated according to the control signal from system arbitrament.The distribution of the control signal between two or more Joystick controllers 630 of system 600 (by program code 604) tunable and processor 601 couplings is to guarantee following the tracks of a plurality of controllers.
In certain embodiments, blender 605 can be configured to obtain control be used to the input that the routine controls such as analog joystick control 631 and button 633 of using from the game console 630 receives the input of the operation of games 604.Specifically, receive the controller input message that blender 605 can receive self-controller 630.The controller input message can comprise following at least one: a) the removable control lever of user of identification game console is with respect to the information of the current location of the resting position of control lever, perhaps b) whether the switch that comprises in the identification game console be movable information.Blender 605 also can receive the additional input message from the environment that just uses controller 630.As example rather than restriction, additional input message can comprise following one or more: i) the resulting information of the image capture device from environment (for example image capture unit 623); And/or ii) from the information of at least one related inertial sensor (for example inertial sensor 632) of game console or user; And/or iii) the resulting acoustic intelligence of the sonic transducer from environment (for example from microphone array 622, may be combined with the acoustical signal that acoustic transmitter 636 generates).
The controller input message can comprise also whether identification presser sensor button is movable information.To produce combinatorial input, blender 605 can obtain the combinatorial input for the operation of control games 604 by processing controller input message and additional input message.
Combinatorial input can comprise for accordingly each merging input of each function of run duration control at games 604.Can by merging about the controller input message of specific independent function and about the additional input message of specific independent function, obtain at least some that each merges input.Combinatorial input can comprise for the merging input of controlling certain function at the run duration of games 604, and can by merging about the controller input message of this function and about the additional input message of this function, obtain to merge at least some of input.In this class situation, can by asking the average of the value that represents the controller input message and the value that represents additional input message, carry out merging.As example, can ask according to one to one ratio the average of the value of the value of controller input message and additional input message.Alternatively, the controller input message all can be endowed different weights with additional input message, and can according to the tax weight, as the weighted average of the value of controller input message and additional input message, carry out and average.
In certain embodiments, the value of first of controller input message or additional input message can be used as the modification input to games, is used for revising at least one still control of movable function that activates of second for according to controller input message or additional input message.Additional input message can comprise the orientation information by the orientation of operation inertial sensor 632 resulting inertial sensor information and/or expression user movable objects.Alternatively, additional input message comprises at least one information of the position of indicating user movable objects or orientation.Here employed " user's movable objects " can refer to controller 630 or be installed to the product of the main body of controller 630, and additional input message comprises the information of the orientation of indicating user movable objects.As example, this orientation information can comprise at least one the information in indication pitching, driftage or the rolling.
In certain embodiments, the value of the controller input message of position that can be by will representing control lever (for example analog joystick 631 one of them) merges with the value of the additional input message of the orientation of expression user movable objects, obtains combinatorial input.As mentioned above, user's movable objects can comprise object and/or the game console 630 that is installed to game console 630, and when backward movement of control lever, simultaneously pitching just had been increased to (high head (nose-up)) value, combinatorial input can reflect the input of facing upward of enhancing.Similarly, when control lever move forward, when simultaneously pitching reduces to negative (undershoot) value, combinatorial input can reflect that the underriding of enhancing inputs.
The value of the controller input message of position that can be by specifying the expression control lever as thin control information, obtains combinatorial input as thick control information and the value of additional input message of orientation of specifying expression user movable objects.Alternatively, switch that can be by specifying the identification game console whether be the value of movable controller input message as thick control information and the value of additional input message of orientation of specifying expression user movable objects as thin control information, obtain combinatorial input.In addition, the value of the additional input message of orientation that can be by specifying expression user movable objects as thin control information, obtains combinatorial input as thick control information and the value of controller input message of position of specifying the expression control lever.In addition, switch that also can be by specifying the identification game console whether be the value of movable controller input message as thin control information and the value of additional input message of orientation of specifying expression user movable objects as thick control information, obtain combinatorial input.In all these situations or any situation wherein, combinatorial input can represent to adjust according to thin control information the value of the thick control information of less quantity.
In certain embodiments, can pass through the represented value addition combination of the value that the controller input message is represented and additional input message, so that combinatorial input provides the signal of any the higher or lower value with the value of getting separately than controller input message or additional input message to games 604, obtain combinatorial input.Alternatively, combinatorial input can provide the signal with smooth value to games 604, and the smooth value signal is in time through any slower variation of the value of getting separately than controller input message or additional input message.Combinatorial input also can provide to games the high-definition signal of the signal content with increase.High-definition signal can be in time changes more rapidly through any of the value of getting separately than controller input message or additional input message.
Although described embodiments of the invention according to the example relevant with the game of PlayStation 3 videogame console/PS3 630, but, the embodiments of the invention that comprise system 600 can be handled use on main body, molded object, knob, the structure etc. any user, wherein have the inertial sensor signal transmission capabilities of inertia sensing ability and wireless or alternate manner.
As example, embodiments of the invention can be realized at parallel processing system (PPS).This class parallel processing system (PPS) generally includes two or more processor elements, and they are configured to use some parts of independent processor parallel running program.As example rather than restriction, Fig. 7 illustrates cel l processor 700 according to an embodiment of the invention a type.Cell processor 700 can be used as the processor 601 of Fig. 6 or the processor 502 of Fig. 5 A.In the example depicted in fig. 7, cell processor 700 comprises main storage 702, power programmer element (PPE) 704 and a plurality of coprocessor element (SPE) 706.In the example depicted in fig. 7, cell processor 700 comprises single PPE 704 and eight SPE 706.In this configuration, seven among the SPE 706 can be used for parallel processing, and one can keep for subsequent use when being out of order as one in other seven.Alternatively, the cell processor can comprise many group PPE (PPE group) and many group SPE (SPE group).In this case, hardware resource can be shared between the unit in a group.But SPE and PPE must show as independent component to software.Therefore, embodiments of the invention are not limited to compound and cooperation shown in Figure 7 and use.
Main storage 702 generally include general and Nonvolatile memory devices and be used for such as system configuration, data transmission synchronously, specialized hardware register or the array of the functions such as memory mapped I/O and I/O subsystem.In an embodiment of the present invention, video game program 703 can be resided in the main storage 702.Memory 702 also can comprise signal data 709.Video program 703 can comprise inertia, image and above or acoustic analysis device and blender that their certain combination dispose described for Fig. 4, Fig. 5 A, Fig. 5 B or Fig. 5 C.Program 703 can be moved at PPE.Program 703 can be divided into a plurality of signal processing tasks that can move at SPE and/or PPE.
As example, PPE 704 can be 64 PowerPC processor units (PPU) of the related L1 of tool and L2 high-speed cache.PPE 704 is General Porcess Unit, its addressable system management resource (for example memory protection table).Hardware resource can clearly be mapped to the actual address space that PPE sees.Therefore, PPE can be by using suitable effective address value directly to any addressing of these resources.The major function of PPE 704 is the task of the SPE 706 in management and the distribution cell processor 700.
Although single PPE only is shown among Fig. 7, realize at some cell processors, such as cell wideband engine framework (CBEA) in, cell processor 700 can have a plurality of PPE that are organized into the PPE group, can exist more than a PPE in the PPE group.These PPE groups can be shared the access to main storage 702.In addition, cell processor 700 can comprise two or more groups SPE.The SPE group also can be shared the access to main storage 702.This class configuration falls within the scope of the present invention.
Each SPE 706 comprises coprocessor unit (SPU) and its local storage LS.Local storage LS can comprise one or more independently memory storage areas, and each is related with specific SPU.Each SPU can be configured to only to move from the instruction in the storage territory, this locality of its association (comprising that data load and data storage operations).In this configuration, can be by send direct memory access (DMA) (DMA) order from memory stream controller (MFC) in order to transmit data to (separately SPE) local storage territory or transmit data from storage territory, this locality, carry out the data transmission between other position of local storage LS and system 700.Compare with PPE 704, SPU is not too complicated computing unit, because they do not carry out any system management function.SPU generally has single-instruction multiple-data (SIMD) ability, and common deal with data and initiate any desired data transmission (obeying the access attribute that PPE sets up), in order to carry out its allocating task.The purpose of SPU is needing to realize the application of higher computing unit density, and the instruction set that provides can be provided effectively.A large amount of SPE in the system that PPE 704 manages allow the cost-effective processing for widespread adoption.
Each SPE 706 can comprise private memory stream controller (MFC), and it comprises the association store management unit that can keep and process memory protection and access grant information.MFC provides data transmission, protection and the synchronous first one step process between the local storage of the main storage means of cell processor and SPE.The transmission that the MFC command description is pending.The order of transmitting data is called MFC direct memory access (DMA) (DMA) order (or MFC command dma) sometimes.
Each MFC can support a plurality of DMA to transmit simultaneously, and can keep and process a plurality of MFC orders.Each MFC DMA data transferring command request can comprise local memory address (LSA) and effective address (EA).Local memory address can be only to the local storage direct addressin of its related SPE.Effective address can have more general application, and for example, it can quote main storage means, comprises all SPE local storages, if they are aliased into the actual address space.
For help between the SPE 706 and/or SPE 706 and PPE 704 between communicate by letter, SPE706 and PPE 704 can comprise the signal notice register that relies on signaling event.PPE 704 and SPE 706 can be coupled by star topology, and wherein PPE 704 serves as the router that transmits message to SPE 706.Alternatively, each SPE 706 and PPE 704 can have the one way signal notice register that is called mailbox.Mailbox can be used for presiding over operating system (0S) synchronously by SPE 706.
Cell processor 700 can comprise I/O (I/O) function 708, cell processor 700 can by this function with such as the peripheral unit interface of microphone array 712 and optional image capture unit 713 and game console 730 etc.The game console unit can comprise inertial sensor 732 and light source 734.In addition, element interconnection bus 710 can connect above-mentioned various assembly.Each SPE and PPE can visit bus 710 by Bus Interface Unit BIU.Cell processor 700 also can comprise two controllers that usually are present in the processor: the bus interface controller BIC of the data flow between the memory interface controller MIC of the data flow between control bus 710 and the main storage 702 and control I/O 708 and the bus 710.Although the requirement of MIC, BIC, BIU and bus 710 may greatly change for different realizations, the circuit that those skilled in the art can be familiar with its function and be used for realizing them.
Cell processor 700 also can comprise internal interrupt controller IIC.The IIC assembly management offers the priority of the interruption of PPE.IIC allows to process the interruption from other assembly of cell processor 700, and need not to use the main system interrupt control unit.IIC can be counted as second level controller.The main system interrupt control unit can be processed the interruption of cell processor originate outside
In an embodiment of the present invention, can use some calculating of one or more executed in parallel of PPE 704 and/or SPE 706, such as above-mentioned fractional delay.Each fractional delay calculating can be used as one or more independent tasks and moves, and can carry out these tasks but become different SPE 706 of time spent at them.
Although more than be the complete description to the preferred embodiments of the present invention, can use various alternative, modifications and equivalents.Therefore, scope of the present invention should not determined with reference to above description, but should jointly determine with reference to claims and complete equivalent scope thereof.No matter whether preferred any feature as herein described all can with as herein described no matter whether preferred any further feature makes up.In following claims, " one " refers to the one or more quantity after this word, unless otherwise noted.Appended claims is not to be appreciated that as comprising means-plus-function restriction, unless in given claim, use word " be used for ... parts " this restriction clearly described.

Claims (27)

1. controller that is used for the operation of control program comprises:
But from the source of the controller input message of user's steering controller, described controller input message comprises the information be used to the current state of identifying the removable switch of user on the described controller or control lever;
From the source of the additional input message of described controller, wherein said additional input message comprises the information of the three-dimensional motion of indicating described controller; And
Wherein, described controller input message and described additional input message are configured to by processing described controller input message and described additional input message is combined to obtain combinatorial input, thereby to be used for controlling the described combinatorial input of the operation of described program, on the contrary wherein the value by specifying described controller input message as thick control information and the value of described additional input message of orientation of specifying the described user's movable objects of expression as thin control information or obtain described combinatorial input.
2. controller as claimed in claim 1, wherein, described controller input message and described additional input message are arranged so that described combinatorial input comprises for the merging input of controlling certain function at the run duration of described program, and the described controller input message by merging relevant described function and the described additional input message of relevant described function obtain at least some that described merging is inputted.
3. controller as claimed in claim 2 wherein, by the average of the value of asking the described controller input message of expression and the value that represents described additional input message, is carried out described merging.
4. controller as claimed in claim 3 wherein, is asked the average of the value of the value of described controller input message and described additional input message according to one to one ratio.
5. controller as claimed in claim 3, wherein, described controller input message all is endowed different weights with described additional input message, and according to the tax weight, as the weighted average of the described value of controller input message and additional input message, carries out the described step of averaging.
6. controller as claimed in claim 1, wherein, the value of first of described controller input message or described additional input message is configured to the modification input with the described program of opposing, to be used for revising second the still control of movable function that at least one was activated for according to described controller input message or described additional input message.
7. controller as claimed in claim 1, wherein, the source of described side information comprises led light source and the diffusing globe on the described controller, wherein said diffusing globe is configured to make the light generation diffusion from described light source, and wherein from the described additional input message of the diffusion image acquisition of described led light source in the resulting image from image capture device.
8. controller as claimed in claim 1, wherein, described additional input message also comprises at least one by the orientation information of the orientation of the operation resulting inertial sensor information of inertial sensor or expression user movable objects.
9. controller as claimed in claim 8, wherein, described inertial sensor is installed to described controller, and described inertial sensor comprise accelerometer or gyrostatic at least one.
10. controller as claimed in claim 1, wherein, described additional input message comprises at least one information of the position of indicating user movable objects or orientation.
11. controller as claimed in claim 10, wherein, described user's movable objects comprises described controller or is installed at least one of product of the main body of described controller, and described additional input message comprises the information of the orientation of indicating described user's movable objects.
12. controller as claimed in claim 10, wherein, described additional input message comprises at least one information of indication pitching, driftage or rolling.
13. controller as claimed in claim 12, wherein, described additional input message comprises the information of indicating pitching, driftage or rolling.
14. controller as claimed in claim 10, wherein, the value of the controller input message of the state by will representing described switch or control lever merges with the value of the described additional input message of the orientation of the described user's movable objects of expression, obtains described combinatorial input.
15. controller as claimed in claim 14, wherein, when described control lever when mobile, simultaneously pitching just is being increased to (high head) value backward, the input of facing upward of described combinatorial input reflection enhancing.
16. controller as claimed in claim 15, wherein, when described control lever move forward, when simultaneously pitching is reduced to negative (undershoot) value, the underriding that described combinatorial input reflection strengthens is inputted.
17. controller as claimed in claim 14, wherein, described combinatorial input represents to adjust according to described thin control information the value of the described thick control information of less quantity.
18. controller as claimed in claim 14, wherein, by specify described switch on the described controller of identification or control lever whether be the value of movable described controller input message as thick control information and specify the value of described additional input message of the orientation of the described user's movable objects of expression to obtain described combinatorial input as thin control information, wherein said combinatorial input represents to adjust according to described thin control information the value of the described thick control information of less quantity.
19. controller as claimed in claim 14, wherein, described controller input message and described side information be arranged so that by specify described switch on the described controller of identification or control lever whether be the value of movable described controller input message as thin control information and specify the value of described additional input message of the orientation of the described user's movable objects of expression to obtain described combinatorial input as thick control information, wherein said combinatorial input represents to adjust according to described thin control information the value of the described thick control information of less quantity.
20. controller as claimed in claim 1, wherein, described controller input message and described side information are arranged so that by the combination of the represented value addition of the value that described controller input message is represented and described additional input message so that described combinatorial input provides the signal of any the higher value with the value of getting separately than described controller input message or described additional input message to obtain described combinatorial input to described program.
21. controller as claimed in claim 1, wherein, described controller input message and described side information are arranged so that by the represented value subtractive combination of the value that described controller input message is represented and described additional input message so that described combinatorial input provides the signal of any the lower value with the value of getting separately than described controller input message or described additional input message to obtain described combinatorial input to described program.
22. controller as claimed in claim 1, wherein, described controller input message and described side information are arranged so that described combinatorial input provides the signal with smooth value to described program, and described smooth value signal is in time through any slower variation of the value of getting separately than described controller input message or described additional input message.
23. controller as claimed in claim 1, wherein, described controller input message and described side information are arranged so that described combinatorial input provides the high-definition signal of the signal content with increase to described program, and described high-definition signal changes more rapidly through any of the value of getting separately than described controller input message or described additional input message in time.
24. controller as claimed in claim 1, wherein, described additional input message comprises the resulting acoustic intelligence of the sonic transducer of sound from environment of the sound source emission of response from described controller.
25. controller as claimed in claim 1, wherein, described controller input message comprises whether identification presser sensor button is movable information.
26. controller as claimed in claim 1, wherein, described additional input message comprise following at least one: i) the resulting information of the image capture device from environment, ii) from the information of at least one related inertial sensor of described controller or user, perhaps iii) from the information of the sonic transducer in the environment.
27. controller as claimed in claim 1, wherein, described additional input message comprise the resulting information of image capture device from environment, from the information of at least one related inertial sensor of described controller or user and from the information of the sonic transducer in the environment.
CN201210496712.8A 2006-05-04 2007-04-14 Obtain the input being used for controlling the operation of games Active CN102989174B (en)

Applications Claiming Priority (137)

Application Number Priority Date Filing Date Title
PCT/US2006/017483 WO2006121896A2 (en) 2005-05-05 2006-05-04 Microphone array based selective sound source listening and video game control
US11/429047 2006-05-04
US11/381,729 2006-05-04
US11/381,725 US7783061B2 (en) 2003-08-27 2006-05-04 Methods and apparatus for the targeted sound detection
US11/418,988 US8160269B2 (en) 2003-08-27 2006-05-04 Methods and apparatuses for adjusting a listening area for capturing sounds
US11/418989 2006-05-04
US11/429133 2006-05-04
US11/418,988 2006-05-04
US11/381,724 US8073157B2 (en) 2003-08-27 2006-05-04 Methods and apparatus for targeted sound detection and characterization
US11/429,047 2006-05-04
US11/381,724 2006-05-04
US11/381,725 2006-05-04
US11/429,414 2006-05-04
US11/381724 2006-05-04
US11/381,729 US7809145B2 (en) 2006-05-04 2006-05-04 Ultra small microphone array
US11/429,133 2006-05-04
US11/381,727 US7697700B2 (en) 2006-05-04 2006-05-04 Noise removal for electronic device with far field microphone on console
US11/381,728 2006-05-04
US11/381727 2006-05-04
US11/381,727 2006-05-04
US11/381,721 US8947347B2 (en) 2003-08-27 2006-05-04 Controlling actions in a video game unit
US11/429,414 US7627139B2 (en) 2002-07-27 2006-05-04 Computer image and audio processing of intensity and input devices for interfacing with a computer program
US11/429,047 US8233642B2 (en) 2003-08-27 2006-05-04 Methods and apparatuses for capturing an audio signal based on a location of the signal
US11/381,721 2006-05-04
US11/429414 2006-05-04
US11/381,728 US7545926B2 (en) 2006-05-04 2006-05-04 Echo and noise cancellation
US11/381729 2006-05-04
US11/418,989 2006-05-04
US11/418988 2006-05-04
US11/381728 2006-05-04
USPCT/US2006/017483 2006-05-04
US11/381725 2006-05-04
US11/429,133 US7760248B2 (en) 2002-07-27 2006-05-04 Selective sound source listening in conjunction with computer interactive processing
US11/418,989 US8139793B2 (en) 2003-08-27 2006-05-04 Methods and apparatus for capturing audio signals based on a visual image
US11/381721 2006-05-04
US79803106P 2006-05-06 2006-05-06
US29/259349 2006-05-06
US29259349 2006-05-06
US11/382,031 US7918733B2 (en) 2002-07-27 2006-05-06 Multi-input game control mixer
US29/259,350 2006-05-06
US11/382,032 US7850526B2 (en) 2002-07-27 2006-05-06 System for tracking user manipulations within an environment
US11/382,033 US8686939B2 (en) 2002-07-27 2006-05-06 System, method, and apparatus for three-dimensional input control
US11/382037 2006-05-06
US29/259,350 USD621836S1 (en) 2006-05-06 2006-05-06 Controller face with tracking sensors
US11/382035 2006-05-06
US11/382031 2006-05-06
US11/382032 2006-05-06
US29259348 2006-05-06
US29/259350 2006-05-06
US29/259348 2006-05-06
US11/382,032 2006-05-06
US11/382,034 US20060256081A1 (en) 2002-07-27 2006-05-06 Scheme for detecting and tracking user manipulation of a game controller body
US11/382,037 2006-05-06
US11/382033 2006-05-06
US11/382,035 US8797260B2 (en) 2002-07-27 2006-05-06 Inertially trackable hand-held controller
US11/382038 2006-05-06
US11/382,036 2006-05-06
US11/382,031 2006-05-06
US11/382,033 2006-05-06
US11/382,036 US9474968B2 (en) 2002-07-27 2006-05-06 Method and system for applying gearing effects to visual tracking
US11/382,038 2006-05-06
US29/259,349 2006-05-06
US11/382036 2006-05-06
US11/382,038 US7352358B2 (en) 2002-07-27 2006-05-06 Method and system for applying gearing effects to acoustical tracking
US11/382,034 2006-05-06
US11/382034 2006-05-06
US11/382,037 US8313380B2 (en) 2002-07-27 2006-05-06 Scheme for translating movements of a hand-held controller into inputs for a system
US60/798,031 2006-05-06
US29/259,348 2006-05-06
US11/382,035 2006-05-06
US60/798031 2006-05-06
US11/382,041 2006-05-07
US11/382,043 US20060264260A1 (en) 2002-07-27 2006-05-07 Detectable and trackable hand-held controller
US11/382040 2006-05-07
US11/382,041 US7352359B2 (en) 2002-07-27 2006-05-07 Method and system for applying gearing effects to inertial tracking
US11/382,039 US9393487B2 (en) 2002-07-27 2006-05-07 Method for mapping movements of a hand-held controller to game commands
US11/382,040 US7391409B2 (en) 2002-07-27 2006-05-07 Method and system for applying gearing effects to multi-channel mixed input
US11/382041 2006-05-07
US11/382039 2006-05-07
US11/382043 2006-05-07
US11/382,043 2006-05-07
US11/382,039 2006-05-07
US11/382,040 2006-05-07
US11/382252 2006-05-08
US29/246,762 2006-05-08
US29/246759 2006-05-08
US11/382,250 2006-05-08
US11/382,251 2006-05-08
US11/382251 2006-05-08
US29/246,759 2006-05-08
US29/246762 2006-05-08
US29/246,744 2006-05-08
US11/430,593 2006-05-08
US29/246765 2006-05-08
US29/246,766 2006-05-08
US11/382259 2006-05-08
US29/246768 2006-05-08
US11/382,258 US7782297B2 (en) 2002-07-27 2006-05-08 Method and apparatus for use in determining an activity level of a user in relation to a system
US11/382,250 US7854655B2 (en) 2002-07-27 2006-05-08 Obtaining input for controlling execution of a game program
US29/246,764 USD629000S1 (en) 2006-05-08 2006-05-08 Game interface device with optical port
US11/382,251 US20060282873A1 (en) 2002-07-27 2006-05-08 Hand-held controller having detectable elements for tracking purposes
US29/246,743 USD571367S1 (en) 2006-05-08 2006-05-08 Video game controller
US29/246,767 2006-05-08
US29246762 2006-05-08
US11/382,259 2006-05-08
US29/246743 2006-05-08
US29246759 2006-05-08
US11/382,256 US7803050B2 (en) 2002-07-27 2006-05-08 Tracking device with sound emitter for use in obtaining information for controlling game program execution
US11/430594 2006-05-08
US29246766 2006-05-08
US29/246,744 USD630211S1 (en) 2006-05-08 2006-05-08 Video game controller front face
US11/382250 2006-05-08
US11/382258 2006-05-08
US11/382,256 2006-05-08
US29/246744 2006-05-08
US29/246,763 2006-05-08
US29/246,765 2006-05-08
US11/382256 2006-05-08
US29/246763 2006-05-08
US29/246,768 USD571806S1 (en) 2006-05-08 2006-05-08 Video game controller
US11/430,593 US20070261077A1 (en) 2006-05-08 2006-05-08 Using audio/visual environment to select ads on game platform
US29/246,767 USD572254S1 (en) 2006-05-08 2006-05-08 Video game controller
US29/246766 2006-05-08
US29/246,764 2006-05-08
US11/430,594 US20070260517A1 (en) 2006-05-08 2006-05-08 Profile detection
US11/382,259 US20070015559A1 (en) 2002-07-27 2006-05-08 Method and apparatus for use in determining lack of user activity in relation to a system
US29246763 2006-05-08
US29/246,743 2006-05-08
US11/382,252 2006-05-08
US11/382,258 2006-05-08
US11/430593 2006-05-08
US29246765 2006-05-08
US11/430,594 2006-05-08
US29/246767 2006-05-08
US29/246,768 2006-05-08
US11/382,252 US10086282B2 (en) 2002-07-27 2006-05-08 Tracking device for use in obtaining information for controlling game program execution
US29/246764 2006-05-08

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN200780025400.6A Division CN101484221B (en) 2006-05-04 2007-04-14 Obtaining input for controlling execution of a game program

Publications (2)

Publication Number Publication Date
CN102989174A true CN102989174A (en) 2013-03-27
CN102989174B CN102989174B (en) 2016-06-29

Family

ID=46469882

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201210496712.8A Active CN102989174B (en) 2006-05-04 2007-04-14 Obtain the input being used for controlling the operation of games
CN201210037498.XA Active CN102580314B (en) 2006-05-04 2007-04-14 Obtaining input for controlling execution of a game program

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201210037498.XA Active CN102580314B (en) 2006-05-04 2007-04-14 Obtaining input for controlling execution of a game program

Country Status (3)

Country Link
JP (3) JP2009535173A (en)
CN (2) CN102989174B (en)
WO (2) WO2007130793A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108604454A (en) * 2016-03-16 2018-09-28 华为技术有限公司 Audio signal processor and input audio signal processing method
CN110891658A (en) * 2017-05-05 2020-03-17 安德烈·瓦莱里维赫·格鲁兹杰夫 Control equipment of motion system

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10279254B2 (en) * 2005-10-26 2019-05-07 Sony Interactive Entertainment Inc. Controller having visually trackable object for interfacing with a gaming system
JP5659453B2 (en) 2007-11-15 2015-01-28 セイコーエプソン株式会社 Ink composition
US20090221368A1 (en) * 2007-11-28 2009-09-03 Ailive Inc., Method and system for creating a shared game space for a networked game
US8419545B2 (en) * 2007-11-28 2013-04-16 Ailive, Inc. Method and system for controlling movements of objects in a videogame
GB2458297B (en) * 2008-03-13 2012-12-12 Performance Designed Products Ltd Pointing device
JP4628483B2 (en) * 2008-07-15 2011-02-09 パナソニック株式会社 Portable device and position specifying method thereof
WO2010062521A1 (en) * 2008-10-27 2010-06-03 Sony Computer Entertainment Inc. Spherical ended controller with configurable modes
KR20100138725A (en) * 2009-06-25 2010-12-31 삼성전자주식회사 Method and apparatus for processing virtual world
JP5534729B2 (en) * 2009-07-16 2014-07-02 株式会社タイトー Screen coordinate position detection method, screen coordinate position detection apparatus and gun game apparatus using double circle index
CN106943742B (en) * 2011-02-11 2024-04-26 漳州市阿思星谷电子科技有限公司 Action amplifying system
US20120277001A1 (en) * 2011-04-28 2012-11-01 Microsoft Corporation Manual and Camera-based Game Control
US10960300B2 (en) 2011-11-23 2021-03-30 Sony Interactive Entertainment LLC Sharing user-initiated recorded gameplay with buffered gameplay
US8870654B2 (en) * 2011-11-23 2014-10-28 Sony Computer Entertainment America Llc Gaming controller
US10525347B2 (en) 2012-03-13 2020-01-07 Sony Interactive Entertainment America Llc System and method for capturing and sharing console gaming data
US10486064B2 (en) 2011-11-23 2019-11-26 Sony Interactive Entertainment America Llc Sharing buffered gameplay in response to an input request
US8672765B2 (en) * 2012-03-13 2014-03-18 Sony Computer Entertainment America Llc System and method for capturing and sharing console gaming data
US9116555B2 (en) 2011-11-23 2015-08-25 Sony Computer Entertainment America Llc Gaming controller
CN103974752B (en) * 2011-12-19 2016-05-18 英派尔科技开发有限公司 Be used for the time-out of the game based on posture and restart scheme
CN104704455B (en) 2012-10-15 2017-09-01 索尼电脑娱乐公司 Operation device
EP3741438B1 (en) * 2012-10-15 2023-08-30 Sony Interactive Entertainment Inc. Operating device
GB2533394A (en) * 2014-12-19 2016-06-22 Gen Electric Method and system for generating a control signal for a medical device
AU2016280269A1 (en) * 2015-06-17 2017-10-05 Crown Equipment Corporation Dynamic vehicle performance analyzer with smoothing filter
JP6957218B2 (en) * 2017-06-12 2021-11-02 株式会社バンダイナムコエンターテインメント Simulation system and program
JP6822906B2 (en) 2017-06-23 2021-01-27 株式会社東芝 Transformation matrix calculation device, position estimation device, transformation matrix calculation method and position estimation method
KR102480310B1 (en) * 2017-11-06 2022-12-23 삼성전자주식회사 Display apparatus and control method of the same

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5554980A (en) * 1993-03-12 1996-09-10 Mitsubishi Denki Kabushiki Kaisha Remote control system
US6069594A (en) * 1991-07-29 2000-05-30 Logitech, Inc. Computer input device with multiple switches using single line
US6489948B1 (en) * 2000-04-20 2002-12-03 Benny Chi Wah Lau Computer mouse having multiple cursor positioning inputs and method of operation
US20040029640A1 (en) * 1999-10-04 2004-02-12 Nintendo Co., Ltd. Game system and game information storage medium used for same

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0461761B1 (en) * 1990-05-18 1994-06-22 British Aerospace Public Limited Company Inertial sensors
US5181181A (en) 1990-09-27 1993-01-19 Triton Technologies, Inc. Computer apparatus input device for three-dimensional information
JP3907213B2 (en) * 1992-09-11 2007-04-18 伸壹 坪田 Game control device
US6022274A (en) * 1995-11-22 2000-02-08 Nintendo Co., Ltd. Video game system using memory module
ES2231754T3 (en) 1996-03-05 2005-05-16 Sega Enterprises, Ltd. CONTROLLER AND EXPANSION UNIT FOR THE CONTRALOR.
US5992233A (en) * 1996-05-31 1999-11-30 The Regents Of The University Of California Micromachined Z-axis vibratory rate gyroscope
JPH1021000A (en) * 1996-06-28 1998-01-23 Sumitomo Metal Ind Ltd Signal input device
US6400374B2 (en) * 1996-09-18 2002-06-04 Eyematic Interfaces, Inc. Video superposition system and method
US6720949B1 (en) * 1997-08-22 2004-04-13 Timothy R. Pryor Man machine interfaces and applications
JPH11253656A (en) * 1998-03-09 1999-09-21 Omron Corp Attachment of game controller
JP4805433B2 (en) * 1999-03-31 2011-11-02 株式会社カプコン Signal input device and regulating member
US6417836B1 (en) * 1999-08-02 2002-07-09 Lucent Technologies Inc. Computer input device having six degrees of freedom for controlling movement of a three-dimensional object
JP2002090384A (en) * 2000-09-13 2002-03-27 Microstone Corp Structure of motion sensor and internal connecting method
JP3611807B2 (en) * 2001-07-19 2005-01-19 コナミ株式会社 Video game apparatus, pseudo camera viewpoint movement control method and program in video game
JP2003131796A (en) * 2001-10-22 2003-05-09 Sony Corp Information input device, its method and computer program
WO2003088204A1 (en) * 2002-04-12 2003-10-23 Obermeyer Henry K Multi-axis joystick and transducer means therefore
JP4179162B2 (en) * 2003-12-26 2008-11-12 株式会社セガ Information processing device, game device, image generation method, and game image generation method
JP2006031515A (en) * 2004-07-20 2006-02-02 Vodafone Kk Mobile communication terminal, application program, image display control device, and image display control method
JP4610971B2 (en) * 2004-09-07 2011-01-12 任天堂株式会社 Game program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6069594A (en) * 1991-07-29 2000-05-30 Logitech, Inc. Computer input device with multiple switches using single line
US5554980A (en) * 1993-03-12 1996-09-10 Mitsubishi Denki Kabushiki Kaisha Remote control system
US20040029640A1 (en) * 1999-10-04 2004-02-12 Nintendo Co., Ltd. Game system and game information storage medium used for same
US6489948B1 (en) * 2000-04-20 2002-12-03 Benny Chi Wah Lau Computer mouse having multiple cursor positioning inputs and method of operation

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108604454A (en) * 2016-03-16 2018-09-28 华为技术有限公司 Audio signal processor and input audio signal processing method
CN108604454B (en) * 2016-03-16 2020-12-15 华为技术有限公司 Audio signal processing apparatus and input audio signal processing method
CN110891658A (en) * 2017-05-05 2020-03-17 安德烈·瓦莱里维赫·格鲁兹杰夫 Control equipment of motion system

Also Published As

Publication number Publication date
CN102580314A (en) 2012-07-18
WO2007130793A2 (en) 2007-11-15
JP2009535173A (en) 2009-10-01
WO2007130793A3 (en) 2008-12-11
JP4553917B2 (en) 2010-09-29
JP2009254888A (en) 2009-11-05
JP2007296367A (en) 2007-11-15
WO2007130792A2 (en) 2007-11-15
CN102989174B (en) 2016-06-29
WO2007130792A3 (en) 2008-09-12
JP5465948B2 (en) 2014-04-09
CN102580314B (en) 2015-05-20

Similar Documents

Publication Publication Date Title
CN102580314B (en) Obtaining input for controlling execution of a game program
CN101484221B (en) Obtaining input for controlling execution of a game program
KR101036403B1 (en) Object detection using video input combined with tilt angle information
CN101438340B (en) System, method, and apparatus for three-dimensional input control
US7854655B2 (en) Obtaining input for controlling execution of a game program
US7850526B2 (en) System for tracking user manipulations within an environment
US8723794B2 (en) Remote input device
US8427426B2 (en) Remote input device
US7918733B2 (en) Multi-input game control mixer
US10086282B2 (en) Tracking device for use in obtaining information for controlling game program execution
US9009747B2 (en) Gesture cataloging and recognition
US20060287085A1 (en) Inertially trackable hand-held controller
US20070265075A1 (en) Attachable structure for use with hand-held controller having tracking ability
KR101020510B1 (en) Multi-input game control mixer
KR101020509B1 (en) Obtaining input for controlling execution of a program
CN102058976A (en) System for tracking user operation in environment
EP2351604A2 (en) Obtaining input for controlling execution of a game program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant