JP5726793B2 - A method for detecting and tracking user operations on the main body of the game controller and converting the movement into input and game commands - Google Patents

A method for detecting and tracking user operations on the main body of the game controller and converting the movement into input and game commands Download PDF

Info

Publication number
JP5726793B2
JP5726793B2 JP2012057132A JP2012057132A JP5726793B2 JP 5726793 B2 JP5726793 B2 JP 5726793B2 JP 2012057132 A JP2012057132 A JP 2012057132A JP 2012057132 A JP2012057132 A JP 2012057132A JP 5726793 B2 JP5726793 B2 JP 5726793B2
Authority
JP
Japan
Prior art keywords
controller
command
position
game
step
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2012057132A
Other languages
Japanese (ja)
Other versions
JP2012135643A (en
Inventor
ザレウスキー、ゲイリー、エム
マークス、リチャード
マオ、シャドン
Original Assignee
ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー
ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US11/382,258 priority Critical patent/US7782297B2/en
Priority to US11/382,251 priority patent/US20060282873A1/en
Priority to US11/382,259 priority patent/US20070015559A1/en
Priority to US11/381,727 priority patent/US7697700B2/en
Priority to US11/381,724 priority
Priority to US11/429,047 priority patent/US8233642B2/en
Priority to US11/418,989 priority
Priority to US11/381,721 priority patent/US8947347B2/en
Priority to US11/418,988 priority
Priority to US11/381,728 priority
Priority to US11/429,414 priority patent/US7627139B2/en
Priority to US11/381,721 priority
Priority to US11/381,724 priority patent/US8073157B2/en
Priority to US11/381,729 priority patent/US7809145B2/en
Priority to US11/381,727 priority
Priority to US11/418,988 priority patent/US8160269B2/en
Priority to US11/381,725 priority patent/US7783061B2/en
Priority to US11/381,725 priority
Priority to US11/429,047 priority
Priority to US11/418,989 priority patent/US8139793B2/en
Priority to US11/429,414 priority
Priority to US11/429,133 priority
Priority to US11/381,728 priority patent/US7545926B2/en
Priority to US11/429,133 priority patent/US7760248B2/en
Priority to US11/381,729 priority
Priority to US29259349 priority
Priority to US11/382,037 priority patent/US8313380B2/en
Priority to US79803106P priority
Priority to US11/382,031 priority patent/US7918733B2/en
Priority to US60/798,031 priority
Priority to US11/382,031 priority
Priority to US11/382,038 priority patent/US7352358B2/en
Priority to US11/382,035 priority patent/US8797260B2/en
Priority to US29/259,350 priority patent/USD621836S1/en
Priority to US11/382,034 priority
Priority to US29/259,349 priority
Priority to US11/382,034 priority patent/US20060256081A1/en
Priority to US11/382,035 priority
Priority to US11/382,036 priority patent/US9474968B2/en
Priority to US11/382,033 priority
Priority to US11/382,032 priority
Priority to US11/382,033 priority patent/US8686939B2/en
Priority to US11/382,036 priority
Priority to US29/259,350 priority
Priority to US11/382,038 priority
Priority to US11/382,037 priority
Priority to US11/382,032 priority patent/US7850526B2/en
Priority to US29/259,348 priority
Priority to US29259348 priority
Priority to US11/382,039 priority patent/US9393487B2/en
Priority to US11/382,041 priority
Priority to US11/382,040 priority
Priority to US11/382,041 priority patent/US7352359B2/en
Priority to US11/382,043 priority
Priority to US11/382,043 priority patent/US20060264260A1/en
Priority to US11/382,039 priority
Priority to US11/382,040 priority patent/US7391409B2/en
Priority to US29/246,759 priority
Priority to US11/382,251 priority
Priority to US29246759 priority
Priority to US11/382,250 priority
Priority to US11/430,593 priority
Priority to US29/246,766 priority
Priority to US29/246,743 priority patent/USD571367S1/en
Priority to US29/246,744 priority
Priority to US11/382,250 priority patent/US7854655B2/en
Priority to US29/246,744 priority patent/USD630211S1/en
Priority to US11/382,252 priority patent/US10086282B2/en
Priority to US11/430,594 priority
Priority to US29246762 priority
Priority to US29/246,743 priority
Priority to US29/246,768 priority patent/USD571806S1/en
Priority to US11/382,256 priority patent/US7803050B2/en
Priority to US29/246,767 priority patent/USD572254S1/en
Priority to US11/382,252 priority
Priority to US29/246,764 priority
Priority to US11/382,259 priority
Priority to US29/246,765 priority
Priority to US29/246,768 priority
Priority to US29/246,764 priority patent/USD629000S1/en
Priority to US29246763 priority
Priority to US29246765 priority
Priority to US11/430,593 priority patent/US20070261077A1/en
Priority to US11/430,594 priority patent/US20070260517A1/en
Priority to US29/246,762 priority
Priority to US29246766 priority
Priority to US29/246,767 priority
Priority to US29/246,763 priority
Priority to US11/382,256 priority
Priority to US11/382,258 priority
Priority to US11/382,699 priority
Priority to US11/382,699 priority patent/US20070265075A1/en
Priority to US11/624,637 priority patent/US7737944B2/en
Priority to US11/624,637 priority
Application filed by ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー, ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー filed Critical ソニー コンピュータ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー
Publication of JP2012135643A publication Critical patent/JP2012135643A/en
Publication of JP5726793B2 publication Critical patent/JP5726793B2/en
Application granted granted Critical
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/02Accessories
    • A63F13/06Accessories using player-operated means for controlling the position of a specific area display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/10Control of the course of the game, e.g. start, progess, end
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light

Description

  The present invention relates generally to computer entertainment systems, and more particularly to user operation of a controller for such computer entertainment systems.

  Computer entertainment systems typically include a handheld controller, game controller or other controller. The user or player uses the controller to send commands or other instructions to the entertainment system to control the video game or other simulation being played. For example, the controller may include a manipulator (for example, a joystick) operated by the user. The manipulated variable of the joystick is converted from an analog value to a digital value and sent to the main frame of the game machine. The controller may include a button that can be operated by the user.

  With respect to these information and other background information, the present invention has been devised.

  One embodiment provides a method used in obtaining information. The method includes receiving a projection of a geometric shape established on a controller at a camera image plane, analyzing movement and deformation in the projection of the geometric shape, and projecting the geometric shape. Determining position information of the controller based on an analysis of movement and deformation at.

  Another embodiment provides a system for use in obtaining information. The system includes a controller whose geometry is established on itself, a camera configured to receive on the image plane a projection of the geometry established on the controller, and the geometry. An image analyzer configured to analyze movement and deformation in the projection of the shape and to determine position information of the controller based on the analysis.

  Another embodiment provides a system for use in obtaining information. The system includes means for receiving a projection of the geometric shape established on the controller on a camera image plane, means for analyzing movement and deformation in the projection of the geometric shape, Means for determining position information of the controller based on analysis of movement and deformation in projection.

  Another embodiment provides a computer program product comprising a medium for embodying a computer program input to a computer, and a computer program embodied on the medium for causing the computer to execute the following steps. provide. The computer program product comprises: receiving a projection of a geometric shape established on a controller at a camera image plane; analyzing movement and deformation in the projection of the geometric shape; and And determining the position information of the controller based on the analysis of the movement and deformation in the projection.

  Another embodiment provides a method used in providing input to the system. The method includes determining position information of a controller of a system, comparing the determined position information of the controller to predetermined position information associated with a command, and the determined position information: Providing a command to the system if it matches the predetermined location information for the command.

  Another embodiment provides a system for use in providing input to the system. The system includes means for determining position information of a controller of the system, means for comparing the determined position information of the controller with predetermined position information associated with a command, and the determined position information: Means for providing a command to the system if it matches the predetermined position information for the command.

  Another embodiment provides a computer program product comprising a medium for embodying a computer program input to a computer, and a computer program embodied on the medium for causing the computer to execute the following steps. provide. The computer program product comprises: determining position information of a controller of a system; comparing the determined position information of the controller with predetermined position information associated with a command; and the determined position information. Providing a command to the system if is consistent with the predetermined location information for the command.

  Another embodiment provides a method used in a game. The method includes receiving position information of a controller being operated by a user; analyzing the position information to determine whether a predetermined movement of the controller associated with a command has been performed; Executing a command when a predetermined movement of the controller associated with the command is performed.

  Another embodiment provides a computer program product comprising a medium for embodying a computer program input to a computer, and a computer program embodied on the medium for causing the computer to execute the following steps. provide. The computer program product receives position information of a controller operated by a user, analyzes the position information, and determines whether or not a predetermined movement of the controller associated with the command has been executed. Executing a command when a predetermined movement of the controller associated with the command is executed.

  The features and advantages of various embodiments of the present invention will be better understood by reference to the following detailed description and accompanying drawings that set forth illustrative examples in which the principles of the embodiments of the invention are utilized. Let's go.

  The above aspects and other aspects, features, and advantages of embodiments of the present invention will become more apparent from the following more detailed description, presented in conjunction with the following drawings.

1 is a schematic diagram illustrating a system operating in accordance with one embodiment of the present invention. 2 is a perspective view of a controller created in accordance with one embodiment of the present invention. FIG. FIG. 6 is a schematic diagram illustrating a method for determining position information of a controller according to an embodiment of the present invention. FIG. 4 is a plan view of an image plane illustrating a method for determining controller position information according to an embodiment of the present invention. FIG. 6 is a flow diagram illustrating a method used in obtaining information according to an embodiment of the present invention. FIG. 6 is a flow diagram illustrating a method used in providing input to a system in accordance with an embodiment of the present invention. 1 is a block diagram illustrating a system that can be used to implement, implement, and / or perform the methods and techniques described herein in accordance with one embodiment of the present invention. FIG. 6 is a block diagram illustrating a processor that can be used to implement, implement, and / or perform the methods and techniques described herein in accordance with an embodiment of the invention.

  A user or player of a video game generally holds the game controller with one hand or both hands in order to operate buttons, joysticks, and the like arranged on the controller. While playing a game, the user often moves the controller itself in the air at the same time as operating a button, joystick, or the like. Some users are excited during the game and tend to try to control the behavior or situation of the game by moving the controller itself in the air.

  Various embodiments of the methods, apparatus, schemes and systems described herein provide for detection, capture and tracking of movement, motion and / or manipulation of the controller body itself by a user. Detected movement, movement and / or manipulation of the entire controller body by the user can be used as additional commands to control various aspects of the game or other simulation being played.

  The detection and tracking of the operation by the user of the game controller main body can be performed in various ways. For example, in some embodiments, a camera peripheral can be used with a computer entertainment system to sense the movement of the handheld controller body and convert that movement into an action in the game. The camera can be used to detect many different types of movements of the controller. For example, up / down movement, twisting movement, lateral movement, sudden movement, movement like a cane, depression movement, and the like. Such movements may correspond to various commands that are converted into in-game actions.

  Many different types of games, simulations, etc. can be implemented using detection and tracking of operations by the user of the game controller body. This allows the user to fight, for example, with a sword or lightsaber, trace the shape of an item with a cane, participate in many different types of sports competitions, and engage in battles on the screen or other encounters.

  Referring to FIG. 1A, a system 100 that operates according to one embodiment of the present invention is shown. As shown, the computer entertainment system or console 102 uses a television or other video display 104 to display images of a video game or other simulation. The game or other simulation may be stored on a DVD, CD, flash memory, USB memory, or other memory medium 106 inserted into the console 102. A user or player 108 operates the game controller 110 to control a video game or other simulation.

  A camera or other video image acquisition device 112 is arranged so that the controller 110 is within the camera field of view 114. As shown, the camera 112 may be placed on the video display 104, but it should be appreciated that the camera may be located anywhere. As an example, the camera 112 may be a camera peripheral device such as a commercially available EyeToy (registered trademark) product. However, it should be appreciated that any type or brand of camera can be used. For example, a web camera, an add-on USB camera, an infrared (IR) camera, a high frame capture rate camera, and the like.

  In operation, the user 108 physically moves the controller 110 itself. That is, the user 108 physically moves the entire controller 110 in the air. For example, the controller 110 can be moved by the user 108 in any direction, that is, up and down, left and right, twisting, rotation, shaking, sudden movement, and depression. These movements of the controller 110 itself can be detected and acquired by the camera 112 by tracking by image analysis in the manner described below.

  In general, the sensed and acquired motion of the controller 110 can be used to generate position and orientation data for the controller 110. Since this data is collected for each image frame, the data can be used to calculate many physical conditions of the movement of the controller 110. For example, the acceleration and velocity of the controller along any axis, its tilt, pitch, yaw, roll, and any telemetry point of the controller 110.

  The ability to detect and track the movement of the controller 110 makes it possible to determine whether a predefined movement of the controller 110 is performed. That is, a specific movement pattern or gesture of the controller 110 can be predefined and used as an input command for a game or other simulation. For example, a gesture for dropping the controller 110 can be defined as one command, a gesture for twisting the controller 110 can be defined as another command, and a gesture for shaking the controller 110 can be defined as another command. Etc. Thus, the manner in which the user 108 physically moves the controller 110 itself is used as another input for controlling the game. This provides the user with a more exciting and interesting experience. A method for assigning the movement of the controller 110 to a game input command is described below.

  Referring to FIG. 1B, a more detailed view of the controller 110 made in accordance with an embodiment of the present invention is shown. The controller 110 includes a main body 111. The main body 111 is a part of the game controller 110 that a person holds by hand (or wears if it is a wearable game controller). The input device that can be operated by the user is, for example, a button on the controller or a multi-axis operation stick. One or more buttons may be arranged on the main body 111. The body may include a housing that can be held by hand. The housing may include a hand-gripable grip. Therefore, when the user 108 physically moves the controller 110 itself, the user 108 physically moves the main body 111 of the controller 110 during operation. The user moves the main body 111 in the air, that is, in free space.

  When the progress of the game controlled by the game controller is displayed on the screen, the main body 111 may have a front portion that faces the screen. At least one input device may be assembled to the main body 111 together with an input device that can be operated by the user in order to register an input from the user.

  One or more light emitting diodes (LEDs) may be disposed on the body configured in a geometric shape. Alternatively, another type of light detectable (PD) element may be assembled with the main body 111. The position of the photodetectable element may be within the range of the image being recorded by the image acquisition device when the front is at least facing the screen. By quantifying the movement of the main body 111 in the space, the position of the PD element at different time points can be quantified.

  In this embodiment, the controller 110 includes four light emitting diodes (LEDs) 122, 124, 126, 128. As shown, the four LEDs 122, 124, 126, 128 may be arranged in a substantially square or rectangular pattern, and may be arranged in a bridge portion of the controller 110 between the Rl button and the Ll button. Thus, in this embodiment, the geometric shape is a substantially square or rectangular pattern. A square or rectangular pattern made of four LEDs 122, 124, 126, 128 is referred to herein as a “bounding box” formed by the LEDs.

  It should be appreciated that the geometric shape may be many different shapes. For example, the geometric shape may be a linear pattern or a two-dimensional pattern. Although it is preferred to arrange the LEDs linearly, alternatively, the determination of the image plane of the LED array when the LEDs are arranged in a rectangular or arc pattern and the image of the LED pattern acquired by the image acquisition camera is analyzed. Can be made easier.

  Although the illustrated embodiment of the controller utilizes four LEDs, it should be appreciated that other embodiments may utilize more than five or fewer than four LEDs. For example, three LEDs function, and two LEDs also function to provide tracking information. Even a single LED can provide position information. Further, the LEDs may be placed on different parts of the controller 110.

  The four LEDs 122, 124, 126, 128 generate four dots or dots that are recognized by the camera 112 (FIG. 1A). Since the camera 112 is looking at the player 108 with the controller 110 in hand, the camera 112 tracks the dots created by the four LEDs 122, 124, 126, 128 and the movement of the bounding box they create. The movement of the controller 110 can be tracked.

  That is, when the user 108 twists or rotates the controller main body 110, the projection of four dots is projected on the image plane of the output of the camera 112. Image analysis is used to track operations by the controller user and determine the position and orientation of the controller. Thus, the four LEDs 122, 124, 126, and 128 generate information regarding the movement of the main body. The position of one or two controllers can be determined, or the relative movement of the two controllers can be tracked.

  FIG. 2A shows an example of a method of tracking controller movement using a bounding box. Specifically, the controller and the four LEDs 122, 124, 126, 128 are disposed within the field of view 114 of the camera 112. When the controller is in the initial position, the four LEDs 122, 124, 126, 128 form a bounding box 202. As the controller moves to the second position, the four LEDs 122, 124, 126, 128 form a second bounding box 204. In addition, when the controller moves from the initial position to the second position, an intermediate position of the bounding box is also obtained depending on the moving speed and the frame rate of the camera 112.

  The bounding boxes 202, 204 formed by the four LEDs 122, 124, 126, 128 are acquired in the image plane of the camera 112. FIG. 2B shows an example of the image plane 220 of the camera 112 representing the bounding boxes 202 and 204. A physical analysis is performed to derive and determine how the bounding box motion and the bounding box rectangle will deform to different shapes based on controller tilt, yaw, etc. By projecting the bounding box onto the image plane, the position, direction, acceleration, velocity, etc. of the controller can be determined. These can be used to track user operations of the game controller.

  With reference to FIG. 3A, illustrated is a method 300 used in obtaining information from a controller in accordance with one embodiment of the present invention. The method 300 can be implemented and performed on many different types of systems and devices (eg, entertainment systems, consoles, computers, consumer electronics devices, etc.). Examples of systems that can be used to perform the method 300 are described below.

  The method 300 begins at step 302 where a projection of a geometric shape established on a controller is received at the image plane of the camera. This step can be performed as described above.

  At step 304, the motion and deformation in the projection of the geometric shape is analyzed. That is, the four dots in the bounding box are tracked and analyzed. Field and frame analysis is performed on the image plane of the camera output to analyze the operation of the four reference points and determine the position, direction, tilt, yaw, roll, etc. of the controller. Furthermore, the acceleration of the controller can be tracked in any direction. By analyzing the frame of the image, the acceleration along an arbitrary axis can be obtained. Controller telemetry points can also be calculated. It can also be determined whether the controller is in a stationary position or stationary state, for example whether the controller is in a neutral or stable state near the user's waist.

  As the controller rotates, the image transforms on the plane. A change in the width of the bounding box rectangle indicates that the controller is rotating. The width of the rectangle changes as the controller yaw is adjusted. Yaw is mapped to a rectangular width. The tilt of the controller affects the height of the rectangle.

  For example, bounding box 202 (FIG. 2B) indicates that the controller was first placed to look straight ahead at the camera. The bounding box 204 indicates that the controller has since been moved down, rotated and turned to the left of the user.

  Since the image plane only sees a deformed rectangle, it is difficult to know which side of the “base plane” the controller is on. This problem can arise, for example, when someone walks in front of the camera and blocks the camera during the time that the user operates the controller to move the controller an equal distance across the axis horizon. This can also cause the bounding box to see the same in the image plane. This can also happen if the controller moves outside the field of view of the image acquisition device.

  Thus, it is necessary to determine whether the deformation is caused by a positive or negative slope or roll (positive or negative with respect to up and down and left and right movement away from the original position in a stable state). This determination can be made by reading other telemetry from the controller, or by strobing or modulating the LEDs so that the video analysis system can identify the individual corners of the bounding box rectangle for tracking purposes. Can be solved. The LEDs can be strobed or modulated to help recognize the different corners of the bounding box. Alternatively, each LED may have a unique frequency to help recognize the different corners of the bounding box. By identifying a particular corner of the border area, ie each LED, it can be determined which side of the horizon the controller is at any given time. In this way, problems associated with controllers passing through the camera plane can be handled.

  Tracking the movement and rotation of the bounding box on the screen is based on frame-by-frame analysis. The output of the camera creates a frame of image data. The projection of the bounding box is obtained by software. The movement of the controller across the frame is based on a box transformation.

  By using a high frame rate, it is possible to accurately track the acceleration of the movement of the controller and the change in the acceleration. That is, the delta motion of the controller can be tracked by projecting the image onto the plane at a high rate. This makes it possible to plot the acceleration, the point at which acceleration is maximized, the point at which gravity is zero, and the inflection point. The inflection point is a conversion point where the controller stops and changes direction. All of this analysis is performed by analyzing the image frame to determine the position and orientation of the bounding box. As an example, a frame rate of 120 frames per second or more may be used. However, it should be appreciated that any frame rate can be used.

  As described below, the history of the previous frame may be mapped. This allows viewing telemetry in front of the controller to determine specific parameters such as, for example, tracking acceleration and speed, stopping point.

  At step 306 (FIG. 3A), controller position information is determined based on the motion and deformation analysis in the projection of the geometric shape. As an example, one or both of steps 304 and 306 can be performed using an image analyzer. That is, an analysis of the bounding box movement and deformation in the image plane of the camera can be performed using the image analyzer. The output of the video camera is connected to the input of the image analyzer. An example of a system that incorporates an image analyzer to implement one or more methods, schemes, and functions described herein is described below.

  The image analyzer monitors a bounding box formed by a reference LED acquired at the image plane of the camera. The image analyzer analyzes the position, rotation, horizontal and vertical deformations of the bounding box and determines the physical user operation of the controller, its position, roll, tilt, and yaw coordinates. After the image analysis is completed, the data may be output in an output ID or other form. Such output IDs obtained from image analysis may include data such as x, y, z coordinates, acceleration and velocity along any axis, whether the controller is in a rest position or a rest state. Thus, after the image analysis is complete, the image analyzer can indicate the location of the controller and whether a command is issued. The image analyzer can then ping at any moment and can provide position, direction, last command, etc.

As an example, an image analyzer can provide the following outputs, but is not limited to:
Controller position (X, Y, Z coordinates)
Controller direction Alpha, beta, gamma (radians)
Controller X-axis speed Controller Y-axis speed Controller Z-axis speed Controller X-axis acceleration Controller Y-axis acceleration Controller Z-axis acceleration Steady position in steady state Y / N (the above-mentioned waist position. However, any position Can be defined as)
Time from the last steady state Last gesture recognized Last time gesture recognized Last interrupt of zero acceleration point reached Each of these outputs is analyzed by analyzing the bounding box movement and deformation as described above. Can be generated. These outputs may be further processed to track controller movement. Such tracking makes it possible to recognize a specific movement of the controller and use it to activate a specific command as described below. It should be appreciated that many other outputs can be used in addition to or in place of the above outputs.

Additional input to the image analyzer can optionally be provided. Such optional inputs include, but are not limited to:
Set noise level (X, Y or Z axis) (this is the reference tolerance when analyzing the jitters of the arm in the game)
Set sampling rate (frequency at which camera frames are acquired and analyzed)
Setting gear ratio (gearing)
Configuration mapping chain

  As described above, the ability to detect and track the movement of the controller 110 allows a determination as to whether any predefined movement of the controller 110 is performed. That is, specific movement patterns or gestures of the controller 110 can be mapped and commands can be entered for a game or other simulation.

  With reference to FIG. 3B, illustrated is a method 320 used in providing input to a system in accordance with one embodiment of the present invention. The method 320 begins at step 322 where system controller position information is determined. This step can be performed using the methods and techniques described above.

  At step 324, the determined position information of the controller is compared with predetermined position information associated with the command. That is, different movements, gestures or operations of any number of controllers can be assigned to various commands. Thereby, different movements, gestures or operations of the controller can be mapped to the game model. For example, moving the controller up can be assigned to one command, moving the controller down can be assigned to another command, and moving the controller in any other direction can be assigned to other commands.

  Similarly, swaying the controller once is assigned to one command, swaying the controller twice is assigned to another command, and so on to sway the controller three times, four times, five times, etc. Can be allocated. That is, various gestures can be defined based on shaking the controller a specific number of times. Still other gestures can be defined based on shaking the controller up and down a certain number of times. Other movements of the controller, such as twists and rotations, can be assigned to further commands.

  In this way, various different trajectories of the game controller can be assigned to the gesture, which triggers the game command. Each command is assigned to a predetermined movement of the controller. Such a predetermined movement of the controller is associated with predetermined position information. In this embodiment, the determined position information of the controller is compared with predetermined position information to confirm whether or not to activate the command.

  As an example, mapping of such gestures to game commands can be implemented as follows. The output of the image analyzer can be used to determine controller position and orientation information. The image analyzer can output a variety of different IDs representing the position and orientation of the controller. For example, even if one ID is an output for determining the steady state, another ID is an output for indicating the vibration of the controller, and various other IDs are outputs for indicating other directions Good. Therefore, it is possible to output whether the controller is in a steady state or moving by using such an ID. If the controller is in a steady state, the ID may indicate the time that the controller is in a steady state.

  The determined position and direction information of the controller may be compared with predetermined position information associated with the game input command. If the determined location information matches the predetermined location information, a command is provided to the entertainment system. Also, various gestures such as controller up / down movement, twist to draw a circle, left / right movement, twist during up / down movement, and rotation to the right or left can all be assigned to various commands.

  When a new command or gesture is recognized, the image analyzer may trigger an interrupt. Such an interrupt activation may be used as part of the process of providing commands to the entertainment system. The system may optionally be configured such that zero acceleration points, stop points and / or other events within the axis also trigger interrupts.

  It often happens that there is no perfect match when comparing the determined position and direction information with the predetermined position information associated with the input command to see if it matches. This is because it may be difficult to accurately reproduce the predefined movement because the controller moves in free space. Therefore, the predetermined position information related to the input command may be defined with respect to a range, an allowable value, and / or a threshold value that are considered to be sufficiently close to the predetermined position information to activate the command. That is, commands may be defined with respect to thresholds or ranges. Thus, when determining whether any command or gesture has been recognized, the system may check whether the determined position and orientation information is included within the gesture. Thus, the defined command may have a threshold that is seen when determining whether to invoke the command.

  In addition, the history of the previous frame may be saved or mapped when comparing the determined position and direction information with predetermined position information associated with the input command to see if they match. . For example, the frame buffer may be monitored or the system may continue to record the history of previous frames. It may be determined whether or not any command is satisfied by looking at the previous frame. Frame history mapping allows the controller to be telemetered at a specific time to provide a position and direction in determining whether a command is satisfied.

  Finally, at step 326, if the determined position information of the controller matches the predetermined position information of the command, the command is provided to the system. Such commands can be used to cause or not generate events within a video game or other simulation.

  In other embodiments, game controller movement can be assigned to game commands in a video game, for example. With this usage, which can be used in video games or other simulations, the position information of the operating controller is received by the user. This position information is analyzed to determine whether a predetermined movement of the controller associated with the command has been performed. This analysis can be performed as described above. If a predetermined movement of the controller associated with the command is executed, the command is executed by the game. Executing the command can generate visual effects or other on the video display where the game is displayed.

  While the discussion herein relates to the use of LEDs in controllers for games or other entertainment systems, the teachings provided herein provide for the motion of controllers in other types of systems, devices, appliances, etc. It must be fully understood that it can be applied to detection and tracking. In other words, the remote operation function of the home appliance device or any device can be executed using the LED of the game controller described above. LEDs on the controller of many other types of systems and devices can be used so that the controller can be detected and tracked and its movement mapped to system and device commands. Examples of other types of systems and devices of this type include, but are not limited to, televisions, stereos, telephones, computers, home networks, office networks, portable computer devices or communication devices.

  Furthermore, the teachings described herein can be applied to general purpose remote controls that have the ability to control several or many different devices. That is, such a general purpose remote control may include the LEDs described herein, and the motion of the general purpose remote control body can be used as an input command to several or many different devices or systems.

  Furthermore, the game controller may have a general-purpose remote function. For example, when the progress of the game controlled by the game controller is displayed on the screen, a main body having a front portion facing the screen may be provided. In order to register an input from a user, at least one input device may be assembled to a main body having an input device that can be operated by the user. A signal encoder may be included. An infrared signal transmitter operable to transmit an infrared signal through the atmosphere using a signal generated by the signal encoder may be included. A signal encoder encodes a signal using a selected one of a plurality of signal codes for reception by an electronic device having an infrared receiver and a signal decoder operable with the selected signal code. It may be programmable.

  In addition, battery operated toys (including toys formed in the shape and style of a brand game) may be formed using LEDs to form a user operated body that is tracked in a detected environment.

  In some embodiments, the image analyzer may recognize the user, process a voice authenticated gesture, or the like. The user can be identified by the analyzer of the system through the gesture, and the gesture may be user specific. Gestures may be recorded by the user or stored in the model. The recording process may selectively save the sound produced during the recording of the gesture. The sensed environment may be input to the multi-channel analyzer for processing. The processor may determine and authenticate the user's identity or object with high accuracy and high performance based on the speech or acoustic pattern with reference to the gesture model.

  According to embodiments of the present invention, the methods and techniques described herein may be implemented as part of the signal processing apparatus 400 shown in FIG. The apparatus 400 may include a processor 401 and memory 402 (eg, RAM, DRAM, ROM, etc.). Furthermore, when implementing parallel processing, the signal processing apparatus 400 may include a plurality of processors 401. The memory 402 may include data and code configured as described above.

Specifically, the memory 402 may include signal data 406. Memory 402 may include calibration data 408. The calibration data 408 is data representing one or more inverse eigen matrices C −1 of one or more corresponding pre-adjusted listening zones obtained from, for example, calibration of the microphone array 422. As an example, memory 402 may include an eigenmatrix for 18 20 degree sectors including microphone array 422.

  The apparatus 400 may include a well-known support function 410, such as an input / output (I / O) element 411, a power supply (P / S) 412, a clock (CLK) 413, a cache 414, and the like. The device 400 may optionally include a mass storage device 415 such as a disk drive, a CD-ROM drive, or a tape device that stores programs and / or data. The controller may optionally include a display device 416 and a user interface unit 418 to facilitate interaction between the controller 400 and the user. Display device 416 may be in the form of a cathode ray tube (CRT) or flat panel screen that displays text, numbers, graphic symbols or images. User interface 418 may be a keyboard, mouse, joystick, light pen, or other device. Further, the user interface 418 may be a microphone, video camera or other signal conversion device for directly acquiring the signal to be analyzed. Processor 401, memory 402, and other components of system 400 may exchange signals (eg, code instructions and data) with each other via system bus 420 as shown in FIG.

  The microphone array 422 may be connected to the device 400 via the I / O function 411. The microphone array may be about 2-8 microphones, preferably 4 microphones, located less than about 4 cm, preferably about 1-2 cm away from the adjacent microphone. Preferably, the microphones in array 422 are omnidirectional microphones. An optional image acquisition device 423 (eg, a digital camera) is coupled to the device 400 via an I / O function 411. One or more pointing actuators 425 that are mechanically coupled to the camera can exchange signals with the processor 401 via an I / O function 411.

  As used herein, the term I / O generally refers to any program, operation or device that exchanges data between system 400 and peripheral devices. Any data transfer can be viewed as an output from one device and an input to another device. Peripheral devices include devices such as a writable CD-ROM that functions as an input / output device, in addition to devices dedicated to input such as a keyboard and mouse, devices dedicated to output such as a printer. The term “peripheral device” includes external devices such as a mouse, keyboard, printer, monitor, microphone, game controller, camera, external Zip drive or scanner, CD-ROM drive, CD-R drive or built-in modem. Also included are internal devices or other peripherals such as flash memory reader / writers, hard disks.

  In particular embodiments of the present invention, the device 400 may be a video game unit and comprises a joystick controller 430 connected to the processor via an I / O function 411 by wire (eg, USB cable) or wirelessly. May be. The joystick controller 430 may include an analog joystick control 431 and a conventional button 433 that provide control signals commonly used during video game play. Such a video game may be implemented as processor readable data and / or instructions that can be stored in the memory 402, or implemented as other processor readable media such as that associated with the mass storage device 415. Also good.

  The joystick control 431 is normally configured to send a motion signal along the X axis when the control stick is moved left and right, and send a motion signal along the Y axis when the stick is moved back and forth (up and down). In a joystick configured for three-dimensional movement, a signal of movement along the Z axis is sent when the stick is twisted leftward (counterclockwise) or rightward (clockwise). These three axes X, Y, and Z are often called roll, pitch, and yaw, respectively, particularly with respect to aircraft.

  In addition to conventional features, joystick controller 430 may include one or more inertial sensors 432. The inertial sensor can provide position information and / or direction information to the processor 401 via an inertial signal. The direction information may include angle information such as tilt, roll or yaw of the joystick controller 430. As an example, inertial sensor 432 may be any number of accelerometers, gyroscopes, tilt sensors, and / or combinations thereof. In a preferred embodiment, the inertial sensor 432 includes a tilt sensor adapted to sense the direction of the joystick controller with respect to the tilt axis and the roll axis, a first accelerometer that senses acceleration along the yaw axis, and a yaw axis. You may provide the 2nd accelerometer which senses angular acceleration. The accelerometer may be a MEMS device having a sensor that senses displacement of the mass in one or more directions and having a mass attached by one or more springs. The acceleration of the joystick controller 430 can be determined using a signal from the sensor in response to the mass displacement. Such a technique is implemented by program code instructions 404 stored in the memory 402 and executable by the processor 401.

  As an example, an accelerometer suitable as inertial sensor 432 may be a single mass that is elastically coupled to the frame at three or four points, for example by a spring. The pitch axis and roll axis are in a plane across the frame and the frame is attached to the joystick controller 430. When the frame (and joystick controller 430) rotates about the pitch axis and roll axis, the mass is displaced by the influence of gravity, and the spring expands and contracts depending on the pitch and / or roll angle. Mass displacement is sensed and converted into a signal depending on the amount of pitch and / or roll. Angular acceleration about the yaw axis, or linear acceleration along the yaw axis, produces a characteristic pattern of spring expansion or contraction or mass movement that can be sensed and converted to a signal depending on the amount of angular or linear acceleration. Such an accelerometer device can measure tilt, roll, angular acceleration about the yaw axis, and linear acceleration about the yaw axis by tracking mass movement or spring stretching force. There are many different ways to track mass position and / or force on mass, including resistive strain gauge materials, photonic sensors, magnetic sensors, Hall effect devices, piezoelectric devices, capacitive sensors, etc. .

  Further, the joystick controller 430 may include one or more light sources 434, such as light emitting diodes (LEDs). The light source 434 may be used to distinguish one controller from another. For example, this can be accomplished with one or more LEDs by flashing or holding the LED pattern code. As an example, five LEDs can be provided on the joystick controller 430 in a linear or two-dimensional pattern. Preferably, the LEDs are linearly arranged, but instead the LEDs are arranged in a rectangular pattern or arc shape to facilitate the determination of the image plane of the LED array when analyzing the image of the LED pattern acquired by the image acquisition device 423. You may arrange by a pattern. Furthermore, the LED pattern code may be used to determine the position of the joystick controller 430 during the game. For example, LEDs can help distinguish controller tilt, yaw, and roll. This detection pattern is useful for improving the user's feeling in games such as airplane flight games. The image acquisition device 423 can capture an image including the joystick controller 430 and the light source 434. Such an image can be analyzed to determine the position and / or orientation of the joystick controller. This type of analysis can be implemented by program code instructions 404 stored in memory 402 and executed by processor 401. In order to facilitate the capture of the image of the light source 434 by the image acquisition device 423, the light source 434 may be located on two or more different sides of the joystick controller 430, for example on the front and back (shown in dotted lines). . By arranging in this way, the image acquisition device 423 can acquire images of the light source 434 in different directions of the joystick controller 430 depending on how the user holds the joystick controller 430.

  In addition, the light source 434 can provide a telemetry signal to the processor 401 in the form of, for example, pulse code, amplitude modulation, or frequency modulation. Such a telemetry signal can indicate which joystick button is pressed and / or how hard the button is pressed. The telemetry signal can be encoded into an optical signal by pulse coding, pulse width modulation, frequency modulation or light intensity (amplitude) modulation. The processor 401 decodes the telemetry signal from the optical signal and executes a game command according to the decoded telemetry signal. The telemetry signal may be decoded from an image analysis of the joystick controller 430 acquired by the image acquisition device 423. Alternatively, device 401 may include a separate optical sensor dedicated to receiving telemetry signals from light source 434. For example, see Richard L. Marks et al., US Patent Application No. 11 / 429,414, “USE OF COMPUTER IMAGE AND AUDIO PROCESSING IN DETERMINING AN INTENSITY AMOUNT WHEN. INTERFACING WITH A COMPUTER PROGRAM ”(agent serial number SONYP052), the entirety of which is incorporated herein by reference. In addition, analysis of the image including the light source 434 can be used for both telemetry and determination of the position and / or orientation of the joystick controller 430. Such a technique can be implemented by program code instructions 404 stored in the memory 402 and executable by the processor 401. The processor 401 receives the inertial signal from the inertial sensor 432 together with the optical signal from the light source 434 detected by the image acquisition device 423 and / or the sound source position and feature information obtained from the acoustic signal detected by the microphone array 422. Can be used to infer the position and / or orientation of the joystick controller 430 and / or its user. For example, using “acoustic radar” sound source locations and features with microphone array 422 to track moving voice while joystick controller movement is tracked independently (by inertial sensor 432 or light source 434). Can do. Any number, different combinations, and different modes of providing control signals to the processor 401 can be used with embodiments of the present invention. Such techniques can be implemented by program code instructions 404 stored in memory 402 and executable by processor 401.

  The signal from the inertial sensor 432 may provide part of the tracking information input, and the signal generated from the image acquisition device 423 by tracking one or more light sources 434 may provide another part of the tracking information input. As an example, such a “mixed mode” signal can be used in a football video game in which a quarterback fakes his head to the left and then throws the ball to the right, but is not so limited. Specifically, the game player holding the controller 430 can make a pitch operation by rotating the head to the left and making a sound while swinging the controller to the right as if it were football. The microphone array 422 in conjunction with the “acoustic radar” program code can follow the user's voice. The image acquisition device 423 can track the movement of the user's head or other commands that do not require voice or the use of a controller. Sensor 432 can track the movement of a joystick controller (representing football). The image acquisition device 423 may track the light source 434 on the controller 430. The user can release the “ball” when the amount and / or direction of acceleration of the joystick controller 430 reaches a specific value or by a key command activated by pressing a button of the joystick controller 430.

  In certain embodiments of the invention, the position of the joystick controller 430 can be determined using, for example, an inertial signal from an accelerometer or a gyroscope. Specifically, the acceleration signal from the accelerometer can be temporarily integrated to obtain a speed change, and the speed is time integrated to obtain a position change. If the values of the initial position and speed at a certain time are known, the absolute position can be obtained using these values, speed change, and position change. Although using an inertial sensor allows positioning faster than using the image acquisition device 423 and the light source 434, the inertial sensor 432 may be affected by a type of error known as "drift". That is, errors accumulate over time, and a mismatch D may occur between the position of the joystick 430 calculated from the inertia signal (indicated by a dotted line) and the actual position of the joystick controller 430. Embodiments of the present invention allow several ways to deal with such errors.

  For example, the drift can be manually canceled by resetting the initial position of the joystick controller 430 to be equal to the current calculated position. The user can use one or more buttons on the joystick controller 430 to activate a command to reset the initial position. Instead, image-based drift can be implemented by resetting the current position with reference to the position obtained from the image obtained from the image acquisition device 423. Such image-based drift correction can be performed manually, for example, when the user activates one or more buttons of the joystick controller 430. Alternatively, image-based drift correction can be performed automatically, for example at regular time intervals or in response to game play. Such a technique can be implemented by program code instructions 404 stored in the memory 402 and executable by the processor 401.

  In certain embodiments, it is desirable to compensate for spurious data in the inertial sensor signal. For example, the signal from the inertial sensor 432 may be oversampled, a moving average may be calculated from the oversampled signal, and spurious data may be removed from the inertial sensor signal. In some situations, the signal may be oversampled to remove high and / or low values from some subset of data points, and a moving average may be calculated from the remaining data points. In addition, other data sampling and data manipulation techniques can be used to adjust the inertial sensor signal to remove or reduce the significance of spurious data. Techniques can be selected according to the nature of the signal, the calculations performed on the signal, the nature of the game play, or a combination of two or more of these. Such a technique can be implemented by program code instructions 404 stored in the memory 402 and executable by the processor 401.

  The processor 401 can perform digital signal processing on the signal data 406 in accordance with the data 406 stored and read in the memory 402 and executed by the processor module 401 and the program code instructions of the program 404. The code portion of the program 404 may be compliant with any one of a plurality of different programming languages such as assembly, C ++, JAVA, or some other language. The processor module 401 forms a general purpose computer that becomes a special purpose computer when executing a program such as the program code 404. Although it is stated herein that program code 404 is implemented in software and executed on a general purpose computer, one of ordinary skill in the art will recognize hardware such as an application specific integrated circuit (ASIC) or other hardware circuit. You will understand that task management methods can also be implemented using hardware. Thus, it should be understood that all or part of embodiments of the present invention can be implemented in software, hardware or a combination of both.

  In one embodiment, the program code 404 comprises a set of processor readable instructions that implement any one or more of the methods and techniques described herein, or a combination of two or more of such methods and techniques. Can be included. For example, program code 404 can be configured to implement the image analyzer described herein. Alternatively, the image analyzer described herein may be implemented in hardware.

  In the illustrated embodiment, the image analyzer function described above is illustrated as an image analyzer 450. The image analyzer 450 can receive input from a camera, such as the image acquisition device 423 or the camera 112 (FIG. 1A), for example. Accordingly, the output of video camera 112 or image acquisition device 423 may be coupled to the input of image analyzer 450. The output of the image analyzer 450 may be provided to the system of the apparatus 400. In this way, the device 400 is provided with the information necessary to see the command itself or whether the command or gesture has been recognized. The image analyzer 450 can be coupled to the rest of the apparatus 400 in many different ways. That is, the illustrated connection method is only an example. In another example, the image analyzer 450 may be coupled to the system bus 420 such that it can receive input data from the image acquisition device 423 and provide output to the device 400.

  Image analyzer 450 may be selectively included in device 400, entertainment system or console 102, or image analyzer 450 may be located remotely from these devices and systems. It should be appreciated that all or part of the image analyzer 450 can be implemented in software, hardware or a combination of both. In a scenario where the image analyzer 450 is implemented in software, block 450 represents that the image analysis function is implemented in software.

  The program code 404 typically includes one or more processors that instruct one or more processors to select a pre-adjusted listening zone at runtime and to remove sound originating from sound sources outside the pre-adjusted listening zone. Instructions can be included. The pre-adjusted listening zone may include a listening zone that matches the focal size or field of view of the image acquisition device 423.

  The program code may include one or more instructions that, when executed, cause apparatus 400 to select a pre-tuned listening sector that includes a sound source. This instruction allows the apparatus to determine whether the sound source is inside the initial sector or on a specific side of the initial sector. If the sound source is not in the default sector, at runtime, the instruction selects a different sector on a particular side of the default sector. Different sectors can be characterized by the attenuation of the input signal closest to the optimal value. These instructions can calculate the attenuation of the input signal from the microphone array 422 and the attenuation to the optimum value at runtime. At run time, an instruction can determine the attenuation value of the input signal to one or more sectors and cause the device to select the sector whose attenuation is closest to the optimum value.

Program code 404 is sent to microphones M 0 . . . Instruct to generate a discrete time domain input signal x m (t) from M M and select a finite impulse response filter coefficient using a listening sector with semi-blind source separation from the input signal x m (t) One or more instructions for separating different sound sources may be selectively included. Program 404 may include instructions for applying one or more non-integer delays to selected input signals x m (t) other than input signal x 0 (t) from reference microphone M 0 . Each non-integer delay can be selected to optimize the signal-to-noise ratio of the discrete time domain output signal y (t) obtained from the microphone array. As the signal from the reference microphone M 0 is first becomes temporally to the signal from the other microphone array may be selected fractional delay. Program 404 may include instructions for introducing a non-integer time delay Δ into the output signal y (t) of the microphone array.
That is, y (t + Δ) = x (t + Δ) * b 0 + x (t−1 + Δ) * b 1 + x (t−2 + Δ) * b 2 +. . . + X (t−N + Δ) b N , where Δ is between 0 and 1.

  When executed, the program code 404 causes the image acquisition device 423 to monitor the visual field in front of the image acquisition device 423, identify one or more light sources 434 within the visual field, and detect changes in the light emitted from the light source 434. Processor executable instructions including one or more instructions to be selectively included. Then, an input command is triggered by the processor 401 in response to detecting the change. Inducing action on a game controller through the use of an LED in conjunction with an image acquisition device is described in a shared US patent application Ser. No. 10 / 759,782, filed Jan. 16, 2004, Richard L. Marks et al., “METHOD AND APPARATUS. FOR LIGHT INPUT DEVICE ”, which is incorporated herein by reference in its entirety.

  The program code 404 is one or more instructions that, when executed, use the signal from the inertial sensor and the signal generated from the image acquisition device by tracking one or more light sources as input to the game system as described above. May optionally include processor-executable instructions. Program code 404 may optionally include processor-executable instructions that include one or more instructions that compensate for drift of inertial sensor 432 during execution.

  Further, program code 404 may optionally include processor-executable instructions that include one or more instructions that adjust the gearing and mapping of controller operations to the game environment at runtime. With such a feature, the user can change the “gearing” of the operation of the joystick controller 430 for the game state. For example, a 45 degree rotation of the joystick controller 430 may be converted into a 45 degree rotation of the game object. However, this 1: 1 gear ratio so that the controller's X degree rotation (or tilt, yaw, or "operation") is converted into a Y rotation (or tilt, yaw, or "operation") of the game object. May be modified. The gearing may be a 1: 1, 1: 2, 1: X, or X: Y ratio. Here, X and Y can take arbitrary values. Furthermore, the mapping of input channels to game controls may be modified over time or immediately. The modification includes changing the trajectory model of the gesture, correcting the position, size, and threshold value of the gesture. Such mapping may be programmed, random, tiered, or staggered to give the user a dynamic range of operations. Mapping, gearing or ratio corrections are adjusted by program code 404 according to game play and game status, either by use of a user correction button (such as a keypad) on game controller 430 or roughly in response to an input channel. May be. Input channels include user audio components, controller-generated sounds, controller-generated tracking sounds, controller button status, video camera output, accelerometer data, tilt, yaw, roll, position, acceleration and other telemetry data , And any other data obtained from sensors that can track user or user manipulation of objects.

  In certain embodiments, the program code 404 may change the mapping or gearing over time from one scheme or ratio to another in a predetermined time dependent manner. Gearing and mapping changes can be applied to the game environment in various ways. In one example, a user's controller for controlling a video game character in some gearing manner when the character is healthy, and when the character's health deteriorates, the system changes the gear of the controller command to give the character a gesture command You may make it worsen the movement. For example, when the user is required to adjust input to regain control of the character under the new mapping, a video game character that has lost its sense of direction may be forced to change the input channel mapping. . The mapping scheme that modifies the conversion of the input channel to the game command may also change during game play. This conversion occurs in various ways depending on the game state or in response to a modification command issued under one or more elements of the input channel. Gearing and mapping may be configured to affect the setting and / or processing of one or more elements of the input channel.

  Furthermore, a speaker 436 may be attached to the joystick controller 430. In an “acoustic radar” embodiment where the program code 404 localizes and characterizes the sound detected using the microphone array 422, the speaker 436 is detectable by the microphone array 422 to track the position of the joystick controller 430. The audio signal used by the program code 404 can be provided. Speaker 436 can also be used to provide an additional “input channel” from joystick controller 430 to processor 401. The audio signal from the speaker 436 may be periodically pulsed to provide an acoustic radar beacon for position tracking. The audio signal (pulsed or otherwise) may be audible or ultrasonic. The acoustic radar can track user operations on the joystick controller 430. Such operation tracking may include information regarding the position and orientation (eg, pitch, roll or yaw angle) of the joystick controller 430. The pulses can be triggered with an appropriate duty cycle as applicable by those skilled in the art. The pulse may be initiated based on a control signal that is adjusted from the system. The device 400 coordinates dispatch of control signals between two or more joystick controllers 430 connected to the processor 401 (by program code 404) to ensure that multiple controllers are tracked. Can do.

  As an example, embodiments of the present invention can be implemented on a parallel processing system. Such parallel processing systems typically include two or more processor elements configured to execute portions of a program in parallel using separate processors. By way of example, and not limitation, FIG. 5 illustrates a type of cell processor 500 according to one embodiment of the invention. The cell processor 500 can be used as the processor 401 in FIG. In the example shown in FIG. 5, the cell processor 500 includes a main memory 502, a power processor element (PPE) 504, and a plurality of synergistic processor elements (SPE) 506. In the example shown in FIG. 5, the cell processor 500 includes one PPE 504 and eight SPEs 506. In such a configuration, seven of the SPEs 506 may be used for parallel processing, and one may be reserved as a backup in case one of the other seven fails. Alternatively, the cell processor may include a plurality of groups of PPE (PPE groups) and a plurality of groups of SPEs (SPE groups). In this case, hardware resources can be shared between units in the group. However, SPE and PPE must appear to the software as independent elements. Thus, embodiments of the present invention are not limited to the use of the configuration shown in FIG.

  The main memory 502 is typically a general purpose, non-volatile storage device and special purpose hardware used for functions such as system configuration, data transfer synchronization, memory mapped I / O, I / O subsystem, etc. Includes both wear registers and arrays. In the embodiment of the present invention, the signal processing program 503 may reside in the main memory 502. The signal processing program 503 may operate on the PPE. The program 503 may be divided into multiple signal processing tasks that can be executed on the SPE and / or PPE.

  As an example, PPE 504 may be a 64-bit power PC processor unit (PPU) with associated caches L1, L2. The PPE 504 is a general-purpose processing unit and can access system management resources (such as a memory protection table). As seen by the PPE, hardware resources may be explicitly mapped to real address space. Thus, the PPE can directly address any of these resources using an appropriate effective address value. The main functions of the PPE 504 are task management and task assignment for the SPE 506 of the cell processor 500.

  Although only one PPE is shown in FIG. 5 as a cell processor implementation such as Cell Broadband Engine Architecture (CBEA), the cell processor 500 may have multiple PPEs organized into two or more PPE groups. You may have. These PPE groups may share access to the main memory 502. Further, the cell processor 500 may include two or more SPE groups. SPE groups may share access to main memory 502. Such a configuration is within the scope of the present invention.

  Each SPE 506 includes a synergistic processor unit (SPU) and its own local storage area LS. The local storage area LS may have one or more separate memory storage areas, each associated with a particular SPU. Each SPU may be configured to execute only instructions (including data loading and data storage operations) from within its associated local storage area. In such a configuration, data transfer between the local storage LS and somewhere in the system 500 is a direct memory access (DMA) command that transfers data in and out of the local storage area (in the individual SPE). Can be executed by issuing a memory flow controller (MFC). The SPU is a less complex computing unit than the PPE 504 in that it does not perform any system management functions. SPUs generally have single instruction, multiple data (SIMD) capabilities and process data to perform any necessary data transfer (access to characteristics set by the PPE) to perform assigned tasks. Start). The purpose of the SPU is to enable applications that require higher computational unit density and can effectively use the provided instruction set. A significant number of SPEs in the system managed by PPE 504 allow for cost effective processing for a wide range of applications.

  Each SPE 506 may comprise a dedicated memory flow controller (MFC) with an associated memory management unit that holds and processes memory protection and access permission information. The MFC provides the primary method of data transfer, protection and synchronization between the cell processor main storage and the SPE local storage. The MFC command describes the transfer to be performed. A command for transferring data may be referred to as an MFC direct memory access (DMA) command (or MFC DMA command).

  Each MFC can support multiple DMA transfers simultaneously and can maintain and process multiple MFC commands. Each MFC DMA data transfer command may require both a local storage device address (LSA) and an effective address (EA). A local storage address may directly address only the local storage area of its associated SPE. Effective addresses may have more general applications. For example, if the effective address is aliased in the real address space, the main storage device including all SPE local storage areas can be referenced.

  To facilitate communication between SPE 506 and / or communication between SPE 506 and PPE 504, SPE 506 and PPE 504 may include a signal notification register that signals an event. PPE 504 and SPE 506 may be combined by a star topology that acts as a router from which PPE 504 sends messages to SPE 506. Alternatively, each SPE 506 and PPE 504 may have a one-way signal notification register called a mailbox. Mailboxes are used by SPE 506 and can host operating system (OS) synchronization.

  Cell processor 500 may include an input / output (I / O) function 508 that allows cell processor 500 to interact with peripheral devices such as microphone array 512 and optional image acquisition device 513. In addition, an element interconnect bus 510 may connect the various components described above. Each SPE and PPE can access the bus 510 through the bus interface unit BIU. Cell processor 500 may include two controllers that are typically found in the processor. That is, the memory interface controller MIC that controls the flow of data between the bus 510 and the main memory 502, and the bus interface controller BIC that controls the flow of data between the I / O 508 and the bus 510. Although the requirements for MIC, BIC, BIU and bus 510 may vary widely for various implementations, those skilled in the art will be familiar with the functions and circuitry for implementing them.

  The cell processor 500 may include an internal interrupt controller IIC. The IIC component manages the priority of interrupts presented to the PPE. The IIC allows the cell processor 500 to handle interrupts from other components without using the main system interrupt controller. The IIC may be considered a second level controller. The main system interrupt controller may handle interrupts originating from outside the cell processor.

  In embodiments of the present invention, certain calculations such as non-integer delays may be performed in parallel using PPE 504 and / or one or more SPEs 506. Each non-integer delay calculation can be performed as one or more separate tasks that can be undertaken as different SPEs 506 become available.

  Although the invention disclosed herein has been described using specific embodiments and applications thereof, numerous modifications and changes can be made by those skilled in the art without departing from the scope of the invention as set forth in the claims. Can be applied.

Claims (10)

  1. A step of receiving positional information of the controller that is operated by a user, at least a part of the location information is obtained from the inertial sensor before Symbol in the controller, at least a part of the location information is determined from the inertial sensor look including the current calculated position of the controllers, the inertia sensor is caused current and calculated position of the controller determined from inertial sensors, the mismatch between the actual position of the controller Affected by drift error, step, and
    An image acquisition device arranged so that the controller can visually recognize the current calculated position of the controller in the position information in response to a user activating one or more input devices on the controller. by resetting the position obtained from the images obtained from the steps of compensating for the drift error of the inertial sensor,
    Analyzing the position information to determine whether a predetermined movement of the controller associated with the command has been performed;
    Executing a command when a predetermined movement of the controller associated with the command is performed;
    A method used in a game, wherein the processor executes
  2.   The method of claim 1, wherein the processor further executes the step of generating an interrupt to the game if a predetermined movement of the controller associated with the command is performed.
  3. The analyzing step includes a step of determining whether or not the position information of the controller indicates that the position information of the controller is included in a specific range related to a predetermined movement of the controller associated with the command. The method according to claim 1 or 2 .
  4. 4. A method as claimed in any preceding claim, wherein the processor further performs the step of receiving at a camera image plane a projection of a geometric shape established on a controller.
  5.   The method of claim 4, wherein the processor further performs the step of analyzing motion and deformation in the projection of the geometric shape.
  6. A computer-readable recording medium on which a program for causing a computer to execute the following steps is recorded, the program being
    A step of receiving positional information of the controller that is operated by a user, at least a part of the location information is obtained from the inertial sensor before Symbol in the controller, at least a part of the location information is determined from the inertial sensor look including the current calculated position of the controllers, the inertia sensor is caused current and calculated position of the controller determined from inertial sensors, the mismatch between the actual position of the controller Affected by drift error, step, and
    An image acquisition device arranged so that the controller can visually recognize the current calculated position of the controller in the position information in response to a user activating one or more input devices on the controller. by resetting the position obtained from the images obtained from the steps of compensating for the drift error of the inertial sensor,
    Analyzing the position information to determine whether a predetermined movement of the controller associated with the command has been performed;
    Executing a command when a predetermined movement of the controller associated with the command is performed;
    A recording medium comprising:
  7.   The recording medium according to claim 6, further comprising: generating an interrupt for the game when a predetermined movement of the controller associated with the command is executed.
  8. The analyzing step includes a step of determining whether or not the position information of the controller indicates that the position information of the controller is included in a specific range related to a predetermined movement of the controller associated with the command. The recording medium according to claim 6 or 7 .
  9. Recording medium according to any one of 6 to claim, further comprising the step of receiving the projective geometry established on the controller in the camera image plane 8.
  10.   The recording medium according to claim 9, further comprising a step of analyzing movement and deformation in the projection of the geometric shape.
JP2012057132A 2002-07-27 2012-03-14 A method for detecting and tracking user operations on the main body of the game controller and converting the movement into input and game commands Active JP5726793B2 (en)

Priority Applications (94)

Application Number Priority Date Filing Date Title
US11/382,259 US20070015559A1 (en) 2002-07-27 2006-01-10 Method and apparatus for use in determining lack of user activity in relation to a system
US11/382,258 US7782297B2 (en) 2002-07-27 2006-01-10 Method and apparatus for use in determining an activity level of a user in relation to a system
US11/382,251 US20060282873A1 (en) 2002-07-27 2006-01-10 Hand-held controller having detectable elements for tracking purposes
US11/429,047 US8233642B2 (en) 2003-08-27 2006-05-04 Methods and apparatuses for capturing an audio signal based on a location of the signal
US11/418,989 2006-05-04
US11/381,721 US8947347B2 (en) 2003-08-27 2006-05-04 Controlling actions in a video game unit
US11/418,988 2006-05-04
US11/381,728 2006-05-04
US11/429,414 US7627139B2 (en) 2002-07-27 2006-05-04 Computer image and audio processing of intensity and input devices for interfacing with a computer program
US11/381,721 2006-05-04
US11/381,729 2006-05-04
US11/381,729 US7809145B2 (en) 2006-05-04 2006-05-04 Ultra small microphone array
US11/381,727 2006-05-04
US11/418,988 US8160269B2 (en) 2003-08-27 2006-05-04 Methods and apparatuses for adjusting a listening area for capturing sounds
US11/381,725 US7783061B2 (en) 2003-08-27 2006-05-04 Methods and apparatus for the targeted sound detection
US11/381,725 2006-05-04
US11/429,047 2006-05-04
US11/418,989 US8139793B2 (en) 2003-08-27 2006-05-04 Methods and apparatus for capturing audio signals based on a visual image
US11/429,414 2006-05-04
US11/429,133 2006-05-04
US11/381,728 US7545926B2 (en) 2006-05-04 2006-05-04 Echo and noise cancellation
US11/429,133 US7760248B2 (en) 2002-07-27 2006-05-04 Selective sound source listening in conjunction with computer interactive processing
US11/381,724 2006-05-04
US11/381,727 US7697700B2 (en) 2006-05-04 2006-05-04 Noise removal for electronic device with far field microphone on console
US11/381,724 US8073157B2 (en) 2003-08-27 2006-05-04 Methods and apparatus for targeted sound detection and characterization
US79803106P true 2006-05-06 2006-05-06
US11/382,031 US7918733B2 (en) 2002-07-27 2006-05-06 Multi-input game control mixer
US60/798,031 2006-05-06
US11/382,031 2006-05-06
US11/382,038 US7352358B2 (en) 2002-07-27 2006-05-06 Method and system for applying gearing effects to acoustical tracking
US11/382,035 US8797260B2 (en) 2002-07-27 2006-05-06 Inertially trackable hand-held controller
US29/259,350 USD621836S1 (en) 2006-05-06 2006-05-06 Controller face with tracking sensors
US11/382,034 2006-05-06
US29/259,349 2006-05-06
US11/382,034 US20060256081A1 (en) 2002-07-27 2006-05-06 Scheme for detecting and tracking user manipulation of a game controller body
US11/382,035 2006-05-06
US11/382,036 US9474968B2 (en) 2002-07-27 2006-05-06 Method and system for applying gearing effects to visual tracking
US11/382,033 2006-05-06
US11/382,032 2006-05-06
US11/382,033 US8686939B2 (en) 2002-07-27 2006-05-06 System, method, and apparatus for three-dimensional input control
US11/382,036 2006-05-06
US29/259,350 2006-05-06
US11/382,038 2006-05-06
US11/382,037 2006-05-06
US11/382,032 US7850526B2 (en) 2002-07-27 2006-05-06 System for tracking user manipulations within an environment
US29/259,348 2006-05-06
US11/382,037 US8313380B2 (en) 2002-07-27 2006-05-06 Scheme for translating movements of a hand-held controller into inputs for a system
US29259349 2006-05-06
US29259348 2006-05-06
US11/382,040 2006-05-07
US11/382,041 US7352359B2 (en) 2002-07-27 2006-05-07 Method and system for applying gearing effects to inertial tracking
US11/382,043 2006-05-07
US11/382,043 US20060264260A1 (en) 2002-07-27 2006-05-07 Detectable and trackable hand-held controller
US11/382,039 2006-05-07
US11/382,040 US7391409B2 (en) 2002-07-27 2006-05-07 Method and system for applying gearing effects to multi-channel mixed input
US11/382,039 US9393487B2 (en) 2002-07-27 2006-05-07 Method for mapping movements of a hand-held controller to game commands
US11/382,041 2006-05-07
US29246759 2006-05-08
US11/382,250 2006-05-08
US29/246,759 2006-05-08
US29/246,766 2006-05-08
US29/246,743 USD571367S1 (en) 2006-05-08 2006-05-08 Video game controller
US29/246,744 2006-05-08
US11/382,250 US7854655B2 (en) 2002-07-27 2006-05-08 Obtaining input for controlling execution of a game program
US29/246,744 USD630211S1 (en) 2006-05-08 2006-05-08 Video game controller front face
US11/382,252 US10086282B2 (en) 2002-07-27 2006-05-08 Tracking device for use in obtaining information for controlling game program execution
US11/430,594 2006-05-08
US29246762 2006-05-08
US29/246,743 2006-05-08
US29/246,768 USD571806S1 (en) 2006-05-08 2006-05-08 Video game controller
US11/382,256 US7803050B2 (en) 2002-07-27 2006-05-08 Tracking device with sound emitter for use in obtaining information for controlling game program execution
US29/246,767 USD572254S1 (en) 2006-05-08 2006-05-08 Video game controller
US29/246,763 2006-05-08
US11/382,251 2006-05-08
US11/382,259 2006-05-08
US29/246,765 2006-05-08
US29/246,768 2006-05-08
US29/246,764 USD629000S1 (en) 2006-05-08 2006-05-08 Game interface device with optical port
US29246763 2006-05-08
US29246765 2006-05-08
US11/430,593 US20070261077A1 (en) 2006-05-08 2006-05-08 Using audio/visual environment to select ads on game platform
US11/430,594 US20070260517A1 (en) 2006-05-08 2006-05-08 Profile detection
US29/246,762 2006-05-08
US29246766 2006-05-08
US29/246,767 2006-05-08
US11/382,258 2006-05-08
US11/382,256 2006-05-08
US11/382,252 2006-05-08
US29/246,764 2006-05-08
US11/430,593 2006-05-08
US11/382,699 2006-05-10
US11/382,699 US20070265075A1 (en) 2006-05-10 2006-05-10 Attachable structure for use with hand-held controller having tracking ability
US11/624,637 US7737944B2 (en) 2002-07-27 2007-01-18 Method and system for adding a new player to a game in response to controller activity
US11/624,637 2007-01-18

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2009509960 Division 2007-04-25

Publications (2)

Publication Number Publication Date
JP2012135643A JP2012135643A (en) 2012-07-19
JP5726793B2 true JP5726793B2 (en) 2015-06-03

Family

ID=46673570

Family Applications (5)

Application Number Title Priority Date Filing Date
JP2009509960A Active JP5301429B2 (en) 2002-07-27 2007-04-25 A method for detecting and tracking user operations on the main body of the game controller and converting the movement into input and game commands
JP2009509977A Pending JP2009535179A (en) 2002-07-27 2007-04-27 Determination of the lack of user operation, determination of the user's activity level, and / or methods and apparatus used when adding a new player to the system
JP2012057132A Active JP5726793B2 (en) 2002-07-27 2012-03-14 A method for detecting and tracking user operations on the main body of the game controller and converting the movement into input and game commands
JP2012057129A Pending JP2012135642A (en) 2002-07-27 2012-03-14 Scheme for detecting and tracking user manipulation of game controller body and for translating movement thereof into input and game command
JP2012120096A Active JP5726811B2 (en) 2002-07-27 2012-05-25 Method and apparatus for use in determining lack of user activity, determining user activity level, and / or adding a new player to the system

Family Applications Before (2)

Application Number Title Priority Date Filing Date
JP2009509960A Active JP5301429B2 (en) 2002-07-27 2007-04-25 A method for detecting and tracking user operations on the main body of the game controller and converting the movement into input and game commands
JP2009509977A Pending JP2009535179A (en) 2002-07-27 2007-04-27 Determination of the lack of user operation, determination of the user's activity level, and / or methods and apparatus used when adding a new player to the system

Family Applications After (2)

Application Number Title Priority Date Filing Date
JP2012057129A Pending JP2012135642A (en) 2002-07-27 2012-03-14 Scheme for detecting and tracking user manipulation of game controller body and for translating movement thereof into input and game command
JP2012120096A Active JP5726811B2 (en) 2002-07-27 2012-05-25 Method and apparatus for use in determining lack of user activity, determining user activity level, and / or adding a new player to the system

Country Status (2)

Country Link
JP (5) JP5301429B2 (en)
WO (3) WO2007130833A2 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8542907B2 (en) * 2007-12-17 2013-09-24 Sony Computer Entertainment America Llc Dynamic three-dimensional object mapping for user-defined control device
JP5334037B2 (en) * 2008-07-11 2013-11-06 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Sound source position detection method and system
US8761434B2 (en) * 2008-12-17 2014-06-24 Sony Computer Entertainment Inc. Tracking system calibration by reconciling inertial data with computed acceleration of a tracked object in the three-dimensional coordinate system
GB2467951A (en) * 2009-02-20 2010-08-25 Sony Comp Entertainment Europe Detecting orientation of a controller from an image of the controller captured with a camera
US9058063B2 (en) * 2009-05-30 2015-06-16 Sony Computer Entertainment Inc. Tracking system calibration using object position and orientation
JP5463790B2 (en) * 2009-08-18 2014-04-09 ソニー株式会社 Operation input system, control device, handheld device, and operation input method
JP5514774B2 (en) 2011-07-13 2014-06-04 株式会社ソニー・コンピュータエンタテインメント Game device, game control method, game control program, and recording medium
KR20160019439A (en) * 2013-06-13 2016-02-19 바스프 에스이 Optical detector and method for manufacturing the same
KR20150123491A (en) * 2014-04-25 2015-11-04 삼성전자주식회사 Method and apparatus for controlling device in a home network system

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5214615A (en) * 1990-02-26 1993-05-25 Will Bauer Three-dimensional displacement of a body with computer interface
US5227985A (en) * 1991-08-19 1993-07-13 University Of Maryland Computer vision system for position monitoring in three dimensions using non-coplanar light sources attached to a monitored object
JP3218716B2 (en) * 1992-07-31 2001-10-15 ソニー株式会社 Input device, and input system
US5453758A (en) * 1992-07-31 1995-09-26 Sony Corporation Input apparatus
JP3907213B2 (en) * 1992-09-11 2007-04-18 伸壹 坪田 Game controller
US7358956B2 (en) 1998-09-14 2008-04-15 Microsoft Corporation Method for providing feedback responsive to sensing a physical presence proximate to a control of an electronic device
JP2000259340A (en) * 1999-03-12 2000-09-22 Sony Corp Device and method for input, input system, and distribution medium
JP3847058B2 (en) * 1999-10-04 2006-11-15 任天堂株式会社 Game system and game information storage medium used therefor
JP2002306846A (en) * 2001-04-12 2002-10-22 Saibuaasu:Kk Controller for game machine
JP2002320773A (en) * 2001-04-25 2002-11-05 Pacific Century Cyberworks Japan Co Ltd Game device, its control method, recording medium, program and cellular phone
JP2003078779A (en) * 2001-08-31 2003-03-14 Hitachi Ltd Multi remote controller and remote control system using the same
JP4028708B2 (en) * 2001-10-19 2007-12-26 株式会社コナミデジタルエンタテインメント Game apparatus and a game system
JP2003135851A (en) * 2001-10-31 2003-05-13 Konami Computer Entertainment Yokyo Inc Game device, method for controlling computer game system, and program
JP3824260B2 (en) 2001-11-13 2006-09-20 任天堂株式会社 Game system
JP4010533B2 (en) 2001-11-20 2007-11-21 任天堂株式会社 Game machines, electronic devices, and the power-saving mode management program
JP3470119B2 (en) * 2002-02-14 2003-11-25 コナミ株式会社 Controller, controller of attitude telemetry device and the video game device
US7227976B1 (en) * 2002-07-08 2007-06-05 Videomining Corporation Method and system for real-time facial image enhancement
US20040063502A1 (en) * 2002-09-24 2004-04-01 Intec, Inc. Power module
JP2006509548A (en) * 2002-12-10 2006-03-23 ノキア コーポレイション METHOD AND APPARATUS player electronic multi-player game to continue the game in the absence of
JP4906239B2 (en) * 2003-04-16 2012-03-28 株式会社ソニー・コンピュータエンタテインメント Communication device, game machine, and communication method
US7815507B2 (en) * 2004-06-18 2010-10-19 Igt Game machine user interface using a non-contact eye motion recognition device
JP2006099468A (en) * 2004-09-29 2006-04-13 Toshiba Corp Gesture input device, method, and program
US7620316B2 (en) * 2005-11-28 2009-11-17 Navisense Method and device for touchless control of a camera
US7834850B2 (en) * 2005-11-29 2010-11-16 Navisense Method and system for object control
JP4481280B2 (en) * 2006-08-30 2010-06-16 富士フイルム株式会社 Image processing apparatus, and image processing method
US8277316B2 (en) * 2006-09-14 2012-10-02 Nintendo Co., Ltd. Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting

Also Published As

Publication number Publication date
WO2007130872A2 (en) 2007-11-15
WO2007130999A3 (en) 2008-12-18
JP5726811B2 (en) 2015-06-03
WO2007130872A3 (en) 2008-11-20
WO2007130833A3 (en) 2008-08-07
JP2009535175A (en) 2009-10-01
JP2009535179A (en) 2009-10-01
WO2007130833A2 (en) 2007-11-15
JP2012179420A (en) 2012-09-20
WO2007130999A2 (en) 2007-11-15
JP5301429B2 (en) 2013-09-25
JP2012135642A (en) 2012-07-19
JP2012135643A (en) 2012-07-19

Similar Documents

Publication Publication Date Title
US9411437B2 (en) Easily deployable interactive direct-pointing system and presentation control system and calibration method therefor
US9218058B2 (en) Wearable digital input device for multipoint free space data collection and analysis
JP3759152B2 (en) Data processing apparatus having a data input device and such a data input device for use with a data processing device
US5611731A (en) Video pinball machine controller having an optical accelerometer for detecting slide and tilt
US10065108B2 (en) Video game using dual motion sensing controllers
CN101484933B (en) Method and apparatus based on the one or more visual, the inertia of the mixed data and the auditory effect to apply to the input of the transmission
EP2359606B1 (en) Tracking system calibration with minimal user input
US9261968B2 (en) Methods and systems for dynamic calibration of movable game controllers
US8223147B1 (en) Method and system for vision-based interaction in a virtual environment
US9266026B2 (en) Method and apparatus for dynamically adjusting game or other simulation difficulty
EP1878013B1 (en) Video game control with joystick
US9339724B2 (en) Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting
US8112371B1 (en) Systems and methods for generalized motion recognition
US9489053B2 (en) Skeletal control of three-dimensional virtual world
US9533220B2 (en) Game controller and game system
US9474968B2 (en) Method and system for applying gearing effects to visual tracking
EP1832967B1 (en) Coordinate calculating apparatus and coordinate calculating program
US20060233389A1 (en) Methods and apparatus for targeted sound detection and characterization
US10279254B2 (en) Controller having visually trackable object for interfacing with a gaming system
KR101169813B1 (en) Game system and storage medium having game program stored thereon
JP5675627B2 (en) Mobile device with gesture recognition
US9901828B2 (en) Method for an augmented reality character to maintain and exhibit awareness of an observer
ES2527047T3 (en) Video game controller and video game system
US20080001951A1 (en) System and method for providing affective characteristics to computer generated avatar during gameplay
US20120086630A1 (en) Using a portable gaming device to record or modify a game or application in real-time running on a home gaming system

Legal Events

Date Code Title Description
A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20131203

A601 Written request for extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A601

Effective date: 20140303

A602 Written permission of extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A602

Effective date: 20140306

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140403

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20150317

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20150401

R150 Certificate of patent or registration of utility model

Ref document number: 5726793

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250