CN110665220B - Game Controller - Google Patents

Game Controller Download PDF

Info

Publication number
CN110665220B
CN110665220B CN201910768967.7A CN201910768967A CN110665220B CN 110665220 B CN110665220 B CN 110665220B CN 201910768967 A CN201910768967 A CN 201910768967A CN 110665220 B CN110665220 B CN 110665220B
Authority
CN
China
Prior art keywords
game
controller
user
video
main body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910768967.7A
Other languages
Chinese (zh)
Other versions
CN110665220A (en
Inventor
R.纳卡亚马
E.黄
N.加里
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment America LLC
Original Assignee
Sony Computer Entertainment America LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/842,975 external-priority patent/US9116555B2/en
Application filed by Sony Computer Entertainment America LLC filed Critical Sony Computer Entertainment America LLC
Priority to CN201910768967.7A priority Critical patent/CN110665220B/en
Publication of CN110665220A publication Critical patent/CN110665220A/en
Application granted granted Critical
Publication of CN110665220B publication Critical patent/CN110665220B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention provides a controller for interfacing with an interactive application, comprising: a housing defined by a main body, a first extension extending from a first end of the main body, and a second extension extending from a second end of the main body, the first and second extensions being for grasping by first and second hands, respectively, of a user; an input device positioned along a top surface of the body; a touch sensitive panel defined along the top surface of the body.

Description

Game controller
The present divisional application is a divisional application with an application date of 2014, 3 months, 17 days, an application number of 201410099053.3, and a name of "game controller".
Technical Field
The present invention relates to a controller for interfacing with an interactive program.
Background
Many variations have been made over the years by the video game industry. As computing power expands, video game developers have likewise created game software that takes advantage of these increases in computing power. To this end, video game developers have been encoding games that incorporate complex operations and mathematical operations to produce a very realistic game experience.
An example gaming platform may be SonySony/>(PS 2) and Sony(PS 3), each of which is sold in the form of a game console. As is well known, game consoles are designed to connect to a monitor (typically a television) and enable a user to interact through a handheld controller. An example of a hand-held control is manufactured by Sony Computer Entertainment inc +.>And 3, a wireless controller.
Embodiments of the invention are produced in this context.
Summary of The Invention
Embodiments of the present invention provide a controller that interfaces with an interactive application (e.g., a video game). Several inventive embodiments of the present invention are described below.
In one embodiment, a controller for interfacing with an interactive application is provided, comprising: a housing defined by a main body, a first extension extending from a first end of the main body, and a second extension extending from a second end of the main body, the first and second extensions being for grasping by first and second hands, respectively, of a user; an input device positioned along a top surface of the body; a touch sensitive panel defined along the top surface of the body.
In one embodiment, the input device is selected from the group consisting of a joystick, a button, a trigger, a steering wheel.
In one embodiment, the controller further comprises a tracking panel defined along a front side surface of the body; and a light defined in the body to illuminate the tracking panel.
In one embodiment, the tracking panel is defined by a translucent material.
In one embodiment, the controller further comprises one or more of an accelerometer, a gyroscope, or a magnetometer.
In one embodiment, the input device is a button configured to activate a sharing interface to share the recorded game to the user's social graph.
In one embodiment, the game of sharing the record includes sharing one or more of the images or video clips.
In one embodiment, the game of sharing the record includes streaming live video of the user game.
In another embodiment, a controller for interfacing with an interactive application includes: a housing defined by a main body, a first extension extending from a first end of the main body, and a second extension extending from a second end of the main body, the first and second extensions being for grasping by first and second hands, respectively, of a user; a button positioned along a top surface of the body, the button configured to activate a sharing interface to share the recorded game to a social graph of the user.
In one embodiment, the game of sharing the record includes sharing one or more of the images or video clips.
In one embodiment, the game of sharing the record includes streaming live video of the user game.
In one implementation, the controller also includes a touch sensitive panel defined along the top surface of the body.
In one embodiment, the controller further comprises an input device positioned along the top surface of the body.
In one embodiment, the input device is selected from the group consisting of a joystick, a button, a trigger, a steering wheel.
In one embodiment, the controller further comprises a tracking panel defined along a front side surface of the body; and a light defined in the body to illuminate the tracking panel.
In one embodiment, the tracking panel is defined by a translucent material.
In one embodiment, the controller further comprises one or more of an accelerometer, a gyroscope, or a magnetometer.
In another embodiment, a controller for interfacing with an interactive application is provided, comprising: a housing defined by a main body, a first extension extending from a first end of the main body, and a second extension extending from a second end of the main body, the first and second extensions being for grasping by first and second hands, respectively, of a user; a touch sensitive panel defined along the top surface of the body; a button positioned along a top surface of the body, the button configured to activate a sharing interface to share the recorded game to a social graph of a user; a tracking panel defined along a front side surface of the body; and a light defined in the body to illuminate the tracking panel.
In one embodiment, the tracking panel is defined by a translucent material.
In one embodiment, the game of sharing the record includes sharing one or more of the images or video clips.
In one embodiment, the game of sharing the record includes streaming live video of the user game.
Other aspects of the invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the invention.
Brief Description of Drawings
The invention may be best understood by reference to the following description taken in conjunction with the accompanying drawings, in which:
fig. 1 illustrates a perspective view of a controller 10 for interfacing with an interactive program according to an embodiment of the present invention.
Fig. 2 illustrates a user's hand holding the controller 10 in the vicinity of the display 30 according to an embodiment of the present invention.
Fig. 3 illustrates the combined use of input devices on the controller 10 to control virtual perspectives in a three-dimensional virtual environment in accordance with an embodiment of the present invention.
Fig. 4A and 4B illustrate side and front views, respectively, of a controller 10 according to an embodiment of the present invention.
Fig. 5A illustrates a side view of the controller 10 according to an embodiment of the present invention.
Fig. 5B-5M illustrate various tracking features for tracking a panel according to embodiments of the invention.
Fig. 6 illustrates a perspective view of a controller 10 having a recess 56 defined along a front side of a body of the controller, according to an embodiment of the present invention.
Fig. 7 illustrates a bottom view of the controller 10 according to an embodiment of the present invention.
Fig. 8 illustrates a cross-sectional view of the controller 10, which shows the operation of the 3D control lever 22.
Fig. 9 illustrates a back perspective view of the controller 10 according to an embodiment of the present invention.
Fig. 10A illustrates another perspective view of the controller 10 according to an embodiment of the present invention.
Fig. 10B illustrates the controller 10 without a 3D joystick according to an embodiment of the present invention.
Fig. 11A illustrates a top view of a controller device according to an embodiment of the present invention.
Fig. 11B illustrates a perspective view of a controller device according to an embodiment of the present invention.
FIG. 12 illustrates hardware and user interfaces that may be used to provide interaction with a video game according to one embodiment of the invention.
FIG. 13 illustrates additional hardware that may be used to process instructions according to one embodiment of the invention.
Fig. 14 is an example illustration of scenarios a through E and respective users a through E interacting with a game client 1102 that is connected to a server process via the internet, according to one embodiment of the present invention.
Fig. 15 illustrates an embodiment of an information service provider architecture.
Detailed Description
The following embodiments describe methods and apparatus for interfacing with interactive programs.
It will be apparent, however, to one skilled in the art, that the present invention may be practiced without some or all of these specific details. In other instances, well known process operations have not been described in detail in order not to unnecessarily obscure the present invention.
Fig. 1 illustrates a perspective view of a controller 10 for interfacing with an interactive program according to an embodiment of the present invention. The controller 10 includes a main body 12 and extension portions 14a and 14b. The extension portions 14a and 14b are configured to be held by the left and right hands, respectively, of the user and thus function as handles or handle portions to enable the user to securely grasp the controller. Various input devices, such as buttons 16, levers 18, and a steering wheel 20, are included on the top surface of the body 12. The top of the 3D control lever 22, which extends through the body of the controller from top to bottom, is also shown and described in more detail below. A speaker 24 is provided for playing sound, which provides feedback to the user.
Further, the controller 10 includes a touch panel 26 defined along the back side of the body that faces the user when the controller is held in a conventional position. The touch panel 26 is oriented in a generally vertical manner and is between the extensions 14a and 14b such that a user holding the controller through the extensions can easily use the touch panel with the thumb of either hand. Touch panel 26 utilizes touch sensitive technology (e.g., resistive, capacitive, etc.) to detect touch gestures. In the illustrated embodiment, the touch panel 26 also has a slight outward curvature from bottom to top that provides a tactile sensation by virtue of its shape, enabling a user to easily determine the approximate vertical position of his thumb on the touch panel based solely on the sensation.
In another embodiment, the controller 10 may include one or more microphones for capturing sound from an interactive environment. In some implementations, the microphones may be arranged as an array of microphones. In one implementation, the arrangement constitutes a linear array of microphones. When three or more microphones are included in the microphone array, the location of the sound source relative to the microphones may be determined based on analysis of audio data captured from the microphone array. More specifically, the sound source may be positioned relative to the microphone array based on the relative time of its sound as captured by each microphone of the microphone array. In combination with the known position and orientation of the controller (e.g., as determined based on the sensors and tracking methods defined elsewhere herein) and the known position and orientation of the extended microphone array, then the position of the sound source within the interactive environment may be determined. Further, the captured sound may be processed to exclude sound that does not originate from a certain area of the interactive environment.
In one embodiment, the touch panel 26 may be coupled with a display screen to provide a touch screen interface as part of the controller. The touch screen interface may be controlled by the interactive application to display various images according to the interactive application. For example, the touch screen may display an image of a touch screen area depicting various functions corresponding to the interactive application. For another example, the touch screen may display lines indicating gestures that a user may perform. The user may track the lines shown on the touch screen in order to perform the indicated gesture. Still further by way of example, a touch screen may be configured to display a gesture or gestures of a user by providing a visual trace line along which the user touches and slides/moves his finger along the touch screen. In one embodiment, the touch screen is configured to display a recent gesture or a gesture that has been performed on the touch screen. For example, an old gesture may be dismissed from the display when a new gesture is detected, or the trace line may be configured to disappear from the display over time, or to dismiss from the display after some preset time has elapsed.
The placement of one or more microphones on the controller may be advantageous over alternative microphone placement (e.g., near the display, on a separate device) because the controller is held by the user and thus in close proximity to the user. In addition, where there are multiple users, each operating one controller, then each user's corresponding controller is close to himself/herself to help reduce crosstalk, thereby facilitating better recognition of sounds such as from a particular user. Furthermore, when multiple controllers are utilized, then audio data captured from the multiple controllers may be analyzed in combination, along with the location and orientation of the controllers, to enable the location of the sound source to be determined with a higher level of accuracy.
Fig. 2 illustrates a user's hand holding the controller 10 in the vicinity of the display 30 according to an embodiment of the present invention. As can be seen, the vertical orientation of the touch panel 26 is such that the touch panel is generally parallel to the display screen 30 when the user is holding the controller in a normal manner and facing the display screen. In this manner, the touch panel 26 may be utilized as an intuitive input mechanism to provide directional input in a plane parallel to the display screen 30 (x-y plane). For example, the touch panel 26 may be used to control a cursor or cross-hair on a screen, or scroll vertically and horizontally, or provide other types of input along the x-y plane.
Fig. 3 illustrates a combined use of input devices on the controller 10 to control virtual perspectives in a three-dimensional virtual environment in accordance with an embodiment of the present invention. As shown, the controller 10 is operated in the vicinity of the display 30. In some prior art systems, a controller having two vertically oriented joysticks is utilized to control the movement and orientation of a virtual perspective associated with a character or other object in a video game. Typically, one joystick will control the x-axis translation (side-to-side translation) and the z-axis translation (back-to-forth translation) of the view, while the other joystick controls the pitch (vertical rotation) and roll (horizontal rotation) of the view. However, the use of a vertically oriented joystick to control pitch and roll is non-intuitive, particularly because the forward/reverse joystick input must be converted to pitch motion, which is a substantially vertically oriented motion. In fact, it is not uncommon for the system to provide a "normal" setting in which the forward and reverse movements of the joystick correspond to positive and negative changes in pitch, respectively, and an exactly opposite configuration (sometimes referred to as an airplane configuration) in which the forward and reverse movements of the joystick correspond to negative and positive changes in pitch, respectively.
However, in contrast to the non-intuitive control scheme just described, the touch panel 26 and joystick 18 of the presently disclosed controller 10 may be utilized in a more intuitive manner to enable control of the virtual viewing angle. In one embodiment, joystick 18 is used to control x-axis translation and z-axis translation, and touch panel 26 is used to control pitch and yaw. As the touch panel 26 is defined along the sides of the body of the controller and vertically oriented so as to be generally parallel to the plane of the display 30, its use to control pitch and roll is intuitive for the user. In the illustrated embodiment, the right joystick 18 is shown for ease of description only and is not limiting. In embodiments of the controller 10 having left and right joysticks, one or both of the joysticks may be configured to control x-axis translation and z-axis translation.
In still other implementations, it will be appreciated that the touch panel 26 and joystick 28 may be combined in other ways to provide control of the virtual viewing angle. For example, touch panel 26 may be used to control pitch and x-axis translation, while joystick 18 is used to control yaw and z-axis translation. Furthermore, although reference has been made to control of virtual perspectives, the presently described control scheme may be applied to control the movement, orientation or position of any type of character, vehicle, weapon or other object in a virtual environment.
Fig. 4A and 4B illustrate side and front views, respectively, of a controller 10 according to an embodiment of the present invention. As shown, the controller 10 includes a tracking panel 52 defined along the front side of the body opposite the back side on which the touch panel is located. Tracking panel 52 is illuminated and may be visually tracked in accordance with image recognition techniques to determine the position and orientation of controller 10. Tracking panel 52 may be contoured to provide a wide angle of visibility. For example, in the illustrated embodiment, the tracking panel 52 includes an upwardly facing top 54a and a downwardly facing bottom 54b, the combination of which provides a wide angle of visibility to enable the controller 10 to be visually tracked in a wide variety of locations. The tracking panel 52 may be defined by a translucent material and internally illuminated by a light source, such as one or more LEDs.
Fig. 5A illustrates a side view of the controller 10 according to an embodiment of the present invention. As can be seen, the tracking panel 52 is defined so as to also serve as a support for the front of the controller, thus preventing the bottom trigger 50 from being accidentally depressed.
It will be appreciated that in various implementations, the tracking panel 52 may have any of a variety of tracking features including various shapes, sizes, or forms. Some examples of these shapes and forms are provided with reference to fig. 5B-5M. In one implementation illustrated in fig. 5B, the tracking panel may define a single rectangular shape. In other embodiments, the tracking panel may have any other shape, such as circular, oval, triangular, square, polygonal, and the like. In another embodiment, the tracking panel may include a plurality of rectangular shapes, as shown in fig. 5C. It will be appreciated that multiple rectangular shapes may be illuminated in the same color or different colors. Although eight rectangular shapes are shown in fig. 5C, any number of rectangular shapes may be present. In the embodiment of fig. 5D, the tracking panel is defined by a plurality of circles. In the embodiment of fig. 5DE, the tracking panel is defined by a single oval shape. In the embodiment of fig. 5F, the tracking panel includes a plurality of shapes, including square, triangle, circle, and "X", all of which are horizontally aligned. It will be appreciated that any combination of shapes may be utilized in embodiments of the present invention.
When multiple tracking features (e.g., multiple shapes, which may be the same shape or different shapes) are utilized, they may be individually illuminated to facilitate identifying a particular controller when multiple controllers are present. For example, a first controller may be controlled to illuminate a particular one or combination of the shapes, a second controller may be controlled to illuminate a different particular one or combination of the shapes, and so on. In this way, each of the plurality of controllers may be identified and resolved from one another based on analysis of the captured image of the tracking feature, as each controller is configured to illuminate a unique one or combination of shapes that exist as tracking features on the controller.
In the embodiment of fig. 5G, the tracking panel includes a plurality of vertically oriented rectangles positioned in a horizontal array. In the embodiment of fig. 5H, the tracking panel includes a plurality of horizontally oriented rectangles arranged in a vertical array. Fig. 5I illustrates an embodiment in which the tracking panel includes letters. It will be appreciated that the tracking panel may include any letters, numbers, symbols, or other characters, according to various embodiments of the invention. While the illustrated embodiment includes multiple shapes that are separate from one another, it will be appreciated that in other embodiments, such as the embodiment shown in fig. 5J, there may be multiple shapes disposed adjacent one another. In the embodiment illustrated in fig. 5J, the tracking panel defines a plurality of rectangles arranged adjacent to each other with no space between adjacent rectangles.
It will also be appreciated that the tracking panel may be defined to have various three-dimensional shapes. For example, fig. 5K illustrates a cross-sectional view of a tracking panel including concave features for tracking. Fig. 5L illustrates a cross-sectional view of a tracking panel including convex features for tracking. The foregoing examples of features that may be defined as part of a tracking panel are provided by way of example only and are not limiting. Those skilled in the art will appreciate that in various other implementations, the tracking panel may include features of any shape, size, or form.
Fig. 5M illustrates a tracking panel defined by a matrix of pixels, each of which may be individually illuminated. In the illustrated embodiment, the particular pattern illuminated by the pixels of the tracking panel may be configured to have any of a variety of designs, and may be configured to display different patterns of different controllers when the multiple controllers are in operation.
Fig. 6 illustrates a perspective view of a controller 10 having a recess 56 defined along a front side of a body of the controller, according to an embodiment of the present invention. The recess may be illuminated and utilized for visual tracking by the controller. In one embodiment, the depressions are differentially illuminated: for example, from top to bottom or side to side, with a smooth or abrupt transition, with different colors or different brightness/darkness, or a combination thereof. Variations in the size, shape, or orientation of the distinct illumination-binding depressions in the captured image of the controller may be detected and analyzed to determine the position and orientation of the controller relative to the image capture device. In various embodiments, the particular shape of the recess 56 may vary in terms of both its surface shape (the shape defined by the recess at the controller surface) and the shape of the recess. For example, the surface shape may be circular, rectangular or bar-shaped as shown, polygonal, etc. And the recessed portion may be semi-circular, semi-oval, angular, faceted, etc. Additionally, in some implementations, there may be more than one recess defined on the controller.
Fig. 7 illustrates a bottom view of the controller 10 according to an embodiment of the present invention. The bottom of the 3D control lever 22 is visible protruding from the recess 60.
Fig. 8 illustrates a cross-sectional view of the controller 10, which shows the operation of the 3D control lever 22. As mentioned, the 3D control lever 22 extends from the top surface of the main body through the main body of the controller to the bottom surface thereof. The 3D control lever 22 includes a top plate 70a and a bottom plate 70b for contacting a user's finger. In one embodiment, the lever is mounted in its central position and pivots about its central position. Thus, horizontal movement of the top plate 70a results in movement of the bottom plate 70b in the opposite direction, and vice versa. Furthermore, in one embodiment, the 3D control lever 22 may translate up and down in a vertical manner. In one embodiment, the 3D control lever 22 may also translate in a horizontal direction, moving the entire lever in a horizontal manner.
In one embodiment, this free movement of the 3D control lever 22 (including pivoting about its central position, as well as vertical and horizontal translational movement) is accomplished by means of a floating mount. For example, in one embodiment, a central portion of the control rod is mounted in a compliant material 72 that allows the control rod to "float" and thereby facilitate such movement. In one embodiment, the lever includes a flange 71 to facilitate a secure mounting within the compliant material 72. The compliant material may be any type of resilient material that allows the lever 22 to be moved by a user, but returns the lever to a normal centered orientation when unaffected by the user. The sensor 74 detects movement of the lever 22 and the sensor data analyzer 76 analyzes raw data from the sensor to determine the orientation and/or movement of the lever 22.
Fig. 9 illustrates a back perspective view of the controller 10 according to an embodiment of the present invention.
Fig. 10A illustrates another perspective view of the controller 10 according to an embodiment of the present invention. Fig. 10B illustrates the controller 10 without a 3D joystick according to an embodiment of the present invention.
It will be appreciated that in various embodiments, the controller may include any of a variety of additional features, including, but not limited to, a haptic feedback mechanism, such as a vibration mechanism, various data and power connectors, such as a USB connector, various inertial sensors, such as accelerometers, gyroscopes, and magnetometers, and the like. Additional details regarding possible features that may be included in the controller may be found in reference to U.S. patent No. 12/259,181 entitled "Determining Location and Movement of Ball-Attached Controller" to month 10, 27 of 2008, and U.S. patent application No.11/382,036 entitled "Method and System for Applying Gearing Effects to Visual Tracking" to month 5, 6 of 2006, the disclosures of which are incorporated herein by reference.
Fig. 11A illustrates a top view of a controller device 100 according to an embodiment of the present invention. The controller 100 includes various buttons 102 that may be configured for various purposes, and a direction button 104 for providing a direction input. A left joystick 108 and a right joystick 110 are provided. The system buttons 112 may perform or otherwise provide access to various system functions, such as exiting a video game, pausing or closing a game console, shutting down the controller device 100, and so forth.
Speakers 114 are provided to allow audio output to appear at the controller, which may enrich the user's game by allowing some audio to be presented through the controller rather than presented in conjunction with the remainder of the audio output of the video game. For example, ambient sound from the virtual environment of the video game may be presented through a normal audio mechanism (e.g., as part of the video output of the video game), while audio from the communication (e.g., telephone, radio communication, etc.) is presented, particularly through the controller speaker 114. When multiple players participate in a game, the audio of a particular player may be routed to a controller operated by the player. In this way, players of a multiplayer game can each receive audio specific to them and easily discern such audio as applicable thereto, even while participating in a game in the same local gaming environment.
The controller device 100 includes a touch sensitive panel 116 to facilitate touch-based input. Options button 118 may be configured to provide access to various options that may be specific to a game console, cloud gaming platform, a particular video game, or some other context. The sharing button 120 may provide access to a sharing interface to share the user game to a social network, such as sharing a screen shot or video clip of the user game, or to initiate streaming of a live user's current game. In one embodiment, the share button provides access to a cached video of the user's most recent game from which the user may select a portion or screen capture to share to a social network, such as a gaming social network or other social network.
Fig. 11B illustrates a perspective view of the controller device 100 according to an embodiment of the present invention. As shown, the controller 100 includes trigger buttons 130, 132, 134, and 136, which provide additional inputs for the game. A light bar 138 is defined on the front side of the controller 100 to facilitate identification and tracking of the controller 100. The light bar 138 may be illuminated to have a particular color and may be identified from a captured image of the gaming environment. It will be appreciated that the position and orientation of the controller 100 may be determined by tracking the light bar 138.
In one embodiment, a method for storing a game is contemplated. The game may be executed by the operating system of the game console in response to a user request, which may be in the form of a standard file operation with respect to a data set related to the desired game. The request may be transmitted from an application associated with the game. The game may include, for example, video content, audio content, and/or still visual content, including wallpaper, theme, code "extension" content, or any other type of content related to the game. It is contemplated that such content may be user-generated or developer-generated, free or paid, full or trial, and/or sold or rented.
A portion of the game may be cached, i.e., temporarily stored. For example, the previously completed level may be temporarily stored for the first 15 seconds of the previous action within the game, as further described herein. The term "portion" as used herein may correspond to any portion of a game that may be divided into any relevant or any group of single or multiple bits or bytes of data. For example, a "portion" of a game may correspond to a level, chapter, scene, action, character, background, structure, route, action, song, theme, duration, size, file, portion thereof, and combinations thereof. Further, portions of the game may include screen shots or video captures of a prescribed duration.
In one embodiment, portions of the game may be stored locally in temporary or permanent memory on the game console. Alternatively or additionally, portions of the game may be transmitted over a network for remote storage. For example, portions of the game may be transmitted over a wireless or wired network to another computing device, another game console, or a remote server. Such remote servers may include social media servers.
Optionally, portions of the game that are not cached or partially restored from games outside of a particular game interval (e.g., a particular duration, level, chapter, route, etc.) may be removed from the cache. This removal process may be accomplished using standard file operations on the operating system.
Portions of the game may be displayed on any number of display devices that have access to the stored game. For example, the stored games may be displayed on a television connected to a game console from which the games were captured. In another example, the stored game may be displayed on a computer to which the stored game is transmitted. The stored games may be displayed alone or in conjunction with other information, such as on a social media website.
In one embodiment, portions of the game are displayed by another game console associated with a user other than the user caching or capturing the game. According to this embodiment, portions of the game may show that from the perspective of a first user, a ball is thrown from the first user to a second user. The portion of the game may then be transmitted to a game console of a second user. Thus, the second user may then view the game from the perspective of the first user. The second user may also have a stored game portion showing the ball being thrown by the first user and caught by the second user from the second user's perspective. In this embodiment, the second user may view the game from both the first user's perspective and the second user's perspective. However, in addition, the portion of the game stored by the second user may be transmitted to the first user's game console so that the first user may view the game from two perspectives. The present embodiment is applicable to any number of users having any number of perspectives, and thus the game may be viewed from any number of different perspectives.
With respect to the storage, transmission, and/or display of a portion of a game as described herein, it is contemplated that a portion of a game may be stored, transmitted, and displayed as image or video data. However, in another embodiment, a portion of the game may be stored and transmitted as telemetry or metadata representing the image or video data, and may be recreated into an image or video by a game console or other device prior to display.
In some embodiments, a portion of the game has a predetermined relationship with the game being executed. For example, a portion of the game may correspond to a certain amount of the game (e.g., the first 10 seconds of the game) prior to the currently executed game. In another embodiment, the first portion of the game has a predetermined relationship with the second portion of the game. For example, a first portion of the game may correspond to a quantity of the game prior to receiving a request to capture a second portion of the game (e.g., a 10 second game prior to selecting the capture button). In each of these embodiments, the amount of game buffered prior to the current game or requested game may be configured and adjusted by the user according to their particular preferences.
In other embodiments, the buffer is "intelligent" or "flexible" so that it captures the game according to variables regardless of time. In one such embodiment, the first portion of the game has a predetermined relationship to the game-related event. For example, a first portion of the game may be buffered to include statistical anomalies, such as reaching a high score, collecting a large number of points in a short period of time, selecting a plurality of buttons on the controller, and other rare events. Such statistical anomalies may generally be determined by comparing game metrics with average metrics for a particular game or all games. Such average metrics may be stored locally or remotely for comparison. For example, a game console may track the global high score of a particular game and buffer games that the user approaches and surpasses the high score. In another example, a remote server may track the global high score of a particular game and communicate this information to a game console, which buffers games that the user approaches and surpasses the high score.
In another example, a portion of the game may be buffered to include achievements, such as awards obtained or other landmarks reached. Such prizes or landmarks commemorate any target or game achievement, such as a certain number of points earned, a certain level reached, etc. For example, the game may be buffered to include awards for up to 10 levels, up to 100,000 points, and so on.
Similarly, in addition to actually obtaining a prize or statistical anomaly, progress toward reaching an event may be buffered for inclusion in a portion of the game. For example, screen shots may be taken at each of the 1 to 10 levels, creating an album to commemorate the receipt of prizes reaching the 10 level. As another example, a video of the first through fifth winning matches may be captured by the user, wherein the five wins awards a prize.
Thus, according to embodiments of the present invention, at least a portion of the game being executed may remain in the running buffer at all times. In other words, when a request to share a portion of a game is received, a portion of a previous game has been captured to include a previous footage. For example, if a request to share a game is received after a user crosses the finish line of the tournament game, the buffered game may include a footage of the user across the finish line. In other words, the user will be able to capture the moment that occurred before requesting the sharing game.
It should be appreciated that a user may share a game (e.g., a selected screen shot, video, or live game stream) with one or more particularly selected friends, an entire social graph, or any user in a social network. The social network may be a social network related to the platform on which the video game is running, or a third party social network that exists independent of the video game or its platform. The social network may be accessed through an API defined to allow interfacing with the social network. Users of a shared game may receive a notification informing them of the shared game. Such notifications may be in the form of mailings to social news feeds, private messages through social networks, in-game notifications, emails, chat notifications, and the like. Sharing games with a social network may require making the games available to other subsets of the social network users (which may or may not be part of the social graph of the sharing user). For example, for a given video game, any user of the social network (who also owns the video game) may share or have available games, and thus be granted access to the shared games of the video game. Such shared games may be accessed through an online forum, chat room, or other online channel that is open only to video game players. In one embodiment, a video game may have a dedicated page or site on a social network. The shared game may be open to users accessing pages or sites of the video game. Of course, it will be appreciated that from the perspective of sharing users, options may be provided to allow users to explicitly and tailor with whom and what forum their games will be shared.
While the various interfaces for sharing may be accessed by pressing a dedicated button (e.g., pressing a controller sharing button), it will be appreciated that in other embodiments, some or all of these interfaces may not be required to help share games with the user's social graph. For example, in one embodiment, the controller share button may be configured to capture a screen shot of a user game when it is pressed. The captured screen shots may then be automatically uploaded and shared to the user's social graph.
In another embodiment, pressing a share button on the controller initiates recording of the game video. When the share button is pressed a second time, recording of the game video ceases and the video clip may be uploaded and shared to the user social graph. In one embodiment, the uploading and sharing of video clips to the user's social graph may occur automatically after the video recording operation is completed. However, in another embodiment, when the share button is pressed a second time to stop recording, an interface is presented that allows the user to customize various options (e.g., clip video, select a representative screen shot for video), determine the particular user with whom to share video, add text or titles, etc. After being customized by the user, the video may be shared with others or available for viewing.
In one embodiment, a share button on the controller may be configured to share game video for a predetermined duration on the social network. For example, the user may specify that the first 10 seconds of the game video will be shared to the user's social graph when the share button is pressed. In another embodiment, it may be specified that when the share button is pressed, the next 10 seconds of the game video will be recorded and shared to the social graph. It should be appreciated that options for cropping videos and performing other types of customization may be applied to recorded game videos. Furthermore, a game video of a predetermined duration recorded after triggering the button may be combined with a buffered game video as already described.
In yet another embodiment, a sharing button on the controller device may be configured to begin live video streaming of the user activity game. Live video streaming may be predefined as members of only a user social graph, or other smaller or larger groups of users, such as a particular subset of user social graphs, all users who own or have access to the same video game, any gaming platform users, etc.
FIG. 12 illustrates hardware and user interfaces that may be used to provide interaction with a video game according to one embodiment of the invention. FIG. 12 schematically illustrates an embodiment according to the invention Playstation/>Entertainment devices, the overall system architecture of a console that is compatible to interface a control device with a computer program executing on a base computing device. A system unit 700 is provided, and various peripheral devices may be connected to the system unit 700. The system unit 700 includes: a Cell processor 728; />A dynamic random access memory (XDRAM) unit 726; a real synthesizer graphics unit 730 having a dedicated Video Random Access Memory (VRAM) unit 732; and I/O bridge 734. The system unit 700 further comprises Blu-ray for reading from the disk 740a>Magnetic disk->An optical disk reader 740 and a removable slot-in Hard Disk Drive (HDD) 736, which are accessible via I/O bridge 734. Optionally, the system unit 700 further comprises a Memory for reading a compact flash card>Memory card reader 738 for memory cards and the like, which is similarly accessible through I/O bridge 734.
I/O bridge 734 is also connected to six Universal Serial Bus (USB) 2.0 ports 724; gigabit ethernet port 722; an IEEE 802.11b/g wireless network (Wi-Fi) port 720; and can support up to seven bluetooth connectionsA wireless link port 718.
In operation, I/O bridge 734 handles all wireless, USB, and Ethernet data, including data from one or more game controllers 702-703. For example, when a user is playing a game, I/O bridge 734 receives data from game controllers 702-703 via a Bluetooth link and directs it to Cell processor 728, where Cell processor 728 updates the current state of the game accordingly.
In addition to game controllers 702-703, wireless, USB and Ethernet ports provide connections to other peripheral devices, such as remote control device 704, keyboard 706, mouse 708, portable entertainment device 710 (e.g., sony Playstation)Entertainment device), camera 712 (e.g., +.>A camera), microphone headset 714, and a microphone 715. In principle, such peripheral devices may thus be connected wirelessly to the system unit 700; for example, portable entertainment device 710 may communicate via a Wi-Fi dedicated connection, while microphone headset 714 may communicate via a bluetooth link.
Providing these interfaces means that the Playstation3 device is also compatible with other peripheral devices, such as Digital Video Recorders (DVRs), set-top boxes, digital cameras, portable media players, voice-over-IP telephones, mobile telephones, printers, and scanners.
In addition, a conventional memory card reader 716 may be connected to the system unit via a USB port 724 to enable readingOr Playstation->A memory card 748 of the type used by the device.
The game controllers 702-703 are operable to communicate wirelessly with the system unit 700 via a bluetooth link or to connect to a USB port to power the batteries of the game controllers 702-703 through the USB port. Game controllers 702-703 may also include memory, processors, memory card readers, permanent memory (e.g., flash memory), light emitters (e.g., illuminated spherical portions, LEDs, or infrared light), microphones and speakers for ultrasonic communications, sound isolation chambers, digital cameras, internal clocks, recognizable shapes (e.g., spherical portions facing a game console), and usage protocols such as WiFi TM Wireless communication of (a) and the like.
Game controller 702 is a controller designed for two-handed use and gamingThe controller 703 is a one-hand controller with an accessory. In addition to one or more analog joysticks and conventional control buttons, game controllers are susceptible to three-dimensional position determination. Thus, gestures and actions of the game controller user may be converted to game inputs in addition to or in lieu of conventional buttons or joystick commands. Alternatively, other wireless-enabled peripheral devices (e.g., playstation TM Portable device) may be used as a controller. At the Playstation of TM In the case of a portable device, additional game or control information (e.g., control instructions or number of lives) may be provided on the screen of the device. Other alternative or complementary control means may be used, such as a dance mat (not shown), a light pen (not shown), a steering wheel and pedal (not shown) or a predetermined control, such as a single or several large buttons (also not shown) for a quick response quiz game.
The remote control 704 is also operable to communicate wirelessly with the system unit 700 via a bluetooth link. The remote control 704 includes a device suitable for use with Blu Ray TM Operation of the disk BD-ROM reader 540 and control for disk content navigation.
In addition to conventional pre-recorded and recordable CDs and so-called super-audio CDs, blu-Ray TM The disk BD-ROM reader 740 is operable to read CD-ROMs compatible with the Playstation and Playstation2 devices. In addition to conventional pre-recorded and recordable DVDs, the reader 740 is also operable to read DVD-ROMs compatible with the Playstation2 and Playstation 3 devices. The reader 740 is further operable to read BD-ROM compatible with Playstation 3 devices and conventional pre-recorded and recordable blue-Ray (Blu-Ray) disks.
The system unit 700 is operable to provide audio and video generated or decoded by the Playstation 3 apparatus via the real synthesizer graphics unit 730 to a display and sound output device 742, such as a monitor or television set having a display 744 and one or more speakers 746, through an audio and video connector. The audio connector 750 may include conventional analog and digital outputs, while the video connector 752 may variously include component video, S-video, composite video, and one or more high-definition multimedia interface (HDMI) outputs. Thus, the video output may be in, for example, PAL or NTSC format, or 720p, 1080i or 1080p high definition.
Audio processing (generation, decoding, etc.) is performed by the Cell processor 728. Operating system support for Playstation 3 device 5.1 surround sound->Cinema surround (DTS) and from +.>Decoding of 7.1 surround sound of disk.
In this embodiment, the camera 712 includes a single Charge Coupled Device (CCD), an LED indicator, and a hardware-based real-time data compression and encoding device so that compressed video data may be transmitted for decoding by the system unit 700 in a suitable format, such as based on the MPEG (moving Picture experts group) standard within an image. The camera LED indicators are arranged to illuminate in response to suitable control data from the system unit 700, for example to indicate adverse lighting conditions. Embodiments of camera 712 may be variously connected to system unit 700 via a USB, bluetooth, or Wi-Fi communication port. Embodiments of the camera may include one or more associated microphones and may also be capable of transmitting audio data. In a camera embodiment, the CCD may have a resolution suitable for high definition video capture. In use, the images captured by the camera may be incorporated into a game or interpreted as game control inputs, for example. In another embodiment, the camera is an infrared camera adapted to detect infrared light.
Typically, for successful data communication with a peripheral device (e.g., a camera or remote control) via one of the communication ports of the system unit 700, suitable software, such as a device driver, should be provided. Device driver technology is well known and will therefore not be described in detail herein, only to point out that a person skilled in the art will appreciate that a device driver or similar software interface may be required in the described embodiments.
FIG. 13 illustrates additional hardware that may be used to process instructions according to one embodiment of the invention. Cell processor 728 has an architecture that includes four basic components: external input and output structures including a memory controller 860 and dual bus interface controllers 870A, 870B; a main processor called a Power processor 850; eight coprocessors called coprocessors elements (SPEs) 810A-H; and a circular data bus connecting the above components and referred to as a component interconnect bus 880. The total floating point performance of the Cell processor is 218GFLOP compared to the floating point performance of 6.2GFLOP of the emotion engine of the Playstation 2 device.
Main processing element (PPE) 850 is based on a bi-directional simultaneous multithreading Power 570, which accommodates a PowerPC core (PPU) 855 running with a 3.2GHz internal clock. It includes 512kb level 2 (L2) cache and 32kb level 1 (L1) cache. The PPE 850 is capable of eight separate position operations per clock cycle, scaled to 25.6GFLOP at 3.2 GHz. The main task of the PPE 850 is to act as a controller for the co-processing elements 810A-H that handle the majority of the computational workload. In operation, PPE 850 maintains a job queue, schedules jobs for co-processing elements 810A-H, and monitors its progress. Thus, each co-processing element 810A-H runs a kernel that functions to take work, execute work, and synchronize with PPE 850.
Each Synergistic Processing Element (SPE) 810A-H includes a respective Synergistic Processing Unit (SPU) 820A-H, and a respective Memory Flow Controller (MFC) 840A-H, which in turn includes a respective Dynamic Memory Access Controller (DMAC) 842A-H, a respective Memory Management Unit (MMU) 844A-H, and a bus interface (not shown). Each SPU 820A-H is a RISC processor clocked at 3.2GHz and includes 256KB of local RAM 830A-H, which can in principle be extended to 4GB. Each SPE gives a single precision performance of theoretically 25.6 GFLOP. The SPU can process 4 single precision floating point numbers, 4 32 bit numbers, 8 16 bit integers, or 16 8 bit integers in a single clock cycle. The store operation may also be performed during the same clock cycle. SPUs 820A-H do not directly access system memory XDRAM 726; the 64-bit address formed by the SPUs 820A-H is transferred to the MFCs 840A-H, which instruct their DMA controllers 842A-H to access memory via the element interconnect bus 880 and the memory controller 860.
Element Interconnect Bus (EIB) 880 is a logically circular communication bus within Cell processor 728 that connects the above-described processor elements, namely PPE 850, memory controller 860, dual bus interface 870A, B, and 8 SPEs 810A-H, for a total of 12 participating elements. The participating components may read and write to the bus at 8 bytes per clock cycle simultaneously. As previously described, each SPE 810A-H includes a DMAC 842A-H for scheduling longer read or write sequences. The EIB contains 4 channels, two of which are clockwise and counterclockwise, respectively. Thus, for 12 participating components, the longest stepwise data flow between any two participating components is 6 steps in the appropriate direction. Thus, the theoretical peak instantaneous EIB bandwidth of 12 slots is 96B per clock with full utilization between participating elements by arbitration. This corresponds to a theoretical peak bandwidth of 307.2GB/s (gigabytes per second) at a clock rate of 3.2 GHz.
Memory controller 860 includes an XDRAM interface 862 developed by Rambus corporation. The memory controller interfaces with Rambus XDRAM 726 at a theoretical peak bandwidth of 25.6 GB/s.
The dual bus interface 870A, B includes a RambusSystem interface 872A, B. The interface is organized into 12 lanes, each lane 8 bits wide, with five paths inbound and seven paths outbound. This provides a theoretical peak bandwidth of 62.4GB/s between the Cell processor and I/O bridge 734 via controller 870A and between the Cell processor and real simulator graphics unit 730 via controller 870B (where 36.4GB/s is outbound and 26GB/s is inbound).
The data sent by the Cell processor 728 to the real simulator graphics unit 730 typically includes a display list, which is a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on.
Fig. 14 is an example illustration of scenes a through E and respective users a through E interacting with a game client 1102 connected to a server process via the internet, according to one embodiment of the present invention. A game client is a device that allows a user to connect to a server application and process via the internet. The game client allows users to access and play online entertainment content such as, but not limited to, games, movies, music, and photographs. In addition, the game client may provide access to online communication applications, such as VOIP, text chat protocols, and email
The user interacts with the game client via the controller. In some embodiments, the controller is a game client specific controller, while in other embodiments the controller may be a combination of a keyboard and a mouse. In one embodiment, the game client is a stand-alone device that can output audio and video signals to create a multimedia environment through a monitor/television and associated audio equipment. For example, the game client may be, but is not limited to, a thin client, an internal PCI expansion card, an external PCI expansion device, an expansion card device, an internal, external or wireless USB device, or a Firewire device, among others. In other implementations, the game client is integrated with a television or other multimedia device, such as a DVR, a blue-Ray (Blu-Ray) player, a DVD player, or a multi-channel receiver.
In scenario A of FIG. 14, user A interacts with a client application displayed on monitor 1104A using controller 1106A paired with game client 1102A. Similarly, within context B, user B interacts with a client application displayed on monitor 1104B using controller 1106B paired with game client 1102B. Scene C illustrates the scene seen from behind user C while user C views a monitor displaying a game and buddy list from game client 1102C. In one embodiment, while FIG. 14 shows a single server processing module, there are multiple server processing modules worldwide. Each server processing module includes sub-modules for user session control, sharing/communication logic, user geographic location, and load balancing processing services. Further, the server processing module includes network processing and distributed storage.
When game client 1102 connects to the server processing module, user session control may be used to authenticate the user. Authenticated users may have associated virtualized distributed storage and virtualized network processing. Example items that may be stored as part of the user virtualized distributed storage include purchased media such as, but not limited to, games, video, music, and the like. In addition, distributed storage may be used to save game states for multiple games, custom settings for a single game, and custom settings for a game client. In one embodiment, a server-processed user geographic location module is used to determine the geographic location of users and their respective game clients. The geographic location of the user may be used by the sharing/communication logic and load balancing processing server to optimize performance based on the geographic positioning and processing requirements of the plurality of server processing modules. Virtualizing one or both of network processing and network storage would allow the game client's processing tasks to be dynamically transferred to an underutilized server processing module. Thus, load balancing may be used to minimize latency associated with data transfer from the storage recall and server processing module to the game client.
The server processing module has instances of server application a and server application B. The server processing module is capable of supporting a plurality of server applications, such as server application X 1 And server application X 2 As indicated. In one embodiment, server processing is based on a cluster computing architecture that allows multiple processors within a cluster to process server applications. In another embodiment, different types of multi-computer processing schemes are applied to processing server applications. This allows the server process to be scalable to accommodate a large number of game clients executing multiple client applications and corresponding server applications. Alternatively, the server processing may be scaled to accommodate increased computing demands as required by more demanding graphics processing or gaming, video compression, or application complexity. In one embodiment, the server processing module performs most of the processing via the server application. This allows relatively expensive components (such as graphics processor, RAM and general purpose processor) to be located in a central location and reduces the cost of the game client. The processed server application data is sent back to the corresponding game client via the internet for display on the monitor.
Scenario C illustrates an exemplary application that may be executed by the game client and server processing module. For example, in one embodiment, game client 1102C allows user C to create and view a buddy list 1120 that includes user a, user B, user D, and user E. As shown, in scene C, user C is able to see real-time images or avatars of the users on monitor 1104C. The server processes execute the respective applications of game client 1102C and execute with the respective game clients 1102 of user a, user B, user D, and user E. Because the server process knows the application being executed by game client B, user A's buddy list may indicate which game user B is playing. Still further, in one embodiment, user A may view the real scene in the game video directly from user B. This may be accomplished by sending only the processed server application data of user B to game client a in addition to game client B.
In addition to being able to view video from friends, the communication application may allow real-time communication between friends. As applied in the previous example, this allows user a to provide encouragement or clues while watching user B's real-time video. In one embodiment, two-way real-time voice communication is established through a client/server application. In another embodiment, the client/server application supports text chat. In yet another embodiment, the client/server application converts the speech to text for display on the buddy's screen.
Scenario D and scenario E illustrate the interaction of respective users D and E with game consoles 1110D and 1110E, respectively. Both game consoles 1110D and 1110E are connected to a server processing module and illustrate a network in which the server processing module coordinates games for the game consoles and game clients.
Fig. 15 illustrates an embodiment of an information service provider architecture. An Information Service Provider (ISP) 1370 delivers a number of information services to users 1382 distributed throughout and connected via a network 1386. An ISP may deliver only one type of service (e.g., stock price updates) or various services (e.g., broadcast media, news, sports, games, etc.). In addition, the services provided by each ISP are dynamic, i.e., services may be added or removed at any point in time. Thus, ISPs that provide a particular type of service to a particular individual may change over time. For example, a user may be served by an ISP in the vicinity of the user when the user is at his home, and may be served by a different ISP when the user travels to a different city. The hometown ISP will transfer the required information and data to the new ISP so that the user information "follows" the user to the newcastle city making the data closer to the user and easier to access. In another embodiment, a host-server relationship may be established between a host ISP that manages user information and a server ISP that interfaces directly with users under the control of the host ISP. In other embodiments, data is transferred from one ISP to another as the client moves around the world so that the ISP that serves the user in a better location is the ISP that delivers these services.
ISP 1370 comprises an Application Service Provider (ASP) 1372 that provides computer-based services to customers through a network. The software provided using the ASP model is sometimes also referred to as on-demand software or software as a service (SaaS). A simple way to provide access to a particular application, such as customer relationship management, is through the use of standard protocols, such as HTTP. The application software resides in the provider's system and is accessed by the user through a web browser using HTML, through dedicated client software provided by the provider, or other remote interface (e.g., a thin client).
Services delivered over a large geographic area often use cloud computing. Cloud computing is a type of computing in which dynamically extensible and often virtualized resources are provided as services over the internet. The users need not be experts in supporting the technical structure in their "cloud". Cloud computing can be divided into different services, such as infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS). Cloud computing services often provide generic business online applications that are accessible from web browsers, while software and data are stored on servers. The term "cloud" is used to metaphe the Internet, based on how the Internet is depicted in a computer network graph, is an abstraction of its hidden complex infrastructure.
Further, ISP 1370 includes a Game Processing Server (GPS) 1374 that is used by game clients to play single or multiplayer video games. Most video games played over the internet operate through a connection to a game server. Typically, games use dedicated server applications that collect data from players and distribute it to other players. This is more efficient and effective than a point-to-point arrangement, but requires a separate server to host the server application. In another embodiment, the GPS establishes communication between players and their respective gaming devices, and information exchange occurs without reliance on a central GPS.
The dedicated GPS is a server that operates independently of the client. Such servers typically run on dedicated hardware located in a data center, providing more bandwidth and dedicated processing power. Dedicated servers are the preferred method of hosting game servers for most PC-based multiplayer games. The massively multiplayer runs on a dedicated server, typically hosted by a software company, which has game rights allowing it to control and update content.
A Broadcast Processing Server (BPS) 1376 distributes audio or image signals to listeners. Broadcasting to a very narrow range of listeners is sometimes referred to as narrowcasting. The final one of the broadcast distribution is how the signal arrives at the listener or viewer, which may propagate over the air to the antenna and receiver as a radio station or television station, or may propagate through a cable television or cable broadcast (or "wireless cable") via a station or directly from the network. The internet may also bring broadcast or television to the audience, particularly with multicasting, allowing sharing of signals and bandwidth. Historically, broadcasts have been delimited by geographical areas, such as national broadcasts or regional broadcasts. However, with the proliferation of the fast internet, broadcasting is not geographically limited, and its content can reach almost any country in the world.
A Storage Service Provider (SSP) 1378 provides computer storage space and related management services. SSPs also provide periodic backup and archiving. By providing the storage as a service, the user can subscribe to more storage as desired. Another major advantage is that SSPs include backup services and users do not lose all of their data when computer hardware drives are damaged. Further, multiple SSPs can have full or partial copies of user data that allow users to access the data in an efficient manner, independent of where the user is or the device used. For example, a user may access personal documents at a home computer, or through a mobile phone when the user is outside.
The communication provider 380 provides connectivity for the user. One type of communications provider is an Internet Service Provider (ISP) that provides access to the internet. ISPs connect their customers using data transmission technologies suitable for delivering internet protocol datagrams (e.g., dial-up, DSL, cable modem, wireless or dedicated high speed interconnection). The communication provider may also provide messaging services such as email, instant messaging, and SMS text. Another type of communication provider is a Network Service Provider (NSP) that sells broadband or network access by providing direct backbone access to the internet. The network service provider may consist of a carrier, a data carrier, a wireless communication provider, an internet service provider, a cable television operator providing high-speed internet access, etc.
Data exchange 1388 and several modules within ISP 1370 are interconnected and connect the modules to user 1382 through network 1386. Data exchange 1388 may cover a small area where all modules of ISP 1370 are close to each other or may cover a large geographic area when different modules are spread out in different places. For example, data exchange 1388 may include a fast gigabit ethernet (or faster) or an intercontinental Virtual Local Area Network (VLAN) located within a data center chassis.
The user 1382 accesses a remote service using a client device 1384 including at least one CPU, one display and I/O. The client device may be a PC, mobile phone, netbook, PDA, etc. In one embodiment, ISP 1370 identifies the type of device used by the client and adjusts the communication method employed. In other cases, the client device accesses the ISP 1370 using standard communication methods (e.g., html).
Embodiments of the present invention may be used with a variety of computer system configurations, including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a wired or wireless network.
In view of the above, it should be appreciated that the present invention can employ various computer-implemented operations involving data stored in computer systems. The operations are those requiring physical manipulations of physical quantities. Any of the operations described herein that form part of the invention are useful machine operations. The invention also relates to an apparatus or device for performing these operations. The apparatus may be specially constructed for the required purposes, or it may be a general-purpose computer selectively activated or configured by a computer program stored in the computer. In particular, various general-purpose machines may be used with computer programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required operations.
The present invention may also be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data which can thereafter be read by a computer system. Examples of computer readable media include hard disk drives, network Attached Storage (NAS), read-only memory, random-access memory, CD-ROMs, CD-R, CD-RWs, magnetic tapes, and other optical and non-optical data storage devices. The computer readable medium can comprise a computer readable tangible medium distributed over a network coupled computer system such that the computer readable code is stored and executed in a distributed fashion.
Although the method operations are described in a particular order, it should be understood that other housekeeping operations may be performed between operations, or operations may be adjusted so that they occur at slightly different times, or may be distributed within a system that allows processing operations to occur at time intervals related to the processing, so long as the processing of the overlapping operations is performed in a desired manner.
Although the foregoing invention has been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications may be practiced within the scope of the appended claims. The present embodiments are, therefore, to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.

Claims (17)

1. A controller for interfacing with an interactive application, comprising:
a housing defined by a main body, a first extension extending from a first end of the main body, and a second extension extending from a second end of the main body, the first and second extensions being for grasping by first and second hands, respectively, of a user;
a button positioned along a top surface of the body, the button configured to access a cached portion of a game and activate a sharing interface to share the game of the cached portion, wherein sharing the game includes streaming live video of a user game.
2. The controller of claim 1, further comprising: a second input device selected from the group consisting of a touch sensitive panel, a joystick, a button, a trigger, a steering wheel.
3. The controller of claim 1, further comprising,
a tracking panel defined along a front side surface of the body; and
a light is defined in the body to illuminate the tracking panel.
4. The controller of claim 3, wherein the tracking panel is defined by a translucent material.
5. The controller of claim 1, further comprising one or more of an accelerometer, a gyroscope, or a magnetometer.
6. The controller of claim 1, wherein the game is shared to a social media website of a user.
7. The controller of claim 6, wherein the shared game comprises one or more of a shared image or video clip.
8. A controller for interfacing with an interactive application, comprising:
a housing defined by a main body, a first extension extending from a first end of the main body, and a second extension extending from a second end of the main body, the first and second extensions being for grasping by first and second hands, respectively, of a user;
A button positioned along a top surface of the body, the button configured to access a cached portion of a game and activate a sharing interface to share recorded games of the cached portion to a social media website of a user, wherein sharing games includes streaming live video of the user's game.
9. The controller of claim 8, wherein the shared game comprises one or more of a shared image or video clip.
10. The controller of claim 8, further comprising an input device positioned along a top surface of the body.
11. The controller of claim 10, wherein the input device is selected from the group consisting of a touch sensitive panel, a joystick, a button, a trigger, a steering wheel.
12. The controller of claim 8, further comprising:
a tracking panel defined along a front side surface of the body; and
a light is defined in the body to illuminate the tracking panel.
13. The controller of claim 12, wherein the tracking panel is defined by a translucent material.
14. The controller of claim 8, further comprising one or more of an accelerometer, a gyroscope, or a magnetometer.
15. A controller for interfacing with an interactive application, comprising:
a housing defined by a main body, a first extension extending from a first end of the main body, and a second extension extending from a second end of the main body, the first and second extensions being for grasping by first and second hands, respectively, of a user;
a touch sensitive panel defined along a top surface of the body;
a button positioned along the top surface of the body, the button configured to activate a sharing interface to share a game, wherein sharing a game includes streaming live video of a user game;
a tracking panel defined along a front side surface of the body; and
a light is defined in the body to illuminate the tracking panel to be visually tracked in accordance with image recognition techniques to determine the position and orientation of the controller.
16. The controller of claim 15, wherein the tracking panel is defined by a translucent material.
17. The controller of claim 15, wherein the shared game comprises one or more of a shared image or video clip.
CN201910768967.7A 2013-03-15 2014-03-17 Game Controller Active CN110665220B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910768967.7A CN110665220B (en) 2013-03-15 2014-03-17 Game Controller

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US13/842,975 2013-03-15
US13/842,975 US9116555B2 (en) 2011-11-23 2013-03-15 Gaming controller
CN201410099053.3A CN104043245B (en) 2013-03-15 2014-03-17 Game console
CN201910768967.7A CN110665220B (en) 2013-03-15 2014-03-17 Game Controller

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201410099053.3A Division CN104043245B (en) 2013-03-15 2014-03-17 Game console

Publications (2)

Publication Number Publication Date
CN110665220A CN110665220A (en) 2020-01-10
CN110665220B true CN110665220B (en) 2023-10-20

Family

ID=51496948

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201410099053.3A Active CN104043245B (en) 2013-03-15 2014-03-17 Game console
CN201910768967.7A Active CN110665220B (en) 2013-03-15 2014-03-17 Game Controller

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201410099053.3A Active CN104043245B (en) 2013-03-15 2014-03-17 Game console

Country Status (2)

Country Link
CN (2) CN104043245B (en)
TW (2) TWI594791B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016045018A1 (en) * 2014-09-24 2016-03-31 深圳市大疆创新科技有限公司 Remote controller, handle structure thereof, and control method for uav
US10185396B2 (en) 2014-11-12 2019-01-22 Immersion Corporation Haptic trigger modification system
US9724601B2 (en) 2015-06-12 2017-08-08 Nintendo Co., Ltd. Game controller
JP6083884B2 (en) * 2015-06-12 2017-02-22 任天堂株式会社 Support device, charging device, and operation system
US10427036B2 (en) * 2015-09-24 2019-10-01 Ironburg Inventions Limited Games controller
CN109036046A (en) * 2018-09-05 2018-12-18 南京阿波罗机器人科技有限公司 A kind of STEM touch screen programmable electronic building blocks controller
US10471345B1 (en) * 2019-02-08 2019-11-12 Arkade, Inc. Pedal system for gaming apparatus
JP2021159417A (en) * 2020-03-31 2021-10-11 株式会社ソニー・インタラクティブエンタテインメント Input device
TWI804027B (en) * 2021-02-04 2023-06-01 仁寶電腦工業股份有限公司 Game console
CN114949845A (en) * 2022-05-31 2022-08-30 广州市品众电子科技有限公司 Direction adjusting method of game handle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1173825A (en) * 1995-10-09 1998-02-18 任天堂株式会社 Game machine and game machine systm using the same
CN102149436A (en) * 2008-05-30 2011-08-10 美国索尼电脑娱乐有限责任公司 Determination of controller three-dimensional location using image analysis and ultrasonic communication
CN102441276A (en) * 2010-10-12 2012-05-09 索尼计算机娱乐公司 Using a portable gaming device to record or modify a game or application in real-time running on a home gaming system
CN102968183A (en) * 2011-12-20 2013-03-13 微软公司 Content system with auxiliary touch controller

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5558339A (en) * 1994-05-05 1996-09-24 Perlman; Stephen G. Network architecture to support recording and playback of real-time video games
JP4691268B2 (en) * 2001-05-02 2011-06-01 任天堂株式会社 Game system and game program
JP2003187242A (en) * 2001-12-17 2003-07-04 Sumitomo Electric Ind Ltd System and method for inputting data
JP4606150B2 (en) * 2004-12-16 2011-01-05 株式会社バンダイナムコゲームス PROGRAM, INFORMATION STORAGE MEDIUM, AND GAME DEVICE
US20080139301A1 (en) * 2006-12-11 2008-06-12 Ole-Ivar Holthe System and method for sharing gaming experiences
JP5285234B2 (en) * 2007-04-24 2013-09-11 任天堂株式会社 Game system, information processing system
US8961313B2 (en) * 2009-05-29 2015-02-24 Sony Computer Entertainment America Llc Multi-positional three-dimensional controller
US8556721B2 (en) * 2009-11-16 2013-10-15 Steelseries Aps Apparatus and method for managing peripheral device communications
US9440144B2 (en) * 2011-04-21 2016-09-13 Sony Interactive Entertainment Inc. User identified to a controller
US9345966B2 (en) * 2012-03-13 2016-05-24 Sony Interactive Entertainment America Llc Sharing recorded gameplay to a social graph

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1173825A (en) * 1995-10-09 1998-02-18 任天堂株式会社 Game machine and game machine systm using the same
CN102149436A (en) * 2008-05-30 2011-08-10 美国索尼电脑娱乐有限责任公司 Determination of controller three-dimensional location using image analysis and ultrasonic communication
CN102441276A (en) * 2010-10-12 2012-05-09 索尼计算机娱乐公司 Using a portable gaming device to record or modify a game or application in real-time running on a home gaming system
CN102968183A (en) * 2011-12-20 2013-03-13 微软公司 Content system with auxiliary touch controller

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"传PS4年内面市";Zhengogo;《https://pad.mydrivers.com/1/254/254088.htm》;20130202;第一页 *
"索尼公布PS4专用手柄 可分享游戏视频截图";17173;《http://game.17173.com/content/2013-02-21/20130221082056717.shtml》;20130221;第一页 *
17173."索尼公布PS4专用手柄 可分享游戏视频截图".《http://game.17173.com/content/2013-02-21/20130221082056717.shtml》.2013, *

Also Published As

Publication number Publication date
TWI565504B (en) 2017-01-11
TW201701931A (en) 2017-01-16
CN104043245B (en) 2019-09-13
TW201501758A (en) 2015-01-16
CN110665220A (en) 2020-01-10
TWI594791B (en) 2017-08-11
CN104043245A (en) 2014-09-17

Similar Documents

Publication Publication Date Title
US10610778B2 (en) Gaming controller
CN110665220B (en) Game Controller
US8870654B2 (en) Gaming controller
US9744452B2 (en) Remote control of a first user's gameplay by a second user
TWI564062B (en) Remote control of a first user's gameplay by a second user
CN104363970B (en) Multi-image interactive game device
JP6158220B2 (en) Directional input for video games
US11771981B2 (en) Sharing buffered gameplay in response to an input request

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant