MX2014000231A - Remote control of a first user's gameplay by a second user. - Google Patents

Remote control of a first user's gameplay by a second user.

Info

Publication number
MX2014000231A
MX2014000231A MX2014000231A MX2014000231A MX2014000231A MX 2014000231 A MX2014000231 A MX 2014000231A MX 2014000231 A MX2014000231 A MX 2014000231A MX 2014000231 A MX2014000231 A MX 2014000231A MX 2014000231 A MX2014000231 A MX 2014000231A
Authority
MX
Mexico
Prior art keywords
game
user
mechanics
video
game mechanics
Prior art date
Application number
MX2014000231A
Other languages
Spanish (es)
Other versions
MX353111B (en
Inventor
David Perry
Kelvin Yong
Victor Octav Suba Miura
Philippe Dias
Original Assignee
Sony Comp Entertainment
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/831,190 external-priority patent/US9364743B2/en
Priority claimed from US13/831,178 external-priority patent/US9352226B2/en
Priority claimed from US13/839,382 external-priority patent/US9345966B2/en
Application filed by Sony Comp Entertainment filed Critical Sony Comp Entertainment
Publication of MX2014000231A publication Critical patent/MX2014000231A/en
Publication of MX353111B publication Critical patent/MX353111B/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/88Mini-games executed independently while main games are being loaded
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/47Controlling the progress of the video game involving branching, e.g. choosing one of several possible scenarios at a given point in time
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/49Saving the game status; Pausing or ending the game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/67Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/69Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/73Authorising game programs or game devices, e.g. checking authenticity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • A63F13/795Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for finding other players; for building a team; for providing a buddy list
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/49Saving the game status; Pausing or ending the game
    • A63F13/497Partially or entirely replaying previous game actions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5526Game data structure
    • A63F2300/554Game data structure by saving game or status data
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/57Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Optics & Photonics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Information Transfer Between Computers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Pinball Game Machines (AREA)
  • Display Devices Of Pinball Game Machines (AREA)
  • Processing Or Creating Images (AREA)
  • Details Of Television Systems (AREA)
  • Selective Calling Equipment (AREA)

Abstract

A method for providing remote control of a user's gameplay is provided. A live video feed of a first user's gameplay is presented to a remote second user. A request to transition control of the first user's gameplay to the second user is processed. Control of the first user's gameplay by the second user is initiated.

Description

REMOTE CONTROL OF A FIRST GAME MECHANICS USER FOR A SECOND USER FIELD OF THE INVENTION The present invention relates to methods and systems for the automatic generation of suggested mini-games based on the game mechanics recorded, generation of multi-part mini-games for game in the cloud based on the game mechanics recorded, sharing of game mechanics recorded to a social profile, and remote control of a game mechanics of the first user by a second user.
BACKGROUND OF THE INVENTION The video game industry has seen many changes over the years. As the power of computing expands, similarly video game developers have similarly created gaming software that takes advantage of these increases in the power of computing. To this end, video game developers have been coding games that incorporate sophisticated operations and mathematics to produce a very realistic gaming experience.
Examples of gaming platforms can be Sony Playstation®, Sony Playstation2® (PS2), and Sony Playstation3® (PS3), each one of which is sold in the form of a game console. As is well known, the game console is designed to connect to a monitor (usually a television) and allows interaction with the user through portable controllers. The game console is designed with specialized processing hardware, including a CPU, a graphics synthesizer for intensive processing graphics operations, a vector unit to perform geometric transformations, and other queuing hardware, microcode, and software. The game console is additionally designed with an optical disc tray for receiving compact discs for local gaming through the game console. It is also possible to play online, where a user can play interactively against or with other users on the Internet. As the complexity of juice continues to intrigue players, game makers and hardware have continued to innovate to allow for additional interactivity and computer programs.
A growing trend in the computer gaming industry is to develop games that increase the interaction between the user and the gaming system. One way to achieve a more enriching interactive experience is to use wireless game controllers whose movement is tracked by the game system to track the player's movements and use these movements as inputs to the game. Generally speaking, the input gesture refers to having an electronic device such as a computer system, video game console, smart device, etc., that reacts with some gesture made by the player and captured by the electronic device.
Another growth trend in the industry involves the development of cloud-based gaming systems. Such systems may include a remote processing server that runs a gaming application, and communicates with a local reduced client that can be configured to receive input from users and provide video on a screen.
It is in this context that the modalities of the invention arise.
BRIEF DESCRIPTION OF THE INVENTION The embodiments of the present invention provide methods and systems for the automatic generation of suggested mini-games based on the mechanics of recorded game, generation of multi-part mini-games for game in the cloud based on the game mechanics recorded, share the game mechanics recorded with a social profile, and remote control of a first game mechanics of the user by a second user. It should be appreciated that the present invention can be implemented in numerous ways, such as a process, an apparatus, a system, a device or a method in a computer-readable medium. Various inventive embodiments of the present invention are described below.
In one embodiment, a method for generating a playable limited version of a video game is provided, including the following method operations: recording a user's game mechanic from a full version of the video game; analyze the mechanics of the user's recorded game to determine a region of interest; define limits within a context of the game mechanics of the video game based on the region of determined interest; and generate the limited version of the video game based on the defined limits; where the method is executed by a processor.
In another embodiment, a method for generating a playable limited version of a video game is provided, including the following method operations: recording a game mechanic of the user of a full version of the video game; analyze the mechanics of the user's recorded game to determine one or more regions of interest; present each of the regions of interest for selection; receive a selection entry indicating a selected region of interest; for the selected region of interest, define limits within a context of game mechanics of the video game based on the selected region of interest; and generate the limited version of the video game based on the defined limits; where the method is executed by a processor.
In another embodiment, a method for generating a playable limited version of a video game is provided, including the following method operations: recording a game mechanic of the user of a full version of the video game; wherein registering the user's game includes recording one or more user input data or game state data; analyze the mechanics of the user's recorded game to determine one or more regions of interest; where each region of interest is identified automatically based on correspondence to one or more thresholds; present each of the regions of interest for the selection; receive a selection entry indicating a selected region of interest; for the selected region of interest, define the limits within a context of game mechanics of the video game based on the selected region of interest; generate the limited version of the video game based on the defined limits; where the method is executed by a processor.
In one embodiment, a method is provided for generating a playable limited version of a video game, which includes the following method operations: recording a user's game mechanics of a full version of the video game; determining a plurality of user-defined portions of the recorded game mechanics of the user; for each defined portion of the user, define limits within a context of game mechanics of the video game based on the defined user portion; and generate a playable portion of the video game based on the defined limits; place each of the playable portions of the video game in a series to define the limited version of the video game; where the method is executed by a processor.
In another embodiment, a tangible computer readable medium having program instructions incorporated therein is provided to generate a playable limited version of a video game, including the following program instructions for recording a user's game mechanics of a full version of the game. videogame; program instructions to determine a plurality of user-defined portions of the user's recorded game mechanics; program instructions for, for each portion defined by the user, define limits within a context of game mechanics of the video game based on the user-defined portion, and generate a playable portion of the video game based on the defined limits; program instructions to arrange each of the playable portions of the video game in a series to define the limited version of the video game.
In another embodiment, a system is provided, which includes the following: at least one server computer device, at least one server computer device that has logic to generate a playable limited version of a video game, which includes, logic for record a user's game mechanics of a full version of the video game; logic for determining a plurality of user-defined portions of the user's recorded game mechanics; logic for, for each user-defined portion, define boundaries within a context of video game mechanics based on the user-defined portion, and generate a playable portion of the video game based on the defined limits; logic to place each of the playable portions of the video game in a series to define the limited version of the video game.
In one embodiment, a method is provided for sharing the game mechanics recorded to a social profile, including the following method operations: recording video of a user's game mechanics during an active state of a game mechanic session; receive a command to initiate a share operation during the active state of the game session; in response to receiving the command, entering the paused state of the game mechanics session and presenting a co-participation interface; processing the received input via the sharing interface to determine a user-defined selection of the recorded video; share the user-defined selection of the recorded video to a user's social profile; resume the active state of the game mechanics session; where the method is executed by a processor.
In another embodiment, a method is provided for sharing the recorded game mechanics to a social profile, including the following method operations: recording video of a user's game mechanics during an active state of a game mechanics session; receive a command to initiate a shared operation during the active state of the game mechanic session; in response to receiving the command, determine a user-defined selection of the recorded video; share the user-defined selection of the recorded video to a user's social profile; where the method is executed by a processor.
In another modality, a non-transient computer-readable medium is provided that has program instructions defined therein to share the recorded game mechanics to a social profile. The program instructions include: program instructions for recording video of a user's game mechanics during an active state of a session of game mechanics; program instructions to receive a command to initiate a shared operation during the active state of the game mechanic session; program instructions for, in response to receiving the command, determining a user-defined selection of the recorded video; program instructions to share the defined selection of recorded video to a user's social profile.
In one embodiment, a method for providing remote control of a user's game mechanics is provided, the method includes the following operations of the method: presenting a live video feed of a first-user game mechanics to a second user; processing a request for game transition control of the game mechanics of the first user to the second user; initiate control of the game mechanics of the first user by the second user; where the method is executed by at least one processor.
In another embodiment, a method for providing multiplayer game mechanics is provided, including the following method operations: displaying live video feed from the game mechanics session of the first user to a second remote user; processing a request for a second user to join the game mechanics session of the first user; start the game mechanics by the second user in the game mechanics session of the first user; where the method is executed by at least one processor.
In another embodiment, a readable medium is provided in non-transient computer having program instructions integrated therein to provide remote control of a user's game mechanics, the program instructions include: program instructions for presenting a live video feed of a game mechanic from the first user to a user second remote user; program instructions for processing a request for transition control of the game mechanics of the first user to the second user; program instructions to initiate control of the game mechanics of the first user by the second user.
Other aspects of the invention will be apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS The invention can be better understood with reference to the following description taken in combination with the accompanying drawings in which: Fig. 1A illustrates a user playing a video game based on the cloud, in accordance with the embodiments of the invention.
Fig. 1 B illustrates multiple users in multiple locations coupled in the game mechanics of cloud-based video games.
Figure 2A illustrates a system for game in the cloud, of according to one embodiment of the invention.
Figure 2B conceptually illustrates the accumulation of game titles through several generations of game consoles, in accordance with the embodiments of the invention.
Figure 3 illustrates a method for providing game demos to a user, in accordance with one embodiment of the invention.
Fig. 4A illustrates the hierarchical organization of several portions of a video game, in accordance with one embodiment of the invention.
Figure 4B illustrates an interface for selecting a portion of a game mechanics timeline for generating a mini-game or game part, in accordance with one embodiment of the invention.
Figure 4C illustrates an interface for selecting a portion of a game mechanics timeline for generating a mini-game, in accordance with one embodiment of the invention.
Figure 5 illustrates a series of screenshots demonstrating a method for generating a mini-game of an existing video game based on the cloud, in accordance with one embodiment of the invention.
Figure 6 illustrates a system for generating code of part of the game, in accordance with one embodiment of the invention.
Figure 7A illustrates the modification of a virtual space for the purpose of generating a game part of a video game, in accordance with one embodiment of the invention.
Figure 7B illustrates the modification of a scene graphic for purposes of generating a part of the game, in accordance with the embodiments of the invention.
Figure 8 illustrates a method for generating a game part, in accordance with an embodiment of the invention.
Figure 9A illustrates an interface for searching game parts associated with various game titles, in accordance with one embodiment of the invention.
Figure 9B illustrates a game part information page, in accordance with one embodiment of the invention.
Figure 10 illustrates a view of a user account information, including live views of friends in a social gaming network in the cloud, in accordance with one embodiment of the invention.
Figure 11 illustrates a method for presenting live game mechanics feeds from friends of the current user, in accordance with one embodiment of the invention.
Figure 12 illustrates a system that includes a game system in the cloud and a social network, in accordance with one embodiment of the invention.
Figure 13 is a graph illustrating several game state variables over time, in accordance with the embodiments of the invention.
Figure 14A illustrates a method for generating a game part for a linear type video game, in accordance with a mode of the invention.
Figure 14B illustrates a method for generating a game part for a world-class video game, in accordance with one emment of the invention.
Figure 14C illustrates a method for generating a game part for a sports video game, in accordance with one emment of the invention.
Figure 15 conceptually illustrates the formation of a multiple-game part, in accordance with one emment of the invention.
Figure 16 conceptually illustrates the generation of a multi-game mini-game part, in accordance with one emment of the invention.
Figure 17 illustrates an interface for sharing game mechanics, in accordance with one emment of the invention.
Figure 18 illustrates an interface 1800 for selecting a portion of video game mechanics recorded for sharing, in accordance with one emment of the invention.
Figure 19 illustrates a 1900 interface for viewing a live video stream of a user's game mechanics, in accordance with one emment of the invention.
Figure 20 illustrates hardware and user interfaces that can be used to provide interactivity with a video game, in accordance with one emment of the present invention.
Figure 21 illustrates additional hardware that can be used to process instructions, in accordance with one emment of the present invention.
Figure 22 is an exemplary illustration of scene A to scene E with the respective user A to user E interacting with game clients 1 102 that are connected to the processing server via the Internet, in accordance with one emment of the present invention.
Figure 23 illustrates a modality of a Supplier Service Information architecture.
DETAILED DESCRIPTION OF THE INVENTION The following modality describes methods and apparatuses for the automatic generation of suggested mini-games based on the mechanics of recorded game, generation of multi-part mini-games for game in the cloud based on the game mechanics recorded, sharing the mechanics of game recorded to a social profile, and remote control of a game mechanics of the first user by a second user.
It will be obvious, however, to one skilled in the art, that the present invention may be practiced without some or all of these specific details. In other cases, well-known process operations have not been described in detail so as not to unnecessarily overshadow the present invention.
Fig. 1A illustrates a user playing a video game based on the cloud, in accordance with the emments of the invention. As shown, a user U1 plays a video game based on the cloud as shown on a 100 screen. A video game based on the cloud is a video game that runs mainly on a remote server. A server, in one mode, can include individual servers or servers running in a virtual machine data center, where many servers can be virtualized to provide the requested processing. In the illustrated mode, game server (s) in the cloud 104 executes the video game that is presented on the screen 100. A client 101 is placed at the user's location to receive and process tickets and communicate these to the game servers in the cloud 104 and also for receiving audio and video data from the game servers in the cloud 104. The client 101 and the network servers in the cloud 104 communicate over a network 102, such as the Internet. In other modalities, the client can be any device, whether portable or not, whether wireless or not, as long as the client can communicate with a network and provide access to a screen to render game mechanics and allow a user to enter for action interactivity. In one modality, the client is a scarce client. However, in other emments, the client may be a general purpose computer, a special purpose computer, a game console, a personal computer, a laptop, a tablet computer, a mobile computing device, a gaming device laptop, a phone cellular, an external tuner, an interface / device of transmission medium, an intelligent television or network screen, or any other computing device capable of being configured to fulfill the functionality of a client as defined herein. In one embodiment, the game server in the cloud is configured to detect the type of client device which is used by the user, and provides an appropriate gaming experience in the cloud for the user's client device. For example, image configurations, audio settings and other types of configurations can be optimized for the user's client device.
In various modalities, the degree of processing performed by the client may vary with respect to the input and output processing. However, broadly speaking, the state of the video game is substantially maintained and executed in the game servers in nuble 104, with the client operating primarily to receive and communicate user inputs, and receive video / audio data for playback. The client 101 can be a single device that is connected to the screen 100 and provides video data for playback on the screen 100. In other embodiments, the client can be integrated into the screen 100. In one embodiment, the screen 100 is a screen in network by providing an operating system platform for applications or "apps" using the network connectivity of the screen. In such modality, the client can be defined by an application executed in the platform provided by the operating system of the screen.
Fig. 1 B illustrates multiple users in multiple locations coupled in the game mechanics of video games based on the cloud. The user U1 is shown in a first location interacting with a video game played on the screen 100. The users U2 and U3 are shown in a second location interacting with a video game played on a screen 106. A user U4 is shown in a third location playing a video game played on a screen 108. The U5, U6 and U7 users are shown in a fourth location interacting with a video game played on a screen 110.
In each of the first, second, third and fourth locations, at least one computing device is provided to process the input of the various users and to play a video juice based on the cloud on their respective screens. It should be appreciated that the computing device can be integrated into a screen, or it can be a stand-alone device such as a personal computer, connection box, game console, or any other type of device that has at least one processor and memory for processing and data storage. The computer device can execute or define a client, as described above. Computer devices are networked, and communicate over a network, such as Internet 102, with game servers in the cloud 104.
The game servers in the cloud 104 execute several video games which are played by the users, defining a game state of the given video game from moment to moment, and sending data of video (including image data and audio data) to a computer device in a particular location. The computing device in a given location processes the input of the user (s) playing the video game, and transmits the input data to the game server in the cloud, which in turn processes the input data to affect the game state of the video game. It should be appreciated that the cloud-based game facilitates the game with multiple players of players located in different locations by providing the execution of the video game on a remote server that is accessible by all players on a network. In this way, the execution of the video game does not depend on any conductivity in network or hardware the single player, although such will affect the user experience for that given player.
Figure 2A illustrates a system for playing in the cloud, in accordance with one embodiment of the invention. As shown, a user 200 operates a controller 202 to provide input to a video game based on the cloud. Controller 202 may include any of several types of input devices, such as buttons, joystick, touchpad, and motion detection hardware such as accelerometers, and magnetometers and gyroscopes. In one embodiment, controller 202 may include an illuminated object that can be tracked to determine the location of controller 202. Controller 202 may communicate wirelessly with a scant game client 204. Client 204 communicates over a network 208 with a service game in the cloud 210. The client 204 processes data from the controller 202 to generate input data that is communicated to a video game executed by the game service in the cloud 210. Additionally, the client 204 receives video data from the game service in the cloud 210, for playback in the screen 206. In one embodiment, the client 204 may process the received video data to provide a video stream in a format compatible with the screen 206. In one embodiment, the client 204 may include a camera to track a controlling device or a object located in the controller device As noted, the object can be illuminated to additionally facilitate tracking based on the analysis of the image frames captured from the camera.
The game service in the cloud 210 includes resources to provide an environment in which a video game can be executed. Broadly speaking, resources can include various types of computer server hardware, including processors, storage devices, and network equipment, which can be used to facilitate the execution of a video game application. In the illustrated embodiment, a video game library 212 includes several game titles. Each game title defines executable code as well as associated data and active libraries which are used to instantiate a video game. The main computer 214 may be a single computing device that defines a platform for illustrating virtual machines 216. In another embodiment, the host 214 may itself be a virtualized resource platform. In other words, the main computer 214 can work in one or server computer devices, manipulating the location and use of the resources defined by the server computing devices, while presenting a unified platform under which the virtual machines 216 can be illustrated.
Each virtual machine 216 defines a resource environment which can support an operating system, under which a video game application 218 can run. In one embodiment, a virtual machine can be configured to emulate the hardware resource environment of a game console, with an operating system associated with the game console running in the virtual machine to support the running of game titles. which were developed for that game console. In another embodiment, the operating system can be configured to emulate a native operating system environment of a game console, through the underlying virtual machine it may or may not be configured to emulate the game hardware. In another embodiment, an emulator application runs above the operating system of a virtual machine, the emulator is configured to emulate the native operating system environment of a game console to support video games designed for that game console. It should be appreciated that a variety of legacy and current game consoles can be emulated in a cloud-based gaming system. In this way, a user can access game titles from different game consoles via the game system in the cloud.
When user 200 requests to play a video game title specific, the video game title is retrieved from the library 212. If a compatible virtual machine has not already been illustrated or is not available for use, then a new compatible virtual machine is instantiated on the main computer 214. The video title Recovered game is then executed as an application 218 in the newly instantiated or available virtual machine 216. In a modality, this may involve determining the appropriate platform for the video game title (for example whose game console or operating system requires the game to run) and assign the video juice title to an appropriate virtual machine for execution, for example, one having an emulator application capable of manipulating the execution of the video game title. The running video game communicates with the gaming client 204 to provide an interactive gaming experience for the user 200. More specifically, the gaming game application 218 receives input data from the client 204 over the network 208. The Application 218 processes the input data to update the game state of the execution application. As the game state changes, the application 218 outputs video data that is sent to the client 204 for playback on the screen 206. Additionally, the application 218 may also output feedback data to the client 204 which is used for provide an additional feedback mechanism to the user. By way of example, the user controller 202 may include a tactile vibration feedback mechanism that can be activated based on the feedback data from the video game application.
In one embodiment, the game system in the cloud is configured to detect the type of client device associated with the user, and also a type of controller available to the user to provide input to the video game based on the cloud. For example, in a modality, when a user enters the game system in the cloud, an option can be presented to designate the type of client device with which they access the game system in the cloud. In one embodiment, a series of client device options are presented from which the user can select one corresponding to his client device. The user can also be presented with an option to designate the type of controller device they will use to play a video game. In one embodiment, a series of controller options may be presented to the user, from which the user may select to designate a type of controller corresponding to his controller hardware. In other embodiments, the game system in the cloud can be configured to automatically detect the type of client device and / or the type of controller device.
For example, upon entering, the client device may send information to the game server in the cloud identifying it as a connected controller device (e.g., in response to a request from the game server in the cloud. Based on this information, the game server in the cloud can determine a appropriate video game output configuration and input parameter configuration to provide an optimized gaming experience for the user client device and controller device. In one embodiment, a search table is used to determine the video game configuration and the input parameter configuration based on a detected client device and a detected controller device.
It should be appreciated that a given video game can be developed for a specific platform and a specific associated controller device. However, when such a game is made available via a game system in the cloud as presented in this, the user can access the video game with a different driver device. For example, a game may have been developed for a game console and its associated controller, while the user may be accessing a game-based version of the game from a personal computer using a keyboard and mouse. In such scenario, the input parameter configuration can define a mapping from the inputs which can be generated by the controller device available by the user (in this case, a keyboard and mouse) to inputs which are acceptable for the execution of the video game.
In another example, a user can access the game system in the cloud via a tablet computer device, a touch-sensitive screen phone, or another device operated by touch-sensitive screen. In this case, the client device and the controller device they are integrated together in the same device, with inputs that are provided as gestures / detected touch screen entries. For such a device, the input parameter configuration can define particular touch-sensitive screen entries corresponding to game inputs for the video game. For example, buttons, a directional control, or other types of input elements can be displayed or superimposed during the video game run to indicate the locations on the sensitive screen that the user can touch to generate a game entry. Gestures such as magnetic readings in particular directions or specific contact movements can also be detected as game entries. In one embodiment, a tutorial can be provided to the user indicating how to provide input via the touch-sensitive screen for game mechanics, for example, before starting the game mechanics of the video game, to acclimatize the user with the operation of the game. controls on the touch screen.
In some embodiments, the client device serves as the connection point for a controlling device. That is, the controlling device communicates via a wired or wireless connection with the client device to transmit the inputs from the controller device to the client device. The client device can in turn process these inputs and then transmit input data to the game server in the cloud via a network (eg, accessed via a local network device such as a router). However, in other In this embodiment, the controller can itself be a networked device, with the ability to communicate inputs directly via the network to the game server in the cloud, without being required to communicate such entries through the first client device. For example, the controller can connect to a local network device (such as the aforementioned router) to send and receive data from the game server in the cloud. Thus, while the client device may still require receiving video output from the video game based on the cloud and playing it on a local screen, input latency can be reduced by allowing the controller to send entries directly over the network to the server. game in the cloud avoiding the client device, avoiding the client device.
In one embodiment, a network controller and the client device can be configured to send certain types of inputs directly from the controller to the game server in the network, and other types of inputs via the client device. For example, entries whose detection does not depend on any additional hardware or processing in addition to the same driver can be sent directly from the controller to the cloud gaming server via the network, avoiding the client device. Such inputs may include button inputs, game lever inputs, integrated motion detection inputs (eg, accelerometer, magnetometer, gyroscope), etc. However, entries that use additional hardware or require processing by the client device may be sent by the client device to the game server on the cloud. This can include captured video or audio from the gaming environment that can be processed by the client device before sending to the game server in the cloud. Additionally, the inputs from the controller's motion detection hardware can be processed by the client device in combination with captured video to detect the position and movement of the controller, which could subsequently be communicated by the client device to the game server in Cloud. It could be appreciated that the controller device in accordance with various modes can also receive data (e.g., feedback data) from the client device or directly from the gaming server in the cloud.
Figure 2B conceptually illustrates the accumulation of game titles through several generations of game consoles, in accordance with the embodiments of the invention. In the video game industry, video games are developed for specific video game consoles. Over time a library of game titles accumulates for a specific game console. For example, in the illustrated diagram, a first generation console 220 has a collection of game titles 228 which have been developed for this. A second generation console 222 has been associated there with a collection of game titles 230 which has been developed for this. And a third-generation console 224 is also shown, having a collection of game titles 232 developed for this. In other modalities, there may be a collection of game titles 234 which has been specifically developed as cloud-based games for use in combination with a 226 client. In addition, other types of games such as Internet games can be developed and grouped for distribution over a game system in the cloud , as described herein. It will be appreciated that game titles from different generations of game consoles can be collected and consolidated in the cloud game library 212. As shown, the library 212 includes a first generation console library 236 which includes game titles which have been developed for the first generation console. In a similar manner, the library 212 also includes a second generation console library 238 and a third generation console library 240 which contains video games that have been developed for the consoles of the second and third generation, respectively. Games which have been developed for the customer 226 and other types of games such as Internet games can also be included in the game library 212. As can be seen, a large number of game titles through several generations of game consoles Video game can be collected and made available via a game library in the cloud. As described, each of these games can be executed in a virtual machine that simulates the operating system environment associated with a given game console for which a game was developed. In this way, users who access the game system based on the cloud are able to easily access and play games from across many different consoles as well as games from other contacts such as Internet games and games which have been developed specifically for use with the game system based on the cloud.
Figure 3 illustrates a method for providing game demos to a user, in accordance with one embodiment of the invention. An interface 300 is shown providing bullets or icons 302, 304, 306, 308, 310 and 312 of various game titles that are available for demonstration. Each icon can be configured to provide an image indicative of the game it represents. In one mode, when the user navigates to a given icon, the icon may be activated to display an animation or video clip that is representative of the video game or which otherwise provides additional information to the user about the content of the video game. . In the illustrated mode, a user has navigated to the icon 302 which is therefore highlighted and displays an active animation showing a scene from the video game. When a user selects an icon, a video preview may be displayed in accordance with the operation of method 314. If the user continues to select the game in the operation of method 316, then in the operation of method 318, the game code for the demonstration it is activated. In the operation of method 320, the game demonstration is made instantly available for game mechanics by the user. It should be appreciated that because the game demonstration is based on the cloud, it can be made available instantaneously from a pre-loaded instantiation of the demonstration of the game in the cloud-based system. When the game demo is activated, the pre-loaded instantiation of the game show is triggered to execute. In the operation method 322, it is determined whether the user wishes to continue the game mechanics of the game demonstration or to play an additional demonstration of the video game, if one is available. If so, then the method returns to operation 322 of continued execution of the game demonstration or executes a new demonstration for the video game. If not, then in the operation of method 324, the user is presented with an option to buy a mini version of the video game. Additionally, in the operation of method 326, the user may be presented with an option to purchase a full version of the video game. If the user chooses to buy a mini version or a full version of the game, then in the operation of the 328 method that version of the game is incorporated into the user's account. It should be appreciated that because games are made available on a cloud-based gaming system, the incorporation of a game into a user's account may simply imply access privileges associated with the specific version of the game with the user's account. In addition, once purchased, the game can be made available almost instantaneously from the cloud-based system for game mechanics by the user, especially when the game can be pre-instanced in the cloud-based system.
Fig. 4A illustrates the hierarchical organization of several portions of a video game, in accordance with one embodiment of the invention. By way of example, a video game may be organized into several sections 400. In the illustrated embodiment, this may include a configuration section, an introduction section, multi-level sections, and a final section. A given level can be further divided into several scenes. As shown, the level 3 section is divided into several scenes 402. As the user plays through a given scene, a game mechanics timeline of the scene can be recorded, including recorded video of the game mechanics of the game. user as well as recorded input data and game states of the executed game application. In the illustrated mode, the timeline of the game mechanics 404 as representative of the game mechanics of the users of scene 2 of level 3 of the game. In accordance with the embodiments of the invention, a user may select portions of his recorded game mechanics from which he generates a mini game or game part. For example, in the illustrated embodiment the timeline of game mechanics 404 has a start time T0 and a end time Tz a portion of the game mechanics timeline from a time Tx to a time Ty has been selected from which to generate a mini-game.
Figure 4B illustrates an interface for selecting a portion of a game mechanics timeline for generating a mini-game or game part, in accordance with one embodiment of the invention. In the illustrated embodiment, the 412 interface is presented on a screen responsive to the touch 411 of a device 410. In one embodiment, the device 410 is a tablet computer device. The interface 412 includes a selectable game mechanics timeline 414. In an expanded view 430 of the game mechanics timeline 414 it can be seen that in one embodiment the game mechanics timeline 414 is represented as a strip of film with adjustable markers 416 and 418. Marker 416 designates the starting point along the game mechanics timeline 414 of the selection, while marker 418 designates the end point along the line of play. Mechanical game time of the selection. In addition, a marker 419 can be placed within the portion of the game mechanics timeline 4414 that has been found for the start marker 416 and the final marker 418. For ease of use and provide the user with a visual understanding of which portion of its game mechanics is selected, a starting frame 420 corresponding to the point along the time line of game mechanics in which it has been placed may be displayed. The marker 416. The start frame 410 is an image of the recorded game mechanics video corresponding to the time at which the start marker 416 is placed. Similarly, a representative final frame 424 may be shown corresponding to the point along the timeline of the game mechanics in which the marker 418 has been placed. In a similar manner, the final frame 424 is an image of the recorded game mechanics video corresponding to the time in which the final marker 418 is placed. Additionally, a representative frame 422 can unfold corresponding to the position of the marker 419 along the timeline of the game mechanics. The representative frame 422 can be used as a representative image for the mini-game that is created based on the selected portion of the game mechanics timeline. It should be appreciated that although in the illustrated embodiment a touch screen interface is provided and described, in other embodiments various other types of input may be used to select the start and end points to define a portion of the game mechanics for creation of a mini-game. For example, entry can be provided via a game controller, a keyboard, via gesture input, voice input, and in accordance with other types of input devices and mechanisms to allow the selection of a portion of the game mechanics together with the selection of an image frame representative of the recorded game mechanics video.
In some modalities the selection markers may not be continuously adjustable along the game mechanics timeline, but instead they may be configured to pressurize the pre-defined time points along the line of play. game mechanics time. For example, predefined time points can be defined to correspond to specific events that occur in the game mechanics timeline. The specific events of a given game mechanics timeline for which previously defined time points will be assigned can be generated based on the analysis of the game mechanics of the users, and will depend on the specific architecture of the game mechanics of the video game. In one modality, the predefined time points can be assigned based on the geographical location of a symbol within a virtual world by the video game. For example, pre-defined time points can be assigned to specific times in which a symbol moves from one geographical location to another geographical location, for example the movement from one scene locality to another scene location, movement from one city to another. another city, entering a structure, entering a room within a structure, entering a different type of environment, or any other type of geographic transition of importance. In another embodiment, the predefined time points can be assigned based on the development of a user or entity symbol which is controlled in the video game. For example, previously defined time points can be assigned when a symbol or entity controlled by the user fulfills a task, acquires a skill, acquires an object, passes a level or otherwise completes a portion of the video game, or performs or achieves any other kind of significant activity in the video game.
Figure 4C illustrates an interface for selecting a portion of a game mechanics timeline for generating a mini-game, in accordance with one embodiment of the invention. A game mechanics timeline 414 graphically illustrates a timeline along which the user can set the start and end markers 416 and 418 to designate a selection of game mechanics from which to generate a mini-game A start frame 410 corresponding to the position of the start marker 416 is shown, as well as a final frame 424 corresponding to the position of the final marker 418. It will be appreciated that the frames can be identified from the recorded video of the user's game mechanics. A number of candidate frames 440, 442, 444 and 446 are presented, of which the user may select one to be used as a representative frame for the mini-game. Candidate frames can be determined according to a variety of methods. For example, candidate frames may be presented from fractioned intervals of the selected portion of the game mechanics timeline. The intervals can be equivalent, such that the candidate frames are equally spaced along the game mechanics timeline, or the intervals may be non-equivalent, such that some frames are closer together than others throughout the course of the game. game mechanics timeline. In one embodiment, a higher density of candidate frames is generated from previous portions of the selected portion of the game mechanics timeline compared to later portions of the selected portion of the game mechanics timeline. In one embodiment, a higher density of candidate frames is generated from both previous and recent regions of the selected portion of the mechanical timeline, while a lower density of candidate frames is generated from the central portions of the selected portion of the game mechanics timeline.
After a user has selected a portion of game mechanics from which to create a mini-game, the embodiments of the present invention provide systems and methods for creating a playable mini-game based on the selected portion of the game mechanics. . More specifically, the mini-game allows another user to play substantially the same portion of the video game that the original user played, and possibly under substantially the same conditions and parameters. In this sense, the mini-game is more than just a video replay of the game mechanics of the original user (although a replay video clip of the game mechanics of the original user can be presented in combination with the mini-game) , but it is a playable portion of the same video game that has been designated based on the user selection of its own game mechanics. Thus a secondary user can experience a game mechanic experience substantially similar to that of the original user.
Figure 5 illustrates a series of screenshots demonstrating a method for generating a mini-game of an existing video game based on the cloud, in accordance with one embodiment of the invention. The capture screen 500 shows the video game of a user. In the illustrated mode, the user plays a level 3 of a video game. In capture screen 502, the user has completed level 3 of the video game. Upon completion of the level, the user is provided with an option to generate a part of the game or mini-game based on the mechanics of the game. user's game of that level. When the user chooses to generate a game part, then on the capture screen 504 the user is presented with an interface to select a starting point for the game part of the user's recorded game mechanics. As described, a representation of a game mechanics timeline can be displayed with an adjustable slider which the user can move to designate a starting point for the game part. After the user has defined the starting point, then on the capture screen 506, the user is presented with the interface configured to allow selection of an end point for the game part. Again, an adjustable slider is moved by the user to designate the end point along the representative game mechanics timeline in the interface. After the start and end points of the user's game mechanics timeline are designated, the game part or mini-game is generated by the system, as described elsewhere herein. In the capture screen 508, the user can be presented with additional options, such as an option to share the recently created game part with other users or otherwise inform other users of the recently generated game part, an option to generate another game part of the same game mechanics timeline, an option to continue the game mechanics of the current video game, an option to observe the user's existing game parts, etc. If the user selects to observe their existing game parts, then on the capture screen 501, shows an interface displaying the previously created game parts of the user. In the illustrated mode, the user can select one of the game parts created above, and observe statistics and information related to that game part, as shown in the capture screen 512. The information and statistics related to a game part given may include any of the following: a title of the game part, the video game from which the game part was originally created, the date on which the game part was created, the number of times the game part it has been played by other users, a completion percentage indicating an average percentage of the game part that is completed by the users that take the game part to play, comments left by other users, etc.
Although at present the described modality, an option is presented to the user to generate a game part after completion of a level of the video game, it should be appreciated that in other modalities the user can generate a game part at any other time during or outside the game mechanics of the video game, provided that the recorded game mechanics of the video game exists from which a selection by the user can be defined for a part of the game. For example, in one embodiment an interface can be presented that provides access to various game mechanics recorded from various video games that are associated with the user. The user can select the game mechanics of a specific video game and generate a game part by selecting a game portion of the game mechanics recorded from which to generate the game part in accordance with the modalities described herein.
Figure 6 illustrates a system for generating code of part of the game, in accordance with one embodiment of the invention. The terms "game part" and "mini-game" are used interchangeably herein to define a playable and discrete portion of a video game that is generated based on the selection of the existing recorded game mechanic user. In the illustrated embodiment, the user game mechanics 600 conceptually represents a user interacting with a full version of a video game. The main game code 602 is executed to define the full version of the video game displayed by the user. When the video game is played, it generates various types of game mechanic output, including video data, game state data, and user input data. These can be recorded to define the mechanics of the user's recorded game. In the illustrated embodiment, an image stream 604 conceptually represents the output of video data by the video game. The metadata of the game state 606 and the user's input data 608 are also displayed. The game state data 606 includes data defining the game state of the video game running from moment to moment during the game mechanics. The game state data may include the values of any variables which define the state of execution of the video game. The user's input data is generated data captured user-initiated actions that occur during interactivity with the video game, as provided via the activation of input devices in the controller devices, detection of sensor data (eg, motion sensors), captured audio input, and the like.
As described, a user interface can graphically illustrate the recorded game mechanics of the user to facilitate the selection by the user of a portion of the user's game mechanics from which to generate a game part. In the illustrated embodiment, the user has defined a 610 selection of his recorded game mechanics. This selection of the game mechanics of the user is used by a generator of part of the game to generate the game part code 622 which defines a limited game based on the selected portion of the game mechanics of the user. The game part generator 612 includes a game state analyzer 614 which analyzes the game state of the recorded selection 610. Based on the game state analysis of the recorded selection, a game break point processor determines appropriate breakpoints to define the start and end of the game part. Breakpoints can be defined based on geography, time, task or target fulfillment, scene boundaries (physical or temporal), or any other aspect of a video game according to which the game mechanics of the game video can be segmented to generate a part of the game. A brief description of some illustrative modalities will serve to highlight certain possibilities for determining the point of breaking off.
For example, some video games involve control of a symbol that can be removed from a geographical scene or location to another scene or location. The selected portion of the user's game mechanics can be determined to be generated from the game mechanics in a particular scene. In such modality, the boundaries of the particular scene may define the geographical break point for the game part, selecting the scene for the exclusion of other scenes, which may involve the exclusion of other adjacent or adjacent scenes, as well as scenes which are not adjacent or non-adjacent or otherwise less related or unrelated to the particular scene. It will be appreciated that the selection of recorded game mechanics 610 can entail game mechanics from multiple scenes, in which case, the game break point processor 616 can be configured to define the breakpoints according to the limits of the multiples. scenes which are used for the selection of recorded game mechanics.
It should be noted that a scene may be of a geographical and temporal nature. That is, the scene can define not only a geographical region within a virtual space defined by the video game, but it can also be configured to exist for a certain time or at a particular chronological point within the larger context of the video game. Such a scene may have definite objectives or goals that are to be achieved by the player. Thus, game breaking points can be defined based on in chronology or other temporal aspects as defined by the video game.
In addition, a given scene may have associated objects or characteristics which are presented as part of the scene during the game mechanics. These objects or characteristics can be analyzed to define additional breakpoints according to their inclusion. For example, these objects in the scene can be taken from a subset of an active library, in which case the subset of the active library can be defined for the game part by the game break point processor 616, up to the exclusion of other objects in the active library which is not used in the scene associated with the selection of recorded game mechanics. It should be understood that objects and characteristics can be dynamic elements of a given scene, with associated mechanisms defining their change in response to events that occur in the video game. For example, an object can have a damage modeling module that determines and adjusts the appearance of the object when it is damaged (for example when it is wounded by a weapon). Or a feature could be a vehicle that becomes available during the scene, with the vehicle having a logical associate which defines its appearance during the game mechanics as well as its operation and response for user input. Such logic or damage modeling can also define game breaking points for the generation of the game part.
Several aspects of a video game which defines or uses Another way for a selected portion of a video game can be the basis for defining a game breaking point. The examples described at present are provided by way of example only and not by way of limitation. It should be appreciated that in other embodiments, other aspects of a video game may form the basis for defining breakpoints to generate a game part.
In one modality, a video game can be organized into several scenes which must be completed in a linear manner, such that a subsequent scene can not be attempted until its preceding scene has been completed first. Each scene may include a number of objectives or goals, some of which may be required to complete the scene, and some of which may be optional to complete the scene. Objectives can include navigating from a home location to a predefined final location within the scene, surviving for a predefined period of time, destroying a predefined number of enemies, acquiring a certain number of points, overcoming a particular enemy, or some other activity which can define an objective within the game. A scene can have several predefined termination points, points where the user, once having reached the termination point, is able to return to that point if the user becomes unable to continue the game mechanics for some reason (for example , the user leaves the game, the user's game symbol dies or runs out of health or lives, the user's vehicle crashes, etc.). In the Predefined termination points, a video game can be configured to automatically save the user's progress, or present an option for the user to save their progress.
In one embodiment, the game break point processor 616 is configured to define a game break point at predefined termination points. In a modality, this is achieved by finding the termination points closest to the selected start and end points of the user's recorded game mechanics selection, and using these closest termination points to define the game breakpoints for the game part. In another embodiment, the closest termination point that occurs before the selected start point of the recorded game mechanics selection is used to define an initial break point, while a closer termination point occurs after the end point selected from the selection of recorded game mechanics is used to define a final break point for the creation of the game part. In yet another embodiment, if a termination point lies within a predefined radius of (e.g., any before or after) any of the start and end points of the game mechanics selection of the user's recorded game, after this point is used of completion to define a corresponding starting or ending game breaking point for the game part. While if no termination point lies within the predefined radius, then a game breakpoint is defined that matches more closely with the user's selected start and end point for the recorded selection. In other embodiments, the previously defined radius for the start and end points may differ for the purposes of determining whether an existing termination point is used to define a game break point.
As discussed, the game breakpoint processor 616 determines appropriate breakpoints applicable to various aspects of the video game based on the analysis of the user's recorded video game selection. The breakpoints defined by the processor 616 serve to define the limited scope of the game part that will be produced based on the selection of game mechanics recorded from the users. In one embodiment, an overlay processor 617 is provided to generate overlays that can contribute to an enhanced user experience when playing the game part generated by the game part generator 612. For example, in one mode the overlay processor 617 defines data from part of the pre-game which defines video or game mechanics or additional information that can be provided as an introduction to the game part before the current game mechanics of the game part. An example of pre-game data is an introductory video which can provide context to a user who initiates the game mechanics of becoming a part. In another modality, data from part of the pregame can define introductory game mechanics for the game part, which can provide a user with an opportunity to learn skills that may be useful or required for or required for play the game part. In another modality, the data of part of the pregame can define a series of one or more informative screens or images which provide information about the game part to the user. Such information may include the configuration of the controller, background history information, objectives or goals, maps, or any other type of information to the game part which may be useful to the user or otherwise improve the user experience of playing the game part.
The overlay processor 617 can also be configured to define the postgame part data. In some modalities, the postgame part data may define video or images to be displayed after the completion of the video game of the game part. For example, a congratulations video can be shown after a user completes the game part. Such a video can be customized based on the game mechanics of the user of the game part, for example, by displaying information or images that are based on the game mechanics of the user. In one embodiment, the post-game party data may define a playback mechanism to reproduce the recorded portions of the game mechanics of the game party user after its completion. In another modality, the postgame part data may be configured to display statistics about the game mechanics of the users of the game part, and may indicate a comparison of the game mechanics of the users to that of other users or that of original creator of the game part. In still other modalities, the data of part of the posjuego they can define additional interactive elements to be presented to the user under the completion of the part of the game. These may include options to buy part or complete the video game under which the game part is based, redirection options towards additional sources of information in relation to the video game, etc.
In some embodiments, the overlay processor 617 can be configured to define elements which are superimposed on the game part. These may include elements that can be customized by a user playing the game part, such as personalization of symbols, objects, properties and other types of customization options. In other embodiments, the overlay processor 617 can be configured to define simplified elements for a game part to reduce the complexity of the game part code and the amount of resources required to execute the game part. As an example, many video games include artificial intelligence (Al) entities such as symbols, vehicles, enemies, etc. These Al entities can be governed in the full video game by artificial intelligence models that define the reaction and activity of Al entities based on events that occur in the video game. However, in the context of a game part which is limited in scope, it may be acceptable to simply define the activity of an Al entity through the encoded definition or simplified extrapolations, rather than fully modeling the activity of entity Al as might be the case in the full video game.
For example, if in the recorded game mechanics selection of the complete video game a given Al symbol moves in a certain way according to its model of Al which is unlikely to change in the game part, then it may be more efficient to define an approximation of the Al symbol movement for the game part. Such an approach might not require the full model to be included as part of the game part code, yet could provide the user playing the part of the game with a substantially similar experience with respect to the Al symbol towards that of the game mechanics. original user game from which the game part was generated. The resource savings realized through the activity approximation of Al entities can be even more significant when multiple Al entities are present and interacting in the user's recorded game mechanics selection. For each of the entities of Al can have models of Al that depend on the output of those other entities of Al. However, when the video game has recorded the activity of each of those entities of Al is known, and therefore, it can be reproduced in the game part through simplified mechanisms such as direct coding of its control variables and approximation of its activity.
With continued reference to Figure 6, a game configuration state processor 618 is provided to define an initial state of the game part. Based on the operation of the analyzer game state 614, game break point processor 616, and game configuration state processor 618, a code assembly manager 620 assembles several code portions to define the game part code 622. When executes the game part code 622, the game mechanics of the user 624 provides input to define the execution state of the game part code, which produces the output of game mechanics including video data and feedback data for render the game part to the user. The video data may include overlay video of part of pregame 622, game part video 624 which is the video resulting from the game mechanics of the game part, and video overlay from postgame part 626.
It should be appreciated that in one embodiment, the game part code 622 is completely self-contained, including all the portions of code which are required to execute the game part. However, in other embodiments, the game part code 622 may incorporate references or pointers to portions of code existing in the main game code of the complete video game. In addition, the game part code 622 may include reference or use existing resources in resource libraries of the main game code of the complete video game. However, in other modalities, new resource libraries can be generated for the game part code.
Figure 7A illustrates the modification of a virtual space for the purpose of generating a part of the game of a video game, in accordance with one embodiment of the invention. Map 700 represents a scene or a geographical portion of a video game. E map 700 represents a scene or a geographical portion of a video game. As shown, the map 700 illustrates a region 702 and several trajectories 704, 706, 708 and 710. In the recorded game mechanics of the video game, a user symbol 712 moves from the region 702 to the path 710. With Based on this recorded movement and other analysis of the game mechanics of the video game, it can be determined that the additional trajectories 704, 706 and 708 are not necessary for the generation of the game part. The trajectories may represent incorrect choices compared to the trajectory 710, or may lead to areas that are not relevant to the game part, or may detract from a player's ability to play the game part and follow the trajectory of the game. game similar to the original user. Furthermore, if the areas to which the trajectories 704, 706 and 708 lead are not supported in the game part, then the inclusion of such trajectories could cause confusion among the players, or at least be a poor user experience. Therefore, in a modified map 720, trajectories 704, 706 and 708 are made unavailable for game mechanics in the game part, while trajectory 710, as well as region 702 remain unchanged. Thus, when a user plays the game part that incorporates the topography defined by the map 720, he will experience a virtual space where the trajectories 704, 706 and 708 are not available to travel. Then the user will be more likely travel path 710 as the original user did, experiencing similar game mechanics.
It will be appreciated that the portion of a virtual space defined for a game part or mini-game can be defined by the limits which are determined based on the mechanics of the user's recorded game. The boundaries will define a sub-region of the larger virtual space, and include a subset of the features which are available in the larger virtual space. In some embodiments, virtual space boundaries can be determined by determining locations in the virtual space defined by the user's game mechanics, and then determining the predefined boundaries associated with the virtual space that are closest to those locations and arranged to encompass them. For example, a user's game mechanics may define a path traveled by a user's video game symbol. This trajectory can be analyzed and based on the location of the trajectory in the virtual space, a set of previously defined limits can be selected to define a region encompassing the trajectory. In some modalities, the previously defined boundaries can be defined by specific symbols which define inherently portions of the virtual space, for example, doors, windows, walls, rooms, halls, fences, roads, intersections, halls, etc.
Figure 7B illustrates the modification of a scene graphic for purposes of generating a part of the game, in accordance with the embodiments of the invention. A scene graph 730 illustrates conceptually the organization of several scenes A to G of a video game. It should be appreciated that the scenes as described may be geographical and / or temporal in nature, and each may represent a playable portion of a video game, such as a stage, a level, a section, a location, or any other unit Organizational within the video game according to which a player can progress from one scene to another. In scene graph 730, several nodes are shown representative of scenes A to G. As shown, a player can progress from scene A to scene B, and scene B to any of scenes D or E. The player can also progress from scene A to scene C, and from scene C to any of scenes F or G. The scene graph 730 is illustrative of the organization of the entire video game scene. However, for purposes of creating a game part, not all available scenes may be required for the game part. Thus, by way of example, a scene graphic 732 illustrates the organization of scenes for a game part. As shown, the scene graph 732 includes scenes A, B, C, and F, but not the remaining scenes which were included in the 730 scene graphic of the full video game. Thus, a user can progress from scene A to any of scenes B or C, and from scene C to scene F. However, the other scenes of the scene graphics of complete video game 730 are not available for the game mechanics in the game part. As described, the systems in accordance with the embodiments of the invention can be configured to limit the inclusion of scenes when a game part is generated. In this way, the game part does not include scenes which are not required for the limited context of its intended game mechanics and purpose.
Figure 8 illustrates a method for generating a game part, in accordance with an embodiment of the invention. In the operation of the method 800, a game mechanic of users of a video game is recorded, including the video recording of the game mechanics of the users and video game data such as input data and state data of game of video game. In the operation of method 802, a user interface is presented for selection of the game mechanics recorded to generate a game part. The interface defines mechanisms to receive user input to define the start and end points of the game mechanics recorded. For example, an interface to review the recorded game mechanics video can be provided to allow the user to define the start and end points within the game mechanics video based on the navigation or playback of the game mechanics video. In the operation of method 804, the user-defined selection of the game mechanics video is received. In the operation of method 806, the video game breakpoints are identified based on the user-defined selection received from the game mechanics video. In one modality, there are predefined breakpoints for the video game. Based on the user defined selection of the game mechanics video, the points of game break which are closest to the start and end points of the user-defined selection can be chosen as the game break points for the game part to be generated. In the operation of method 808, the game code is defined for the selection of the part as defined by the identified breakpoints. That operation of method 810, the configuration state of the game is defined by the selection of the part. In one embodiment, the game configuration state is based on a game state which existed during the game mechanics of the game mechanics recorded by the user. The operation of method 812, the game part code is generated, and in the operation of method 814, the game part code is stored for a library of part of the game and associated with the user's account.
In one embodiment, the method may include the operation of method 816, wherein the suggested game party video selections are generated based on the recorded gameplay video and user data. The suggested selections of the game mechanics of the users can be determined based on the analysis of the game mechanics recorded by the user. For example, portions of game mechanics where a high level of activity (e.g., an activity level exceeding a predefined threshold) is detected can be suggested as possible gameplay video selections. In operation of method 818, the user interface mentioned above for selecting part of the game may present the video selections of part of the game suggested to the user. In a modality, a representative capture screen of each video selection from the suggested game can be presented to the user. In the operation of the method, a selection is received by the user of one of the video selections from the suggested game. Based on the selection of the user, the game part code can be generated and stored as described above.
Figure 9A illustrates an interface for searching game parts associated with various game titles, in accordance with one embodiment of the invention. In the illustrated mode, the interface is organized into a series of tags, including tags 900, 902, and 904 which, when selected, provide access to several pages corresponding to different game titles. In the illustrated embodiment, the tag 900 is currently selected, such that the page presented provides information about the game parts which have been created for a particular game title T1. A listing of game part 904 lists the various game parts which have been created from the game title T1. In one embodiment, the game part listing 904 also identifies the user who created the game part (for example, displaying a username of the user who created the game part). In one embodiment, the game part listing 904 may be searched or navigated to highlight different from the game parts listed in the game part listing 904. In the illustrated mode, a game part A created by a user A is highlighted currently, which causes details 906 to be displayed which are relate to the game part A. The details 906 may include various information suggestions related to the game part, such level or stage from which the game part was generated, a game detail which provides more specific information about the location from which the game part was created, comments by the user to whom the game part was created, various comments from others, various times when the game part was played, or any other information about of the game part which can be provided when the listing of the game part is highlighted.
Figure 9B illustrates a game part information page, in accordance with one embodiment of the invention. The information page of the game part is for the game part A discussed with reference to Figure 9A and can be reached when the user selects the listing of the game part A as shown in Figure 9A. With continued reference to Figure 9B, the game part information page provides various types of information related to the game part. In addition to the bibliographic information about the game part (for example, title, username of the user who created the game part, creation date, etc.), a 910 video of the game part can be displayed. In one modality, video 910 is the recorded game mechanic video of the original user who created the game part. In another embodiment, video 910 could be recorded game mechanic video of other users, such as a user who achieved the highest score in the game part, or a user who most recently played the game part. In one modality, the video 910 could be a live feed of a user who is currently playing the game part. In other modalities, a representative image of the game part could be shown instead of a video.
The game part information page may also include a selectable 912 button to initiate the game mechanics of the game part. The game party information page may also include a details section 914, which may present various details and statistical information about the game part, such as the number of players, the average completion rate, etc. The game part information page may also include a comment section 916, presenting comments left by users. A classification button 918 can be provided to select several options for classifying comments (for example, chronological order, inverse chronological order, most popular, by evaluations, by relation to the current user (for example, comments by the user's friends in a social graphics are prioritized), etc.).
Figure 10 illustrates a view of a user account information, including live views of friends in a social gaming network in the cloud, in accordance with one embodiment of the invention. A library section 1002 displays several game titles in the user's library. These can be game titles which the user has purchased or acquired in another way. The deployment of a game title may include display of representative graphics, as well as title information. It will be appreciated that the games can be full version game titles, but they can also be mini versions of limited versions, each of which can be a portion of a full version game title or be limited in some capacity compared to the title of full version game. A list of friends 1004 lists the current user's friends in a social network associated with the game system in the cloud (that is, other users in the social profile of the current user). The social network can be a social network that is specific to the game system in the cloud, or it can be (third part) the social network that exists apart from the game system in the cloud, with which the game system in the Cloud communicates to obtain information about the user's social profile. The list of friends 1004 may include additional information about the user's friends, such as illustrating games which each friend possesses, identifying a friend's online status (eg, offline, inactive, etc.), the last entry of the friend and its duration, the last game played by the friend, etc.
In one embodiment, a live active user section 1006 provides live views of the game mechanics of friends who are currently online and may be playing a video game. In one modality, each user has an option to define whether or not their live game mechanics are available for live observation by other users. In such modality, live views are presented only from those users who have the option designated to allow their live game mechanics to be observable by other users. In the illustrated mode, the live active user section 1006 includes a live view 1008 of the current game mechanics of a friend A, as well as a live view 1010 of the current game mechanics of a friend B. In one embodiment, the current user can browse or search the friends of the friends list 1004 and / or search the live views which are available in the live active user section 1006. In a modality, a live view can be highlighted when the current user browse for it, and it can be rendered in a different way from other live views. For example, live views can be displayed in an unsaturated color scheme by default, but unfold in a completely saturated color scheme when highlighted or selected. Live views can also be displayed at a lower resolution, frame rate, or default size, but when selected can be displayed at a higher resolution, frame rate, or size. In this way, the bandwidth can be assigned to a specific live view based on the user's selection, to present a live view that the user is interested in observing more faithfully than other live visas that can be active simultaneously . It should be appreciated that the live view can display not just game specific mechanics of a video game, but also another activity of a friend in the game system in the cloud, such as navigating through menus or other types of activity related to your game in the cloud.
In one modality, live views are available only for those users who are currently actively engaged in the game mechanics of a video game. In other words, views of a given user are not available when that user is offline or online but not actively engaged in the game mechanics of a video game. Thus, when the user is performing another activity of non-game mechanics in the game system in the cloud (for example, navigating in a graphical interface of the game system in the cloud while entering the system), such activity is not done available in a live feed for others to watch. In another modality, the live view can include all the activity of a user who entered the game system in the cloud, including both the game mechanics of the user as well as another activity of non-game mechanics.
In one embodiment, the live view of a given user can be filtered so as not to expose potentially sensitive or personal information to be observed by other users. For example, the game system in the cloud can support a conversation function. Because some users may want their conversations to remain private, an option to exclude conversation logs can be provided when they present a live view. It should be appreciated that a conversation function can be implemented during the activity of the game mechanics as well as during the non-game mechanics activity, and can be filtered from the live views of either or both of these circumstances. In another modality, aspects of a video game can be filtered from a live view. For example, a The user may wish to maintain a certain configuration defined by the secret user as this may confer an advantage on that user during the game mechanics. Thus, an option can be provided for activity related to undeployed configurations as part of the live feed (for example, when a user accesses the video game configuration interface.) In another modality, the live view can be configured to avoid observe personal information (for example, avoid observing when a user accesses a personal information page, enters payment information, enters a password, etc.).
In one mode, the interface provides an option to the primary user to request to join the game session of a secondary user who is currently online. For example, the main user can observe the live game mechanics feed of the secondary user and wish to join the game mechanics of the secondary user. In one modality, the activation of the option sends a request to the secondary user notifying the secondary user that the main user wants to join their session. If the secondary accepts the request, then a multiple player mode of the video game is initiated, facilitating the mechanics of multiple player play by the first and second users. In another embodiment, two or more secondary users may already be engaged in the mechanics of multiplayer play. In such modality, the main user can send a request to join the mechanics of multiplayer game. Upon acceptance of the request by one of the users Secondary (for example, a designated main computer of the game mechanics session), the main user is able to join the multiplayer session of the video game. It should be appreciated that the interface mentioned above showing the live game mechanics feeds of secondary users facilitates the union of the main user to the game mechanics of secondary users after being able to see their game mechanics.
In one embodiment, the option to request to join the game mechanics of the secondary user is predicted upon determination of a state of owner of the video game by the main user. If the main user does not have the video game, then no option can be presented, whereas if the main user has the same video game as the secondary user, then the option to request to join the game mechanics of the main user is made available as part of the interface. In one embodiment, when it is determined that the primary user is not an owner of the video game, then the primary user can still join the game mechanics of the secondary user in a multiple player mode, but in a limited capacity, such as being limited in terms of the duration of the game mechanics, available scenes / levels / stages / etc, personalization options, abilities, skills, weapons, symbols, vehicles or any other aspect of the video game that may be limited. In a modality, after the game mechanics of the limited version of the video game, the main user is provided with an option to Buy the full video game. In another embodiment, the main user is provided with an option to purchase an additional portion of the video game.
With continued reference to Figure 10, the displayed information may also include the game history information 10 2, from the user's friends. The game history information 1012 can provide information about a game mechanic history of the given friend, such as the most recent games played, the duration of the game mechanics sessions, statistics related to game mechanics, etc.
It will be appreciated that many methods and configurations for presenting a game interface in the cloud are possible in accordance with various embodiments of the invention. In such an embodiment, a method for displaying a current game state of users of a game system in the cloud is provided, including the following method operations: presenting a game interface in the cloud of a principal user; determine one or more secondary users who are friends of the primary user; determine a current state of each of the secondary users, present a live feed of a current game session of the secondary user online in the game interface in the main user's cloud, the live feed includes providing an option for the main user to join the current game session of the secondary user online; in response to receiving a request to activate the option for the primary user to join the current online secondary user's game session, start a multiple player mode of the current online game session of the secondary user, the multiplayer mode is provided for the game mechanics of the main user in the current game session of the online secondary user; wherein starting the multiple player mode includes determining a state of ownership of the primary user with respect to a video game by defining the current game session of the secondary user online; where when it is determined that the main user owns the video game, then the multiple player mode provides for the game mechanics a full version of the video game; where when it is determined that the main user does not possess the video game, then the multiple player mode provides for the game mechanics a limited version of the video game.
In one modality, the limited version of the video game defines a reduction, compared to the full version of the video game, in one or more of the available levels, available scenes, available features, a time limit, a virtual space, a campaign duration, a number of lives, or a number of reproductions.
In one embodiment, the method further includes an operation of the presenting method, when the video game is not owned by the primary user, an option for the primary user to purchase at least a portion of the video game.
In one embodiment, the live feed of the secondary user online is presented in a first resolution; and the selection of the presentation activates the live feed of the live feed of the secondary user online at a second higher resolution than the first resolution.
In one embodiment, the live feed of the online secondary user is presented in an unsaturated color mode; and the selection of the display activates the live feed of the live feed of the secondary user online in a saturated color mode.
In one embodiment, presenting the game interface in the cloud includes presenting a library of game titles associated with each of the secondary users.
In one modality, determining one more secondary users includes accessing a social profile associated with the primary user. In one modality, accessing the social profile includes accessing an API of a social network.
In one embodiment, presenting the game interface in the main user's cloud includes presenting a list of each of the secondary users in an order of priority, the order of priority based on one or more of the current states, news of entry, or common ownership of games with the main user.
Figure 11 illustrates a method for presenting live game mechanics feeds from friends of the current user, in accordance with one embodiment of the invention. In the operation of method 1100, a current user enters a game system in the cloud. In the operation of the method 1102, the friends of the current user are identified from a social profile of the current user. As can be seen, the social profile of the current user can be from a social network that is specific to the game system in the cloud or can be from a social network that exists apart from the game system in the cloud. In the operation of method 1104, the library information of the user's friends is retrieved. The library information of the user's friends can identify game titles in the libraries of the user's friends. In the operation of method 1106, the current states of the current user's friends are identified. If the current state of a given friend is online then in the operation of method 1108 it is determined whether the given friend is currently engaged in a game mechanic of a video game in the cloud-based system. If so, then in the operation of method 1112, a live game mechanics feed is obtained for that user. In the operation of method 1110, the friends of the current user are prioritized for deployment based on various factors or preferences. In the operation of method 1114, the current user's friends and their status information and live game mechanics feeds are presented in the order of priority. In one mode, friends of the current user who are online can prioritize friends who are currently offline. In one embodiment, the current user's friends who are currently engaged in active game mechanics can prioritize friends who are not currently engaged in active game mechanics. In one modality, the friends who have live game mechanics feeds available can be prioritized over other friends. In one modality, friends can be prioritized based on the current situation with which they have entered the game system in the cloud. In another modality, friends can be prioritized based on the common property of video juices. The above examples of friend prioritization are provided merely by way of example and not by way of limitation. It will be appreciated by those skilled in the art that in other embodiments, the friends of the current user can be prioritized and presented in accordance with the order of priority based on any other relevant factor.
Figure 12 illustrates a system that includes a game system in the cloud and a social network, in accordance with one embodiment of the invention. A 1200 cloud gaming system provides access to its cloud-based games. The game system in the cloud includes a game library 1202 which contains several game titles that can be played by users. The user data 1204 contains various types of data which are associated with the accounts of the users, such as game titles which are owned by a user, and any game mechanics saved from the user. In the illustrated mode, several sessions of game mechanics are shown conceptually, including an A session, a B session, and a C session. Session A defines the game mechanics of a user A, who observes the game mechanics of the game. session A on a screen 1208. The game mechanics of session A are displayed as view 1210 on the user screen A 1208. Similarly, the session B defines the game mechanics of a user B, which is displayed on the device B of the user 1212 horn is observed at 1214 In the illustrated mode, the view 1214 of the user B session shows an interface including the views live from other users, including a live view of session A and a live view of session C. As the game mechanics of sessions A and C proceed, the video of game mechanics comes out of sessions A and C and can be transmitted via the session of user B, to be displayed in the view of user B 1214. The video of the game mechanics can be processed for transmission via the session of user B, for example to decrease the resolution, size, frequency of frame, color saturation, etc. to conserve bandwidth.
With continuous reference to Figure 12, a social network 1216 is also shown. The social network 1216 includes user data 1218, which includes data such as the user's social profiles, publications, images, videos, bibliographic information, etc. Apps 1220 can run on the platform of the social network. A graphical user interface (GUI) 1222 defines an interface to interact with the social network. An API 1224 facilitates access to the social network. A notification module 1226 manipulates the notification of social network users according to their preferences. As noted, the user's B 1214 view includes live feeds from other users' sessions. In one embodiment, the friends of user B were determined based on accessing API 1224 of social network 1216 for determine the members of the social profile of user B. These members were cross-referenced against users of the game system in the cloud to provide live game mechanics feeds from friends of user B, including feeds from sessions A and C.
In one modality, user A chooses to share from his game mechanics session A to his social profile. The session of the user A communicates via the AP1 1224 to activate the notification module 1226 of the social network 1216 to send an appropriate notification to the friends in the social profile of the user A. When a user of the social network 1230 who is in the social profile of user A accesses the social network via a 1228 device, they can observe a message or publish from user A about user A's session. If configured, a user can receive notifications such as an email indicating that user A has shared something in the social network. It should be appreciated that user A can share about the activity of several related video games, such as achievements in a video game, invitations to play a video game, comments about a video game, an invitation to watch or play a part of the game that user A has created, a video clip of the game mechanics of user A, etc.
Figure 13 is a graph illustrating several game state variables over time, in accordance with the embodiments of the invention. It should be appreciated that in various embodiments, there may be many different types of game state variables that will be particular to games of specific videos. Those shown and described with reference to the illustrated mode are provided merely by way of example and not by way of limitation. Game state variables can include values which are defined by the running video game as well as values which are defined by a user input. In the illustrated embodiment, there are position variables shown indicating the X, Y, and Z positions of an object in a virtual space of a video game such as a symbol or a vehicle. Camera angle variables indicate the direction of a virtual camera or virtual view in the video game. In one embodiment, the camera angle is defined and measured by the azimuth component (e.g., along the horizontal plane) in relation to y azimuth of reference and a slope component measured relative to a tilt reference (for example in relation to the vertical). Action values such as the action variables A and B illustrated indicate the start and sustain several actions within the video game. It should be appreciated that the actions for a given video game will be specific to the context of the video game. As an example, actions could include the initiation of specific maneuvers, the application of skills, the activation of modification mechanisms that modify an existing action such as increasing its level of intensity or frequency, etc., or any other type of action. or activity that can be activated by user input during the course of the video game. With continued reference to Figure 13, a weapon variable indicates the activation of a video game weapon. A Health variable indicates a health level of, for example, a user symbol in the video game. Button variables indicate the state of buttons on a controlling device, for example if the button is in a depressed state or in a released manifestation. Game variables of the game lever in the illustrated mode indicate a magnitude of movement of a game lever relative to a neutral position. The above game state variables that have been described above with reference to the illustrated mode are merely exemplary, and it will be recognized by those skilled in the art that many other types of game state variables can be tracked over time.
Referring again to the embodiment of Figure 6, in one embodiment, the game state analyzer 614 may be configured to analyze the game state variables of a game mechanic recorded by the user. Based on the analysis of the recorded game mechanics of the user, several regions of interest of the game mechanics of the user can be defined and presented to the user as possible selections from which to generate a game part. For example, a region of game mechanics characterized by high levels of activity for certain game state variables may define a selection of the game mechanics recorded by the user. It should be appreciated that the activity level for a given game state variable may be based on several factors such as a level of intensity, an activation frequency, a holding duration, etc. In some modalities, the analysis of the variables of game state may involve searching for regions of game mechanics where the activity levels of two or more different game state variables are correlated in a predefined manner, eg, two or more different game state variables are correlated in a predefined way, for example, the two or more variables have high activity levels simultaneously. A high activity level can be determined based on a previously defined threshold.
In various embodiments, a region of interest of the user's recorded game mechanics may be determined automatically based on the threshold detection of any one or more of the following: one or more user inputs, user input speed, frequency of input , repetitions of input types, occurrences of input patterns, combination of inputs (for example, combined keys), motion vectors, pressure exerted on a controller, excitation of a user (for example, detected based on the captured image or user's audio data.
Figure 14A illustrates a method for generating a game part for a linear type video game, in accordance with one embodiment of the invention. Broadly speaking, a linear video game is one for which the progress of the player through the video game follows a linear course, where in order to progress through video game the player must complete a previous objective before progressing towards a later objective. Thus, all players must complete the same objectives in the same order to progress through the video game. The objectives in a linear video game can be linked to both geographical locations as well as temporal within the context of a historical temporal space line of the video game. In operation of method 1400, a scene of a video game is identified. The scene can be spatial and temporal in nature, and have several other objectives defined there. In the operation of method 1402, a spatial or temporal length of the scene is defined. The length of the scene can be defined according to the user input, and can also be defined based on objectives which are linked to the scene. In the operation of method 1404, the initial properties of the objects, symbols, or any other objects within the scene for which properties can be assigned, are defined. In the operation of method 1406, a game part is generated for the identified length of the scene having the initial properties as defined above.
Figure 14B illustrates a method for generating a game part for a world-class video game, in accordance with one embodiment of the invention. A worldwide open video game can be characterized as one in which the user is free to pursue any number of objectives for their own choice. L Video games of the type open to the world typically also allow the user to navigate one or more virtual spaces to Will. To progress to different levels of the video game or full video game, a certain set of objectives may need to be completed, however, users may be free to complete these objectives in different order. In the operation of method 1410, a location is identified within a virtual world of the video game. In the operation of method 1412. a limited portion of the virtual world is defined. The limited portion of the virtual world can be defined based on the user input by defining a selection of the game mechanics recorded as described herein. For example, a user's game mechanics in a worldwide open video game may vary over a wide variety of locations during the course of game mechanics. However, for the purpose of generating a game part, the user may select a portion of the game mechanics that occurs within a limited geographical region of the virtual world of the video game. This limited geographical region can be determined based on the analysis of the selected portion of the user of its recorded game mechanics, for example, by tracking a location of a symbol controlled by the user within the virtual world and defining the limits which include all the locations in which the symbol was found to exist. In the operation of method 1414, the initial properties of various objects, symbols, objects, vehs, or any other objects found within the previously defined limited geographical region of the user's recorded game mechanics selection are defined. In the operation of method 1416, the game part is generated based on the limited geographical region open to the world and the properties defined above Figure 14C illustrates a method for generating a game part for a sports video game, in accordance with one embodiment of the invention. The game part can be generated based on a user-defined selection of the game mechanics recorded by the users. In the operation of method 1420, a configuration is identified based on the selection of the user's recorded game mechanics. By way of example, the configuration may define a location of a sports event, such as a court, stadium, track, or any other place configuration in which an event of game mechanics may occur. In the operation of method 1422, a period of time is defined based on the selection of the game mechanics recorded by the user. The time period defines a temporary portion for which the game part will be generated, and can be defined based on the user selection of the recorded game mechanics. The time period of the video game sports may determine certain aspects of the video game part, such as the inclusion of special rules or activities that will occur at certain periods of time during a sport. In the operation of method 1424, the players of the game part are determined based on the players included in the selection of the user's recorded game mechanics. Players can include one or more symbols which have been defined by the user, as well as artificial intelligence symbols (Al), which were controlled by the Al symbol control logic of the running video game at the time of the game mechanics. As described elsewhere in the present, the actions of Al symbols can be approximate in some instances. While in some embodiments, the control logic of the Al symbol is defined by the game part with parameters such as these were defined in the user's gameplay selection of recorded game mechanics. In the operation of method 1426, scene attributes are defined for the location configuration of the sports video game part. For example, these may include attributes such as weather, track or stadium conditions, and other attributes of the sports configuration. In the operation of method 1428, the game part for the video game is generated based on the parameters mentioned above.
Figure 15 conceptually illustrates the formation of a multiple-game part, in accordance with one embodiment of the invention. A video game can be organized in several levels or stages. In the illustrated mode, a game mechanic of the multi-level user of a video game is shown. Specifically, a game mechanics timeline 1500 represents the game mechanics of the user of a first level of the video game, while a game mechanics timeline 1502 and a game mechanics timeline 1506 represent the game mechanics of the user of the second and third levels, respectively, of the video game. In accordance with one embodiment of the invention, a user can concatenate multiple game parts of the video game to form a larger multiple game party mini-game. In the illustrated embodiment, a selection 1502 of the timeline of the game mechanics of the user 1500 is used to define a first game part 1514. A selection 1504 of the timeline of the user's game mechanics 15 of two is used to define a second game part 1818. And a selection 1508 of the user game mechanics 1506 timeline is used to define a third one part of team 1522. The first, second and third parts of the game are arranged in a sequential order to define the mini-game 1510. By doing so, the mini-game 1510 includes game parts of each of the first, second and third third levels of the video game. In this way, a user who plays mini-game 1510 is able to experience limited portions of multiple levels of the video game playing through the 1510 mini-game. This can be useful in providing a more compelling preview or demonstration of a video game that conventional game shows where a user can play only a portion of a level or stage of the video game. The experience is analogous to that of a movie preview, which typically provides clips of different portions of the same movie, and not just a single clip. In accordance with the embodiments of the invention, users can enjoy multi-portion game demonstrations which provide a better sense of the scope of the complete video game, and which can present the game mechanics in a continuous manner of a portion. to the next.
In one embodiment, a user may insert additional material, such as a user-defined video, message, images or any other type of information, before or after a game part. In the modality Lustrated, an introduction 1512 is provided to introduce a player to mini-game 1510 and perhaps also introduce the player to the first game part 1514. Additionally, a message 1516 is inserted between the first game part 1514 and the second part of game 1518 while another message 1520 is inserted between the second game part 1518 and the third game part 1522. In one embodiment, the inserted material may include video of the game mechanics recorded from the game mechanics of the original user from the which was generated the game part. If the recorded game mechanics are displayed before playing the game part, the player who starts the game part can better understand the game part and its objectives before starting the game mechanics, while showing after the game mechanics of the game part, the player can understand how his game mechanics of the game part is compared with that of the original user.
Figure 16 conceptually illustrates the generation of a multi-game mini-game part, in accordance with one embodiment of the invention. In the illustrated embodiment, a recorded game mechanics timeline of user 1600 is shown for game A, along with the recorded game mechanics timeline of user 1604 for a game B, and the mechanics timeline of game play of user 1608 for a game C. In one mode, the user is able to generate a mini-game of multiple game part based on the game parts of different video games. In the illustrated embodiment, a selection 1602 of the recorded game mechanics of user 1600 is used to define a game part 1616, while a 1606 selection of the recorded game mechanics of the user 1604 is used to define a game part 1620, and a selection 1610 of the game mechanics recorded of the users 1608 is used to define a game part 1624. The game parts 1616, 1620 and 1624 are arranged sequentially to define the mini-game 1612. Optionally, additional material may be inserted before or after a game part. In the illustrated embodiment, an insert 1614 is inserted before the game part 1616, while an interlude 1618 is defined between the game part 1616 and the game part 1620, and another interlude 1622 is defined between the game part 1620 and the game part 1624.
In the illustrated mode, it will be appreciated that the game parts are taken from different video games. This allows the user great flexibility to mix together the game parts from across different game titles, genres, and even generations of console platform. Merely by way of example, a user can create a mini-game that has game parts of each of several game titles in a series of unique video game. In this way, a player of the mini-game is able to experience and appreciate the evolution of the video game series in an experience of continuous game mechanics.
The embodiments of the invention have generally been described with reference to cloud-based gaming systems. However, it should be appreciated by those skilled in the art that similar concepts and principles that have been described herein may be applied to the systems of traditional console-based video game, possibly in combination with cloud-based gaming systems. For example, a user can play a console-based video game and have input data from the user's game mechanics and metadata that remain in the game recorded during game mechanics. Based on the input data of the user's game mechanics and game state metadata, the current game mechanics output of the video game may be regenerated at a later time. Therefore, the recorded input data and game state metadata can be used in combination with the video game code to provide an interface for selecting a portion of the user's game mechanics from which to generate a mini-game. game, as described. The mini-game code can be generated in the console and uploaded to a cloud system and made available for download by other users. In another embodiment, the mini-game code is generated by the cloud system after receiving the selected portion of the input data from the game mechanics of the game state metadata and users. The system in the cloud processes the selected portion of the input data of the game mechanics of the game state metadata and users to generate the mini-game code based on the video game code stored in the game system. cloud. Once generated, the mini-game can be made available for the cloud-based game where the execution of the mini-game occurs in the cloud-based system, but can also be made available for download to systems based on the cloud.
Traditional console for execution in the console to facilitate game-based mechanics of the mini-game console. In this way, mini-games can be created and played by both users of console-based video game systems and users of cloud-based video game systems.
The embodiments of the invention have generally been described with reference to mini-games or user-defined playable video game portions. However, it will be appreciated by those skilled in the art that many of the principles illustrated herein also readily apply to the generation and sharing of recorded game mechanics, including video sharing of recorded game mechanics, capture screens, and live streaming of active game mechanics. In some modalities, providing access to a mini-game (for example, in response to receiving a notification) may include a recorded video presentation of the game mechanics of the original user which formed the basis for the mini-game. In still other modalities, methods, systems and interfaces are contemplated to facilitate the user's game mechanics sharing towards the social profile of the user.
In one embodiment, a method for storing game mechanics is contemplated. Game mechanics can be executed by the operating system of a game console in response to a user request, which can come in the form of a standard file operation with respect to a set of data associated with game mechanics desired. The request can be transmitted want an application associated with a game. The game mechanics may comprise, for example, video content, audio content and / or static visual content, including wallpaper, themes, "added" code content, or any other type of content associated with a game. It is contemplated that such content may be generated by the user or developed free or paid, complete or trial, and / or for sale or for rent.
A portion of the game mechanics can be stored in buffer, that is, stored temporarily. For example, the previous 15 seconds, the level previously completed, for the pre-action within the game mechanics may be stored temporarily, as described further herein. The term "portion" used herein may correspond to any part of game mechanics that is divisible into any related or arbitrary groups of individual or multiple bits or bytes of data. For example, "portions" of game mechanics may correspond to levels, chapters, scenes, acts, symbols, backgrounds, textures, courses, actions, songs, themes, durations, sizes, files, parts thereof, and combinations of the same. In addition, portions of the game mechanics may comprise screenshots or statutory video capture durations.
In one embodiment, portions of the game mechanics can be stored locally in the game console in any temporary or permanent storage. Alternatively or additionally, parts of the game mechanics can be transmitted over a network stored remotely. For example, portions of the game mechanics may be transmitted over a wired or wireless network to another computing device, to another game console, or to a remote server. Such remote servers can include social media servers.
Optionally, portions of game mechanics not recovered from the temporary memory or portions of the game mechanics outside of a particular game interval (e.g., a particular duration, level, chapter, course, etc.) may be removed from the game. the temporary memory. This removal process can be completed using standard file operations in the operating system.
The portions of the game mechanics can be displayed on any number of screen devices that have access to the stored game mechanics. For example, the game mechanics stored can be displayed on a television connected to the game console from which the game mechanics was captured. In another example, the stored game mechanics can be displayed on a computer to which the stored game mechanics was transmitted. The stored game mechanics can be displayed alone or in combination with other information, such as on a social media website.
In one embodiment, portions of the game mechanics are deployed by another game console associated with the user in addition to the user who stored in temporary memory or captured the game mechanics.
In accordance with this embodiment, portions of the game mechanics may show a ball being thrown from a first user to a second user, from the point of view of the first user. The portions of the video game can then be transmitted to the game console of the second user. Thus, then the second user can observe the game mechanics from the point of view of the first user. The second user may also have portions of the game mechanics stored by showing the ball being thrown by the first user and captured by a second user, from the point of view of the second user. In this mode the second user can recover the game mechanics from the point of view of the first user and the point of view of the second user. Still further, the portions of the game mechanics stored by the second user can be transmitted to the game console of the first user, such that the first user can review the game mechanics from two points of view. This mode can apply to any number of users that has any number of views, so the game mechanics can be reviewed from any number of different perspectives.
With respect to storing, transmitting and / or deploying portions of the game mechanics as described herein, it is contemplated that the game mechanics portions may be stored, transmitted and displayed as video or image data. In another mode, however, portions of the game mechanics can be stored and transmitted as telemetry or metadata representative of the image or video data, and can be re-created as video images by a game console or other device before deployment.
In some embodiments, the portion of the game mechanics has a predetermined relationship with the game mechanics executed. For example, the portion of the game mechanics may correspond to a certain amount of game mechanics before the game mechanics currently running, such as the previous 10 seconds of game mechanics. In another embodiment, a first portion of the video game has a predetermined relationship with a second portion of the game mechanics. For example, the first portion of the game mechanics may correspond to a certain amount of game mechanics before receipt of a request to capture a second portion of the game mechanics, such as the 10 seconds of game mechanics before the game. selection of a capture button. In each of these embodiments, the amount of game mechanics stored in buffer before the current game mechanics or the requested game mechanics can be configured and adjusted by the user in accordance with his particular preferences.
In other modalities, the temporary buffer is "intelligent" or "elastic", such that it captures the game mechanics according to the variables without time estimation. In one embodiment, the first portion of the game mechanics has a predetermined relationship with an event related to game mechanics. For example, the first portion of the game mechanics can be stored in temporary memory to include a statistical anomaly, such as a high score reached, the occurrence of a large number of points in a short period of time, multiple selections of buttons in a controller, and other rare events . Such statistical anomalies can be determined by comparing metrics from game mechanics to average metrics for a particular game or scene or for all games generally. Such average metrics can be stored locally or remotely for comparison. For example, the game console can track high global ratings for a particular game, and store in temporary memory the game mechanics in which a user approaches and surpasses that high rating. In another example, a remote server can track global high scores for a particular game, and can communicate that information to the game console, which stores in temporary memory the game mechanics in which the user approaches and surpasses that high rating. .
In another example, the game mechanics portion can be stored in buffer to include an achievement, such as a trophy that is achieved or another important event being achieved. Such trophies or monuments commemorate any goal or game achievement, such as a certain number of points achieved, a certain level achieved, and the like. For example, game mechanics can be stored to include the awarding of a trophy for reaching level 10, for reaching 100,000 points, etc.
Similarly, progress towards an event, in addition to the current award of the trophy or statistical anomaly, can be stored to be included in the portion of the game mechanics. For example, a capture screen can be taken at each of levels one through 10, creating a photo album to commemorate the reception of a trophy by reaching level 10. Another example, a video can be taken from the user by winning a race for the first to fifth times, where a trophy is awarded to five winners.
Thus, in accordance with the embodiments of the invention, at least a portion of the executed game mechanics can always be kept in a running buffer. In other words, when a request to share a portion of the game mechanics is received, a portion of the previous game mechanics may already be captured to include previous footage. For example, if a request to share the game mechanics is received after a user crosses the final line in a racing game, the game mechanics stored in temporary memory may include footage of the user crossing the final line. In other words, a user will be able to capture the moments that occur before the request is made to share the game mechanics.
Figure 17 illustrates an interface for sharing game mechanics, in accordance with one embodiment of the invention. The interface 1700 as shown includes several selection icons to facilitate sharing with a user's friends, for example members of a social user's chart. In one mode, the 1700 interface can be accessed a dedicated button on a controller device. When the button is pressed during game mechanics, the 1700 interface can be presented to allow the user to share their game mechanics.
The icon 1702 can be selected to start loading a screenshot of the user's game mechanics. In one embodiment, region 1704 of icon 1702 is populated with a capture screen representative of the user's recent game mechanics, thus providing a small-scale preview of the capture screen that can be shared. In one embodiment, the selection of icon 1702 can provide access to an additional capture screen selection interface which allows the user to select a particular capture screen from the user's recorded game mechanics to share the user's social profile. This can take the form of a navigable game mechanics video timeline, which can be traversed or navigated to identify a particular time point within the game mechanics and its corresponding capture screen.
The icon 1706 can be selected to start loading a video of the user's game mechanics. Region 1708 of icon 1706 can be configured to display a video clip representative of the user's recent game mechanics, for example the last 5 seconds of the user's game mechanics, a recent achievement, etc., thus displaying a preview Minor scale of a video clip of the user's game mechanics which can be shared with others.
The icon 1710 can be selected to initiate the live video transmission of the user's active game mechanics. In one embodiment, the selection of the 1710 icon will activate the resumption of the user's game mechanics while initiating the live video transmission of the user's game mechanics. In another modality, the selection of the 1710 icon provides access to an interface to determine the configuration to broadcast the video transmission, such as to whom to share the video transmission, if it includes a video transmission from a local image capture device intended to show the same user during game mechanics, if you allow comments, etc.
It will be appreciated that the user can share the game mechanics (eg, a selected capture screen, video, or live gameplay transmission) to one or more specifically selected friends, to their full social profile, or to any user of the game. social profile The social network can be a social gaming network associated with the platform on which the video game runs, or a third party social network that exists separate from the video game or its platform. The social network can be accessed through a defined API to allow interaction with the social network. Users to whom the game mechanics have been shared can receive a notification informing them of the shared game mechanics. Such notification may take the form of a publication to a source of social news, a private message through the social network, a login-game notification, an email, a notification of conversation, etc. Sharing game mechanics towards the social network may involve making game mechanics available to other subsets of social network users who may or may not be part of the social profile of the shared user. For example, for a given video game, game mechanics can be shared or made available to any user of the social network who also owns the video game and therefore access is granted to the shared game mechanics of the video game . Such shared game mechanics can be accessed through online forums, virtual conversation rooms, or other online channels that are available only to players of the video game. In one modality, a video game can have a dedicated page or site in the social network. The shared game mechanics can be made available to users who access the page or site of the video game. Of course, it will be appreciated that from the perspective of the user sharing, options can be provided to allow the user to specify and customize who and what forum to which their game mechanics will be shared.
Figure 18 illustrates an interface 1800 for selecting a portion of video game mechanics recorded for sharing, in accordance with one embodiment of the invention. Interface 1800 includes a preview region 1802 which repeats a preview of a currently selected portion of video from the user's game mechanics. The capture screens 1804, 1806, 1808, 1810 and 1812 are placed adjacent to each other in chronological order to define a timeline of capture screens indicating the video content of game mechanics. The various screens of capture, 1804, 1806, 1808, 1810 and 1812 can be frames of image extracted at regular intervals from the video of game mechanics. The capture screen timeline can be moved to the right or left to show additional frames preceding or proceeding those currently displayed. Markers 1814 and 1816 indicate the start and end points for a currently selected video clip (or segment or portion). The currently selected video clip can be repeatedly played in preview region 1802, as noted.
In one embodiment, the buttons 1818 and 1820 may be selected to reduce or increase, respectively, the duration of the selected video portion. A preview option 1822 can be selected to activate playback of a full-screen preview of the currently selected video clip. A trimming option 1824 can be selected to access additional video trimming features.
Figure 19 illustrates a 1900 interface for viewing a live video stream of a user's game mechanics, in accordance with one embodiment of the invention. In reference 1902, the name and / or alias of the user who is transmitting his game mechanics is shown. In reference 1904, the number of users who are currently watching the live video broadcast is shown. The live video broadcast of the user's game mechanics is deployed in the video display region 1906. Additionally, a live user video 1908 may be included, showing live video of the current user whose game mechanics are being broadcast live. A command option 1910 can be selected to allow user observation of commands that affect the video game, or even take over control of the user's playing mechanics by transmitting. It will be noted that the issuance of in-game commands or remotely controlling the game mechanics of the user may require permission, either predefined or requested at the time of the game mechanics, from the user playing the game to allow the user to observe performing such Actions.
An option 1912 allows the observer to join the game mechanics of the user transmission. And a 1914 option allows the user observing compare the video game in progress.
In addition, the interface can include a comments section 1916 which displays the comments of the users observing the live video stream, indicating the time of each comment. A 1918 comment entry field is provided for the observer to enter text for a component to be published. And the 1920 send button is pressed to load the comment for observation by the user transmitting live and others watching the live video broadcast.
Although the embodiments of the invention have been described with reference to accessing several interfaces for sharing from a button of Dedicated pressure, it will be appreciated that in other modalities, some or all of these interfaces may not be required to facilitate sharing the game mechanics to a social user profile. For example, in one mode, a controller button can be configured to capture a screenshot of the user's game mechanics when pressed. The captured capture screen can then be automatically loaded and shared to the user's social profile.
In another modality, pressing a specific button on the controller initiates the video recording of the game mechanics. When the specific button is pressed a second time, the video recording of game mechanics is stopped, and the video clip can be uploaded and shared to the user's social profile. In one embodiment, the loading and sharing of the video clip to the social profile of the user may occur automatically after the completion of the video recording operation. However, in another mode, when the specific button is pressed a second time to stop recording an interface is presented to allow the user to customize various options such as trimming the video, selecting a representative capture screen for the video, determining the users specific with which the video is shared, incorporate a header or title, etc. After personalization by the user, the video can be shared with others or made available in another way for observation.
In a modality, a specific button on the controller can be configured to share a predefined duration of mechanical video of game in a social network. For example, a user can specify that when the button is pressed, the previous 10 seconds of the game mechanics video will be shared to the user's social profile. In another mode, it can be specified that when the button is pressed, the next 10 seconds of the game mechanics video will be recorded and shared to the social profile. It will be appreciated that the options for trimming the video and performing other types of customization can be applied to the recorded game mechanics video. In addition, the recorded game mechanics video of a predefined duration after the button is activated can be combined with game mechanic video stored in temporary memory previously as described.
In yet another embodiment, a specific button on the controller device may be configured to initiate live video transmission of the game mechanics of active users. A live video stream can be predefined to be made available only to members of the user's social profile, or to other major or minor user groups, such as a specific subset of the user's social profile, all users who own or have access to it. another way to the same video game, any user of the game platform, etc.
Automatic generation of suggested mini-games for the game in the cloud based on the game mechanics recorded In one embodiment, a method for generating a playable limited version of a video game is provided, including the following method operations: recording a user game mechanic of a full version of the video game; analyze the mechanics of the user's recorded game to determine a region of interest; define limits within a context of the game mechanics of the video game based on the region of determined interest; and generate the limited version of the video game based on the defined limits; where the method is executed by a processor.
In one embodiment, the recording of the game mechanics of the user includes recording one or more of the user's input data or game state data. In one embodiment, analyzing the recorded game mechanics of the user includes determining activity levels of user input data or game state data, the region of interest being a region having activity levels that exceed a predefined threshold.
In one modality, defining the boundaries within a game mechanics context of the video game includes defining a spatial boundary within a virtual space of the video game. In one modality, the spatial boundary within the virtual space of the video game defines a portion of the virtual space smaller than one completely of the virtual space, the portion of the virtual space having a subset of features of the virtual space.
In another modality, define the limits within a context of Game mechanics of the video game includes defining a time limit within a temporal context of the video game.
In one modality, defining boundaries includes identifying a closer beginning portion or a closer end point than one or more of a stage, level, or scene.
In one embodiment, analyzing the recorded game mechanics of the user includes determining the user's game state settings based on the user's recorded game mechanics; and generating the limited version of the video game includes defining the limited version of the video game to have the initial game state configuration based on the user's particular game state configuration.
In a modality, the region of interest is automatically identified based on the correspondence to one or more thresholds. In one embodiment, at least one of the thresholds is associated with one or more user inputs, a user input speed, a user input frequency, repetitions of a user input, an input pattern, context sharing of game mechanics with other users, publication of comments associated with game mechanics in a social network, or popularity of portions of the video game based on levels of social network sharing.
In another embodiment, a method for generating a playable limited version of a video game is provided, including the following method operations: recording a game mechanic of the user of a video game. full version of the video game; analyze the mechanics of the user's recorded game to determine one or more regions of interest; present each of the regions of interest for selection; receive a selection entry indicating a selected region of interest; for the selected region of interest, define limits within a context of game mechanics of the video game based on the selected region of interest; and generate the limited version of the video game based on the defined limits; where the method is executed by a processor.
In one embodiment, the recording of the game mechanics of the user includes recording one or more of the user's input data or game state data. In one embodiment, analyzing the recorded game mechanics of the user includes determining levels of activity of user input data or game state data, each region of interest being a region having activity levels that exceed a predefined threshold.
In one embodiment, analyzing the recorded game mechanics of the user includes determining the user's game state settings based on the user's recorded game mechanics; and generating the limited version of the video game includes defining the limited version of the video game to have the initial game state configuration based on the user's particular game state configuration.
In a modality, the region of interest is automatically identified based on the correspondence to one or more thresholds.
In another modality, a method is provided to generate a playable limited version of a video game, including the following operations of the method: record a game mechanic of the user of a full version of the video game; wherein registering the user's game includes recording one or more user input data or game state data; analyze the mechanics of the user's recorded game to determine one or more regions of interest; where each region of interest is automatically identified based on correspondence to one or more thresholds; present each of the regions of interest for the selection; receive a selection entry indicating a selected region of interest; for the selected region of interest, define the limits within a context of game mechanics of the video game based on the selected region of interest; generate the limited version of the video game based on the defined limits; where the method is executed by a processor.
In one modality, defining the boundaries within a game mechanics context of the video game includes defining a spatial boundary within a virtual space of the video game; the spatial boundary within the virtual space of the video game defines a portion of the virtual space less than one completely of the virtual space, the portion of the virtual space having a subset of virtual space features.
In another modality, defining the limits within a game mechanics context of the video game includes defining a temporal limit within a temporal context of the video game.
In one modality, defining the limits includes identifying a nearest start portion or end point closest to one or more of a stage, level or scene.
In one embodiment, at least one of the thresholds is associated with one or more user inputs, a user input speed, a user input frequency, repetitions of a user input, an input pattern, context sharing of game mechanics with other users, publication of comments associated with game mechanics in a social network, or popularity of portions of the video game based on levels of social network sharing.
Automatic generation of a multi-part mini-game for the game in the cloud based on the game mechanics recorded In one embodiment, a method is provided for generating a playable limited version of a video game, which includes the following method operations: recording a user's game mechanics of a full version of the video game; determining a plurality of user-defined portions of the recorded game mechanics of the user; for each defined portion of the user, define limits within a context of game mechanics of the video game based on the defined user portion; and generate a playable portion of the video game based on the defined limits; place each of the playable portions of the video game in a series to define the limited version of the video game; where the method is executed by a processor.
In one embodiment, determining each user-defined portion of the user's recorded game mechanics includes receiving a user defined starting point and a user defined endpoint within the user's recorded game mechanics, and determining the user defined portion based on the starting point defined by the user received and the end point defined by the user.
In one embodiment, the recording of the game mechanics of the user includes recording one or more of the user's input data or game state data. In one embodiment, generating the playable portion of the video game includes analyzing the game state data to identify code elements, and assembling the code elements to define executable code by defining the playable portion of the video game.
In one modality, defining the boundaries within a game mechanics context of the video game includes defining a spatial boundary within a virtual space of the video game. In one modality, the spatial boundary within the virtual space of the video game defines a portion of the virtual space smaller than one completely of the virtual space, the portion of the virtual space having a subset of features of the virtual space.
In another modality, defining the limits within a game mechanics context of the video game includes defining a temporal limit within a temporal context of the video game.
In one embodiment, the method also includes, for each portion defined by the user, analyzing the mechanics of the user's recorded game. to determine the user's game state configuration; and wherein generating the playable portion of the video game includes defining the playable portion of the video game to have initial game state configurations based on the particular user's game state configuration.
In one modality, defining boundaries includes identifying a closer beginning portion or a closer end point than one or more of a stage, level or scene.
In one modality, the method also includes, recording a video defined by the user; wherein arranging the playable portions of the video game includes arranging the user-defined video at a location within the series preceding or following one of the playable portions of the video game. In one embodiment, the user-defined video includes video of at least one of the user-defined portions of the user's recorded game mechanics.
In another embodiment, a tangible computer readable medium having program instructions incorporated therein is provided to generate a playable limited version of a video game, including the following program instructions for recording a user's game mechanics of a full version of the game. videogame; program instructions for determining a plurality of user-defined portions of the user's recorded game mechanics; program instructions for, for each portion defined by the user, define limits within a context of game mechanics of the video game based on the portion defined by the user, and generate a playable portion of the video game based on the defined limits; program instructions to arrange each of the playable portions of the video game in a series to define the limited version of the video game.
In one embodiment, determining each user-defined portion of the user's recorded game mechanics includes receiving a user defined starting point and a user defined endpoint within the user's recorded game mechanics, and determining the user defined portion based on the starting point defined by the user received and the end point defined by the user.
In one embodiment, the recording of the game mechanics of the user includes recording one or more of the user's input data or game state data. In one embodiment, generating the playable portion of the video game includes analyzing the game state data to identify code elements, and assembling the code elements to define executable code by defining the playable portion of the video game.
In one embodiment, the tangible computer readable medium further includes, program instructions for, for each portion defined by the user, analyzing the user's recorded game mechanics to determine the user's game state configuration; and wherein generating the playable portion of the video game includes defining the playable portion of the video game to have initial game state configuration based on the particular user's game state configuration.
In another embodiment, a system is provided, which includes the following: at least one server computer device, at least one server computer device that has logic to generate a playable limited version of a video game, which includes, logic for record a user's game mechanics of a full version of the video game; logic for determining a plurality of user-defined portions of the user's recorded game mechanics; logic for, for each user-defined portion, define boundaries within a context of video game mechanics based on the user-defined portion, and generate a playable portion of the video game based on the defined limits; logic to place each of the playable portions of the video game in a series to define the limited version of the video game.
In one embodiment, determining each user-defined portion of the user's recorded game mechanics includes receiving a user defined starting point and a user defined endpoint within the user's recorded game mechanics, and determining the user defined portion based on the starting point defined by the user received and the end point defined by the user.
In one embodiment, the recording of the game mechanics of the user includes recording one or more of the user's input data or game state data. In one embodiment, generating the playable portion of the video game includes analyzing the game state data to identify code elements, and assembling the code elements to define code executable defining the playable portion of the video game.
In one embodiment, the logic further includes, logic for, for each portion defined by the user, analyzing the game mechanics recorded by the user to determine the user's game state configuration; and wherein generating the playable portion of the video game includes defining the playable portion of the video game to have initial game state configurations based on the user's particular game state configuration.
Sharing the game mechanics recorded to a social profile In one embodiment, a method is provided for sharing the recorded game mechanics to a social profile, including the following method operations: recording video of a user's game mechanics during an active state of a game mechanics session; receive a command to initiate a share operation during the active state of the game session; in response to receiving the command, entering the paused state of the game mechanics session and presenting a co-participation interface; processing the received input via the sharing interface to determine a user-defined selection of the recorded video; share the user-defined selection of the recorded video to a user's social profile; resume the active state of the game mechanics session; where the method is executed by a processor.
In one modality, the video recording of the mechanics of The user's game includes storing the video in a temporary memory during the active state of the game mechanics session.
In one embodiment, presenting the sharing interface includes retrieving and presenting one or more portions of the video from the buffer.
In one embodiment, the sharing interface includes a duration selector to define a duration of the selection defined by the user.
In a modality, sharing the selection includes loading the selection for availability in a social network service that defines the social profile of the user.
In one mode, loading the selection includes compressing the selection and loading the compressed selection.
In a modality, sharing the selection includes generating a notification to a member of the user's social profile.
In one embodiment, the notification is defined by one or more of a private message, a conversation message, a login-game notification, a publication to a social news source, an email.
In one modality, sharing the selection includes presenting the selection on a user's profile page.
In a modality, receiving the command to initiate the sharing operation is defined from a push button on a coritrolador device.
In another modality, a method is provided to share the game mechanics recorded to a social profile, including the following method operations: recording video of a user's game mechanics during an active state of a game mechanic session; receive a command to initiate a shared operation during the active state of the game mechanic session; in response to receiving the command, determine a user-defined selection of the recorded video; share the user-defined selection of the recorded video to a user's social profile; where the method is executed by a processor.
In one embodiment, the video recording of the user's game mechanics includes storing the video in a temporary memory during the active state of the game mechanics session.
In one modality, determining the user-defined selection of the recorded video includes retrieving and presenting the video from the temporary memory, and processing the user input by identifying a portion of the video.
In a modality, sharing the selection includes loading the selection for availability in a social network service that defines the social profile of the user.
In one mode, receiving the command to start the sharing operation is defined from a push button on a controlling device.
In another modality, a non-transient computer readable medium is provided that has program instructions defined in it for Share the game mechanics recorded to a social profile. The program instructions include: program instructions for recording video of a user's game mechanics during an active state of a game mechanics session; program instructions to receive a command to initiate a shared operation during the active state of the game mechanic session; program instructions for, in response to receiving the command, determining a user-defined selection of the recorded video; program instructions to share the defined selection of recorded video to a user's social profile.
In one embodiment, the video recording of the user's game mechanics includes storing the video in a temporary memory during the active state of the game mechanics session.
In one modality, determining the user-defined selection of the recorded video includes retrieving and presenting the video from the temporary memory, and processing the user input by identifying a portion of the video.
In a modality, sharing the selection includes loading the selection for availability in a social network service that defines the social profile of the user.
In one embodiment, receiving the command to start the sharing operation is defined from a pressure button on a controlling device.
Remote control of a first-user game mechanic by a second user In one embodiment, a method for providing remote control of a user's game mechanics is provided, the method includes the following method operations: presenting a live video feed of a first-user game mechanics to a second user; processing a request for game transition control of the game mechanics of the first user to the second user; initiate control of the game mechanics of the first user by the second user; where the method is executed by at least one processor.
In one embodiment, initiating the control of the game mechanics of the first user by the second user includes deactivating the control of the game mechanics of the first user by a first controller device associated with the first user, and activating the control of the mechanics of the first user. game of the first user by a second controller device associated with the second user.
In one embodiment, the control of the game mechanics of the first user by the second controlling device includes receiving the input commands from the second controlling device and applying the input commands to define the game mechanics of the first user.
In one modality, the live video feed is presented through a social interface to the second user, the social interface provides access to a social profile of the second user, the first user being defined as a member of the social profile of the second user.
In one modality, the social interface includes a comment interface to post comments during the game mechanics of the first user.
In one embodiment, processing the request for transition control includes receiving an acknowledgment from the first user to allow control of the game mechanics of the first user by the second user.
In one embodiment, wherein presenting the live video feed of the game mechanics of the first user to the second user includes presenting the live video feed in a non-full screen format; and initiating control of the game mechanics of the first user by the second user includes activating the presentation of the live video feed in a full screen format.
In another embodiment, a method for providing multiplayer game mechanics is provided, including the following method operations: displaying live video feed from the game mechanics session of the first user to a second remote user; processing a request for a second user to join the game mechanics session of the first user; start the game mechanics by the second user in the game mechanics session of the first user; where the method is executed by at least one processor.
In one modality, start the game mechanics by the second user in the game mechanics session of the first user includes starting a multiplayer mode of a video game.
In one modality, the live video feed is presented through a social interface to the second user, the social interface provides access to a social profile of the second user, the first user being defined as a member of the second user's social profile .
In one modality, the social interface includes a comments interface to publish the comments during the game mechanics of the first user.
In one embodiment, processing the request to join the game mechanics session of the first user includes receiving an acknowledgment from the first user to allow the game mechanics for the second user in the mechanics session of the first user.
In one embodiment, presenting the live video feed of the game mechanics of the first user to the second user includes presenting the live video feed in a non-full screen format; and starting the game mechanics by the second user in the game mechanics session of the first user includes activating the presentation of the live video feed in a full screen format.
In another embodiment, a non-transient computer readable medium having program instructions integrated therein is provided to provide remote control of a user's game mechanics, the program instructions include: program instructions for presenting a live video feed of a game mechanic from the first user to a second remote user; program instructions for processing a request for transition control of the game mechanics of the first user to the second user; program instructions to initiate control of the game mechanics of the first user by the second user.
In one embodiment, initiating the control of the game mechanics of the first user by the second user includes deactivating the control of the game mechanics of the first user by a first controller device associated with the first user, and activating the control of the mechanics of the first user. game of the first user by a second controller device associated with the second user.
In one embodiment, the control of the game mechanics of the first user by the second controlling device includes receiving the input commands from the second controlling device and applying the input commands to define the game mechanics of the first user.
In one modality, the live video feed is presented through a social interface to the second user, the social interface provides access to a social profile of the second user, the first user being defined as a member of the social profile of the second user. user.
In one modality, the social interface includes a comments interface to publish the comments during the game mechanics of the first user.
In one embodiment, processing the request for transition control includes receiving an acknowledgment from the first user to allow control of the game mechanics of the first user by the second user.
In one embodiment, presenting the live video feed of the game mechanics of the first user to the second user includes presenting the live video feed in a non-full screen format; and initiating control of the game mechanics of the first user by the second user includes activating the presentation of the live video feed in a full screen format.
Figure 20 illustrates hardware and user interfaces that can be used to provide interactivity with a video game, in accordance with one embodiment of the present invention. Figure 20 schematically illustrates the architecture of the total system of the Sony® Playstation 3® entertainment device, a console that can be compatible to connect a control device to a computer program executed on a base computing device in accordance with the modes of the present invention. A system unit 2000 is provided with several peripheral devices connectable to the system unit 2000. The system unit 2000 comprises: a Cell Processor 2028; a Rambus® 2026 dynamic random access memory (XDRAM) unit; a graphics unit of Reality Synthesizer 2030 with a dedicated video random access memory (VRAM) 2032 unit; and a bridge I / O 2034. The 2000 system unit also comprises a BD-ROM® 2040 Blu Ray® Disk optical drive for reading from a 2040a disc and a removable in-slot (HDD) 2036 hard disk drive, accessible through of the I / O bridge 2034. Optionally the system unit 2000 also comprises a memory card reader 2038 for reading compact flash memory cards, Memory Stick® memory cards and the like, which is similarly accessible through the bridge of I / O 2034.
The 2034 I / O bridge also connects to six Universal Serial Bus (USB) 2.0 2024 ports, a 2022 gigabit Ethernet port, an IEEE 802.11b / g (Wi-Fi) 2020 wireless network port, and a link port Bluetooth® 2018 wireless capable of supporting up to seven Bluetooth connections.
In operation, the 2034 I / O bridge handles all wireless, USB and Ethernet data, including data from one or more game controllers 2002-2003. For example, when a user is playing a game, the 2034 I / O bridge receives data from the 2002-2003 game controller via a Bluetooth link and directs it to the 2028 Cell processor, which updates the current state of the game as corresponds The USB and Ethernet, wireless ports also provide connectivity for other peripheral devices in addition to game controllers 2002-2003, such as: a remote control 2004; a keyboard 2006; a mouse 2008; a portable entertainment device 2010 such as a Sony Playstation Portable® entertainment device; a camera of video such as an EyeToy® 2012 video camera; a 2014 microphone headset; and a microphone 2015. Such peripheral devices can therefore be connected to the system unit 2000 wirelessly; for example, the portable entertainment device 2010 can communicate via an ad-hoc Wi-Fi connection, while the 2014 microphone headset can communicate via a Bluetooth link.
The provision of these interfaces means that the Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), connection boxes, digital cameras, portable media players, Voice over IP phones, mobile phones, printers and scanners.
In addition, a legacy memory card reader 2016 can be connected to the system unit via a USB 2024 port, allowing the reading of 2048 memory cards of the type used by the Playstation® or Playstation 2® devices.
The game controllers 2002-2003 are operable to communicate wirelessly with the system unit 2000 via the Bluetooth link, or to be connected to a USB port, which also provides power through which to charge the battery of game controllers 2002-2003. Game controllers 2002-2003 can also include memory, a processor, a memory card reader, permanent memory such as flash memory, light emitters such as an illuminated spherical section, LEDs, or infrared lights, microphone and speakers for ultrasound communications, an acoustic camera, a digital camera, an internal clock, a recognizable shape such as the spherical section oriented to the game console, and wireless communications using protocols such as Bluetooth®, WiFi ™, etc.
The game controller 2002 is a controller designed to be used with two hands, and the game controller 2003 is a single hand controller with an add-on. In addition to one or more analog levers and conventional control buttons, the game controller is susceptible to three-dimensional location determination. Consequently, gestures and movements by the user of the game controller can be translated as inputs to a game in addition to or instead of the conventional button or game lever commands. Optionally, other peripheral devices enabled wirelessly such as the Playstation ™ Portable device can be used as a controller. In the case of the Playstation ™ Portable device, additional game or control information (for example, control instructions or number of lives) can be provided on the device screen. Other alternative or complementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or custom controllers, such as as a single button or several lengths for a quick response test game (also not shown).
The remote control 2004 is also operable to communicate wirelessly with the system unit 2000 via a Bluetooth link. He Remote control 704 includes controls suitable for the operation of the Blu Ray ™ Disk BD-ROM 2040 reader and for the navigation of the disc contents.
The Blu RayTM Disk BD-ROM 2040 is operable to read the CD-ROMs compatible with Playstation and PlayStation 2 devices, in addition to conventional rewritable and pre-recorded CDs, and the so-called Super Audio CDs. The 2040 reader is also operable to read DVD-ROMs compatible with Playstation 2 and PlayStation 3 devices, in addition to conventional recordable and pre-recorded DVDs. The 2040 reader is also operable to read BD-ROMs compatible with the PlayStation 3 device, as well as conventional recordable and pre-recorded Blu-Ray Discs.
System 2000 unit is operable to provide audio and video, generated or decoded by the Playstation 3 device via the graphics unit of Reality Synthesizer 2030, through audio and video connectors to a sound output and display device 2042 such as a monitor or television having a 2044 screen and one or more 2046 speakers. The 2050 audio connectors may include conventional digital and analog outputs, while the 2052 video connectors may include various video, S-video, composite video, and one or more High Definition Multimedia Interface (HDMI) outputs. ). As a result, the video output is in formats such as PAL or NTSC or in 720o, 1080i or 1080p high definition.
Audio processing (generation, decoding, etc.) It is performed by the Cell 2028 processor. The operating system of the Playstation 3 device supports Dolby® 5.1, surround sound, Dolby® Theater Surround (DTS), and the decoding of 7.1 Blu-Ray® surround sound.
In the present embodiment, the video camera 2012 comprises a single-load coupled device (CCD), an LED indicator, and real-time data compression based on hardware and encoding apparatus such that compressed video data can be transmitted in an appropriate format such as an intra-image based on MPEG (group of moving image experts) standard for decoding by the system unit. The camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit 2000, for example to signify adverse lighting conditions. The modalities of the video camera 2013 can be connected variously to the unit of the system 700 via a USB, Bluetooth or Wi-Fi communication port. The modalities of the video camera may include one or more associated microphones and also be capable of transmitting audio data. In the modalities of the video camera, the CCD can have an appropriate resolution for high definition video capture. In use, the images captured by the video camera can be incorporated for example within a game or interpreted as game control inputs. In another embodiment, the camera is an infrared camera suitable for detecting infrared light.
In general for successful data communication to occur with a peripheral device such as a video camera or remote control via one of the communication ports of the system unit 2000, an appropriate piece of software such as a device unit should be provided. The device unit technology is well known and will not be described in detail here, except to say that the skilled person will be alert to what a similar software device or interface device may be required in the described embodiment.
Figure 21 illustrates additional hardware that can be used to process instructions, in accordance with one embodiment of the present invention. The Cell 2028 processor has an architecture comprising four basic components: external input and output structures comprising a memory controller 2160 and a dual bus interface controller 2170A, B; a main processor referred to as the Power Processing Element 2150; eight co-processors referred to as Synergic Processing Elements (SPEs) 2110A-H and a circular data bus connecting the above components referred to as the Element Interconnection Bus 2180. The total floating point performance of the Cell processing is 218 GFLOPS , compared to the GFLOPs 6.2 of the Emotion Engine of the Playstation 2 device.
The Power Processing Element (PPE) 2150 is based on a (PPU) 855 PowerPC Power Adaptable Core 570 multi-threaded two-way simultaneous running with an internal clock of 3.2 GHz. This comprises a level 2 cache (L2) 512 kB cache and a 32 kB level 1 cache (L1). The PPE 2150 is capable of eight unique position operations per clock cycle, translating up to 25.6 GFLOPs at 3.2 GHz. The primary role of the PPE 2150 is to act as a controller for the Synergic Processing Elements 2110A-H, which manipulates the greater part of the computational workload. In operation of the PPE 2150 maintains a work queue, planning works for the Synergic Processing Elements 2110A-H and monitor its progress. Consequently each Synergic Processing Element 2110A-H runs a kernel whose role is to bring a job, execute it and tune into the PPE 2150.
Each Synergic Processing (SPE) 2110A-H element comprises a respective Synergistic Processing Unit (SPU) 2120A-H, and a respective Memory Flow Controller (MFC) 2140A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 2142A-H, a respective Memory Management Unit (MMU) 2144A-H and an interface bus (not shown). Each SPU 2120A-H is a RISC processor clocked at 3.2 GHz and comprising 256 kB local RAM 2130A-H, expandable in principle to 4 GB. Each SPE provides a theoretical 25.6 GFLOPS of unique precision performance. An SPU can operate on 4 single precision floating point elements, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation. The SPU 2120A-H does not directly access the memory of the XDRAM 2026 system; the 64-bit address formed by the SPU 2120A-H is passed to the MFC 2140A-H which instructs its DMA controller 2142A-H to the access memory via the Element Interconnect Bus 2180 and the memory controller 2160.
The Element Interconnect Bus (EIB) 2180 is an internally logical communication bus to the 2028 Cell processor which connects the processor elements above, namely the PPE 2150, the memory controller 2160, the double bus interface 2170A, B and the 8 SPEs 2110A-H, with a total of 12 participants. Participants can read and write simultaneously to the bus at a speed of 8 bytes per work cycle. As noted above, each SPE 2110A-H comprises a DMAC 2142A-H to plan for larger read and write sequences. The EIB comprises four channels, two each in clockwise and counter clockwise direction. Consequently for twelve participants, the data flow as a longer step between any two participants is six steps in the appropriate direction. The theoretical instantaneous peak EIB bandwidth for 12 slots is therefore 96B per clock, in the case of full utilization through arbitration among the participants. This equals a theoretical peak bandwidth of 307.2 GB / s (gigabytes per second) at a clock speed of 3.2 GHz.
Memory controller 2160 comprises an XDRAM 2162 interface, developed by Rambus Incorporated. The interfaces of the controller memory with the Rambus XDRAM 2026 with a theoretical peak bandwidth of 25.6 GB / s.
The dual bus interface 2170 A, B comprises a Rambus FlexIO® 2172A, B system interface. The interface is organized in 12 channels each one being 8 bits wide, with five trajectories being inbound and seven outgoing. This provides a theoretical peak bandwidth of 62.4 GB / s (36.4 GB / s final sequence, 26 GB / s initial sequence) between the Cell processor and the 2034 I / O bridge via the 2170A controller and the graphics unit of the Reality Simulator 2130 via the 2170B controller.
The data sent by the cell processor 2128 to the graphical unit of the Reality Simulator 2130 will typically comprise display lists, being a sequence of commands towards drawing vertices, applying textures to polygons, specific lighting conditions, and etc.
Figure 22 is an exemplary illustration of scene A to scene E with the respective user A to user E interacting with game clients 2202 that are connected to the processing server via the Internet, in accordance with a modality of the present invention. A gaming client is a device that allows users to connect to server and processing applications via the internet. The game client allows users to access and play entertainment content online such as but not limited to games, movies, music and photos. Additionally, the gaming client can provide access to applications of online communications such as VOIP, text conversation protocols, and email.
A user interacts with the game client via the controller. In some embodiments, the controller is a game-specific controller, while in other modes, the controller may be a keyboard and mouse combination. In one embodiment, the game client is an autonomous device capable of outputting audio and video signals to create a multimedia environment through a monitor / television and associated audio equipment. For example, the gaming client may be, but is not limited to, a thin client, an internal PCI-Express card, an external PCI-Express device, an ExpressCard device, an internal, external, or wireless USB device, or a Firewire device, etc. In other embodiments, the game client is integrated with a television or other multimedia device such as a DVR, Blu-Ray player, DVD player or multi-channel receiver.
Within scene A of Figure 22, user A interacts with a client application displayed on a 2204A monitor using a 2206A controller supplemented with game client 2202A. Similarly, with scene B, user B interacts with another client application that is deployed on monitor 2204B using a 2206B controller supplemented with game client 2202B. Scene C illustrates a view from the back of user C when viewed on a monitor displaying a game and friend list of game client 2202C. Although Figure 22 shows a single server processing module, in one mode, there are multiple server processing modules throughout the world. Each server processing module includes sub-modules for user session control, sharing / communication logic, user geo-location, and load balance processing service. In addition, a server processing module includes distributed storage and network processing.
When a gaming client 2202 is connected to a server processing module, the user session control can be used to authenticate the user. An authenticated user can have associated virtualized distributed storage and virtualized network processing. Example objects that can be stored as part of a user's virtualized distributed storage include buying media such as, but not limited to games, videos and music etc. Additionally, distributed storage can be used to save the game state for multiple games, custom settings for individual games, and general settings for the game client. In one embodiment, the geo-location module of the user of the processing server is used to determine the geographical location of a user and their respective game client. The user's geographic location can be used by both the sharing / communication logic and the load-balancing processing service to optimize performance based on geographic location and processing demands of server processing modules multiple. Virtualizing either or both network processing and network storage could allow gaming client processing tasks to be dynamically shifted to the underutilized server processing module (s). Thus, load balancing can be used to minimize latency associated with both storage recalls and with data transmission between the server processing modules and gaming clients.
The server processing module has application instances of server A and server application B. The server processing module is capable of supporting multiple server applications as indicated by the Xi server application and X2 server application. In one embodiment, the server processing is based on the group computing architecture that allows multiple processors within a group to process the server applications. In another embodiment, a different type of multi-computer processing scheme is applied to process the server applications. This allows server processing to be scaled to accommodate a larger number of gaming clients by running multiple client applications and corresponding server applications. Alternatively, the server processing can be scaled to accommodate increased computational demands needed by more graphics demanding processing or gaming, video compression, or application complexity. In one mode, the server processing module performs most of the processing via the server application. This allows relatively expensive components such as graphics processors, RAM, and general processors to be centrally located and reduces the cost of the gaming client. The processed server application data is sent back to the corresponding game client via the internet to be displayed on a monitor.
Scene C illustrates an exemplary application that can be executed by the game client and server processing module. For example, in a gaming client mode 2202C, user C is allowed to create and observe a friend list 2220 that includes user A, user B, user D, and user E. As shown in scene C , user C is able to observe real-time images or avatars of the respective user on monitor 2204C. The processing server executes the respective applications of the game client 2202C and with the respective game clients 2202 of user A, user B, user D and user E. Because the processing server is alert of the applications that are executed by the game client B, the friend list for user A can indicate which user B of the game is playing. Still further, in one embodiment, user A can observe current in the video game directly from user B. This is allowed by merely sending server application data processed by user B to client A of the game in addition to client B of the game. game.
In addition to being able to watch video from friends, the communication application can allow real-time communications between friends.
As applied to the previous example, this allows user A to provide incentive or clues while observing real-time video from user B. In a two-way real-time voice communication mode is established through a client / server application . In another modality, a client / server application allows virtual text conversation. In yet another mode, a client / server application converts speech to text for display on a friend's screen.
Scene D and scene E illustrate the respective user D and user E interacting with game consoles 221 OD and 221 OE respectively. Each game console 221 OD and 221 OE is connected to the server processing module and illustrates a network where the server processing modules coordinate the game mechanics for both gaming consoles and gaming clients.
Figure 23 illustrates a modality of a Supplier Service Information architecture. Information Service Providers (ISP) 2370 sends a multitude of services and information to 2382 users geographically dispersed and connected via network 2386. An ISP can send just one type of service, such as reserve price updates, or a variety of services such as transmission media, news, sports, games, etc. Additionally, the services offered by each ISP are dynamic, that is, services can be incorporated or taken away at any point in time. Thus, the ISP providing a particular type of service to a particular individual may change over time. For example, a The user can be served by an ISP in close proximity to the user while the user is in their local city, and the user can be served by a different ISP when the user travels to a different city. The ISP of the local city will transfer the required information and data to the new ISP such that the user information "follows" the user to the new city making the data closer to the user and easier to access. In another embodiment, a server-master relationship can be established between a master ISP, which manages the information for the user, and an ISP server that interconnects directly with the user under the control of the master ISP. In another modality, the data is transferred from an ISP to another ISP when the client moves around the world to make the ISP in a better position so that the service to the user is one that sends these services.
The ISP 2370 includes the Application Service Provider (ASP) 2372, which provides computer-based services to clients over a network. Software offered using an ASP model is also sometimes called software on demand or software as a service (SaaS). A simple way to provide access to a particular application application program (such as client relationship management) is by using a standard protocol such as HTTP. Application software resides in the vendor's system and is accessed by users through a web browser using HTML, through special-purpose client software provided by the vendor, or other remote interface such as a thin client.
Services sent over a wide geographical area frequently use cloud computing. Cloud computing is a style of computing in which virtualized resources are provided frequently and dynamically scalable as a service over the Internet. Users do not need to be an expert in infrastructure technology in the "cloud" that supports them. Cloud computing can be divided into different services, such as Infrastructure as a Service (laaS), Platform as a Service (PaaS), and Software as a Service (SaaS). Cloud computing services frequently provide common online business applications that are accessed from a network browser, while software and data are stored on servers. The term cloud is used as a metaphor for the Internet, based on how the Internet is illustrated in computer network diagrams and is an abstraction for the complex infrastructure that it conceals.
In addition, ISP 2370 includes a Game Processing Server (GPS) 2374 which is used by gaming clients to play and multiplayer video games. Most of the video games played over the Internet operate via a connection to a game server. Typically, the games use a dedicated server application that collects player data and distributes it to other players. This is more efficient and effective than a point-to-point provision, but requires a separate server to the main computer of the server application. In other modality, the GPS establishes communication between the players and their respective mechanisms of game mechanics of exchange of information without relapsing in the centralized GPS.
Dedicated GPSs are servers which run independently of the client. Such servers usually run on dedicated hardware located in data centers, providing more bandwidth and dedicated processing power. Dedicated servers are the preferred method of hosting game servers for most PC-based multiplayer games. Multiplayer online games massively run on dedicated servers usually hosted by the software company that owns the game title, allowing them to control and update content.
Transmission Processing Server (BPS) 2376 distributes audio and video signals to an audience. Transmission to a very narrow audience interval is sometimes called controlled news. The final record of transmission distribution is how the signal reaches the listener or observer, and it can arrive over air as with a radio station or TV station towards an antenna and receiver, or it can arrive through a TV or radio cable cable (or "wireless cable") via the station or directly from a network. The Internet can also carry a radio or TV to the receiver, especially with multicasting, allowing the signal and bandwidth to be shared. Historically, transmissions have been delimited by a geographical region, such as national transmissions or regional broadcasts. However, with the proliferation of fast internet, the transmissions are not defined by geography since the content can reach almost any country in the world.
The Storage Service Provider (SSP) 2378 provides computer storage space and related management services. SSPs also offer periodic and archived support. By offering storage as a service, users can order more storage as required. Another main advantage is that SSPs include backup services and users will not lose all their data if their computer disk drives fail. In addition, a plurality of SSPs may have full or partial copies of the user data, allowing access to the data in an efficient manner regardless of where the user is located or the device being used to access the data. For example, a user can access personal files on the local computer, as well as on a mobile phone while the user is on the move.
The Communications Provider 380 provides connectivity to users. A type of Communications Provider is an Internet Service Provider (ISP) which offers access to the Internet. The ISP connects its clients using appropriate data transmission technology to send Internet Protocol datagrams, such as dial-up, DSL, cable modem, high-speed wireless or dedicated interconnects. The Communications Provider can also provide services for messaging, such as email, instant messaging, and text SMS. Another type of Communications Provider is the Network Service Provider (NSP) which sells network or broadband access by providing direct backbone network access to the Internet. Network service providers may consist of telecommunications companies, data carriers, wireless communication providers, Internet service providers, cable television operators offering high speed Internet access, etc. · Data Exchange 2388 interconnects the various modules within the ISP 2370 and connects these modules to the 2382 users via the 2386 network. Data Exchange 2388 can cover a small area where all the ISP 2370 modules are in close proximity, or can cover a large geographic area when the different modules are geographically dispersed. For example, Data Exchange 2388 can include fast (or faster) Gigabit Ethernet within a data center enclosure, or an intercontinental virtual area network (VLAN).
The 2382 users access the remote services with the client device 2384, which includes at least one CPU, one screen and I / O. The client device can be a PC, a mobile phone, an ultraportable device, a PDA, etc. In one mode, ISP 2370 recognizes the type of device used by the customer and adjusts the communication method used. In other cases, ISP 2370 recognizes the type of device used by the customer and adjusts the communication method used.
The embodiments of the present invention can be practiced with various configurations of the computer system including portable devices, microprocessor systems, programmable or microprocessor-based consumer electronics, microcomputers, supercomputers and the like. The invention can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a wireless or wire-based network.
With the foregoing modalities in mind, it should be understood that the invention can employ various computer-implemented operations involving data stored in computer systems. Those operations are those requiring physical manipulation of physical quantities. Any of the operations described herein that are part of the invention are useful machine operations. The invention also relates to a device or an apparatus for performing these operations. The apparatus may be specially constructed for the purpose required, or the apparatus may be a general-purpose computer selectively activated or configured by a computer program stored in the computer. In particular, several general-purpose machines can be used with computer programs written in accordance with the teachings herein, or it may be more convenient to contribute a more specialized apparatus to perform the required operations.
The invention can also be represented as computer readable code on a computer readable medium. The computer-readable medium is any data storage device that can store data, which can then be read by a computer system. Examples of computer-readable media include hard drives, network attached storage (ÑAS), read-only memory, random access memory, CD-ROMs, CD-Rs, CD-RWs, magnetic tapes, and other data storage devices. -optics and optics. The computer readable medium may include tangible media readable in distributed computer over a networked computer system such that the computer readable code is stored and executed in a distributed manner.
Although the operations of the method were described in a specific order, it should be understood that other maintenance operations can be performed between operations, or which operations can be adjusted in such a way that they occur at slightly different times, or can be distributed in a system which allows the occurrence of processing operations at various intervals associated with processing, provided that the processing of the overlay operations is performed in the desired manner.
Although the above invention has been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications may be made within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.

Claims (13)

NOVELTY OF THE INVENTION CLAIMS
1. A method for providing remote control of a user's game mechanics, comprising: presenting live video feed from a first-user game mechanics to a second remote user; processing a request for transition control of the first user game mechanics to the second user; initiate control of the first user game mechanics by the second user; where the method is executed by at least one processor.
2. method according to claim further characterized in that initiating control of the game mechanics of the first user by the second user includes deactivating the control of the first user game mechanics by a first controller device associated with the first user, and activating the first user. control of the game mechanics of the first user by a second controller device associated with the second user.
3. The method according to claim 2, further characterized in that the control of the game mechanics of the first user by the second controller device includes receiving the input commands from the second controller device and applying the input commands to define the game mechanics of the first user.
4. The method according to claim 1, further characterized in that the live video feed is presented through a social interface to the second user, the social interface provides access to a social profile of the second user, the first user being defined as a member of the social profile of the second user.
5. The method according to claim 4, further characterized in that the social interface includes a comment interface for publishing the comments during the game mechanics of the first user.
6. The method according to claim 1, further characterized in that processing the request for transition control includes receiving an acknowledgment from the first user to allow control of the game mechanics of the first user by the second user.
7. The method according to claim 1, further characterized in that the live video feed of the game mechanics of the first user to the second user includes presenting the live video feed in a non-full screen format; and wherein initiating the control of the game mechanics of the first user by the second user includes activating the presentation of the live video feed in a full-screen format.
8. A method for providing multi-player game mechanics, comprising: presenting live video feed from a first-user game mechanics session to a second user remote; processing a request for a second user to join the first user game mechanics session; start the game mechanics by the second user in the game mechanics session of the first user; where the method is executed by at least one processor.
9. The method according to claim 8, further characterized in that starting the game mechanics by the second user in the first user game mechanics session includes starting a multiplayer mode of a video game.
10. The method according to claim 8, further characterized in that the live video feed is presented through a social interface to the second user, the social interface provides access to a social profile of the second user, the first user being defined as a member of the social profile of the second user.
11. The method according to claim 10, further characterized in that the social interface includes a comment interface for publishing the comments during the game mechanics of the first user.
12. The method according to claim 8, further characterized in that processing the request to join the first user game mechanics session includes receiving an acknowledgment from the first user to allow game mechanics by the second user in the session. of first user game mechanics.
13. The method according to claim 8, further characterized in that presenting the live video feed of the first user game mechanics to the second user includes presenting the live video feed in a non-full screen format; and where starting the game mechanics by the second user in the first user game mechanics session includes activating the presentation of the live video feed in a full screen format.
MX2014000231A 2012-12-21 2014-01-07 Remote control of a first user's gameplay by a second user. MX353111B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201261745281P 2012-12-21 2012-12-21
US201261745290P 2012-12-21 2012-12-21
US13/831,190 US9364743B2 (en) 2012-12-21 2013-03-14 Generation of a multi-part mini-game for cloud-gaming based on recorded gameplay
US13/831,178 US9352226B2 (en) 2012-12-21 2013-03-14 Automatic generation of suggested mini-games for cloud-gaming based on recorded gameplay
US13/839,382 US9345966B2 (en) 2012-03-13 2013-03-15 Sharing recorded gameplay to a social graph
US13/839,486 US9242176B2 (en) 2012-12-21 2013-03-15 Remote control of a first user's gameplay by a second user

Publications (2)

Publication Number Publication Date
MX2014000231A true MX2014000231A (en) 2014-11-03
MX353111B MX353111B (en) 2017-12-20

Family

ID=50979420

Family Applications (3)

Application Number Title Priority Date Filing Date
MX2014000231A MX353111B (en) 2012-12-21 2014-01-07 Remote control of a first user's gameplay by a second user.
MX2014000229A MX353112B (en) 2012-12-21 2014-01-07 Sharing recorded gameplay to a social graph.
MX2014000227A MX356707B (en) 2012-12-21 2014-01-07 Automatic generation of suggested mini-games for cloud-gaming based on recorded gameplay.

Family Applications After (2)

Application Number Title Priority Date Filing Date
MX2014000229A MX353112B (en) 2012-12-21 2014-01-07 Sharing recorded gameplay to a social graph.
MX2014000227A MX356707B (en) 2012-12-21 2014-01-07 Automatic generation of suggested mini-games for cloud-gaming based on recorded gameplay.

Country Status (7)

Country Link
JP (4) JP6196147B2 (en)
KR (7) KR20210094149A (en)
BR (1) BR102013033136B1 (en)
MX (3) MX353111B (en)
RU (1) RU2605840C2 (en)
TW (3) TWI573619B (en)
WO (1) WO2014100770A2 (en)

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6612019B2 (en) 2014-08-19 2019-11-27 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus, control data transmission method, and information processing system
US9919208B2 (en) * 2014-12-11 2018-03-20 Immersion Corporation Video gameplay haptics
JP5770918B1 (en) * 2014-12-18 2015-08-26 株式会社Cygames Information processing program and information processing method
US10207185B2 (en) * 2015-03-07 2019-02-19 Sony Interactive Entertainment America Llc Using connection quality history to optimize user experience
JP6893392B2 (en) * 2015-08-04 2021-06-23 任天堂株式会社 Game system, game device, control program and game control method
US10315108B2 (en) * 2015-08-19 2019-06-11 Sony Interactive Entertainment America Llc Local application quick start with cloud transitioning
US10744407B2 (en) * 2015-09-08 2020-08-18 Sony Interactive Entertainment LLC Dynamic network storage for cloud console server
EP3341098B1 (en) * 2015-09-30 2024-03-13 Sony Interactive Entertainment America LLC Multi-user demo streaming service for cloud gaming
CN105610868B (en) * 2016-03-03 2019-08-06 腾讯科技(深圳)有限公司 A kind of method of information exchange, equipment and system
US10238965B2 (en) * 2016-04-28 2019-03-26 Sony Interactive Entertainment America Llc Cloud gaming device handover
US10625156B2 (en) * 2016-06-30 2020-04-21 Sony Interactive Entertainment LLC Method and system for sharing video game content
GB2570305A (en) * 2018-01-18 2019-07-24 Sony Interactive Entertainment Europe Ltd User analysis system and method
GB2571306A (en) * 2018-02-23 2019-08-28 Sony Interactive Entertainment Europe Ltd Video recording and playback systems and methods
TWI716706B (en) * 2018-03-01 2021-01-21 致伸科技股份有限公司 Ai-assisted operating system
KR102551254B1 (en) * 2018-04-06 2023-07-04 주식회사 엔씨소프트 Method and computer program for providing a service of sharing a game
KR102319298B1 (en) * 2018-09-21 2021-10-29 주식회사 엔씨소프트 System, sever and method for contrllling game character
WO2020129861A1 (en) * 2018-12-21 2020-06-25 株式会社ソニー・インタラクティブエンタテインメント Information processing device for presenting preview screen
JP7222722B2 (en) * 2019-01-17 2023-02-15 株式会社ソニー・インタラクティブエンタテインメント Information processing system, information processing method and computer program
KR102329749B1 (en) * 2019-08-05 2021-11-22 주식회사 엔씨소프트 Sever, system and method for control of game character
US11344799B2 (en) 2019-10-01 2022-05-31 Sony Interactive Entertainment Inc. Scene change hint and client bandwidth used at encoder for handling video frames after a scene change in cloud gaming applications
CN111147885B (en) * 2020-01-03 2021-04-02 北京字节跳动网络技术有限公司 Live broadcast room interaction method and device, readable medium and electronic equipment
CN111494965B (en) * 2020-04-15 2021-09-14 腾讯科技(深圳)有限公司 Information processing method, device, equipment and storage medium
CN111603764B (en) * 2020-05-28 2021-05-28 腾讯科技(深圳)有限公司 Cloud game processing method and equipment
US20220008824A1 (en) * 2020-07-13 2022-01-13 Nvidia Corporation Game generation using one or more neural networks
CN111917768B (en) * 2020-07-30 2022-05-13 腾讯科技(深圳)有限公司 Virtual scene processing method and device and computer readable storage medium
KR102606834B1 (en) * 2021-05-10 2023-11-29 주식회사 엔씨소프트 System and method for providing game service
TWI790801B (en) * 2021-11-01 2023-01-21 宏碁股份有限公司 Remote game executing method and remote game executing system
JP7216314B1 (en) 2022-02-17 2023-02-01 株式会社Mixi Program, information processing device, and information processing method
WO2023157618A1 (en) * 2022-02-18 2023-08-24 優太 竹田 Replay system, processing method, and replay program

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6280323B1 (en) * 1996-11-21 2001-08-28 Konami Co., Ltd. Device, method and storage medium for displaying penalty kick match cursors in a video soccer game
US6699127B1 (en) * 2000-06-20 2004-03-02 Nintendo Of America Inc. Real-time replay system for video game
RU2236702C2 (en) * 2002-09-02 2004-09-20 Савин Вадим Георгиевич Method for computer game
JP3534343B2 (en) * 2002-09-12 2004-06-07 株式会社コナミコンピュータエンタテインメントジャパン GAME PROGRAM AND GAME DEVICE
JP3703800B2 (en) * 2002-11-15 2005-10-05 株式会社スクウェア・エニックス Communication game system
US8930561B2 (en) * 2003-09-15 2015-01-06 Sony Computer Entertainment America Llc Addition of supplemental multimedia content and interactive capability at the client
JP4494882B2 (en) * 2004-06-29 2010-06-30 株式会社バンダイナムコゲームス Program, game device, display control device, server, and information storage medium
KR100682455B1 (en) * 2005-03-17 2007-02-15 엔에이치엔(주) Game scrap system, game scrap method, and computer readable recording medium recording program for implementing the method
JP5001445B2 (en) * 2005-05-06 2012-08-15 任天堂株式会社 Communication game system
CA2633895A1 (en) * 2005-12-27 2007-07-05 Massive Incorporated Streaming media casts, such as in a video game or mobile device environment
US20070173325A1 (en) * 2006-01-20 2007-07-26 Microsoft Corporation Join in-progress on-line game session
US20070294089A1 (en) * 2006-06-02 2007-12-20 Garbow Zachary A Gameplay Recording and Marketplace
JP5068080B2 (en) * 2007-01-09 2012-11-07 株式会社バンダイナムコゲームス GAME DEVICE, PROGRAM, AND INFORMATION STORAGE MEDIUM
KR20090000260A (en) * 2007-02-09 2009-01-07 주식회사 엠게임 Beta game providing system and its method for gradually opening to the gamers to arouse gamer's interest
JP4203524B2 (en) * 2007-02-14 2009-01-07 株式会社コナミデジタルエンタテインメント GAME SYSTEM AND GAME DEVICE INCLUDING THE SAME
JP5203646B2 (en) * 2007-07-17 2013-06-05 株式会社ソニー・コンピュータエンタテインメント GAME GUIDING SYSTEM, GAME GUIDING DEVICE, GAME SERVER, GAME GUIDING METHOD, PROGRAM, AND INFORMATION STORAGE MEDIUM
US8235817B2 (en) 2009-02-12 2012-08-07 Sony Computer Entertainment America Llc Object based observation
US8515253B2 (en) * 2008-02-15 2013-08-20 Sony Computer Entertainment America Llc System and method for automated creation of video game highlights
JP2010178010A (en) * 2009-01-29 2010-08-12 Funai Electric Co Ltd Moving image editor
JP2010220089A (en) * 2009-03-18 2010-09-30 Sony Corp Digest reproducing apparatus, digest reproducing method and program
US8292742B2 (en) * 2009-03-23 2012-10-23 Utah State University Systems and methods for simulation and regeneration of a virtual environment
US9898675B2 (en) * 2009-05-01 2018-02-20 Microsoft Technology Licensing, Llc User movement tracking feedback to improve tracking
JP5449859B2 (en) * 2009-05-18 2014-03-19 任天堂株式会社 GAME PROGRAM, GAME DEVICE, AND GAME SYSTEM
JP2010284473A (en) * 2009-06-11 2010-12-24 Gonichi Sasaki Game player evaluation system
JP5417111B2 (en) * 2009-10-01 2014-02-12 株式会社コナミデジタルエンタテインメント GAME SYSTEM, GAME SYSTEM CONTROL METHOD, AND PROGRAM
JP5193242B2 (en) * 2010-03-11 2013-05-08 株式会社コナミデジタルエンタテインメント Game system, game part execution support method, and program
KR101139498B1 (en) * 2010-04-20 2012-05-02 주식회사 넥슨코리아 Watching system and method thereof of other user game play picture in online game
JP2010201180A (en) * 2010-04-30 2010-09-16 Nintendo Co Ltd Game apparatus and game program
JP2012038042A (en) * 2010-08-05 2012-02-23 Sony Computer Entertainment Inc Game device
JP5991649B2 (en) * 2010-08-05 2016-09-14 株式会社ソニー・インタラクティブエンタテインメント Game device
JP5271319B2 (en) * 2010-08-12 2013-08-21 株式会社コナミデジタルエンタテインメント GAME SYSTEM AND PLAY CONTENT BROWSE CONTROL METHOD
JP5542020B2 (en) * 2010-09-22 2014-07-09 株式会社ソニー・コンピュータエンタテインメント Information processing system, information processing method, program, and information storage medium
JP5740972B2 (en) * 2010-09-30 2015-07-01 ソニー株式会社 Information processing apparatus and information processing method
KR101269411B1 (en) * 2010-12-31 2013-05-30 (주)네오위즈게임즈 Game server, method, and recording medium for providing sport game including chatting service
US9345966B2 (en) * 2012-03-13 2016-05-24 Sony Interactive Entertainment America Llc Sharing recorded gameplay to a social graph
US10406429B2 (en) * 2012-08-29 2019-09-10 Sony Interactive Entertainment, LLC User-based mini-game generation and distribution

Also Published As

Publication number Publication date
JP2020099729A (en) 2020-07-02
WO2014100770A8 (en) 2014-07-24
TW201438796A (en) 2014-10-16
KR20190090077A (en) 2019-07-31
BR102013033136B1 (en) 2021-10-26
MX356707B (en) 2018-06-11
TWI573619B (en) 2017-03-11
KR20200083658A (en) 2020-07-08
KR20210094149A (en) 2021-07-28
RU2013156817A (en) 2015-06-27
JP2014121610A (en) 2014-07-03
KR20200083659A (en) 2020-07-08
KR102292820B1 (en) 2021-08-25
KR101742662B1 (en) 2017-06-01
RU2605840C2 (en) 2016-12-27
TWI564062B (en) 2017-01-01
JP6196147B2 (en) 2017-09-13
WO2014100770A2 (en) 2014-06-26
KR20170061196A (en) 2017-06-02
JP7461174B2 (en) 2024-04-03
MX2014000229A (en) 2014-11-03
MX353111B (en) 2017-12-20
JP2019018076A (en) 2019-02-07
JP2017200631A (en) 2017-11-09
KR20150099829A (en) 2015-09-01
TW201440856A (en) 2014-11-01
JP6434583B2 (en) 2018-12-05
KR20190057448A (en) 2019-05-28
MX353112B (en) 2017-12-20
TW201440857A (en) 2014-11-01
MX2014000227A (en) 2014-11-03
BR102013033136A2 (en) 2015-10-20
TWI559965B (en) 2016-12-01
WO2014100770A3 (en) 2014-08-21

Similar Documents

Publication Publication Date Title
JP7461174B2 (en) Minigames accessed via shared interface
US11014012B2 (en) Sharing gameplay in cloud gaming environments
US10016677B2 (en) Remote control of a first user's session by a second user
EP2745893B1 (en) Automatic generation of suggested mini-games for cloud-gaming based on recorded gameplay
US11565187B2 (en) Method for sharing a portion of gameplay of a video game
US11406906B2 (en) Network connected controller for direct to cloud gaming
US20140179427A1 (en) Generation of a mult-part mini-game for cloud-gaming based on recorded gameplay

Legal Events

Date Code Title Description
FG Grant or registration