US20080268961A1 - Method of creating video in a virtual world and method of distributing and using same - Google Patents

Method of creating video in a virtual world and method of distributing and using same Download PDF

Info

Publication number
US20080268961A1
US20080268961A1 US12/112,975 US11297508A US2008268961A1 US 20080268961 A1 US20080268961 A1 US 20080268961A1 US 11297508 A US11297508 A US 11297508A US 2008268961 A1 US2008268961 A1 US 2008268961A1
Authority
US
United States
Prior art keywords
game
video
data
gameplay
software
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/112,975
Inventor
Michael Brook
Ken Mok
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/112,975 priority Critical patent/US20080268961A1/en
Publication of US20080268961A1 publication Critical patent/US20080268961A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/63Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/254Management at additional data server, e.g. shopping server, rights management server
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/355Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/49Saving the game status; Pausing or ending the game
    • A63F13/497Partially or entirely replaying previous game actions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • H04N21/2743Video hosting of uploaded data from client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/53Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing
    • A63F2300/538Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing for performing operations on behalf of the game client, e.g. rendering
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/57Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player
    • A63F2300/577Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player for watching a game played by other players
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6009Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content
    • A63F2300/6018Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content where the game content is authored by the player, e.g. level editor or by game device at runtime, e.g. level is created from music data on CD
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6669Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera using a plurality of virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character change rooms

Definitions

  • the present invention relates to computer software and more specifically to a method of creating video in a virtual world and a method of distributing and using same.
  • the present invention generally describes a novel method of creating a video presentation of an interactive video game (video of other virtual worlds may also be created) using data that has been created and stored as the interactive video game is played.
  • the present invention provides a method of creating a video record of virtual worlds and a method of distributing and using same.
  • the preferred embodiment of the present invention provides numerous benefits over the prior art.
  • the present invention generally provides a means by which a stand-alone or “add-in” software tool may be employed in two parts to re-create from stored data a three dimensional gameplay experience (or other rendition of a three-dimensional world) after it has occurred.
  • the experience or sequence of events may then be stored in a video format after-the-fact, reproducing the event faithfully, but allowing for much more thoughtful pre-production, post-production and point of view, lighting, and sound placement.
  • the present invention generally works by means of a two software application process.
  • the invention could be implemented using more or fewer software applications, but in the preferred embodiment only two are utilized.
  • the first is a software application that is either built-into a video game (or other, similar) software or as a stand alone application.
  • This first software application is designed to “grab” or “log” the data that is created as a video game is played.
  • This application is designed to log (or store) all of the data created. For example, in a first person shooter (FPS) game, the application would log shooter positions, weapons used, the locations of every player at each moment (typically x, y, z coordinates), the number of shots fired by each player, each player's name or “nickname” in the game, the angle of firing and location, the location in which other players are hit by “bullets” or “lasers” and various other data.
  • FPS first person shooter
  • the second application is an application designed in conjunction with the game software in the preferred embodiment, but in alternative embodiments it may stand alone, as well, to recreate the event exactly as it originally occurred during the game play.
  • This application also provides the secondary functions of allowing a user to see the event at whatever speed desired, to place a virtual camera (or multiple virtual cameras) throughout the recreated gameplay sequence, and to similarly place virtual sound receivers or virtual lighting effects in various places throughout the gameplay sequence.
  • This software is capable of moving through the gameplay sequence both in time and space, so as to provide the best possible recreation of the event for “filming” and to provide a subsequent “director” of the filming with the most resources to recreate and edit the event in a compelling manner.
  • the first benefit is the ability to drastically improve rendering quality.
  • Most video capture software captures only the quality of the image displayed on the player's screen (and only from the player's perspective).
  • many gamers intentionally degrade the quality of the graphics displayed by the game.
  • they intentionally degrade the video capture software's resolution so that it does not needlessly hinder the gameplay experience by spending CPU cycles transcoding video in real time.
  • the graphics may be “cranked up” to the maximum levels of resolution and graphical quality.
  • the video may be captured, of those high-resolution graphics, at a very high resolution as well.
  • a second benefit of the present invention is to allow a “director” of the video file or virtual film creation to review the sequence and to select, much like a director would for a television or movie sequence, the best angles and virtual camera locations (POV).
  • the “director” composing the subsequent film creation may be any user of the software, from a game player to a game software manufacturer to a third party media creator or an interactive entertainment competition organizer.
  • the ability to direct allows the most compelling or “best angles” for exciting moments in the game to be selected by the director, resulting in substantially better quality and more compelling video files being created.
  • An additional benefit is that the director of a video file or film creation based upon the gameplay sequence may also create multiple virtual cameras for use in capturing different angles and various times. Cutting between angles and scenes is an excellent way for a director to make lengthy gameplay sequences more exciting to watch and to provide the best angles for multiple different events or actions.
  • the present invention provides the only current method whereby virtual camera locations and virtual camera angles (including multiple virtual cameras) may be used to record an event, other than scripting an event prior to recording or capture. Scripting is rarely desirable in gaming competitions or other similar situations.
  • a multiplicity of “points of view” using virtual cameras and virtual “microphones”
  • FIG. 1 is a block diagram of the components and data flow in a preferred embodiment of the present invention
  • FIG. 2 is a block diagram of an alternative embodiment
  • FIG. 3 illustrates the placement of several virtual cameras and a virtual microphone in a gameplay sequence
  • FIG. 4 is flowchart of the steps involved in the video-creation process
  • FIG. 5 is a flowchart of the steps involved in the method of creating and distributing a video file.
  • FIG. 6 is a flowchart of an alternative embodiment of the method of creating and distributing a video file.
  • Director or “user” as used throughout the specification refers to any one of a number of individuals or companies who act to select “locations” at which to place virtual cameras, microphones and to provide any motion to either, during the course of recording a gameplay sequence.
  • the director or user may be an individual who has taken part in a gameplay sequence, a third party game software manufacturer or developer, a media source or a third-party game competition organizer.
  • Gameplay sequence as used herein generally refers to the parsing of any saved gameplay data with a game software application to create (or re-create) a series of events over the course of time, within a game software application. “Gameplay sequence” is also intended to apply to the use of any data to create (or re-create) any events or activity within a virtual “world” generated by a computer. Gameplay sequence is not intended to be limited to video games only, as it may be applicable to any virtual world that is created by a computer in conjunction with suitable software and hardware.
  • FIG. 1 the components and data flow of a preferred embodiment of the present invention are shown in block diagram form.
  • the present invention generally relates to a method used to create video files or a “film” of a virtual world. This figure shows one of the ways in which the video files are created.
  • a consumer computer 10 may be a conventional desktop or laptop computer. As can be seen, this consumer computer 10 has at least one software application installed, in this instance a game software application 12 .
  • the game software application is preferably a video game of some type.
  • the game software 12 may take the form of an online multiplayer game, a first person shooter (FPS), a massively multiplayer online (MMO) game, a real-time strategy (RTS) game or virtually any other form of video game. It may also take the form of any other virtual world environment capable of creation by a computer. In general, both multiplayer as well as single player games will be recorded or filmed using this method, it being understood that by “filming”, digital recording is intended.
  • the game software 12 contains a datacasting API (Application Programming Interface) 14 .
  • the datacasting API 14 in the embodiment depicted in this figure is an “add on” or a “tool” which has been integrated into the game software 12 itself such that the game software 12 is capable, inherently, of utilizing the datacasting functionality.
  • the datacasting API 14 is the component part of the game which is capable of parsing and translating gameplay sequence data created by the game software 12 .
  • the datacasting API 14 would “log” data created by the game software representing player locations, guns or other weapons picked up or used, the locations and angles of bullets, projectiles or lasers fired or other weapons used, the results of any successful and unsuccessful attacks, player movement, player names, player statistics, animation data and virtually every piece of data used or manipulated by the game software 14 in carrying on the gameplay sequence (hereinafter referred to as “gameplay data”).
  • this datacasting API 14 may be, instead, a part of the game server software (not pictured).
  • the datacasting API begins logging information pertaining to that player.
  • the datacasting API “logs” or otherwise parses gameplay sequence data which may then provided to data capture software 16 .
  • the datacasting API 14 and the data capture software 16 captures data at a rate of 30 frames per second (FPS). This is faster than standard television (24 FPS) and motion picture (16 FPS) and at speeds of 30 FPS, the human eye is virtually incapable of seeing any “chop” in the video subsequently created from this gameplay data.
  • FPS frames per second
  • the data capture software 16 is software which may be resident on the consumer computer 10 , resident at a remote server location or resident at a third party site or computer and is used to capture all of the gameplay data.
  • the data capture software 16 creates a game sequence data file that may be used to reproduce the video game gameplay sequence at a later time.
  • the data capture software 16 creates a detailed log of the entire gameplay sequence for storage.
  • a state is a term of art used in computer science to refer to the status of the entire program in memory at a given time. It may also refer to the current setting of every (or a portion of every) variable currently defined and in use by the program as it runs. The total data pertaining to the on-going and running software program is the “state” of that program.
  • the present invention takes a “snapshot” of the “state” of all variables, the memory registers and values in memory locations allocated to the game program at a rate of 30 times per second.
  • the game state includes any animation state, game data state, sound replay state, player location and action state, and various other states.
  • This data is then written, in a reproducible format such as text, XML or directly as a RAM dump, to a state log for use in rendering later using a rendering computer 18 .
  • the gameplay sequence is re-created, exactly, from the moment that the game state data was captured, including all avatar and character action.
  • the state data may be created or stored in a compressed format using differential analysis to record only the data that has changed from the last moment at which a game state was saved. This may analyze the data, quickly, and determine what elements have changed from moment to moment, then only save those elements in the new game state. For example, from moment to moment a player's nickname would not change, but the player's location may be changing constantly. Therefore, the player's name need not be logged at every instance. However, data pertaining to movement and action would be saved often for accuracy. This may result in a significantly smaller overall file size of the resulting saved gameplay data.
  • the next component is the rendering computer 18 .
  • the rendering computer 18 serves at least two separate and distinct rendering functions. It is, notably, not only a computer capable of rendering video from data.
  • the first function is to accept a data file produced by the data capture software 16 and to “re-create” a gameplay sequence based upon that data.
  • Its second, and distinctly separate, function is to record the re-created gameplay sequence according to a user's direction. All methods of the prior art only perform the second function (and only in part) in that they record, only, the event as it is happening (instead of a re-created gameplay sequence) and from a single perspective, the prior art that is overcome by the present invention.
  • the rendering computer 18 (which may, depending upon the implementation, be the same as the consumer computer 10 , but is pictured separately for ease of explanation) is capable of recreating the gameplay sequence as if it is being played again. This is accomplished by data rendering software 20 .
  • the data rendering software 20 is software designed to interpret the gameplay data (such as player location, name, weapons used and the like) and to re-create the gameplay sequence based upon that data.
  • the data rendering software 20 is used, virtually always, in conjunction with replay game software 22 .
  • the replay game software 22 is the same or is closely related to the game software 12 , but may include higher resolution graphics and/or advertising audio and graphics, for example.
  • the data rendering software 20 feeds the data pertaining to the gameplay sequence that was created by the data capture software 16 back into the replay game software 22 on the rendering computer 18 .
  • the replay game software 22 under the direction of the data rendering software 20 interprets the data and presents the gameplay sequence, in its entirety, subject to the user's direction, exactly as it occurred.
  • the data rendering results in a three dimensional world in which the data rendering software 20 allows a user to move through and view from literally any location in the game world or map. This is not a single-viewpoint locked camera follow of one or more players, it is an exact re-creation of the gameplay sequence, from the complete gameplay data created as the gameplay sequence was first completed.
  • the user may use the rendering computer 18 and the data rendering software 20 to select a multiplicity of locations, within the game world or gameplay sequence to place one or more “virtual” cameras and to record video from one time to another at any one of those cameras.
  • the “virtual” microphone may be placed at any point in the gameplay space or at a multiplicity of locations if so desired. This process may be more readily understood when described with reference to FIG. 3 below.
  • Video rendering software 24 is used in connection with the data rendering software 20 and the replay game software 22 to appropriately create video files of the sequence from each location.
  • Video file 26 The output from the video rendering software 24 is a video file 26 .
  • this video file 26 may in fact be a multiplicity of video files, should a multiplicity of virtual cameras be used.
  • Video file 26 is intended to represent the collective output of this process, including any audio files or audio/video files created. It is also intended to represent the possibility that analog audio and/or video files may be created using this functionality.
  • the video file 26 may be immediately shared via a video distribution channel 32 .
  • This video distribution channel 32 may be a professional video game website, a television show or compilation video of video game highlights. It may be a television network or an internet video sharing site.
  • the video file 26 is ready to be shared, broadly, the moment it has been rendered by the video rendering software 24 .
  • a user may, optionally, choose to use a video editing computer 28 equipped with video editing software 30 to combine the various video files created by the video rendering software 24 into a single, composite video.
  • the optional nature of this choice is indicated by dotted connecting lines.
  • the user may choose to use the video editing software 30 to insert graphics, video effects, transitions between cuts or to integrate multiple angles from which the video was created into a single video file for sharing.
  • this process is typically considered “post production”. It is also known as “editing.” This process allows a user to select the “best” shot, given the series of angles from which video was created. Even further, if the user is dissatisfied with the angles he or she has chosen or the outcome of one or more video files, the user may go back to the gameplay sequence data and use the data rendering software 20 , the replay game software 22 and the video rendering software 24 to create a new angle or new portion of video for the resultant video file 26 or for subsequent distribution.
  • One of the advances over the prior art is that the user may return, directly, to the source of the video in order to create additional angles, movements, pans, or data points from which video may be created for subsequent inclusion in a post-production, final cut video program to be created. Once the final cut is created using the video editing software 30 , the video may be distributed via the video distribution channel 32 .
  • the first component is a consumer computer 34 . This may be the same consumer computer found in FIG. 1 (as element 10 ).
  • the next component is game software 36 . It is notable, however, that in this case, the game software 36 is separate from a datacasting API 38 unlike the combination shown in FIG. 1 . This is the primary difference between this embodiment and the embodiment depicted in FIG. 1 .
  • the datacasting API 38 in this embodiment is an application separate from the game software 36 . This means that the datacasting API 38 in this embodiment is designed in such a way as to “listen” to the game software 36 as it works. The datacasting API 38 still creates a log, which it passes to data capture software 40 , of all relevant information happening within the game. The capture of the gameplay data takes place, using whatever means are available. In some instances, a “plug-in” to a game will be used in connection with the datacasting API 38 when the datacasting API is not integrated into the game software 36 .
  • the datacasting API 38 will employ “listening” techniques to gather the information from the game software 36 .
  • the datacasting API 38 while not a part of the game software 36 , (which will be described further below) is still capable of gathering all relevant gameplay data and passing it on to data capture software 40 .
  • a rendering computer 42 corresponding to the rendering computer 18 in FIG. 1 , performs the same two-part function. First, it renders the game using the gameplay data and data rendering software 44 and replay game software 46 , then it renders a video file from that gameplay data using video rendering software 48 , according to user direction.
  • the resultant video file 50 (which may, in fact, be a multiplicity of audio and/or video files) may be edited according to user wishes or may be immediately distributed via a video distribution channel 56 .
  • the video editing may take place on a video editing computer 52 using video editing software 54 .
  • FIG. 3 a simple example gameplay scenario is shown in order to better explain the process shown in FIGS. 1 and 2 .
  • FIG. 3 is intended to represent a soccer game field 58 within a video game.
  • This soccer game field 58 is shown from a “top-down” perspective as a two-dimensional field.
  • it is intended to represent a three-dimensional “game world” that may be filmed using the method of this invention.
  • the depicted field 58 includes only two players. Player one 60 and player two 62 are both shown on the field 58 . This entire display is intended to represent the “gameplay area” or “gameplay space” and a simple gameplay sequence which may be recreated and filmed using the method of this invention.
  • all of the data pertaining to player two 62 , player one 60 , the ball 64 , the goal 66 and the field 58 are being stored thirty times in a single second.
  • the datacasting API 14 (See FIG. 1 ) captures all of this data and provides it to the data capture software 16 .
  • the data capture software 16 may input its data into the rendering computer 18 for use of the data rendering software 20 .
  • This software allows for the placement of a multiplicity of “virtual cameras” around the gameplay field.
  • the data rendering software 20 places these virtual cameras around the field 58 .
  • the first is fixed camera 68 . It is a fixed camera 68 for use in “filming” the ball and its movement from a single angle for a period of time.
  • a multiplicity of microphones may be placed throughout the game field 58 .
  • Each of these microphones may record time-stamped, position-based audio. If the microphone is “near” the stands of the match, for example, the voices of the “crowd” in the stands may be louder than if the microphone is further away.
  • a microphone closer to the ball 64 may record that sound more closely and accurately and loudly, so as to indicate a close proximity to the kick.
  • a third camera 74 is placed directly behind player two 62 .
  • This third camera 74 similar to the “player following” cameras typically employed in video games, may record the player's movements precisely and watch as player two 62 watches the ball 64 (and later ball 64 ′) go into the net.
  • the cameras (and microphones) may or may not “simultaneously” record the event as it is re-created by the data rendering software 20 .
  • the user may review and replay the re-created event as many times as necessary, because it is based upon stored data, not upon events as they occur, in order to get an appropriate or desired shot of a particular portion (or all) of a gameplay sequence.
  • Time-stamping of the video files is used to synch up the rendered video and audio files for later editing.
  • the perfect shot and perfect “pre-production” (actually occurring after the event, but in effect, planning the best shots to use, and in that sense “pre-production”) may be done.
  • the cameras may be placed numerous times and in different locations. If the shot is not-quite-right, the user may alter the camera location only slightly. If the shot is not acceptable, the user may alter it completely or simply not use any video created from that “virtual” camera in a final video that is created.
  • Video of two dimensional games, three dimensional games and blends of the two may be created using this method.
  • Complex games involving any number of players, online or connected by means of a local area network or connected to a large-scale server or multiple servers may be filmed using the methodology described herein.
  • the game server itself may be equipped with the ability to capture video game data (the data capture software 16 ) in some embodiments, such that it records data locally and subsequently videos may be created from that data at remote locations or locally.
  • This flowchart includes the steps used to capture game data, re-create a gameplay sequence and to subsequently create video from that gameplay data and re-created gameplay sequence.
  • a first step 76 is to play a game.
  • This first step 76 also involves the playing of a game including the creation of gameplay data.
  • the gameplay data is gathered by a datacasting API 14 and captured by data capture software 16 (See FIG. 1 ).
  • data capture software 16 See FIG. 1 .
  • this step one or more players plays a game and thereby generates data pertaining to the gameplay sequence.
  • this data may include: player location within the game world, player weapons, player or computer player actions, items used, injuries or losses sustained, movement of player characters or units around the game world and virtually any and all data generated or used to generate a gameplay sequence (and as further described above “gameplay data”).
  • a next step 78 requires that the gameplay data be captured.
  • a piece of software “listens” to the game as the gameplay sequence moves forward. This is done by the datacasting API 14 , either incorporated into the game or as a stand-alone module.
  • the data capture software 18 captures the data and stores it for later use in re-creating the gameplay sequence.
  • the gameplay data that is captured may be stored in virtually any location.
  • the data is stored on the user's computer.
  • the data may be stored on the consumer computer 10 .
  • the data may be stored remotely, for example, on a multiplayer hosting server or on a media outlet's web server.
  • the data may be stored in any location and may be stored for a set period of time or indefinitely.
  • the gameplay data may be saved remotely.
  • a user may take part in a gameplay sequence, continue playing for several minutes or hours, then request, after-the-fact, that the remote server provide him or her with the gameplay data created in the course of that gameplay sequence from a first time to a second time. The user may then be provided with that gameplay data, either as a downloadable file or through access on a remote server.
  • This type of configuration would allow a user (or other party), not knowing in advance that a particular sequence will be meaningful, to create, after-the-fact, an excellent video of the meaningful gameplay sequence.
  • a next step 80 is to provide the gameplay data to data rendering software.
  • the data rendering software 20 (described with reference to FIGS. 1 and 2 ) can use the gameplay data to recreate the gameplay sequence exactly as it originally happened.
  • the user or director of the process can move on to a next step 82 performing video preproduction. This is the process wherein the user or director may select the various locations for virtual cameras within the gameplay sequence, the places for microphones, any movement of cameras or microphones and the like.
  • this step 82 of preproduction is not actually occurring before the gameplay sequence has completed. Instead, it occurs during the re-creation of the gameplay sequence, but prior to the video rendering. In the sense that various shots may be selected, microphones placed, thoughtfulness given to the ways in which to film action sequences, it is a pre-production step 82 . However, in the sense that it occurs after the gameplay sequence has completed or partially completed, it is not truly pre-production. It is pre-video rendering production, which provides, virtually, all of the benefits of true pre-production, not previously available in the video creation process for interactive video games (and similar computer-created environments).
  • step 80 of providing the gameplay data to data rendering software may be carried out by any party, the user or game player, a gameplay host or game manufacturer or developer, a media outlet or a third-party video game competition organizer.
  • the gameplay data may be readily available to the public on a website or available only to the gameplay host server or user.
  • the data rendering software 20 in conjunction with the game software 22 may also be used to place advertisements or avatars or sounds, not present in the original gameplay sequence, in various locations throughout the shot or shots to be taken. So, while in the in-game world a player never experiences an advertisement for a particular product, the advertisement may be added after-the-fact, to the re-created game world for use in recording the video and advertising, for example, if the video is to be played on a television network or displayed on-line.
  • the type of advertisements or avatars that may be placed are virtually limitless.
  • an object already present in the game world may be replaced, upon a subsequent rendering (at the user's request or automatically) by a different object, such as a billboard or recognizable product. This could act, similarly to a “product placement” in a television or movie sequence.
  • an object need not be present or “visible” within the game world in order to be placed during a subsequent rendering of a gameplay sequence.
  • a particular boss may be known to be on the “progression” list as a player or group of players progresses through the game. It is highly likely, therefore, that a video may be rendered of an encounter against that boss at some point. The game developer may therefore, create one or more “invisible” objects within the area immediately surrounding that boss.
  • the “invisible’ objects may be replaced (automatically or manually) with any number of other objects or avatars. They may be replaced with advertisements of various types, or even with in-game avatars, for example, of commentators provided by a third party renderer (or the user themselves). This allows for “play-by-play” like functionality, “present” in the re-creation of the gameplay sequence, without it being intrusively present during the actual gameplay sequence as a user is playing.
  • any object or “invisible” object may also contain position-based audio as well. So, as a user moves a camera closer (or as a user approaches) a visible or invisible object, a sound or series of sounds may become louder, as if the user is actually closer to the object, in any subsequently rendered video based upon the gameplay data. For example, as a soccer ball moves closer to or further from the goal or one player moves closer to or further from another, during a gameplay sequence, a sound associated with either may grow louder or softer, building suspense or anticipation of a gameplay event within the gameplay sequence.
  • in-game avatars of commentators may also be an example of an “invisible” object that, when rendered, actually appears in the subsequent video.
  • a commentator may follow a particularly good player, for example, as an invisible object. It may take place as the game is ongoing or after the gameplay sequence has ended. However, in order to not distract the player, the commentator, may remain invisible. But data pertaining to that commentator, the avatar's location and positional audio may be saved. In subsequent rendering, an avatar of the commentator and associated positional audio may be rendered, if desired, into the re-created gameplay sequence as if the commentator was there throughout the gameplay sequence.
  • the user may, also, insert “instant replay”-like functionality into the video being captured.
  • the video may slow down to re-create a portion of the re-created sequence so as to better show a particular event or action. This may be scripted into the re-creation during the preproduction step 82 .
  • a portion of the re-creation may also be slowed down so as to appear to be in “slow motion” or sped up so as to appear in “double time” as it is filmed. This may be used to create dramatic effects or to speed through lengthy portions of gameplay in the resulting video.
  • a user or director (as defined above) is given complete control over the level of detail or the resolution of the gameplay.
  • individual players often turn down the resolution settings and “high quality” graphics features in order to aid in better performance.
  • the director or user may turn the graphics setting to the maximum level.
  • the user may also add voice-over effects in this pre-production stage.
  • voice-over effects There may be “announcers” added in or, in the case of a game manufacturer, voice over regarding the plot of the game or the status of the event. These voice overs may be synchronized with the re-creation of the gameplay sequence or may be added to the video created later.
  • a director or user may add “overlays” to the video, from no matter what angle within the gameplay sequence it is taken.
  • These overlays may include the score of a sports game being played or, in the case of a video game competition, a leader board.
  • these overlays may contain advertisements or the “station identification” labels common during televised sports games.
  • overlays may recreate portions of game data (such as player scores, timers or other relevant information) or may be manually set and edited by users.
  • the overlays may further contain graphics or “picture in picture” functionality showing other locations or angles in the gameplay sequence or the actions of other players simultaneously. Similarly, transitions between multiple perspectives may be automatically inserted by means of the data rendering software 20 .
  • the next step 84 is to create the video.
  • the re-creation is set to “run” such that it may be filmed from the various camera locations.
  • the gameplay sequence runs, exactly as it did in the original gameplay sequence and video files are created for each of the camera locations simultaneously or sequentially. These video files are stored and saved in a format suitable for subsequent editing, transmission or storage.
  • a next step 86 is to perform video postproduction.
  • the user or director selects pieces of video files to splice together (typically with software) in order to create a single (or a few) video of the gameplay sequence.
  • this process may be automated.
  • the video rendering software 24 is capable of some small-scale editing either automatically or subject to user input, such that a video may be created. For more complex video editing a stand-alone or separate application may be used for postproduction.
  • a final step 88 is to distribute the video.
  • a user may place the video on a video file sharing website, on an “own” website, on a game-related website or make it available for purchase or review by some other means.
  • a user of the software is a game manufacturer or software company who made the game, they may make the video available on their website as a marketing tool.
  • the user of the software is a television network, a professional gamer group or an online game-related site, the video may be placed in a “highlights reel” or broadcast by any number of means to end users.
  • the gameplay data generated may be used to quickly provide commentary and/or add “preproduction” elements, as described above, prior to the broadcast of the gameplay sequence. This “almost real-time” broadcast, even delaying the gameplay sequence by a few seconds, allows a broadcaster the opportunity to add commentary, to insert advertisements, to select which “cameras” to use for broadcast and to perform a multiplicity of other “pre-production” tasks, as previously described.
  • the game state captured by the data capture software 16 may also be used in other unique ways. Because the game state data is captured at the speed of 30 frames per second, each of these “frames” may be used as a save file. A user may then “pick up” the game from that point onward as a save file. The gameplay data may be used as a series of rapidly-created game save files.
  • a user may resume playing, taking control of any avatar or player who was previously present during the original gameplay sequence.
  • a suitable replacement for each of the player-controlled avatars may be found or, alternatively, a computer using artificial intelligence may take the place of any non-player characters in the game's saved state.
  • a single user may play from a particular point in the gameplay data forward, where each of the other players in the multiplayer sequence simply mimic the actions in the previously-recorded gameplay state or, alternatively, are replaced with computer artificial intelligence.
  • this gameplay data may be “created” and subsequently distributed for later play.
  • a television company televising a particular football game may create a gameplay data with a series of save states, representing the game's exact status at any point in an on-going televised game. Users at home may then download that gameplay data and play the game they just watched or are watching from any point in the game as it progresses, taking the place of any one of the players on the field.
  • the other players may be replaced with player characters via a network or by computer-controlled artificial intelligence.
  • a first element is the software manufacturer 90 . This is the company or individual(s) which produces the software used to enable datacasting and data rendering.
  • a first software package is the datacasting and data capture software 92 . As described above, this software 92 is used to gather gameplay sequence data for later rendering.
  • the second software package is a data rendering software 94 , which is used to render a re-creation of the gameplay sequence based upon the gameplay data captured by the datacasting and data capture software 92 .
  • the datacasting and data capture software 92 is provided to a game manufacturer 96
  • the data rendering software 94 is provided to a game consumer 100 .
  • This providing of the data rendering software 94 may be through a free download, through license of the software or may be provided, along with the game client, for free to enable the game consumer 100 to render gameplay sequences that are created using the datacasting and data capture software 92 as he or she plays the game.
  • the game manufacturer creates a game client incorporating the datacasting API 98 .
  • the consumer simply plays the game client 102 which includes the datacasting API 98 .
  • the consumer creates gameplay data 104 .
  • This data is automatically created, logged and stored by the game client and the datacasting and data capture software 92 .
  • the data rendering software 94 previously provided to the consumer, may be used so that the consumer may create game video 106 .
  • the consumer may then share that video 108 with whomever he or she desires.
  • This distribution model provides the data rendering software 94 to the consumer for his or her own enjoyment, filming and distributing video of his or her gameplay sequences to whomever they choose.
  • the datacasting and data capture software 92 is provided to the game manufacturer, for integration into games such that the game manufacturer may provide the benefit of this software's capability to the game consumer, through integrated datacasting and data capture software 92 .
  • a software manufacturer 110 remains the primary creator of datacasting and data capture software 112 and data rendering software 114 .
  • both pieces of software are provided (by any number of means) to a game manufacturer 116 .
  • game manufacturer herein could refer to anyone that provides online game hosting services for a game, to a licensee of a game manufacturer, to the parent company of a game manufacturer, to a player group, to a professional video game league or to one of many other groups responsible for video game competition.
  • Game client integrating datacasting and data capture software 118 is created by the game manufacturer 116 as is a game server (or series of game servers) integrating the datacasting and data capture software 120 .
  • the consumer plays the game client 122 from a game created by the game manufacturer and the game server integrating datacasting and data capture software 120 and the game client integrating datacasting and data capture software 118 work together as the client plays to create consumer created data 124 of the gameplay sequence.
  • the game manufacturer may create video 126 using the data, typically stored on one or more of the game servers 120 .
  • the video may be put through some post production 128 and may subsequently be distributed 130 by the game manufacturer, for example, to a media outlet for additional press of the game.
  • advertisements may be inserted so that the game manufacturer receives some additional benefit from its broadcast in the form of advertising revenue.
  • the distinction between these two business models is the involvement or lack of involvement of the consumer in the video creation process.
  • the video is created by the consumer, with tools granted by a game manufacturer.
  • the video is created by a party other than the video game consumer.
  • the first embodiment is purely for enjoyment and sharing of gameplay experiences with others by the consumer.
  • the second embodiment may be for those purposes, but may also be for advertising or marketing purposes or simply to better display gameplay in action or an exciting portion of the game that a game manufacturer (or media outlet) wishes to call to the attention of game players or the general public.

Abstract

A video of a virtual world is created utilizing stored gameplay data and game states. The gameplay data is used to re-create a given gameplay sequence in order to create video of the gameplay sequence. The gameplay sequence may be re-created using enhanced graphics to create new video files from any number of visual perspectives.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation-in-part, claiming priority from U.S. Provisional Patent 60/915,073, filed Apr. 30, 2007, and entitled METHOD OF CREATING VIDEO IN A VIRTUAL WORLD AND METHOD OF DISTRIBUTING AND USING SAME.
  • BACKGROUND
  • 1. Field of the Invention
  • The present invention relates to computer software and more specifically to a method of creating video in a virtual world and a method of distributing and using same. The present invention generally describes a novel method of creating a video presentation of an interactive video game (video of other virtual worlds may also be created) using data that has been created and stored as the interactive video game is played.
  • 2. Description of the Related Art
  • There exist other systems and methods whereby a video presentation of interactive video games may be created. For example, there are numerous software products available which simply “record” the actions taken on a user's computer or video game console. These software products are capable of creating a video presentation of interactive video games.
  • These software products are typically purchased and downloaded as stand-alone applications for use in recording whatever actions are taking place on the user's screen. This provides the functionality necessary for recording video game gameplay, but it is not designed specifically with the re-creation of video game gameplay in mind.
  • However, these programs are generally only capable of creating video presentation of exactly what was displayed on a given player's screen (or a portion thereof). There is no capability, short of running similar software on multiple additional machines, to switch perspectives, user viewpoints or to alter dynamically, as the game is on-going, the viewpoint of the video being created, free of any game-created constraints.
  • From a television or video production standpoint, this results in a substantial limitation on the ability to create compelling “replays” or even compelling video presentations of gameplay events. While the gameplay or action within the game may be very exciting to a player or to a group of players, it makes post-production videos based upon the gameplay or action very dull. A very rudimentary understanding of video production and post-production informs content-creators that a single perspective of an entire event is, generally, uninteresting. In most modern motion pictures and television shows, a multiplicity of angles and perspectives on the same event are used in order to tell a much more compelling story. The existing software programs are not capable, absent some substantial pre-planning, of providing this functionality.
  • Furthermore, the existing programs often result in user-created videos that are substantially inferior in quality. In contrast, video game manufacturer's videos of their own games are often of very high quality, but take a substantial amount of time and resources to create, due to the need to generate multiple perspectives, to plan and orchestrate beforehand in pre-production the ways in which various “shots” will be laid out and other relevant factors.
  • For these and other reasons, there exists a need. There are no means by which a game player or game manufacturer can easily create compelling videos based upon an interactive video game “match” or gameplay sequence. Further there are no means by which videos may be created after a gameplay sequence has taken place. The programs of the prior art provide only a single view or complex pre-planned views of a gameplay experience. Therefore, there exists a need for a software product capable of addressing these limitations of the available technology.
  • SUMMARY OF THE INVENTION
  • The present invention provides a method of creating a video record of virtual worlds and a method of distributing and using same. The preferred embodiment of the present invention provides numerous benefits over the prior art.
  • The present invention generally provides a means by which a stand-alone or “add-in” software tool may be employed in two parts to re-create from stored data a three dimensional gameplay experience (or other rendition of a three-dimensional world) after it has occurred. The experience or sequence of events may then be stored in a video format after-the-fact, reproducing the event faithfully, but allowing for much more thoughtful pre-production, post-production and point of view, lighting, and sound placement.
  • The present invention generally works by means of a two software application process. The invention could be implemented using more or fewer software applications, but in the preferred embodiment only two are utilized. The first is a software application that is either built-into a video game (or other, similar) software or as a stand alone application. This first software application is designed to “grab” or “log” the data that is created as a video game is played.
  • This application is designed to log (or store) all of the data created. For example, in a first person shooter (FPS) game, the application would log shooter positions, weapons used, the locations of every player at each moment (typically x, y, z coordinates), the number of shots fired by each player, each player's name or “nickname” in the game, the angle of firing and location, the location in which other players are hit by “bullets” or “lasers” and various other data.
  • The second application is an application designed in conjunction with the game software in the preferred embodiment, but in alternative embodiments it may stand alone, as well, to recreate the event exactly as it originally occurred during the game play. This application also provides the secondary functions of allowing a user to see the event at whatever speed desired, to place a virtual camera (or multiple virtual cameras) throughout the recreated gameplay sequence, and to similarly place virtual sound receivers or virtual lighting effects in various places throughout the gameplay sequence. This software is capable of moving through the gameplay sequence both in time and space, so as to provide the best possible recreation of the event for “filming” and to provide a subsequent “director” of the filming with the most resources to recreate and edit the event in a compelling manner.
  • The first benefit is the ability to drastically improve rendering quality. Most video capture software captures only the quality of the image displayed on the player's screen (and only from the player's perspective). As it turns out, in order to increase response time, many gamers intentionally degrade the quality of the graphics displayed by the game. Similarly, they intentionally degrade the video capture software's resolution so that it does not needlessly hinder the gameplay experience by spending CPU cycles transcoding video in real time.
  • By “filming” (analogous to the operation of a camera) a re-creation of the gameplay sequence and “filming” the re-creation based upon saved data, the graphics may be “cranked up” to the maximum levels of resolution and graphical quality. Furthermore, the video may be captured, of those high-resolution graphics, at a very high resolution as well.
  • There is no concern, at the time of creating the video of the gameplay sequence, for compromising the performance of the machine upon which the game is being played. It is being done after-the-fact. This leads to an overall drastic improvement in the resulting video presentation. Higher-quality videos are much more suitable for broadcast by high definition television or for sharing via the web.
  • A second benefit of the present invention is to allow a “director” of the video file or virtual film creation to review the sequence and to select, much like a director would for a television or movie sequence, the best angles and virtual camera locations (POV). The “director” composing the subsequent film creation may be any user of the software, from a game player to a game software manufacturer to a third party media creator or an interactive entertainment competition organizer. The ability to direct, allows the most compelling or “best angles” for exciting moments in the game to be selected by the director, resulting in substantially better quality and more compelling video files being created.
  • An additional benefit is that the director of a video file or film creation based upon the gameplay sequence may also create multiple virtual cameras for use in capturing different angles and various times. Cutting between angles and scenes is an excellent way for a director to make lengthy gameplay sequences more exciting to watch and to provide the best angles for multiple different events or actions. The present invention provides the only current method whereby virtual camera locations and virtual camera angles (including multiple virtual cameras) may be used to record an event, other than scripting an event prior to recording or capture. Scripting is rarely desirable in gaming competitions or other similar situations.
  • It is therefore an object of the present invention to provide means for any user of stored data pertaining to a gameplay sequence to recreate the gameplay sequence using that data, to insert a multiplicity of “points of view” (using virtual cameras and virtual “microphones”) at various locations within the gameplay sequence and world, to record with multiple virtual cameras at various times throughout the gameplay sequence, to select those locations and times in which each virtual camera will be present and recording video at various locations, and to provide, as an output, a video file or video files suitable for post-production editing, viewing and distribution.
  • The novel features which are characteristic of the invention, both as to structure and method of the operation thereof, together with further objects and advantages thereof, will be understood from the following description, considered in connection with the accompanying drawings, in which the preferred embodiment of the invention is illustrated by way of example. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only, and they are not intended as a definition of the limits of the invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of the components and data flow in a preferred embodiment of the present invention;
  • FIG. 2 is a block diagram of an alternative embodiment;
  • FIG. 3 illustrates the placement of several virtual cameras and a virtual microphone in a gameplay sequence;
  • FIG. 4 is flowchart of the steps involved in the video-creation process;
  • FIG. 5 is a flowchart of the steps involved in the method of creating and distributing a video file; and
  • FIG. 6 is a flowchart of an alternative embodiment of the method of creating and distributing a video file.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Several definitions may be useful during the reading of this document. These are defined in the following paragraphs. “Director” or “user” as used throughout the specification refers to any one of a number of individuals or companies who act to select “locations” at which to place virtual cameras, microphones and to provide any motion to either, during the course of recording a gameplay sequence. The director or user may be an individual who has taken part in a gameplay sequence, a third party game software manufacturer or developer, a media source or a third-party game competition organizer.
  • “Gameplay sequence” as used herein generally refers to the parsing of any saved gameplay data with a game software application to create (or re-create) a series of events over the course of time, within a game software application. “Gameplay sequence” is also intended to apply to the use of any data to create (or re-create) any events or activity within a virtual “world” generated by a computer. Gameplay sequence is not intended to be limited to video games only, as it may be applicable to any virtual world that is created by a computer in conjunction with suitable software and hardware.
  • Turning first to FIG. 1, the components and data flow of a preferred embodiment of the present invention are shown in block diagram form. The present invention generally relates to a method used to create video files or a “film” of a virtual world. This figure shows one of the ways in which the video files are created.
  • A consumer computer 10 may be a conventional desktop or laptop computer. As can be seen, this consumer computer 10 has at least one software application installed, in this instance a game software application 12. The game software application is preferably a video game of some type. The game software 12 may take the form of an online multiplayer game, a first person shooter (FPS), a massively multiplayer online (MMO) game, a real-time strategy (RTS) game or virtually any other form of video game. It may also take the form of any other virtual world environment capable of creation by a computer. In general, both multiplayer as well as single player games will be recorded or filmed using this method, it being understood that by “filming”, digital recording is intended.
  • In this embodiment, the game software 12 contains a datacasting API (Application Programming Interface) 14. The datacasting API 14 in the embodiment depicted in this figure is an “add on” or a “tool” which has been integrated into the game software 12 itself such that the game software 12 is capable, inherently, of utilizing the datacasting functionality.
  • The datacasting API 14 is the component part of the game which is capable of parsing and translating gameplay sequence data created by the game software 12. For example, in the case of a first person shooter, the datacasting API 14 would “log” data created by the game software representing player locations, guns or other weapons picked up or used, the locations and angles of bullets, projectiles or lasers fired or other weapons used, the results of any successful and unsuccessful attacks, player movement, player names, player statistics, animation data and virtually every piece of data used or manipulated by the game software 14 in carrying on the gameplay sequence (hereinafter referred to as “gameplay data”).
  • In the event that the game has an online-multiplayer component, all or a portion of this datacasting API 14 may be, instead, a part of the game server software (not pictured). As a user connects to the game server software at a remote (or one of the player's computers) location, the datacasting API begins logging information pertaining to that player. In either case, the datacasting API “logs” or otherwise parses gameplay sequence data which may then provided to data capture software 16. In the preferred embodiment, the datacasting API 14 and the data capture software 16 captures data at a rate of 30 frames per second (FPS). This is faster than standard television (24 FPS) and motion picture (16 FPS) and at speeds of 30 FPS, the human eye is virtually incapable of seeing any “chop” in the video subsequently created from this gameplay data.
  • The data capture software 16 is software which may be resident on the consumer computer 10, resident at a remote server location or resident at a third party site or computer and is used to capture all of the gameplay data. The data capture software 16 creates a game sequence data file that may be used to reproduce the video game gameplay sequence at a later time. The data capture software 16 creates a detailed log of the entire gameplay sequence for storage.
  • In order to create the gameplay data, the datacasting API 14 and the data capture software 16 together take “snapshots” of the game “state.” A state is a term of art used in computer science to refer to the status of the entire program in memory at a given time. It may also refer to the current setting of every (or a portion of every) variable currently defined and in use by the program as it runs. The total data pertaining to the on-going and running software program is the “state” of that program.
  • The present invention takes a “snapshot” of the “state” of all variables, the memory registers and values in memory locations allocated to the game program at a rate of 30 times per second. The game state includes any animation state, game data state, sound replay state, player location and action state, and various other states. This data is then written, in a reproducible format such as text, XML or directly as a RAM dump, to a state log for use in rendering later using a rendering computer 18. As the state is loaded into the rendering computer 18, the gameplay sequence is re-created, exactly, from the moment that the game state data was captured, including all avatar and character action.
  • The state data may be created or stored in a compressed format using differential analysis to record only the data that has changed from the last moment at which a game state was saved. This may analyze the data, quickly, and determine what elements have changed from moment to moment, then only save those elements in the new game state. For example, from moment to moment a player's nickname would not change, but the player's location may be changing constantly. Therefore, the player's name need not be logged at every instance. However, data pertaining to movement and action would be saved often for accuracy. This may result in a significantly smaller overall file size of the resulting saved gameplay data.
  • The next component is the rendering computer 18. It is to be expressly understood that the rendering computer 18 serves at least two separate and distinct rendering functions. It is, notably, not only a computer capable of rendering video from data. The first function is to accept a data file produced by the data capture software 16 and to “re-create” a gameplay sequence based upon that data. Its second, and distinctly separate, function is to record the re-created gameplay sequence according to a user's direction. All methods of the prior art only perform the second function (and only in part) in that they record, only, the event as it is happening (instead of a re-created gameplay sequence) and from a single perspective, the prior art that is overcome by the present invention.
  • First, the rendering computer 18 (which may, depending upon the implementation, be the same as the consumer computer 10, but is pictured separately for ease of explanation) is capable of recreating the gameplay sequence as if it is being played again. This is accomplished by data rendering software 20. The data rendering software 20 is software designed to interpret the gameplay data (such as player location, name, weapons used and the like) and to re-create the gameplay sequence based upon that data.
  • In order to accomplish this, the data rendering software 20 is used, virtually always, in conjunction with replay game software 22. The replay game software 22 is the same or is closely related to the game software 12, but may include higher resolution graphics and/or advertising audio and graphics, for example. The data rendering software 20 feeds the data pertaining to the gameplay sequence that was created by the data capture software 16 back into the replay game software 22 on the rendering computer 18. The replay game software 22, under the direction of the data rendering software 20 interprets the data and presents the gameplay sequence, in its entirety, subject to the user's direction, exactly as it occurred.
  • It is as if the gameplay is occurring all over again. However, at this stage, the data rendering results in a three dimensional world in which the data rendering software 20 allows a user to move through and view from literally any location in the game world or map. This is not a single-viewpoint locked camera follow of one or more players, it is an exact re-creation of the gameplay sequence, from the complete gameplay data created as the gameplay sequence was first completed.
  • At this point, the user may use the rendering computer 18 and the data rendering software 20 to select a multiplicity of locations, within the game world or gameplay sequence to place one or more “virtual” cameras and to record video from one time to another at any one of those cameras. Similarly, the “virtual” microphone may be placed at any point in the gameplay space or at a multiplicity of locations if so desired. This process may be more readily understood when described with reference to FIG. 3 below.
  • The user may then, once all of the virtual cameras and microphones are placed, “run” the entire gameplay sequence while one or more of the cameras record video and the microphones record sound. In the preferred embodiment, these virtual cameras and microphones output to individual time-stamped video files, audio files or audio/video files. These files may then be used for “post production” editing of a video. Video rendering software 24 is used in connection with the data rendering software 20 and the replay game software 22 to appropriately create video files of the sequence from each location.
  • The output from the video rendering software 24 is a video file 26. Of course this video file 26 may in fact be a multiplicity of video files, should a multiplicity of virtual cameras be used. Video file 26 is intended to represent the collective output of this process, including any audio files or audio/video files created. It is also intended to represent the possibility that analog audio and/or video files may be created using this functionality.
  • The video file 26 may be immediately shared via a video distribution channel 32. This video distribution channel 32 may be a professional video game website, a television show or compilation video of video game highlights. It may be a television network or an internet video sharing site. The video file 26 is ready to be shared, broadly, the moment it has been rendered by the video rendering software 24.
  • However, a user may, optionally, choose to use a video editing computer 28 equipped with video editing software 30 to combine the various video files created by the video rendering software 24 into a single, composite video. The optional nature of this choice is indicated by dotted connecting lines. The user may choose to use the video editing software 30 to insert graphics, video effects, transitions between cuts or to integrate multiple angles from which the video was created into a single video file for sharing.
  • In the film industry, this process is typically considered “post production”. It is also known as “editing.” This process allows a user to select the “best” shot, given the series of angles from which video was created. Even further, if the user is dissatisfied with the angles he or she has chosen or the outcome of one or more video files, the user may go back to the gameplay sequence data and use the data rendering software 20, the replay game software 22 and the video rendering software 24 to create a new angle or new portion of video for the resultant video file 26 or for subsequent distribution.
  • One of the advances over the prior art is that the user may return, directly, to the source of the video in order to create additional angles, movements, pans, or data points from which video may be created for subsequent inclusion in a post-production, final cut video program to be created. Once the final cut is created using the video editing software 30, the video may be distributed via the video distribution channel 32.
  • Referring next to FIG. 2, an alternative components and data flow diagram is shown. The first component is a consumer computer 34. This may be the same consumer computer found in FIG. 1 (as element 10). The next component is game software 36. It is notable, however, that in this case, the game software 36 is separate from a datacasting API 38 unlike the combination shown in FIG. 1. This is the primary difference between this embodiment and the embodiment depicted in FIG. 1.
  • The datacasting API 38 in this embodiment is an application separate from the game software 36. This means that the datacasting API 38 in this embodiment is designed in such a way as to “listen” to the game software 36 as it works. The datacasting API 38 still creates a log, which it passes to data capture software 40, of all relevant information happening within the game. The capture of the gameplay data takes place, using whatever means are available. In some instances, a “plug-in” to a game will be used in connection with the datacasting API 38 when the datacasting API is not integrated into the game software 36.
  • In other cases, the datacasting API 38 will employ “listening” techniques to gather the information from the game software 36. In any event, the datacasting API 38, while not a part of the game software 36, (which will be described further below) is still capable of gathering all relevant gameplay data and passing it on to data capture software 40.
  • A rendering computer 42, corresponding to the rendering computer 18 in FIG. 1, performs the same two-part function. First, it renders the game using the gameplay data and data rendering software 44 and replay game software 46, then it renders a video file from that gameplay data using video rendering software 48, according to user direction.
  • The resultant video file 50 (which may, in fact, be a multiplicity of audio and/or video files) may be edited according to user wishes or may be immediately distributed via a video distribution channel 56. The video editing may take place on a video editing computer 52 using video editing software 54.
  • Referring next to FIG. 3, a simple example gameplay scenario is shown in order to better explain the process shown in FIGS. 1 and 2. FIG. 3 is intended to represent a soccer game field 58 within a video game. This soccer game field 58 is shown from a “top-down” perspective as a two-dimensional field. However, it is intended to represent a three-dimensional “game world” that may be filmed using the method of this invention.
  • The depicted field 58 includes only two players. Player one 60 and player two 62 are both shown on the field 58. This entire display is intended to represent the “gameplay area” or “gameplay space” and a simple gameplay sequence which may be recreated and filmed using the method of this invention.
  • A soccer ball 64 (at time=0) and 64′ (representing the same soccer ball at time=1) is also shown. For purposes of explanation, assume that the gameplay sequence is made up of two time periods, time=0 and time=1. There is, however, a smooth transition between these two time periods. It is apparent that player two 62 has just struck the ball 64 with his on-screen character or avatar. The ball 64 is traveling toward the goal 66.
  • In the preferred embodiment of the present invention, all of the data pertaining to player two 62, player one 60, the ball 64, the goal 66 and the field 58 are being stored thirty times in a single second. As the ball 64 moves toward the goal 66, the datacasting API 14 (See FIG. 1) captures all of this data and provides it to the data capture software 16.
  • After the event or gameplay sequence has occurred (and presumably after the game has ended) the data capture software 16 may input its data into the rendering computer 18 for use of the data rendering software 20. This software allows for the placement of a multiplicity of “virtual cameras” around the gameplay field.
  • The data rendering software 20 places these virtual cameras around the field 58. The first is fixed camera 68. It is a fixed camera 68 for use in “filming” the ball and its movement from a single angle for a period of time. For example, fixed camera 68 may be set to record only a portion of the transition time from time=0 to time=1, the portion in which the ball 64 is still in the frame of the camera. Alternatively, it may be set to film the entire period from time=0 to time=1.
  • The next camera, pan camera 70 is placed, using scripting protocols or “drag and drop” methodologies, such that it pans across the field moving past player one 60 and following the ball 64 from time=0 to time=1. This shot would follow the goal-scoring moment in the match from player two's 62 shot until the ball 64 (or later 64′) enters the net of the goal 66. As depicted, there is a single virtual microphone 72 (and at time=1, microphone 72′) which also follows or pans with the ball as it moves past player one 60.
  • It is to be understood that a multiplicity of microphones may be placed throughout the game field 58. Each of these microphones may record time-stamped, position-based audio. If the microphone is “near” the stands of the match, for example, the voices of the “crowd” in the stands may be louder than if the microphone is further away. Similarly, if sound effects of the kick of the ball 64 are generated by the software in a spatial manner, a microphone closer to the ball 64 may record that sound more closely and accurately and loudly, so as to indicate a close proximity to the kick.
  • Finally, a third camera 74 is placed directly behind player two 62. This third camera 74, similar to the “player following” cameras typically employed in video games, may record the player's movements precisely and watch as player two 62 watches the ball 64 (and later ball 64′) go into the net.
  • It is to be understood that the placement of these “cameras” around the game field 58 take place after the events of the game have already ended. The re-creation of the gameplay sequence is done after the fact using the data rendering software 20, not during the actual match taking place. Similarly, no physical camera is placed anywhere. These are virtual cameras in the sense that they record action from a certain perspective within the gameplay sequence and the gameplay world in accordance with user requests or direction.
  • The cameras (and microphones) may or may not “simultaneously” record the event as it is re-created by the data rendering software 20. The user may review and replay the re-created event as many times as necessary, because it is based upon stored data, not upon events as they occur, in order to get an appropriate or desired shot of a particular portion (or all) of a gameplay sequence. Time-stamping of the video files is used to synch up the rendered video and audio files for later editing.
  • Because the video recording only occurs after the gameplay sequence is completed, and is based upon re-rendered or re-created gameplay performed by the data rendering software, the perfect shot and perfect “pre-production” (actually occurring after the event, but in effect, planning the best shots to use, and in that sense “pre-production”) may be done. The cameras may be placed numerous times and in different locations. If the shot is not-quite-right, the user may alter the camera location only slightly. If the shot is not acceptable, the user may alter it completely or simply not use any video created from that “virtual” camera in a final video that is created.
  • It is to be understood that this technology may be applied to gameplay sequences of any length, type or kind. Video of two dimensional games, three dimensional games and blends of the two may be created using this method. Complex games involving any number of players, online or connected by means of a local area network or connected to a large-scale server or multiple servers may be filmed using the methodology described herein. For example, as described more fully below, the game server itself may be equipped with the ability to capture video game data (the data capture software 16) in some embodiments, such that it records data locally and subsequently videos may be created from that data at remote locations or locally.
  • Referring now to FIG. 4, a flowchart of the steps in the preferred embodiment is shown. This flowchart includes the steps used to capture game data, re-create a gameplay sequence and to subsequently create video from that gameplay data and re-created gameplay sequence.
  • A first step 76 is to play a game. This first step 76 also involves the playing of a game including the creation of gameplay data. The gameplay data, as has previously been discussed, is gathered by a datacasting API 14 and captured by data capture software 16 (See FIG. 1). During this step one or more players plays a game and thereby generates data pertaining to the gameplay sequence. As described above, this data may include: player location within the game world, player weapons, player or computer player actions, items used, injuries or losses sustained, movement of player characters or units around the game world and virtually any and all data generated or used to generate a gameplay sequence (and as further described above “gameplay data”).
  • A next step 78 requires that the gameplay data be captured. In the preferred embodiment a piece of software “listens” to the game as the gameplay sequence moves forward. This is done by the datacasting API 14, either incorporated into the game or as a stand-alone module. The data capture software 18 captures the data and stores it for later use in re-creating the gameplay sequence.
  • It is to be understood that the gameplay data that is captured may be stored in virtually any location. In the preferred embodiment, the data is stored on the user's computer. For example, in the instance of a game player playing a game and using the datacasting API 14 (see FIG. 1), the data may be stored on the consumer computer 10. However, in some cases the data may be stored remotely, for example, on a multiplayer hosting server or on a media outlet's web server. The data may be stored in any location and may be stored for a set period of time or indefinitely.
  • Similarly, the gameplay data may be saved remotely. A user may take part in a gameplay sequence, continue playing for several minutes or hours, then request, after-the-fact, that the remote server provide him or her with the gameplay data created in the course of that gameplay sequence from a first time to a second time. The user may then be provided with that gameplay data, either as a downloadable file or through access on a remote server. This type of configuration would allow a user (or other party), not knowing in advance that a particular sequence will be meaningful, to create, after-the-fact, an excellent video of the meaningful gameplay sequence.
  • A next step 80 is to provide the gameplay data to data rendering software. The data rendering software 20 (described with reference to FIGS. 1 and 2) can use the gameplay data to recreate the gameplay sequence exactly as it originally happened. After the gameplay sequence has been recreated, the user or director of the process can move on to a next step 82 performing video preproduction. This is the process wherein the user or director may select the various locations for virtual cameras within the gameplay sequence, the places for microphones, any movement of cameras or microphones and the like.
  • As can be seen this step 82 of preproduction is not actually occurring before the gameplay sequence has completed. Instead, it occurs during the re-creation of the gameplay sequence, but prior to the video rendering. In the sense that various shots may be selected, microphones placed, thoughtfulness given to the ways in which to film action sequences, it is a pre-production step 82. However, in the sense that it occurs after the gameplay sequence has completed or partially completed, it is not truly pre-production. It is pre-video rendering production, which provides, virtually, all of the benefits of true pre-production, not previously available in the video creation process for interactive video games (and similar computer-created environments).
  • It is to be understood that the step 80 of providing the gameplay data to data rendering software may be carried out by any party, the user or game player, a gameplay host or game manufacturer or developer, a media outlet or a third-party video game competition organizer. The gameplay data may be readily available to the public on a website or available only to the gameplay host server or user.
  • The data rendering software 20 in conjunction with the game software 22 may also be used to place advertisements or avatars or sounds, not present in the original gameplay sequence, in various locations throughout the shot or shots to be taken. So, while in the in-game world a player never experiences an advertisement for a particular product, the advertisement may be added after-the-fact, to the re-created game world for use in recording the video and advertising, for example, if the video is to be played on a television network or displayed on-line.
  • The type of advertisements or avatars that may be placed are virtually limitless. For example, an object already present in the game world may be replaced, upon a subsequent rendering (at the user's request or automatically) by a different object, such as a billboard or recognizable product. This could act, similarly to a “product placement” in a television or movie sequence.
  • Similarly, an object need not be present or “visible” within the game world in order to be placed during a subsequent rendering of a gameplay sequence. For example, in a massively multiplayer online game, a particular boss may be known to be on the “progression” list as a player or group of players progresses through the game. It is highly likely, therefore, that a video may be rendered of an encounter against that boss at some point. The game developer may therefore, create one or more “invisible” objects within the area immediately surrounding that boss.
  • Once a user recreates a video based upon the gameplay data of an encounter with this boss, the “invisible’ objects may be replaced (automatically or manually) with any number of other objects or avatars. They may be replaced with advertisements of various types, or even with in-game avatars, for example, of commentators provided by a third party renderer (or the user themselves). This allows for “play-by-play” like functionality, “present” in the re-creation of the gameplay sequence, without it being intrusively present during the actual gameplay sequence as a user is playing.
  • It is to be further understood that any object or “invisible” object may also contain position-based audio as well. So, as a user moves a camera closer (or as a user approaches) a visible or invisible object, a sound or series of sounds may become louder, as if the user is actually closer to the object, in any subsequently rendered video based upon the gameplay data. For example, as a soccer ball moves closer to or further from the goal or one player moves closer to or further from another, during a gameplay sequence, a sound associated with either may grow louder or softer, building suspense or anticipation of a gameplay event within the gameplay sequence.
  • Finally, in-game avatars of commentators may also be an example of an “invisible” object that, when rendered, actually appears in the subsequent video. A commentator may follow a particularly good player, for example, as an invisible object. It may take place as the game is ongoing or after the gameplay sequence has ended. However, in order to not distract the player, the commentator, may remain invisible. But data pertaining to that commentator, the avatar's location and positional audio may be saved. In subsequent rendering, an avatar of the commentator and associated positional audio may be rendered, if desired, into the re-created gameplay sequence as if the commentator was there throughout the gameplay sequence.
  • During the re-recreation and preproduction process, the user may, also, insert “instant replay”-like functionality into the video being captured. For example, the video may slow down to re-create a portion of the re-created sequence so as to better show a particular event or action. This may be scripted into the re-creation during the preproduction step 82. A portion of the re-creation may also be slowed down so as to appear to be in “slow motion” or sped up so as to appear in “double time” as it is filmed. This may be used to create dramatic effects or to speed through lengthy portions of gameplay in the resulting video.
  • Also during the re-creation preproduction, a user or director (as defined above) is given complete control over the level of detail or the resolution of the gameplay. In normal play, individual players often turn down the resolution settings and “high quality” graphics features in order to aid in better performance. Because at the stage of video rendering of gameplay sequences, there is no concern for the quality of gameplay (the gameplay sequence has already occurred), the director or user may turn the graphics setting to the maximum level. These settings often would reduce the ability of a player to play a game effectively, but for purposes of creating a re-creation of the event after-the-fact as a video file, it results in a higher-quality, better-looking video presentation.
  • The user may also add voice-over effects in this pre-production stage. There may be “announcers” added in or, in the case of a game manufacturer, voice over regarding the plot of the game or the status of the event. These voice overs may be synchronized with the re-creation of the gameplay sequence or may be added to the video created later.
  • Similarly, at this stage, a director or user may add “overlays” to the video, from no matter what angle within the gameplay sequence it is taken. These overlays, for example, may include the score of a sports game being played or, in the case of a video game competition, a leader board. Similarly, these overlays may contain advertisements or the “station identification” labels common during televised sports games.
  • These overlays may recreate portions of game data (such as player scores, timers or other relevant information) or may be manually set and edited by users. The overlays may further contain graphics or “picture in picture” functionality showing other locations or angles in the gameplay sequence or the actions of other players simultaneously. Similarly, transitions between multiple perspectives may be automatically inserted by means of the data rendering software 20.
  • Once the preproduction process is complete, planning out each camera location, any slow motions, any instant replays (or rewinds to replay), any advertisements or other indicia added to the “background” of the game, all microphone locations and all camera and microphone pans or movements; the next step 84 is to create the video. In this step, the re-creation is set to “run” such that it may be filmed from the various camera locations. The gameplay sequence runs, exactly as it did in the original gameplay sequence and video files are created for each of the camera locations simultaneously or sequentially. These video files are stored and saved in a format suitable for subsequent editing, transmission or storage.
  • A next step 86 is to perform video postproduction. At this point the user or director selects pieces of video files to splice together (typically with software) in order to create a single (or a few) video of the gameplay sequence. In some embodiments, this process may be automated. In the preferred embodiment, the video rendering software 24 is capable of some small-scale editing either automatically or subject to user input, such that a video may be created. For more complex video editing a stand-alone or separate application may be used for postproduction.
  • A final step 88 is to distribute the video. In this step a user may place the video on a video file sharing website, on an “own” website, on a game-related website or make it available for purchase or review by some other means. Similarly, if a user of the software is a game manufacturer or software company who made the game, they may make the video available on their website as a marketing tool. Finally, if the user of the software is a television network, a professional gamer group or an online game-related site, the video may be placed in a “highlights reel” or broadcast by any number of means to end users.
  • The gameplay data generated may be used to quickly provide commentary and/or add “preproduction” elements, as described above, prior to the broadcast of the gameplay sequence. This “almost real-time” broadcast, even delaying the gameplay sequence by a few seconds, allows a broadcaster the opportunity to add commentary, to insert advertisements, to select which “cameras” to use for broadcast and to perform a multiplicity of other “pre-production” tasks, as previously described.
  • A user, viewing this gameplay sequence would not, strictly speaking, be watching the game in “real time” but would receive the gameplay sequence broadcast sufficiently shortly thereafter and with excellent added detail, so as to appear to be broadcast in virtually real time. This application of this software provides the best of both worlds, providing pre-production to a previously-impossible pre-production environment of interactive games, while providing the content as it is happening to expectant viewers.
  • The game state captured by the data capture software 16 may also be used in other unique ways. Because the game state data is captured at the speed of 30 frames per second, each of these “frames” may be used as a save file. A user may then “pick up” the game from that point onward as a save file. The gameplay data may be used as a series of rapidly-created game save files.
  • For example, in single player games a user may resume playing, taking control of any avatar or player who was previously present during the original gameplay sequence. In the case of multiplayer games, a suitable replacement for each of the player-controlled avatars may be found or, alternatively, a computer using artificial intelligence may take the place of any non-player characters in the game's saved state.
  • In large multiplayer games, a single user may play from a particular point in the gameplay data forward, where each of the other players in the multiplayer sequence simply mimic the actions in the previously-recorded gameplay state or, alternatively, are replaced with computer artificial intelligence.
  • Additionally, this gameplay data may be “created” and subsequently distributed for later play. For example, a television company televising a particular football game, may create a gameplay data with a series of save states, representing the game's exact status at any point in an on-going televised game. Users at home may then download that gameplay data and play the game they just watched or are watching from any point in the game as it progresses, taking the place of any one of the players on the field. The other players may be replaced with player characters via a network or by computer-controlled artificial intelligence.
  • Referring next to FIG. 5, a method of distribution of the software embodying the method described above is disclosed. A first element is the software manufacturer 90. This is the company or individual(s) which produces the software used to enable datacasting and data rendering. A first software package is the datacasting and data capture software 92. As described above, this software 92 is used to gather gameplay sequence data for later rendering. The second software package is a data rendering software 94, which is used to render a re-creation of the gameplay sequence based upon the gameplay data captured by the datacasting and data capture software 92.
  • In this embodiment, the datacasting and data capture software 92 is provided to a game manufacturer 96, while the data rendering software 94 is provided to a game consumer 100. This providing of the data rendering software 94 may be through a free download, through license of the software or may be provided, along with the game client, for free to enable the game consumer 100 to render gameplay sequences that are created using the datacasting and data capture software 92 as he or she plays the game.
  • The game manufacturer creates a game client incorporating the datacasting API 98. The consumer simply plays the game client 102 which includes the datacasting API 98. In the course of playing the game, the consumer creates gameplay data 104. This data is automatically created, logged and stored by the game client and the datacasting and data capture software 92. Using this gameplay data, the data rendering software 94, previously provided to the consumer, may be used so that the consumer may create game video 106. The consumer may then share that video 108 with whomever he or she desires.
  • This distribution model provides the data rendering software 94 to the consumer for his or her own enjoyment, filming and distributing video of his or her gameplay sequences to whomever they choose. The datacasting and data capture software 92, however, is provided to the game manufacturer, for integration into games such that the game manufacturer may provide the benefit of this software's capability to the game consumer, through integrated datacasting and data capture software 92.
  • Referring now to FIG. 6, an alternative distribution model is disclosed. In this distribution model a software manufacturer 110 remains the primary creator of datacasting and data capture software 112 and data rendering software 114. However, in this embodiment, both pieces of software are provided (by any number of means) to a game manufacturer 116. It is to be understood that “game manufacturer” herein could refer to anyone that provides online game hosting services for a game, to a licensee of a game manufacturer, to the parent company of a game manufacturer, to a player group, to a professional video game league or to one of many other groups responsible for video game competition.
  • Game client integrating datacasting and data capture software 118 is created by the game manufacturer 116 as is a game server (or series of game servers) integrating the datacasting and data capture software 120. The consumer plays the game client 122 from a game created by the game manufacturer and the game server integrating datacasting and data capture software 120 and the game client integrating datacasting and data capture software 118 work together as the client plays to create consumer created data 124 of the gameplay sequence.
  • In this embodiment, the game manufacturer may create video 126 using the data, typically stored on one or more of the game servers 120. The video may be put through some post production 128 and may subsequently be distributed 130 by the game manufacturer, for example, to a media outlet for additional press of the game. As discussed previously, advertisements may be inserted so that the game manufacturer receives some additional benefit from its broadcast in the form of advertising revenue.
  • The distinction between these two business models is the involvement or lack of involvement of the consumer in the video creation process. In the first embodiment, the video is created by the consumer, with tools granted by a game manufacturer. In the second, the video is created by a party other than the video game consumer. The first embodiment is purely for enjoyment and sharing of gameplay experiences with others by the consumer. The second embodiment may be for those purposes, but may also be for advertising or marketing purposes or simply to better display gameplay in action or an exciting portion of the game that a game manufacturer (or media outlet) wishes to call to the attention of game players or the general public.
  • Accordingly, a method of creating video of virtual worlds and method of distributing and using same has been described. It provides numerous benefits over the video recording of interactive games of the prior art. It is to be understood that the foregoing description has been made with respect to specific embodiments thereof for illustrative purposes only. The overall spirit and scope of the present invention is limited only by the following claims, as defined in the foregoing description.

Claims (6)

1. A method of capturing and transmitting video game sequences for custom editing and retransmitting of virtual worlds, comprising the steps of:
generating gameplay data by game software during the playing of a game;
sending said gameplay data to a datacasting API embedded in said game software;
logging said gameplay data by said datacasting API;
sending said logged gameplay data to data capture software;
converting said logged gameplay data in said data capture software into a state of variables in a reproducible format;
sending said state of variables to data rendering software;
converting said state of variables into replay game data;
sending said replay game data to replay game software;
reproducing gameplay of the game from said replay game software;
sending reproduced gameplay of the game to video rendering software; and
creating a plurality of video files by said video rendering software from said reproduced gameplay.
2. The method of claim 1 in which said plurality of video files are sent to a video distribution channel.
3. The method of claim 1 in which said plurality of video files are sent to video editing software and including the additional steps of:
editing said plurality of video files to create a plurality of edited video files; and
sending said plurality of edited video files to a video distribution channel.
4. A method of capturing and transmitting video game sequences for custom editing and retransmitting of virtual worlds, comprising the steps of:
generating gameplay data by game software during the playing of a game;
sending said gameplay data to a datacasting API;
logging said gameplay data by said datacasting API;
sending said logged gameplay data to data capture software;
converting said logged gameplay data in said data capture software into a state of variables in a reproducible format;
sending said state of variables to data rendering software;
converting said state of variables into replay game data;
sending said replay game data to replay game software;
reproducing gameplay of the game from said replay game software;
sending reproduced gameplay of the game to video rendering software; and
creating a plurality of video files by said video rendering software from said reproduced gameplay.
5. The method of claim 4 in which said plurality of video files are sent to a video distribution channel.
6. The method of claim 4 including the additional steps of:
sending said files to video editing software;
editing said plurality of video files to create a plurality of edited video files; and
sending said plurality of edited video files to a video distribution channel.
US12/112,975 2007-04-30 2008-04-30 Method of creating video in a virtual world and method of distributing and using same Abandoned US20080268961A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/112,975 US20080268961A1 (en) 2007-04-30 2008-04-30 Method of creating video in a virtual world and method of distributing and using same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US91507307P 2007-04-30 2007-04-30
US12/112,975 US20080268961A1 (en) 2007-04-30 2008-04-30 Method of creating video in a virtual world and method of distributing and using same

Publications (1)

Publication Number Publication Date
US20080268961A1 true US20080268961A1 (en) 2008-10-30

Family

ID=39887644

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/112,975 Abandoned US20080268961A1 (en) 2007-04-30 2008-04-30 Method of creating video in a virtual world and method of distributing and using same

Country Status (1)

Country Link
US (1) US20080268961A1 (en)

Cited By (75)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090253506A1 (en) * 2008-04-04 2009-10-08 Namco Bandai Games Inc. Game movie distribution method and system
US20090253507A1 (en) * 2008-04-04 2009-10-08 Namco Bandai Games Inc. Game movie distribution method and system
US20090312100A1 (en) * 2008-06-12 2009-12-17 Harris Scott C Face Simulation in Networking
EP2331222A1 (en) * 2008-08-11 2011-06-15 Haven Holdings Llc Interactive entertainment and competition system
US20110265058A1 (en) * 2010-04-26 2011-10-27 Microsoft Corporation Embeddable project data
US20120014658A1 (en) * 2009-03-19 2012-01-19 Tatsuya Suzuki Program, information storage medium, image processing device, image processing method, and data structure
US20120021828A1 (en) * 2010-02-24 2012-01-26 Valve Corporation Graphical user interface for modification of animation data using preset animation samples
US20120021827A1 (en) * 2010-02-25 2012-01-26 Valve Corporation Multi-dimensional video game world data recorder
US20120028706A1 (en) * 2010-02-24 2012-02-02 Valve Corporation Compositing multiple scene shots into a video game clip
US20120028707A1 (en) * 2010-02-24 2012-02-02 Valve Corporation Game animations with multi-dimensional video game data
WO2011143123A3 (en) * 2010-05-11 2012-04-05 Bungie, Inc. Method and apparatus for online rendering of game files
EP2519024A1 (en) * 2011-04-30 2012-10-31 Samsung Electronics Co., Ltd. Crowd sourcing
US20120296812A1 (en) * 2011-05-16 2012-11-22 Piccionelli Gregory A Systems and processes for providing performance content on a communication network
CN102799432A (en) * 2012-06-30 2012-11-28 邱东 Game video recording and replaying method based on recorded drawing instruction
US20130326374A1 (en) * 2012-05-25 2013-12-05 Electronic Arts, Inc. Systems and methods for a unified game experience in a multiplayer game
US20140004950A1 (en) * 2012-06-28 2014-01-02 Electronic Arts Inc. Adaptive learning system for video game enhancement
US20140274387A1 (en) * 2013-03-15 2014-09-18 Electronic Arts, Inc. Systems and methods for indicating events in game video
US20140364207A1 (en) * 2013-06-07 2014-12-11 Nintendo Co., Ltd. Information processing system, server machine, information processing device, recording medium and information processing method
US20140364205A1 (en) * 2013-06-07 2014-12-11 Nintendo Co., Ltd. Information processing system, information processing device, recording medium and information display method
WO2014197876A3 (en) * 2013-06-07 2015-01-29 Sony Computer Entertainment Inc. Sharing three-dimensional gameplay
US8968080B1 (en) * 2010-11-05 2015-03-03 Wms Gaming, Inc. Display of third party content on a wagering game machine
US20150062113A1 (en) * 2009-11-09 2015-03-05 International Business Machines Corporation Activity triggered photography in metaverse applications
EP2750032A3 (en) * 2012-12-27 2015-08-05 Sony Computer Entertainment America LLC Methods and systems for generation and execution of miniapp of computer application served by cloud computing system
US20150224396A1 (en) * 2012-03-05 2015-08-13 Capcom Co., Ltd. Game program and game system
US20150304697A1 (en) * 2014-04-18 2015-10-22 Microsoft Corporation Changing broadcast without interruption to active gameplay
US20160092069A1 (en) * 2014-09-26 2016-03-31 Bally Gaming, Inc. Modifying wagering game graphics
US9350787B2 (en) 2009-06-01 2016-05-24 Sony Interactive Entertainment America Llc Methods and systems for generation and execution of miniapp of computer application served by cloud computing system
US20160214012A1 (en) * 2015-01-28 2016-07-28 Gree, Inc. Method, non-transitory computer-readable recording medium, information processing system, and information processing device
US9473758B1 (en) * 2015-12-06 2016-10-18 Sliver VR Technologies, Inc. Methods and systems for game video recording and virtual reality replay
US20170106283A1 (en) * 2015-10-16 2017-04-20 Microsoft Technology Licensing, Llc Automated generation of game event recordings
US9675874B1 (en) * 2013-07-18 2017-06-13 nWay, Inc. Multi-player gaming system
WO2017160932A1 (en) * 2016-03-16 2017-09-21 Skillz Inc. Management of streaming video data
US9776085B2 (en) 2013-06-07 2017-10-03 Nintendo Co., Ltd. Information processing system, information processing device, server machine, recording medium and information processing method
US9782678B2 (en) 2015-12-06 2017-10-10 Sliver VR Technologies, Inc. Methods and systems for computer video game streaming, highlight, and replay
US20170368458A1 (en) * 2016-06-23 2017-12-28 Minkonet Corporation Method of recording and replaying game video by object state recording
US9901822B2 (en) 2014-01-09 2018-02-27 Square Enix Holding Co., Ltd. Video gaming device with remote rendering capability
US20190238908A1 (en) * 2016-12-28 2019-08-01 Tencent Technology (Shenzhen) Company Limited Information processing method, terminal, system, and computer storage medium
US10403018B1 (en) 2016-07-12 2019-09-03 Electronic Arts Inc. Swarm crowd rendering system
US20190272156A1 (en) * 2018-03-01 2019-09-05 Vreal Inc Virtual reality capture and replay systems and methods
US10471361B2 (en) * 2015-03-27 2019-11-12 Popbox Ltd. Video sharing method
US10484578B2 (en) 2018-03-30 2019-11-19 Cae Inc. Synchronizing video outputs towards a single display frequency
US10535174B1 (en) 2017-09-14 2020-01-14 Electronic Arts Inc. Particle-based inverse kinematic rendering system
US10713543B1 (en) 2018-06-13 2020-07-14 Electronic Arts Inc. Enhanced training of machine learning systems based on automatically generated realistic gameplay information
US10722793B2 (en) 2016-03-15 2020-07-28 Skillz Inc Synchronization model for virtual tournaments
US10726611B1 (en) 2016-08-24 2020-07-28 Electronic Arts Inc. Dynamic texture mapping using megatextures
US10733765B2 (en) 2017-03-31 2020-08-04 Electronic Arts Inc. Blendshape compression system
US10786736B2 (en) 2010-05-11 2020-09-29 Sony Interactive Entertainment LLC Placement of user information in a game space
US10792566B1 (en) 2015-09-30 2020-10-06 Electronic Arts Inc. System for streaming content within a game application environment
US10799798B2 (en) 2016-12-30 2020-10-13 Electronic Arts Inc. Systems and methods for automatically measuring a video game difficulty
US10807004B2 (en) 2016-03-08 2020-10-20 Electronic Arts Inc. Dynamic difficulty adjustment
US10839215B2 (en) 2018-05-21 2020-11-17 Electronic Arts Inc. Artificial intelligence for emulating human playstyles
US10860838B1 (en) 2018-01-16 2020-12-08 Electronic Arts Inc. Universal facial expression translation and character rendering system
US10878540B1 (en) 2017-08-15 2020-12-29 Electronic Arts Inc. Contrast ratio detection and rendering system
US10902618B2 (en) 2019-06-14 2021-01-26 Electronic Arts Inc. Universal body movement translation and character rendering system
US10940393B2 (en) 2019-07-02 2021-03-09 Electronic Arts Inc. Customized models for imitating player gameplay in a video game
US10953334B2 (en) 2019-03-27 2021-03-23 Electronic Arts Inc. Virtual character generation from image or video data
US11023729B1 (en) * 2019-11-08 2021-06-01 Msg Entertainment Group, Llc Providing visual guidance for presenting visual content in a venue
US11062569B2 (en) 2016-03-15 2021-07-13 Skillz Platform Inc. Across-match analytics in peer-to-peer gaming tournaments
CN113259770A (en) * 2021-05-11 2021-08-13 北京奇艺世纪科技有限公司 Video playing method, device, electronic equipment, medium and product
US11110353B2 (en) 2019-07-10 2021-09-07 Electronic Arts Inc. Distributed training for machine learning of AI controlled virtual entities on video game clients
US11217003B2 (en) 2020-04-06 2022-01-04 Electronic Arts Inc. Enhanced pose generation based on conditional modeling of inverse kinematics
US20220035442A1 (en) * 2020-07-29 2022-02-03 AniCast RM Inc. Movie distribution method
US11276216B2 (en) 2019-03-27 2022-03-15 Electronic Arts Inc. Virtual animal character generation from image or video data
US11413539B2 (en) 2017-02-28 2022-08-16 Electronic Arts Inc. Realtime dynamic modification and optimization of gameplay parameters within a video game application
US11452938B2 (en) 2014-08-12 2022-09-27 Utherverse Gaming Llc Method, system and apparatus of recording and playing back an experience in a virtual worlds system
US20220331704A1 (en) * 2021-04-15 2022-10-20 Sony Interactive Entertainment Inc. Video recording system and method
US11504625B2 (en) 2020-02-14 2022-11-22 Electronic Arts Inc. Color blindness diagnostic system
US11562523B1 (en) 2021-08-02 2023-01-24 Electronic Arts Inc. Enhanced animation generation based on motion matching using local bone phases
US11623146B2 (en) 2020-11-05 2023-04-11 Onmobile Global Solutions Canada Limited Game moment implementation system and method of use thereof
US11648480B2 (en) 2020-04-06 2023-05-16 Electronic Arts Inc. Enhanced pose generation based on generative modeling
US11670030B2 (en) 2021-07-01 2023-06-06 Electronic Arts Inc. Enhanced animation generation based on video with local phase
EP4194068A4 (en) * 2021-03-05 2023-07-19 Redefinearts Inc. System for creating gameplay video recording
US11830121B1 (en) 2021-01-26 2023-11-28 Electronic Arts Inc. Neural animation layering for synthesizing martial arts movements
US20240012530A1 (en) * 2022-07-08 2024-01-11 Shanghai Lilith Technology Corporation Video acquisition method, electronic device, and storage medium
US11887232B2 (en) 2021-06-10 2024-01-30 Electronic Arts Inc. Enhanced system for generation of facial models and animation

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6699127B1 (en) * 2000-06-20 2004-03-02 Nintendo Of America Inc. Real-time replay system for video game
US20080139301A1 (en) * 2006-12-11 2008-06-12 Ole-Ivar Holthe System and method for sharing gaming experiences

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6699127B1 (en) * 2000-06-20 2004-03-02 Nintendo Of America Inc. Real-time replay system for video game
US20080139301A1 (en) * 2006-12-11 2008-06-12 Ole-Ivar Holthe System and method for sharing gaming experiences

Cited By (126)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090253507A1 (en) * 2008-04-04 2009-10-08 Namco Bandai Games Inc. Game movie distribution method and system
US20090253506A1 (en) * 2008-04-04 2009-10-08 Namco Bandai Games Inc. Game movie distribution method and system
US20090312100A1 (en) * 2008-06-12 2009-12-17 Harris Scott C Face Simulation in Networking
EP2331222A1 (en) * 2008-08-11 2011-06-15 Haven Holdings Llc Interactive entertainment and competition system
EP2331222A4 (en) * 2008-08-11 2012-07-25 Haven Holdings Llc Interactive entertainment and competition system
US9792950B2 (en) * 2009-03-19 2017-10-17 Sony Interactive Entertainment Inc. Program, information storage medium, image processing device, image processing method, and data structure
US20120014658A1 (en) * 2009-03-19 2012-01-19 Tatsuya Suzuki Program, information storage medium, image processing device, image processing method, and data structure
US9350787B2 (en) 2009-06-01 2016-05-24 Sony Interactive Entertainment America Llc Methods and systems for generation and execution of miniapp of computer application served by cloud computing system
US9875580B2 (en) * 2009-11-09 2018-01-23 International Business Machines Corporation Activity triggered photography in metaverse applications
US20150062113A1 (en) * 2009-11-09 2015-03-05 International Business Machines Corporation Activity triggered photography in metaverse applications
US20120021828A1 (en) * 2010-02-24 2012-01-26 Valve Corporation Graphical user interface for modification of animation data using preset animation samples
US20120028707A1 (en) * 2010-02-24 2012-02-02 Valve Corporation Game animations with multi-dimensional video game data
US20120028706A1 (en) * 2010-02-24 2012-02-02 Valve Corporation Compositing multiple scene shots into a video game clip
US9381429B2 (en) * 2010-02-24 2016-07-05 Valve Corporation Compositing multiple scene shots into a video game clip
US20120021827A1 (en) * 2010-02-25 2012-01-26 Valve Corporation Multi-dimensional video game world data recorder
US20110265058A1 (en) * 2010-04-26 2011-10-27 Microsoft Corporation Embeddable project data
US10786736B2 (en) 2010-05-11 2020-09-29 Sony Interactive Entertainment LLC Placement of user information in a game space
US8632409B2 (en) 2010-05-11 2014-01-21 Bungie, Llc Method and apparatus for online rendering of game files
US11478706B2 (en) 2010-05-11 2022-10-25 Sony Interactive Entertainment LLC Placement of user information in a game space
WO2011143123A3 (en) * 2010-05-11 2012-04-05 Bungie, Inc. Method and apparatus for online rendering of game files
US11806620B2 (en) 2010-05-11 2023-11-07 Sony Interactive Entertainment LLC Systems and methods for placing and displaying user information in a game space
US8968080B1 (en) * 2010-11-05 2015-03-03 Wms Gaming, Inc. Display of third party content on a wagering game machine
EP2519024A1 (en) * 2011-04-30 2012-10-31 Samsung Electronics Co., Ltd. Crowd sourcing
US20120296812A1 (en) * 2011-05-16 2012-11-22 Piccionelli Gregory A Systems and processes for providing performance content on a communication network
US20150224396A1 (en) * 2012-03-05 2015-08-13 Capcom Co., Ltd. Game program and game system
US9492749B2 (en) * 2012-03-05 2016-11-15 Capcom Co., Ltd. Game program and game system
US20130326374A1 (en) * 2012-05-25 2013-12-05 Electronic Arts, Inc. Systems and methods for a unified game experience in a multiplayer game
US9873045B2 (en) 2012-05-25 2018-01-23 Electronic Arts, Inc. Systems and methods for a unified game experience
US9751011B2 (en) * 2012-05-25 2017-09-05 Electronics Arts, Inc. Systems and methods for a unified game experience in a multiplayer game
US20140213363A1 (en) * 2012-05-25 2014-07-31 Electronic Arts, Inc. Systems and methods for a unified game experience
US9616329B2 (en) * 2012-06-28 2017-04-11 Electronic Arts Inc. Adaptive learning system for video game enhancement
US20140004950A1 (en) * 2012-06-28 2014-01-02 Electronic Arts Inc. Adaptive learning system for video game enhancement
CN102799432A (en) * 2012-06-30 2012-11-28 邱东 Game video recording and replaying method based on recorded drawing instruction
EP2750032A3 (en) * 2012-12-27 2015-08-05 Sony Computer Entertainment America LLC Methods and systems for generation and execution of miniapp of computer application served by cloud computing system
US20140274387A1 (en) * 2013-03-15 2014-09-18 Electronic Arts, Inc. Systems and methods for indicating events in game video
US10369460B1 (en) 2013-03-15 2019-08-06 Electronic Arts Inc. Systems and methods for generating a compilation reel in game video
US9919204B1 (en) 2013-03-15 2018-03-20 Electronic Arts Inc. Systems and methods for indicating events in game video
US10974130B1 (en) 2013-03-15 2021-04-13 Electronic Arts Inc. Systems and methods for indicating events in game video
US10099116B1 (en) 2013-03-15 2018-10-16 Electronic Arts Inc. Systems and methods for indicating events in game video
US9776075B2 (en) * 2013-03-15 2017-10-03 Electronic Arts Inc. Systems and methods for indicating events in game video
JP2016526957A (en) * 2013-06-07 2016-09-08 株式会社ソニー・インタラクティブエンタテインメント 3D gameplay sharing
US9682312B2 (en) * 2013-06-07 2017-06-20 Nintendo Co., Ltd. Information processing system, server machine, information processing device, recording medium and information processing method
US9757652B2 (en) * 2013-06-07 2017-09-12 Nintendo Co., Ltd. Information processing system, information processing device, recording medium and information display method
US10150042B2 (en) 2013-06-07 2018-12-11 Sony Interactive Entertainment Inc. Sharing recorded gameplay
US9776085B2 (en) 2013-06-07 2017-10-03 Nintendo Co., Ltd. Information processing system, information processing device, server machine, recording medium and information processing method
US9452354B2 (en) 2013-06-07 2016-09-27 Sony Interactive Entertainment Inc. Sharing three-dimensional gameplay
US10843088B2 (en) 2013-06-07 2020-11-24 Sony Interactive Entertainment Inc. Sharing recorded gameplay
WO2014197876A3 (en) * 2013-06-07 2015-01-29 Sony Computer Entertainment Inc. Sharing three-dimensional gameplay
US20140364205A1 (en) * 2013-06-07 2014-12-11 Nintendo Co., Ltd. Information processing system, information processing device, recording medium and information display method
US20140364207A1 (en) * 2013-06-07 2014-12-11 Nintendo Co., Ltd. Information processing system, server machine, information processing device, recording medium and information processing method
CN105358227A (en) * 2013-06-07 2016-02-24 索尼电脑娱乐公司 Sharing three-dimensional gameplay
US9675874B1 (en) * 2013-07-18 2017-06-13 nWay, Inc. Multi-player gaming system
US9901822B2 (en) 2014-01-09 2018-02-27 Square Enix Holding Co., Ltd. Video gaming device with remote rendering capability
US20150304697A1 (en) * 2014-04-18 2015-10-22 Microsoft Corporation Changing broadcast without interruption to active gameplay
US11452938B2 (en) 2014-08-12 2022-09-27 Utherverse Gaming Llc Method, system and apparatus of recording and playing back an experience in a virtual worlds system
US11638871B2 (en) 2014-08-12 2023-05-02 Utherverse Gaming Llc Method, system and apparatus of recording and playing back an experience in a virtual worlds system
US10068358B2 (en) * 2014-09-26 2018-09-04 Bally Gaming, Inc. Modifying wagering game graphics
US20160092069A1 (en) * 2014-09-26 2016-03-31 Bally Gaming, Inc. Modifying wagering game graphics
US11013998B2 (en) * 2015-01-28 2021-05-25 Gree, Inc. Method, non-transitory computer-readable recording medium, information processing system, and information processing device
US20160214012A1 (en) * 2015-01-28 2016-07-28 Gree, Inc. Method, non-transitory computer-readable recording medium, information processing system, and information processing device
US20210245049A1 (en) * 2015-01-28 2021-08-12 Gree, Inc. Method, non-transitory computer-readable recording medium, information processing system, and information processing device
US10471361B2 (en) * 2015-03-27 2019-11-12 Popbox Ltd. Video sharing method
US10792566B1 (en) 2015-09-30 2020-10-06 Electronic Arts Inc. System for streaming content within a game application environment
US20170106283A1 (en) * 2015-10-16 2017-04-20 Microsoft Technology Licensing, Llc Automated generation of game event recordings
US9782678B2 (en) 2015-12-06 2017-10-10 Sliver VR Technologies, Inc. Methods and systems for computer video game streaming, highlight, and replay
US9473758B1 (en) * 2015-12-06 2016-10-18 Sliver VR Technologies, Inc. Methods and systems for game video recording and virtual reality replay
US10807004B2 (en) 2016-03-08 2020-10-20 Electronic Arts Inc. Dynamic difficulty adjustment
US11369880B2 (en) 2016-03-08 2022-06-28 Electronic Arts Inc. Dynamic difficulty adjustment
US10722793B2 (en) 2016-03-15 2020-07-28 Skillz Inc Synchronization model for virtual tournaments
US11842609B2 (en) 2016-03-15 2023-12-12 Skillz Platform Inc. Across-match analytics in peer-to-peer gaming tournaments
US11062569B2 (en) 2016-03-15 2021-07-13 Skillz Platform Inc. Across-match analytics in peer-to-peer gaming tournaments
US11376499B2 (en) 2016-03-15 2022-07-05 Skillz Platform, Inc. Synchronization model for virtual tournaments
US10421011B2 (en) 2016-03-16 2019-09-24 Skillz Inc. Management of streaming video data
US10960306B2 (en) 2016-03-16 2021-03-30 Skillz Inc. Management of streaming video data
WO2017160932A1 (en) * 2016-03-16 2017-09-21 Skillz Inc. Management of streaming video data
EP4015055A1 (en) * 2016-03-16 2022-06-22 Skillz Platform Inc. Management of streaming video data
US10016674B2 (en) 2016-03-16 2018-07-10 Skillz Inc Management of streaming video data
US11583764B2 (en) 2016-03-16 2023-02-21 Skillz Platform, Inc. Management of streaming video data
US10137371B2 (en) * 2016-06-23 2018-11-27 Minkonet Corporation Method of recording and replaying game video by using object state recording method
US20170368458A1 (en) * 2016-06-23 2017-12-28 Minkonet Corporation Method of recording and replaying game video by object state recording
US10403018B1 (en) 2016-07-12 2019-09-03 Electronic Arts Inc. Swarm crowd rendering system
US10726611B1 (en) 2016-08-24 2020-07-28 Electronic Arts Inc. Dynamic texture mapping using megatextures
US20190238908A1 (en) * 2016-12-28 2019-08-01 Tencent Technology (Shenzhen) Company Limited Information processing method, terminal, system, and computer storage medium
US10841624B2 (en) * 2016-12-28 2020-11-17 Tencent Technology (Shenzhen) Company Limited Information processing method, terminal, system, and computer storage medium
US11458399B2 (en) 2016-12-30 2022-10-04 Electronic Arts Inc. Systems and methods for automatically measuring a video game difficulty
US10799798B2 (en) 2016-12-30 2020-10-13 Electronic Arts Inc. Systems and methods for automatically measuring a video game difficulty
US11413539B2 (en) 2017-02-28 2022-08-16 Electronic Arts Inc. Realtime dynamic modification and optimization of gameplay parameters within a video game application
US10733765B2 (en) 2017-03-31 2020-08-04 Electronic Arts Inc. Blendshape compression system
US11295479B2 (en) 2017-03-31 2022-04-05 Electronic Arts Inc. Blendshape compression system
US10878540B1 (en) 2017-08-15 2020-12-29 Electronic Arts Inc. Contrast ratio detection and rendering system
US10535174B1 (en) 2017-09-14 2020-01-14 Electronic Arts Inc. Particle-based inverse kinematic rendering system
US11113860B2 (en) 2017-09-14 2021-09-07 Electronic Arts Inc. Particle-based inverse kinematic rendering system
US10860838B1 (en) 2018-01-16 2020-12-08 Electronic Arts Inc. Universal facial expression translation and character rendering system
US11163588B2 (en) 2018-03-01 2021-11-02 Vreal Inc. Source code independent virtual reality capture and replay systems and methods
US11169824B2 (en) 2018-03-01 2021-11-09 Vreal Inc Virtual reality replay shadow clients systems and methods
US20190272156A1 (en) * 2018-03-01 2019-09-05 Vreal Inc Virtual reality capture and replay systems and methods
US10484578B2 (en) 2018-03-30 2019-11-19 Cae Inc. Synchronizing video outputs towards a single display frequency
US10839215B2 (en) 2018-05-21 2020-11-17 Electronic Arts Inc. Artificial intelligence for emulating human playstyles
US11532172B2 (en) 2018-06-13 2022-12-20 Electronic Arts Inc. Enhanced training of machine learning systems based on automatically generated realistic gameplay information
US10713543B1 (en) 2018-06-13 2020-07-14 Electronic Arts Inc. Enhanced training of machine learning systems based on automatically generated realistic gameplay information
US11406899B2 (en) 2019-03-27 2022-08-09 Electronic Arts Inc. Virtual character generation from image or video data
US11276216B2 (en) 2019-03-27 2022-03-15 Electronic Arts Inc. Virtual animal character generation from image or video data
US10953334B2 (en) 2019-03-27 2021-03-23 Electronic Arts Inc. Virtual character generation from image or video data
US10902618B2 (en) 2019-06-14 2021-01-26 Electronic Arts Inc. Universal body movement translation and character rendering system
US11798176B2 (en) 2019-06-14 2023-10-24 Electronic Arts Inc. Universal body movement translation and character rendering system
US10940393B2 (en) 2019-07-02 2021-03-09 Electronic Arts Inc. Customized models for imitating player gameplay in a video game
US11110353B2 (en) 2019-07-10 2021-09-07 Electronic Arts Inc. Distributed training for machine learning of AI controlled virtual entities on video game clients
US11023729B1 (en) * 2019-11-08 2021-06-01 Msg Entertainment Group, Llc Providing visual guidance for presenting visual content in a venue
US11647244B2 (en) 2019-11-08 2023-05-09 Msg Entertainment Group, Llc Providing visual guidance for presenting visual content in a venue
US11872492B2 (en) 2020-02-14 2024-01-16 Electronic Arts Inc. Color blindness diagnostic system
US11504625B2 (en) 2020-02-14 2022-11-22 Electronic Arts Inc. Color blindness diagnostic system
US11232621B2 (en) 2020-04-06 2022-01-25 Electronic Arts Inc. Enhanced animation generation based on conditional modeling
US11836843B2 (en) 2020-04-06 2023-12-05 Electronic Arts Inc. Enhanced pose generation based on conditional modeling of inverse kinematics
US11648480B2 (en) 2020-04-06 2023-05-16 Electronic Arts Inc. Enhanced pose generation based on generative modeling
US11217003B2 (en) 2020-04-06 2022-01-04 Electronic Arts Inc. Enhanced pose generation based on conditional modeling of inverse kinematics
US20220035442A1 (en) * 2020-07-29 2022-02-03 AniCast RM Inc. Movie distribution method
US11623146B2 (en) 2020-11-05 2023-04-11 Onmobile Global Solutions Canada Limited Game moment implementation system and method of use thereof
US11830121B1 (en) 2021-01-26 2023-11-28 Electronic Arts Inc. Neural animation layering for synthesizing martial arts movements
EP4194068A4 (en) * 2021-03-05 2023-07-19 Redefinearts Inc. System for creating gameplay video recording
US20220331704A1 (en) * 2021-04-15 2022-10-20 Sony Interactive Entertainment Inc. Video recording system and method
CN113259770A (en) * 2021-05-11 2021-08-13 北京奇艺世纪科技有限公司 Video playing method, device, electronic equipment, medium and product
US11887232B2 (en) 2021-06-10 2024-01-30 Electronic Arts Inc. Enhanced system for generation of facial models and animation
US11670030B2 (en) 2021-07-01 2023-06-06 Electronic Arts Inc. Enhanced animation generation based on video with local phase
US11562523B1 (en) 2021-08-02 2023-01-24 Electronic Arts Inc. Enhanced animation generation based on motion matching using local bone phases
US20240012530A1 (en) * 2022-07-08 2024-01-11 Shanghai Lilith Technology Corporation Video acquisition method, electronic device, and storage medium
US11914837B2 (en) * 2022-07-08 2024-02-27 Shanghai Lilith Technology Corporation Video acquisition method, electronic device, and storage medium

Similar Documents

Publication Publication Date Title
US20080268961A1 (en) Method of creating video in a virtual world and method of distributing and using same
US20200222803A1 (en) Virtual playbook with user controls
US9782678B2 (en) Methods and systems for computer video game streaming, highlight, and replay
US11679333B2 (en) Methods and systems for generating a video game stream based on an obtained game log
KR102430130B1 (en) Management of streaming video data
CN104915542B (en) A kind of method of network game video recording and playback based on data synchronization
US9573062B1 (en) Methods and systems for virtual reality streaming and replay of computer video games
US8665374B2 (en) Interactive video insertions, and applications thereof
US7446772B2 (en) Spectator experience for networked gaming
JP6947985B2 (en) Game video editing program and game video editing system
US9143721B2 (en) Content preparation systems and methods for interactive video systems
WO2020022405A1 (en) Three-dimensional content distribution system, three-dimensional content distribution method and computer program
KR20130103817A (en) System and method for creating, editing, and sharing video content relating to video game events
US20150040165A1 (en) Multi-source video navigation
WO2018063957A1 (en) Methods and systems for virtual reality streaming and replay of computer video games
WO2018106461A1 (en) Methods and systems for computer video game streaming, highlight, and replay
US10525348B2 (en) System for generating game replay video
US20120021827A1 (en) Multi-dimensional video game world data recorder
KR20160137924A (en) Method of Recording and Replaying Game Video by Object State Recording
Drucker et al. Spectator games: A new entertainment modality of networked multiplayer games
KR20160114481A (en) Method of Recording and Replaying Game Video by Object State Recording
US10137371B2 (en) Method of recording and replaying game video by using object state recording method
JP6989796B2 (en) Information processing systems, information processing equipment and programs
CN111773738B (en) Game sightseeing method and device
JP7045727B2 (en) How to create a distribution system, a computer program for a distribution system, and a video for distribution

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION