US20200014949A1 - Synchronizing session content to external content - Google Patents
Synchronizing session content to external content Download PDFInfo
- Publication number
- US20200014949A1 US20200014949A1 US16/577,123 US201916577123A US2020014949A1 US 20200014949 A1 US20200014949 A1 US 20200014949A1 US 201916577123 A US201916577123 A US 201916577123A US 2020014949 A1 US2020014949 A1 US 2020014949A1
- Authority
- US
- United States
- Prior art keywords
- content
- content file
- external
- identified
- identifier
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 239000002131 composite material Substances 0.000 claims abstract description 37
- 230000001360 synchronised effect Effects 0.000 claims abstract description 33
- 238000000034 method Methods 0.000 claims abstract description 31
- 230000006854 communication Effects 0.000 claims abstract description 17
- 238000004891 communication Methods 0.000 claims abstract description 16
- 238000006243 chemical reaction Methods 0.000 description 12
- 238000012545 processing Methods 0.000 description 10
- 230000002093 peripheral effect Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000008451 emotion Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000002996 emotional effect Effects 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/33—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
- A63F13/335—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/46—Embedding additional information in the video signal during the compression process
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/33—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
- A63F13/332—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using wireless networks, e.g. cellular phone networks
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/35—Details of game servers
- A63F13/355—Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an encoded video stream for transmitting to a mobile phone or a thin client
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
- A63F13/655—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/85—Providing additional services to players
- A63F13/86—Watching games played by other players
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/401—Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
- H04L65/4015—Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference
-
- H04L65/4023—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/402—Support for services or applications wherein the services involve a main real-time session and one or more additional parallel non-real time sessions, e.g. downloading a file in a parallel FTP session, initiating an email or combinational services
- H04L65/4025—Support for services or applications wherein the services involve a main real-time session and one or more additional parallel non-real time sessions, e.g. downloading a file in a parallel FTP session, initiating an email or combinational services where none of the additional parallel sessions is real time or time sensitive, e.g. downloading a file in a parallel FTP session, initiating an email or combinational services
Definitions
- the present invention generally relates to game session content. More specifically, the present invention relates to synchronization of session content to external content.
- Presently available digital content may allow for sharing of images, video, and other content generated during an game session with one or more players. For example, a player playing a digital game during a game session may have performed a notable feat, achieved a notable status, or otherwise wish to share content relating an in-game event. Notwithstanding, images and even video captured during the game session may fail to engage other individuals (e.g., online audiences). One reason for such failure to engage is the impersonal nature of such content. Because such content are generated within the in-game environment of a game title, many images or video that are captured therefrom may appear monotonous and lacking in emotion. While references herein may be made specifically to a game or game session, such reference should be understood to encompass any variety of different types of digital content made available via sessions as known in the art.
- Audience members may be engaged when they see the faces of people they know, when they hear personalized reactions and accounts, and when their experience of the game session includes human interactions, reactions, and emotions.
- One way to incorporate such human interactions into game session content is to generate a reaction video that is captured as one or more individuals (e.g., player(s), non-playing friends and family in the same room, remote players and non-players) watch the game transpire.
- Some game consoles may be associated with a peripheral camera or other device that captures images or video of the room in which a game is being played.
- peripheral cameras are usually fixed, however, as well as being set at a distance from the player(s) and other individuals in the room.
- Such long shots may capture images and video of more area within the room, but lack the immediacy and emotional engagement of close-up shots.
- personal devices e.g., smartphone, webcam, Wi-Fi connected handheld camera
- Embodiments of the present invention allow for synchronization of session content to external content.
- Session video of a plurality of game sessions may be captured at a content synchronization server.
- Each captured session video of each game session may be associated with an identifier of the respective game session.
- Additional content may be sent over a communication network to the content synchronization server.
- Such content may be external to the game session and identified as being associated with a game session identifier.
- One of the captured session videos may be identified as being associated with a game session identifier that matches the game session identifier associated with the received external content.
- the received external content may be synchronized to the identified session video based on the matching game session identifiers.
- a composite video may be generated that includes the received external content synchronized to the identified session video.
- Various embodiments of the present invention may include systems for synchronization of session content to external content.
- Such systems may include a content delivery server that hosts a plurality of different game sessions, captures session video for each of the different game sessions where each captured session video of each game session is associated with an identifier of the respective game session, receives content external to the game session, identifies that the external content is associated with a game session identifier, identifies one of the captured session videos as being associated with a game session identifier that matches the game session identifier associated with the received external content, synchronizes the received external content to the identified session video based on the matching game session identifiers, and generates a composite video comprising the received external content synchronized to the identified session video.
- Systems may further include one or more game consoles that generates session content captured in the session video during the respective game session associated with the matching game session identifier.
- Further embodiments of the present invention may include methods for synchronization of session content to external content. Such methods may include capturing session video for each of a plurality of different game sessions at a content synchronization server where each captured session video of each game session is associated with an identifier of the respective game session, receiving additional content sent over a communication network to the content synchronization server where the additional content is external to the game session, identifying that the external content is associated with a game session identifier, identifying one of the captured session videos as being associated with a game session identifier that matches the game session identifier associated with the received external content, synchronizing the received external content to the identified session video based on the matching game session identifiers, and generating a composite video comprising the received external content synchronized to the identified session video.
- Yet further embodiments of the present invention may include non-transitory computer-readable storage media having embodied thereon a program executable by a processor to perform a method for synchronization of session content to external content as described above.
- FIG. 1 illustrates a network environment in which a system for synchronization of session content to external content may be implemented.
- FIG. 2A illustrates an exemplary layout of a composite video in which session content has been synchronized to external content.
- FIG. 2B illustrates an alternative exemplary layout of a composite video in which session content has been synchronized to external content.
- FIG. 2C illustrates another alternative exemplary layout of a composite video in which session content has been synchronized to external content.
- FIG. 3 is a flowchart illustrating an exemplary method for synchronization of session content to external content.
- FIG. 4 is an exemplary electronic entertainment system that may be used in synchronization of session content to external content.
- Embodiments of the present invention allow for synchronization of session content to external content.
- Session video of a plurality of game sessions may be captured at a content synchronization server.
- Each captured session video of each game session may be associated with an identifier of the respective game session.
- Additional content may be sent over a communication network to the content synchronization server.
- Such additional content may be external to the game session and identified as being associated with a game session identifier.
- One of the captured session videos may be identified as being associated with a game session identifier that matches the game session identifier associated with the received external content.
- the received external content may be synchronized to the identified session video based on the matching game session identifiers.
- a composite video may be generated that includes the received external content synchronized to the identified session video.
- FIG. 1 illustrates a network environment 100 in which a system for synchronization of session content to external content may be implemented.
- the network environment 100 may include one or more content source servers 110 that provide digital content (e.g., games) for distribution, one or more content provider server application program interfaces (APIs) 120 , content delivery network server 130 , a content synchronization server 140 , and one or more client devices 150 .
- content source servers 110 that provide digital content (e.g., games) for distribution
- APIs application program interfaces
- Content source servers 110 may maintain and provide a variety of content available for distribution.
- the content source servers 110 may be associated with any content provider that makes its content available for access over a communication network.
- Such content may include not only digital games, but also pre-recorded content (e.g., DVR content, music) and live broadcasts (e.g., live sporting events, live e-sporting events, broadcast premieres). Any images, video clips, or other portions of such content may also be maintained at content source servers 110 .
- the content source servers 110 may maintain content associated with any content provider that makes its content available to be accessed, including individuals who upload content from their personal client devices 150 . Such content may be generated at such personal client devices 150 using native cameras, microphones, and other components for capturing images, audio, and video.
- the content from content source server 110 may be provided through a content provider server API 120 , which allows various types of content sources server 110 to communicate with other servers in the network environment 100 (e.g., content synchronization server 140 ).
- the content provider server API 120 may be specific to the particular language, operating system, protocols, etc. of the content source server 110 providing the content.
- content titles of different formats may be made available so as to be compatible with client device 150 .
- the content provider server API 120 may further facilitate access of each of the client devices 150 to the content hosted by the content source servers 110 , either directly or via content delivery network server 130 .
- Additional information, such as metadata, about the accessed content can also be provided by the content provider server API 120 to the client device 150 .
- the additional information i.e. metadata
- additional services associated with the accessed content such as chat services, ratings and profiles can also be provided from the content source servers 110 to the client device 150 via the content provider server API 120 .
- the content delivery network server 130 may include a server that provides resources and files related to the content from content source servers 110 , including promotional images and service configurations with client devices 150 .
- the content delivery network server 130 can also be called upon by the client devices 150 that request to access specific content.
- Content delivery network server 130 may include game servers, streaming media servers, servers hosting downloadable content, and other content delivery servers known in the art.
- the content provider server API 120 may communicate with a content synchronization server 140 in order to synchronize and generate composite content (e.g., from two different content source servers 110 ) for the client device 150 .
- a content synchronization server 140 may communicate with a content synchronization server 140 in order to synchronize and generate composite content (e.g., from two different content source servers 110 ) for the client device 150 .
- one type of content source is an individual who uploads content to content source server 110 .
- Such content may be external to another (e.g., content captured during a digital game session), but may nevertheless be in reaction to or otherwise relating to the game. Because a game session of a digital game may take place over a period of time, different in-game events may take place throughout the time period. Each reaction may therefore correspond to a particular point in time that a respective in-game event occurs.
- reaction content e.g., mobile device-captured video of human reactions
- external content may be captured and saved separately (e.g., as a separate file) from content captured during a play session of a digital game on a game console and/or hosted by a game server.
- Content synchronization server 140 may identify that such external (e.g., reaction) content is associated with a game session identifier.
- Such game session identifier may be generated by the client devices 150 (e.g., game console) engaging in the game session in which the game is being played.
- the game session identifier may be communicated to the client device 150 (e.g., mobile device) that generated the external content.
- Such communication may occur via a mobile application downloaded to the mobile client device 150 (e.g., from a game server or other content delivery network server 130 ).
- the user of the mobile client device 150 may use mobile application to select another client device 150 (e.g., a particular game console device) and request (e.g., via Bluetooth or WiFi connection) the game session identifier.
- the mobile application may then offer a variety of different session content with which to pair external content.
- Such session content may include in-game content from a game session, pre-recorded, or live content made available during a game session, etc.
- the shared game session identifier may therefore create a pairing that associates the external content with the selected session content (e.g., video of the game session).
- multiple external content files may be paired to the same session.
- Examples of external content may include reaction videos of the game player, reaction videos of audience members (local or remote), lip-synching videos in relation to music or video, commentary videos, different angles of the same live event, etc. Multiple different external content may be associated with the same session content. Such external content may not be required to be captured or generated at the same time, however.
- One external content file may be captured in real-time during the game session, while another external content file may be captured in relation to a replay of a recording of the game session. Such external content files may nevertheless by synchronized to the in-game content based on the shared session identifier and timestamps.
- the in-game content (e.g., clips captured during the game session) may therefore be matched to external content based on a common game session identifier.
- the synchronized files may further be composited in a variety of different display configurations. The resulting composite video may thereafter be stored, accessed, and played, thereby presenting multiple synchronized content files within a single composite display.
- composite videos may be generated based on default configurations (e.g. based on number of content files being composited, default settings), as well as on-the-fly based on input from producers, broadcasters, or other users.
- the composite videos may be maintained and made available for access, play, sharing, social media, streaming, broadcast, etc. by various client devices 150 .
- the client device 150 may include a plurality of different types of computing devices.
- the client device 150 may include any number of different gaming consoles, mobile devices, laptops, and desktops.
- a particular player may be associated with a variety of different client devices 150 .
- Each client device 150 may be associated with the particular player by virtue of being logged into the same player account.
- Such client devices 150 may also be configured to access data from other storage media, such as, but not limited to memory cards or disk drives as may be appropriate in the case of downloaded services.
- Such devices 150 may include standard hardware computing components such as, but not limited to network and media interfaces, non-transitory computer-readable storage (memory), and processors for executing instructions that may be stored in memory.
- These client devices 150 may also run using a variety of different operating systems (e.g., iOS, Android), applications or computing languages (e.g., C++, JavaScript).
- An exemplary client device 150 is described in detail herein with respect to FIG. 4 .
- FIG. 2A illustrates an exemplary layout of a composite video in which session content has been synchronized to external content.
- the composite video 200 A combines in-game content 210 A with external content 220 A in a picture-in-picture configuration.
- the external content 220 A may be overlaid on top of the game environment displayed in the in-game content 210 A.
- the placement of such external content 220 A may be static or move (e.g., so as not to block the view of events in the game environment).
- FIG. 2B illustrates an alternative exemplary layout of a composite video in which session content has been synchronized to external content.
- the composite video 220 B combines in-game content 210 B with a plurality of different external content files 220 B-D.
- the external content 220 B-D may be captured by different end-user client devices 150 (e.g., mobile phones, tablets). Each such client device 150 may be local (e.g., in the same room) or remote from a client device 150 (e.g., game console) upon which the game is played.
- client devices 150 e.g., mobile phones, tablets
- Each such client device 150 may be local (e.g., in the same room) or remote from a client device 150 (e.g., game console) upon which the game is played.
- the composite video 200 B of FIG. 2B includes three sections for displayed different external content 220 B-D, there may be even more external content (e.g., from other client devices 150 ), which may be switched in and out of the defined sections
- FIG. 2C illustrates another alternative exemplary layout of a composite video in which session content has been synchronized to external content.
- the composite video 200 C first displays in-game content 210 C intercut with external content 220 E and external content 220 F before being switched back to the in-game content 210 C.
- FIG. 3 illustrates a method 300 for synchronization of session content to external content.
- the method 00 of FIG. 3 may be embodied as executable instructions in a non-transitory computer readable storage medium including but not limited to a CD, DVD, or non-volatile memory such as a hard drive.
- the instructions of the storage medium may be executed by a processor (or processors) to cause various hardware components of a computing device hosting or otherwise accessing the storage medium to effectuate the method.
- the steps identified in FIG. 3 (and the order thereof) are exemplary and may include various alternatives, equivalents, or derivations thereof including but not limited to the order of execution of the same.
- session video of a plurality of game sessions may be captured at a game server.
- Each captured session video of each game session may be associated with an identifier of the respective game session.
- Additional content may be sent over a communication network to the game server. Such additional content may be external to the game session and identified as being associated with a game session identifier.
- One of the captured session videos may be identified as being associated with a game session identifier that matches the game session identifier associated with the received external content.
- the received external content may be synchronized to the identified session video based on the matching game session identifiers.
- a composite video may be generated that includes the received external content synchronized to the identified session video.
- in-game video may be captured during a game session whereby a client device 150 (e.g., game console) is playing a game hosted by content delivery network server 130 .
- client device 150 e.g., game console
- Such in-game video may be provided to content source server 110 for storage in association with a session identifier that is unique to the particular game session from which the in-game video was captured.
- external content may be received over a communication network (e.g., Internet).
- a user wishing to capture external content in association with the in-game video may download a mobile application to their mobile client device 150 .
- Such mobile application may allow for identification and selection of a particular other client device 150 with which to pair (e.g., the game console hosting the game session).
- the game console may then generate a unique game session identifier.
- External content later captured by the mobile client device 150 may be associated with the game session identifier, as well as be stamped with timestamp(s) related to the game session.
- content synchronization server 140 may identify the game session identifier associated with the external content and find a matching game session identifier associated with an in-game video.
- step 350 the content files—both in-session and external—associated with the same game session identifier may therefore be synched with each other and based on their respective timestamps.
- step 360 a composite video may be generated based on the synchronized in-game video and external content.
- step 370 the composite video may be made available to one or more client devices 150 for download or sharing (e.g., via social networks or other online forums, as well as with connections).
- FIG. 4 is an exemplary electronic entertainment system that may be used in synchronization of session content to external content.
- the entertainment system 400 of FIG. 4 includes a main memory 405 , a central processing unit (CPU) 410 , vector unit 415 , a graphics processing unit 420 , an input/output (I/O) processor 425 , an I/O processor memory 430 , a peripheral interface 435 , a memory card 440 , a Universal Serial Bus (USB) interface 445 , and a communication network interface 450 .
- CPU central processing unit
- I/O input/output
- I/O processor memory 430 input/output
- peripheral interface 435 a peripheral interface 435
- memory card 440 a Universal Serial Bus (USB) interface 445
- USB Universal Serial Bus
- the entertainment system 400 further includes an operating system read-only memory (OS ROM) 455 , a sound processing unit 460 , an optical disc control unit 470 , and a hard disc drive 465 , which are connected via a bus 475 to the I/O processor 425 .
- OS ROM operating system read-only memory
- Entertainment system 400 may be an electronic game console.
- the entertainment system 400 may be implemented as a general-purpose computer, a set-top box, a hand-held game device, a tablet computing device, or a mobile computing device or phone.
- Entertainment systems may contain more or less operating components depending on a particular form factor, purpose, or design.
- the CPU 410 , the vector unit 415 , the graphics processing unit 420 , and the I/O processor 425 of FIG. 4 communicate via a system bus 485 . Further, the CPU 410 of FIG. 4 communicates with the main memory 405 via a dedicated bus 480 , while the vector unit 415 and the graphics processing unit 420 may communicate through a dedicated bus 490 .
- the CPU 410 of FIG. 4 executes programs stored in the OS ROM 455 and the main memory 405 .
- the main memory 405 of FIG. 4 may contain pre-stored programs and programs transferred through the I/O Processor 425 from a CD-ROM, DVD-ROM, or other optical disc (not shown) using the optical disc control unit 470 .
- the I/O processor 425 of FIG. 4 primarily controls data exchanges between the various devices of the entertainment system 400 including the CPU 410 , the vector unit 415 , the graphics processing unit 420 , and the peripheral interface 435 .
- the graphics processing unit 420 of FIG. 4 executes graphics instructions received from the CPU 410 and the vector unit 415 to produce images for display on a display device (not shown).
- the vector unit 415 of FIG. 4 may transform objects from three-dimensional coordinates to two-dimensional coordinates, and send the two-dimensional coordinates to the graphics processing unit 420 .
- the sound processing unit 460 executes instructions to produce sound signals that are outputted to an audio device such as speakers (not shown).
- Other devices may be connected to the entertainment system 400 via the USB interface 445 , and the communication network interface 450 such as wireless transceivers, which may also be embedded in the system 400 or as a part of some other component such as a processor.
- a user of the entertainment system 400 of FIG. 4 provides instructions via the peripheral interface 435 to the CPU 410 , which allows for use of a variety of different available peripheral devices (e.g., controllers) known in the art.
- the user may instruct the CPU 410 to store certain game information on the memory card 440 or other non-transitory computer-readable storage media or instruct a character in a game to perform some specified action.
- the present invention may be implemented in an application that may be operable by a variety of end user devices.
- an end user device may be a personal computer, a home entertainment system (e.g., Sony PlayStation2® or Sony PlayStation3® or Sony PlayStation4®), a portable gaming device (e.g., Sony PSP® or Sony Vita®), or a home entertainment system of a different albeit inferior manufacturer.
- a home entertainment system e.g., Sony PlayStation2® or Sony PlayStation3® or Sony PlayStation4®
- a portable gaming device e.g., Sony PSP® or Sony Vita®
- a home entertainment system of a different albeit inferior manufacturer e.g., Sony PSP® or Sony Vita®
- the present methodologies described herein are fully intended to be operable on a variety of devices.
- the present invention may also be implemented with cross-title neutrality wherein an embodiment of the present system may be utilized across a variety of titles from various publishers.
- Non-transitory computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution. Such media can take many forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of non-transitory computer-readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, RAM, PROM, EPROM, a FLASHEPROM, and any other memory chip or cartridge.
- a bus carries the data to system RAM, from which a CPU retrieves and executes the instructions.
- the instructions received by system RAM can optionally be stored on a fixed disk either before or after execution by a CPU.
- Various forms of storage may likewise be implemented as well as the necessary network interfaces and network topologies to implement the same.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Information Transfer Between Computers (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
- The present application is a continuation and claims the priority benefit of U.S. patent application Ser. No. 15/814,368 filed Nov. 15, 2017, now U.S. Pat. No. 10,425,654, which is incorporated herein by reference in its entirety
- The present invention generally relates to game session content. More specifically, the present invention relates to synchronization of session content to external content.
- Presently available digital content may allow for sharing of images, video, and other content generated during an game session with one or more players. For example, a player playing a digital game during a game session may have performed a notable feat, achieved a notable status, or otherwise wish to share content relating an in-game event. Notwithstanding, images and even video captured during the game session may fail to engage other individuals (e.g., online audiences). One reason for such failure to engage is the impersonal nature of such content. Because such content are generated within the in-game environment of a game title, many images or video that are captured therefrom may appear monotonous and lacking in emotion. While references herein may be made specifically to a game or game session, such reference should be understood to encompass any variety of different types of digital content made available via sessions as known in the art.
- Audience members may be engaged when they see the faces of people they know, when they hear personalized reactions and accounts, and when their experience of the game session includes human interactions, reactions, and emotions. One way to incorporate such human interactions into game session content is to generate a reaction video that is captured as one or more individuals (e.g., player(s), non-playing friends and family in the same room, remote players and non-players) watch the game transpire.
- Some game consoles may be associated with a peripheral camera or other device that captures images or video of the room in which a game is being played. Such peripheral cameras are usually fixed, however, as well as being set at a distance from the player(s) and other individuals in the room. Such long shots may capture images and video of more area within the room, but lack the immediacy and emotional engagement of close-up shots. While personal devices (e.g., smartphone, webcam, Wi-Fi connected handheld camera) may be used to capture such close-up shots—whether by photo or video—there is presently no way for such content that is external to the game to be synchronized automatically to in-game content so that there may be context to the individuals' reactions.
- There is, therefore, a need in the art for improved systems and methods for synchronization of session content to external content.
- Embodiments of the present invention allow for synchronization of session content to external content. Session video of a plurality of game sessions may be captured at a content synchronization server. Each captured session video of each game session may be associated with an identifier of the respective game session. Additional content may be sent over a communication network to the content synchronization server. Such content may be external to the game session and identified as being associated with a game session identifier. One of the captured session videos may be identified as being associated with a game session identifier that matches the game session identifier associated with the received external content. The received external content may be synchronized to the identified session video based on the matching game session identifiers. A composite video may be generated that includes the received external content synchronized to the identified session video.
- Various embodiments of the present invention may include systems for synchronization of session content to external content. Such systems may include a content delivery server that hosts a plurality of different game sessions, captures session video for each of the different game sessions where each captured session video of each game session is associated with an identifier of the respective game session, receives content external to the game session, identifies that the external content is associated with a game session identifier, identifies one of the captured session videos as being associated with a game session identifier that matches the game session identifier associated with the received external content, synchronizes the received external content to the identified session video based on the matching game session identifiers, and generates a composite video comprising the received external content synchronized to the identified session video. Systems may further include one or more game consoles that generates session content captured in the session video during the respective game session associated with the matching game session identifier.
- Further embodiments of the present invention may include methods for synchronization of session content to external content. Such methods may include capturing session video for each of a plurality of different game sessions at a content synchronization server where each captured session video of each game session is associated with an identifier of the respective game session, receiving additional content sent over a communication network to the content synchronization server where the additional content is external to the game session, identifying that the external content is associated with a game session identifier, identifying one of the captured session videos as being associated with a game session identifier that matches the game session identifier associated with the received external content, synchronizing the received external content to the identified session video based on the matching game session identifiers, and generating a composite video comprising the received external content synchronized to the identified session video.
- Yet further embodiments of the present invention may include non-transitory computer-readable storage media having embodied thereon a program executable by a processor to perform a method for synchronization of session content to external content as described above.
-
FIG. 1 illustrates a network environment in which a system for synchronization of session content to external content may be implemented. -
FIG. 2A illustrates an exemplary layout of a composite video in which session content has been synchronized to external content. -
FIG. 2B illustrates an alternative exemplary layout of a composite video in which session content has been synchronized to external content. -
FIG. 2C illustrates another alternative exemplary layout of a composite video in which session content has been synchronized to external content. -
FIG. 3 is a flowchart illustrating an exemplary method for synchronization of session content to external content. -
FIG. 4 is an exemplary electronic entertainment system that may be used in synchronization of session content to external content. - Embodiments of the present invention allow for synchronization of session content to external content. Session video of a plurality of game sessions may be captured at a content synchronization server. Each captured session video of each game session may be associated with an identifier of the respective game session. Additional content may be sent over a communication network to the content synchronization server. Such additional content may be external to the game session and identified as being associated with a game session identifier. One of the captured session videos may be identified as being associated with a game session identifier that matches the game session identifier associated with the received external content. The received external content may be synchronized to the identified session video based on the matching game session identifiers. A composite video may be generated that includes the received external content synchronized to the identified session video.
-
FIG. 1 illustrates anetwork environment 100 in which a system for synchronization of session content to external content may be implemented. Thenetwork environment 100 may include one or morecontent source servers 110 that provide digital content (e.g., games) for distribution, one or more content provider server application program interfaces (APIs) 120, contentdelivery network server 130, acontent synchronization server 140, and one ormore client devices 150. -
Content source servers 110 may maintain and provide a variety of content available for distribution. Thecontent source servers 110 may be associated with any content provider that makes its content available for access over a communication network. Such content may include not only digital games, but also pre-recorded content (e.g., DVR content, music) and live broadcasts (e.g., live sporting events, live e-sporting events, broadcast premieres). Any images, video clips, or other portions of such content may also be maintained atcontent source servers 110. - The
content source servers 110 may maintain content associated with any content provider that makes its content available to be accessed, including individuals who upload content from theirpersonal client devices 150. Such content may be generated at suchpersonal client devices 150 using native cameras, microphones, and other components for capturing images, audio, and video. - The content from
content source server 110 may be provided through a contentprovider server API 120, which allows various types ofcontent sources server 110 to communicate with other servers in the network environment 100 (e.g., content synchronization server 140). The contentprovider server API 120 may be specific to the particular language, operating system, protocols, etc. of thecontent source server 110 providing the content. In anetwork environment 100 that includes multiple different types ofcontent source servers 110, there may likewise be a corresponding number of contentprovider server APIs 120 that allow for various formatting, conversion, and other cross-device and cross-platform communication processes for providing content (e.g., composites of different types) todifferent client devices 150, which may use different content media player application to play such content. As such, content titles of different formats may be made available so as to be compatible withclient device 150. - The content
provider server API 120 may further facilitate access of each of theclient devices 150 to the content hosted by thecontent source servers 110, either directly or via contentdelivery network server 130. Additional information, such as metadata, about the accessed content can also be provided by the contentprovider server API 120 to theclient device 150. As described below, the additional information (i.e. metadata) can be usable to provide details about the content being provided to theclient device 150. Finally, additional services associated with the accessed content such as chat services, ratings and profiles can also be provided from thecontent source servers 110 to theclient device 150 via the contentprovider server API 120. - The content
delivery network server 130 may include a server that provides resources and files related to the content fromcontent source servers 110, including promotional images and service configurations withclient devices 150. The contentdelivery network server 130 can also be called upon by theclient devices 150 that request to access specific content. Contentdelivery network server 130 may include game servers, streaming media servers, servers hosting downloadable content, and other content delivery servers known in the art. - The content
provider server API 120 may communicate with acontent synchronization server 140 in order to synchronize and generate composite content (e.g., from two different content source servers 110) for theclient device 150. As noted herein, one type of content source is an individual who uploads content to contentsource server 110. Such content may be external to another (e.g., content captured during a digital game session), but may nevertheless be in reaction to or otherwise relating to the game. Because a game session of a digital game may take place over a period of time, different in-game events may take place throughout the time period. Each reaction may therefore correspond to a particular point in time that a respective in-game event occurs. Because the reaction content (e.g., mobile device-captured video of human reactions) are external to the digital game, however, such external content may be captured and saved separately (e.g., as a separate file) from content captured during a play session of a digital game on a game console and/or hosted by a game server. -
Content synchronization server 140 may identify that such external (e.g., reaction) content is associated with a game session identifier. Such game session identifier may be generated by the client devices 150 (e.g., game console) engaging in the game session in which the game is being played. The game session identifier may be communicated to the client device 150 (e.g., mobile device) that generated the external content. Such communication may occur via a mobile application downloaded to the mobile client device 150 (e.g., from a game server or other content delivery network server 130). The user of themobile client device 150 may use mobile application to select another client device 150 (e.g., a particular game console device) and request (e.g., via Bluetooth or WiFi connection) the game session identifier. The mobile application may then offer a variety of different session content with which to pair external content. Such session content may include in-game content from a game session, pre-recorded, or live content made available during a game session, etc. - The shared game session identifier may therefore create a pairing that associates the external content with the selected session content (e.g., video of the game session). In some embodiments, multiple external content files may be paired to the same session. Examples of external content may include reaction videos of the game player, reaction videos of audience members (local or remote), lip-synching videos in relation to music or video, commentary videos, different angles of the same live event, etc. Multiple different external content may be associated with the same session content. Such external content may not be required to be captured or generated at the same time, however. One external content file may be captured in real-time during the game session, while another external content file may be captured in relation to a replay of a recording of the game session. Such external content files may nevertheless by synchronized to the in-game content based on the shared session identifier and timestamps.
- The in-game content (e.g., clips captured during the game session) may therefore be matched to external content based on a common game session identifier. Further, the paired content—in-game content and external content—may be associated with timestamps regarding points in time within the game session. Using such timestamps that appear in the two or more different content files (e.g., in-game session video and external video),
content synchronization server 140 may be able to synchronize the content files. The synchronized files may further be composited in a variety of different display configurations. The resulting composite video may thereafter be stored, accessed, and played, thereby presenting multiple synchronized content files within a single composite display. In some embodiments, composite videos may be generated based on default configurations (e.g. based on number of content files being composited, default settings), as well as on-the-fly based on input from producers, broadcasters, or other users. The composite videos may be maintained and made available for access, play, sharing, social media, streaming, broadcast, etc. byvarious client devices 150. - The
client device 150 may include a plurality of different types of computing devices. For example, theclient device 150 may include any number of different gaming consoles, mobile devices, laptops, and desktops. A particular player may be associated with a variety ofdifferent client devices 150. Eachclient device 150 may be associated with the particular player by virtue of being logged into the same player account.Such client devices 150 may also be configured to access data from other storage media, such as, but not limited to memory cards or disk drives as may be appropriate in the case of downloaded services.Such devices 150 may include standard hardware computing components such as, but not limited to network and media interfaces, non-transitory computer-readable storage (memory), and processors for executing instructions that may be stored in memory. Theseclient devices 150 may also run using a variety of different operating systems (e.g., iOS, Android), applications or computing languages (e.g., C++, JavaScript). Anexemplary client device 150 is described in detail herein with respect toFIG. 4 . -
FIG. 2A illustrates an exemplary layout of a composite video in which session content has been synchronized to external content. As illustrated, thecomposite video 200A combines in-game content 210A withexternal content 220A in a picture-in-picture configuration. Theexternal content 220A may be overlaid on top of the game environment displayed in the in-game content 210A. The placement of suchexternal content 220A may be static or move (e.g., so as not to block the view of events in the game environment). -
FIG. 2B illustrates an alternative exemplary layout of a composite video in which session content has been synchronized to external content. As illustrated, the composite video 220B combines in-game content 210B with a plurality of different external content files 220B-D. The external content 220B-D may be captured by different end-user client devices 150 (e.g., mobile phones, tablets). Eachsuch client device 150 may be local (e.g., in the same room) or remote from a client device 150 (e.g., game console) upon which the game is played. While thecomposite video 200B ofFIG. 2B includes three sections for displayed different external content 220B-D, there may be even more external content (e.g., from other client devices 150), which may be switched in and out of the defined sections within the composite video. -
FIG. 2C illustrates another alternative exemplary layout of a composite video in which session content has been synchronized to external content. As illustrated, thecomposite video 200C first displays in-game content 210C intercut withexternal content 220E andexternal content 220F before being switched back to the in-game content 210C. -
FIG. 3 illustrates amethod 300 for synchronization of session content to external content. The method 00 ofFIG. 3 may be embodied as executable instructions in a non-transitory computer readable storage medium including but not limited to a CD, DVD, or non-volatile memory such as a hard drive. The instructions of the storage medium may be executed by a processor (or processors) to cause various hardware components of a computing device hosting or otherwise accessing the storage medium to effectuate the method. The steps identified inFIG. 3 (and the order thereof) are exemplary and may include various alternatives, equivalents, or derivations thereof including but not limited to the order of execution of the same. - In
method 300 ofFIG. 3 , session video of a plurality of game sessions may be captured at a game server. Each captured session video of each game session may be associated with an identifier of the respective game session. Additional content may be sent over a communication network to the game server. Such additional content may be external to the game session and identified as being associated with a game session identifier. One of the captured session videos may be identified as being associated with a game session identifier that matches the game session identifier associated with the received external content. The received external content may be synchronized to the identified session video based on the matching game session identifiers. A composite video may be generated that includes the received external content synchronized to the identified session video. - In
step 310, in-game video may be captured during a game session whereby a client device 150 (e.g., game console) is playing a game hosted by contentdelivery network server 130. Such in-game video may be provided tocontent source server 110 for storage in association with a session identifier that is unique to the particular game session from which the in-game video was captured. - In
step 320, external content may be received over a communication network (e.g., Internet). A user wishing to capture external content in association with the in-game video may download a mobile application to theirmobile client device 150. Such mobile application may allow for identification and selection of a particularother client device 150 with which to pair (e.g., the game console hosting the game session). The game console may then generate a unique game session identifier. External content later captured by themobile client device 150 may be associated with the game session identifier, as well as be stamped with timestamp(s) related to the game session. - In
steps content synchronization server 140 may identify the game session identifier associated with the external content and find a matching game session identifier associated with an in-game video. - In
step 350, the content files—both in-session and external—associated with the same game session identifier may therefore be synched with each other and based on their respective timestamps. Instep 360, a composite video may be generated based on the synchronized in-game video and external content. Instep 370, the composite video may be made available to one ormore client devices 150 for download or sharing (e.g., via social networks or other online forums, as well as with connections). -
FIG. 4 is an exemplary electronic entertainment system that may be used in synchronization of session content to external content. Theentertainment system 400 ofFIG. 4 includes amain memory 405, a central processing unit (CPU) 410,vector unit 415, agraphics processing unit 420, an input/output (I/O)processor 425, an I/O processor memory 430, aperipheral interface 435, amemory card 440, a Universal Serial Bus (USB)interface 445, and acommunication network interface 450. Theentertainment system 400 further includes an operating system read-only memory (OS ROM) 455, asound processing unit 460, an opticaldisc control unit 470, and ahard disc drive 465, which are connected via abus 475 to the I/O processor 425. -
Entertainment system 400 may be an electronic game console. Alternatively, theentertainment system 400 may be implemented as a general-purpose computer, a set-top box, a hand-held game device, a tablet computing device, or a mobile computing device or phone. Entertainment systems may contain more or less operating components depending on a particular form factor, purpose, or design. - The
CPU 410, thevector unit 415, thegraphics processing unit 420, and the I/O processor 425 ofFIG. 4 communicate via asystem bus 485. Further, theCPU 410 ofFIG. 4 communicates with themain memory 405 via adedicated bus 480, while thevector unit 415 and thegraphics processing unit 420 may communicate through adedicated bus 490. TheCPU 410 ofFIG. 4 executes programs stored in theOS ROM 455 and themain memory 405. Themain memory 405 ofFIG. 4 may contain pre-stored programs and programs transferred through the I/O Processor 425 from a CD-ROM, DVD-ROM, or other optical disc (not shown) using the opticaldisc control unit 470. I/O Processor 425 ofFIG. 4 may also allow for the introduction of content transferred over a wireless or other communications network (e.g., 4G, LTE, 1G, and so forth). The I/O processor 425 ofFIG. 4 primarily controls data exchanges between the various devices of theentertainment system 400 including theCPU 410, thevector unit 415, thegraphics processing unit 420, and theperipheral interface 435. - The
graphics processing unit 420 ofFIG. 4 executes graphics instructions received from theCPU 410 and thevector unit 415 to produce images for display on a display device (not shown). For example, thevector unit 415 ofFIG. 4 may transform objects from three-dimensional coordinates to two-dimensional coordinates, and send the two-dimensional coordinates to thegraphics processing unit 420. Furthermore, thesound processing unit 460 executes instructions to produce sound signals that are outputted to an audio device such as speakers (not shown). Other devices may be connected to theentertainment system 400 via theUSB interface 445, and thecommunication network interface 450 such as wireless transceivers, which may also be embedded in thesystem 400 or as a part of some other component such as a processor. - A user of the
entertainment system 400 ofFIG. 4 provides instructions via theperipheral interface 435 to theCPU 410, which allows for use of a variety of different available peripheral devices (e.g., controllers) known in the art. For example, the user may instruct theCPU 410 to store certain game information on thememory card 440 or other non-transitory computer-readable storage media or instruct a character in a game to perform some specified action. - The present invention may be implemented in an application that may be operable by a variety of end user devices. For example, an end user device may be a personal computer, a home entertainment system (e.g., Sony PlayStation2® or Sony PlayStation3® or Sony PlayStation4®), a portable gaming device (e.g., Sony PSP® or Sony Vita®), or a home entertainment system of a different albeit inferior manufacturer. The present methodologies described herein are fully intended to be operable on a variety of devices. The present invention may also be implemented with cross-title neutrality wherein an embodiment of the present system may be utilized across a variety of titles from various publishers.
- The present invention may be implemented in an application that may be operable using a variety of devices. Non-transitory computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution. Such media can take many forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of non-transitory computer-readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, RAM, PROM, EPROM, a FLASHEPROM, and any other memory chip or cartridge.
- Various forms of transmission media may be involved in carrying one or more sequences of one or more instructions to a CPU for execution. A bus carries the data to system RAM, from which a CPU retrieves and executes the instructions. The instructions received by system RAM can optionally be stored on a fixed disk either before or after execution by a CPU. Various forms of storage may likewise be implemented as well as the necessary network interfaces and network topologies to implement the same.
- The foregoing detailed description of the technology has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the technology to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The described embodiments were chosen in order to best explain the principles of the technology, its practical application, and to enable others skilled in the art to utilize the technology in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the technology be defined by the claim.
Claims (19)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/577,123 US20200014949A1 (en) | 2017-11-15 | 2019-09-20 | Synchronizing session content to external content |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/814,368 US10425654B2 (en) | 2017-11-15 | 2017-11-15 | Synchronizing session content to external content |
US16/577,123 US20200014949A1 (en) | 2017-11-15 | 2019-09-20 | Synchronizing session content to external content |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/814,368 Continuation US10425654B2 (en) | 2017-11-15 | 2017-11-15 | Synchronizing session content to external content |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200014949A1 true US20200014949A1 (en) | 2020-01-09 |
Family
ID=66432889
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/814,368 Active 2037-12-25 US10425654B2 (en) | 2017-11-15 | 2017-11-15 | Synchronizing session content to external content |
US16/577,123 Abandoned US20200014949A1 (en) | 2017-11-15 | 2019-09-20 | Synchronizing session content to external content |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/814,368 Active 2037-12-25 US10425654B2 (en) | 2017-11-15 | 2017-11-15 | Synchronizing session content to external content |
Country Status (4)
Country | Link |
---|---|
US (2) | US10425654B2 (en) |
EP (1) | EP3710122A4 (en) |
JP (1) | JP7296379B2 (en) |
WO (1) | WO2019099202A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10425654B2 (en) | 2017-11-15 | 2019-09-24 | Sony Interactive Entertainment LLC | Synchronizing session content to external content |
US10834478B2 (en) * | 2017-12-29 | 2020-11-10 | Dish Network L.L.C. | Methods and systems for an augmented film crew using purpose |
US10453496B2 (en) * | 2017-12-29 | 2019-10-22 | Dish Network L.L.C. | Methods and systems for an augmented film crew using sweet spots |
US10783925B2 (en) | 2017-12-29 | 2020-09-22 | Dish Network L.L.C. | Methods and systems for an augmented film crew using storyboards |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140023348A1 (en) * | 2012-07-17 | 2014-01-23 | HighlightCam, Inc. | Method And System For Content Relevance Score Determination |
US20140164373A1 (en) * | 2012-12-10 | 2014-06-12 | Rawllin International Inc. | Systems and methods for associating media description tags and/or media content images |
US8949438B2 (en) * | 2008-04-25 | 2015-02-03 | Omniplug Technologies, Ltd. | Data synchronisation to automate content adaptation and transfer between storage devices and content servers |
US20150261389A1 (en) * | 2014-03-14 | 2015-09-17 | Microsoft Corporation | Communication Event History |
US20170251231A1 (en) * | 2015-01-05 | 2017-08-31 | Gitcirrus, Llc | System and Method for Media Synchronization and Collaboration |
US20180353855A1 (en) * | 2017-06-12 | 2018-12-13 | Microsoft Technology Licensing, Llc | Audio balancing for multi-source audiovisual streaming |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9032465B2 (en) * | 2002-12-10 | 2015-05-12 | Ol2, Inc. | Method for multicasting views of real-time streaming interactive video |
US7824268B2 (en) | 2006-12-19 | 2010-11-02 | Electronic Arts, Inc. | Live hosted online multiplayer game |
US9498714B2 (en) | 2007-12-15 | 2016-11-22 | Sony Interactive Entertainment America Llc | Program mode switching |
US8424037B2 (en) | 2010-06-29 | 2013-04-16 | Echostar Technologies L.L.C. | Apparatus, systems and methods for accessing and synchronizing presentation of media content and supplemental media rich content in response to selection of a presented object |
US8798598B2 (en) * | 2012-09-13 | 2014-08-05 | Alain Rossmann | Method and system for screencasting Smartphone video game software to online social networks |
JP6306512B2 (en) * | 2012-11-05 | 2018-04-04 | 株式会社ソニー・インタラクティブエンタテインメント | Information processing device |
US9795871B2 (en) * | 2014-04-15 | 2017-10-24 | Microsoft Technology Licensing, Llc | Positioning a camera video overlay on gameplay video |
US10039979B2 (en) | 2015-06-15 | 2018-08-07 | Sony Interactive Entertainment America Llc | Capturing asynchronous commentary to pre-recorded gameplay |
JP6744080B2 (en) * | 2015-09-30 | 2020-08-19 | 株式会社バンダイナムコエンターテインメント | Game system, photographing device and program |
US11420114B2 (en) | 2015-09-30 | 2022-08-23 | Sony Interactive Entertainment LLC | Systems and methods for enabling time-shifted coaching for cloud gaming systems |
US11036458B2 (en) * | 2015-10-14 | 2021-06-15 | Google Llc | User interface for screencast applications |
CN105791958A (en) * | 2016-04-22 | 2016-07-20 | 北京小米移动软件有限公司 | Method and device for live broadcasting game |
US10425654B2 (en) | 2017-11-15 | 2019-09-24 | Sony Interactive Entertainment LLC | Synchronizing session content to external content |
-
2017
- 2017-11-15 US US15/814,368 patent/US10425654B2/en active Active
-
2018
- 2018-10-31 WO PCT/US2018/058538 patent/WO2019099202A1/en unknown
- 2018-10-31 EP EP18878426.8A patent/EP3710122A4/en active Pending
- 2018-10-31 JP JP2020527086A patent/JP7296379B2/en active Active
-
2019
- 2019-09-20 US US16/577,123 patent/US20200014949A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8949438B2 (en) * | 2008-04-25 | 2015-02-03 | Omniplug Technologies, Ltd. | Data synchronisation to automate content adaptation and transfer between storage devices and content servers |
US20140023348A1 (en) * | 2012-07-17 | 2014-01-23 | HighlightCam, Inc. | Method And System For Content Relevance Score Determination |
US20140164373A1 (en) * | 2012-12-10 | 2014-06-12 | Rawllin International Inc. | Systems and methods for associating media description tags and/or media content images |
US20150261389A1 (en) * | 2014-03-14 | 2015-09-17 | Microsoft Corporation | Communication Event History |
US20170251231A1 (en) * | 2015-01-05 | 2017-08-31 | Gitcirrus, Llc | System and Method for Media Synchronization and Collaboration |
US20180353855A1 (en) * | 2017-06-12 | 2018-12-13 | Microsoft Technology Licensing, Llc | Audio balancing for multi-source audiovisual streaming |
Also Published As
Publication number | Publication date |
---|---|
JP7296379B2 (en) | 2023-06-22 |
CN111918705A (en) | 2020-11-10 |
EP3710122A1 (en) | 2020-09-23 |
EP3710122A4 (en) | 2021-06-09 |
JP2021503258A (en) | 2021-02-04 |
US20190149833A1 (en) | 2019-05-16 |
WO2019099202A1 (en) | 2019-05-23 |
US10425654B2 (en) | 2019-09-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200014949A1 (en) | Synchronizing session content to external content | |
JP7158858B2 (en) | Capturing asynchronous comments on pre-recorded gameplay | |
US10881962B2 (en) | Media-activity binding and content blocking | |
JP2020151494A (en) | Cloud game streaming using asset integration on client side | |
JP2023166519A (en) | Simulating local experience by live streaming sharable perspective of live event | |
US20180221762A1 (en) | Video generation system, control device, and processing device | |
US10321192B2 (en) | System and methods of communicating between multiple geographically remote sites to enable a shared, social viewing experience | |
US10843085B2 (en) | Media-activity binding and content blocking | |
US20200188796A1 (en) | Experience-based peer recommendations | |
CN105933757A (en) | Video playing method, device and system thereof | |
WO2016074325A1 (en) | Audience grouping association method, apparatus and system | |
CN108768832B (en) | Interaction method and device between clients, storage medium and electronic device | |
WO2010141172A1 (en) | Addition of supplemental multimedia content and interactive capability at the client | |
US11727959B2 (en) | Information processing device and content editing method | |
CN113272031A (en) | Integrated interface for dynamic user experience | |
US11539988B2 (en) | Real-time incorporation of user-generated content into third-party streams | |
CN111918705B (en) | Synchronizing session content to external content | |
JP2019170861A (en) | Content processing system, content processing method, timer control server and timer control program | |
US20220295135A1 (en) | Video providing system and program | |
EP4140553A1 (en) | Audio analytics and accessibility across applications and platforms | |
JP2019171006A (en) | Content processing system, content processing method, timer control server and timer control program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY INTERACTIVE ENTERTAINMENT LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BENEDETTO, WARREN M.;REEL/FRAME:050443/0160 Effective date: 20171115 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |