GB2624172A - Apparatus and methods for virtual events - Google Patents

Apparatus and methods for virtual events Download PDF

Info

Publication number
GB2624172A
GB2624172A GB2216590.6A GB202216590A GB2624172A GB 2624172 A GB2624172 A GB 2624172A GB 202216590 A GB202216590 A GB 202216590A GB 2624172 A GB2624172 A GB 2624172A
Authority
GB
United Kingdom
Prior art keywords
virtual
video
data
scene
object data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2216590.6A
Other versions
GB202216590D0 (en
Inventor
Mcguinness Tim
Witczak Tomasz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Virtex Entertainment Ltd
Original Assignee
Virtex Entertainment Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Virtex Entertainment Ltd filed Critical Virtex Entertainment Ltd
Priority to GB2216590.6A priority Critical patent/GB2624172A/en
Publication of GB202216590D0 publication Critical patent/GB202216590D0/en
Priority to PCT/GB2023/052907 priority patent/WO2024100393A1/en
Publication of GB2624172A publication Critical patent/GB2624172A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/355Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an encoded video stream for transmitting to a mobile phone or a thin client
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/358Adapting the game course according to the network or server load, e.g. for reducing latency due to different connection speeds between clients
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43072Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Processing Or Creating Images (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

A method and user device 26 for generating a virtual 3D scene, such as an e-sports virtual stadium, are provided. The method comprises receiving 3D object data for generating at least one virtual 3D object, storing the 3D object data in a 3D object data buffer 504, receiving video data corresponding to a video stream, and storing the video data in a video data buffer 503. A reference time is also received and, based on the reference time, video data and 3D object data to use to generate the 3D virtual scene is determined. A method and server for synchronising a virtual 3D scene generated at user device is also provided. The server generates a reference time for the user devices to determine video data and 3d object data to use to generate the virtual scene, where the reference time corresponds to a time in the video stream that is earlier than a time in the video stream associated with the packet of video data last received from a video server.

Description

APPARATUS AND METHODS FOR VIRTUAL EVENTS
Field of the Invention
The invention relates to apparatus and methods for providing virtual events -for example (but not limited to), for broadcasting competitive video games as part of an esports event.
Background to the Invention
The live streaming of video games and esports events to an online audience of viewers is becoming incre.asingly popular. Esports events involve competitive play of a video game in a virtual game world. Popular esports games include, for example, Counter-Strike: Global Offensive (CS:G0), League of Legends, Dota 2, Fortnite and Call of Duty. Large esports events may have a live attendance of several thousand spectators physically present at the event location, in additional to several million online viewers who view the event via an online video stream. The audience of spectators at the event location are typically able to view a stage on which the competitors play the videogame, as well as a screen that shows the garneplay. The online viewers of the event typically view a video broadcast that includes live footage of the game and commentary, similar to a broadcast of a live physical sporting event such as a football match.
Whilst the online audience for an esport event is typically many times larger than the audience physically present at the event, the online experience can be less immersive and interactive. Whilst online viewers are able to view a broadcast showing the gameplay, and may have access to a shared chat in which each viewer can post comments and reactions as the match unfolds, some of the excitement and shared experience of being physically present in the live audience may be lost. Moreover, it is difficult to achieve synchronisation of the broadcast of the virtual event between all of the viewers. Delays in presenting the oarneplay to a viewer may be caused, for example, by the time taken to initially connect to the stream, or other processing or network delays. As a result, the video presented to some viewers may fall significantly behind the video presented to other viewers. Messages and reactions to the gameplay from other viewers may therefore be displayed out of synchronisation with the video feed of the match, resulting in a less cohesive and potent:ally confusing experience.
It wi I be appreciated, therefore, that there is a need for 1 proved methods and apparatus for providing virtual events.
Summary of the Invention
Aspects of the present invention are set out in the appended independent claims, 10 while details of certain embodiments are set out in the appended dependent claims.
In a first aspect the invention provides a method of generating a virtual 3D scene for display at a user device, the method comprising: receiving 3D object data for generating at least one virtual 3D object; storing the 3D object data in a 3D object data buffer; receiving video data corresponding to a video stream; storing the video data in a video data buffer; receiving a reference time; determining, based on the reference time, video data and 3D object data to use to generate the virtual scene; generating the virtual scene using the determined video data and 3D object data; and displaying the generated virtual 3D scene.
By virtue of the use of the reference time to determine the video data and 3D object data to use to generate the virtual scene, the syncronisation between the video data and 3D object data is beneficially improved. Moreover, the same reference time can be transmitted to a plurality of user devices, allowing syncronisation of the virtual scene to be achieved between each of the user devices.
Determining the video data and 3D object data to use to generate the virtual scene may comprise: determining a time difference between the reference time and a video playback time corresponding to video data displayed in the virtual 3D scene at the user device, wherein the playback time corresponds to a time that is later in the video stream than the reference time; setting, when the time difference is greater than a threshold time difference, the video playback time to be equal to the reference time; and generating the virtual scene using the video data and 3D object data corresponding to the reference time.
By virtue of the use of the threshold time difference to set the video playback time, whilst this results in a delay in the most recent video data and 3D object data being presented to the users in the virtual scene, the offset of the reference time from the most recent video timestamp enables the synchronisation to be improved, since the video data corresponding to the reference time is more likely to have been received at each of the user devices.
Each video frame of the video stream may be associated with a corresponding timestamp, and the video playback time may corresponds to: the timestamp associated with a frame of the video stream last used to generate the virtual 3D scene; or the timestamp associated with the next frame of the video stream that is to be used to generate the virtual 3D scene.
Generating the virtual scene may comprise generating the at least one virtual 3D object within the virtual scene.
Generating the virtual scene may comprise generating a video frame of the video stream within the virtual scene.
The 3D object data may correspond to virtual 3D objects in a virtual game world.
The 3D object data may indicate a position of a player-controlled character within the virtual game world. By virtue of the 3D object data indicating the position of the player-controlled character, the user is beneficially more easily able understand the relative position of the player-controller character in the virtual game world, compared to viewing a simple 2D representation of the game world (e.g. a 2D map).
The virtual scene may comprise a virtual stadium.
Generating the virtual scene may comprise generating a 3D virtual avatar of a user within the virtual scene. By virtue of the improved syncronisation that can be achieved at each user device using the reference time, from the perspective of a user viewing the virtual event, the synchronisation of cheering and other actions or reactions of virtual avatars controlled by other users with the gameplay that is displayed in the event area is beneficially improved.
The video data may be received from a first server, and the 3D object data may be received from a second server that is different from the first server. By virtue of the 113 use of the reference time to determine the video data and 3D object data to use to generate the virtual scene, the video data and 3D object data can be synchronised without requiring the transmission of the video data from the server that transmits the 3D object data to the user devices, beneficially reducing the bandwidth required between the server and the user devices.
In a second aspect the invention provides a user device for generating and displaying a virtual 3D scene, the user device comprising: means for receiving 3D object data for generating at least one virtual 3D object; a 3D object data buffer for storing the 3D object data; means for receiving video data corresponding to a video stream; a video data buffer for storing the video data; means for receiving a reference time; means for determining, based on the reference time, video data and 3D object data to use to generate the virtual scene; means for generating the virtual scene using the determined video data and 3D object data; and means for displaying the generated virtual 3D scene.
The means for displaying the generated virtual 3D scene may comprise a virtual reality, VR, or augmented reality, AR, headset.
In a third aspect the invention provides a method of synchronising a virtual 3D scene generated at each of a plurality of user devices, the method comprising: receiving, from a video server, a packet of video data corresponding to a video stream; generating 3D object data for generating, at a user device, at least one virtual 3D object in the virtual scene transmitting the 3D object data to each of the plurality of user devices; generating a reference time for use by the user devices to determine video data and 3D object data to use to generate the virtual scene, wherein the reference time corresponds to a time in the video stream that is earlier than a time in the video stream associated with the packet of video data last received from the video server; and transmitting the reference time to each of the plurality of user devices.
The method may further comprise: determining a video duration corresponding to the packet of received video data; increasing the reference time by the video duration; and transmitting the updated reference time to each of the plurality of user devices.
The method may comprise transmitting the reference time to each of the plurality of user devices each time a packet of video data is received from the video server.
The method may further comprise: obtaining game data corresponding to a state of a virtual game world; and generating the 3D object data based on the game data; wherein the 3D object data corresponds to one or more virtual objects within the virtual game world.
The 3D object data may indicate a position of a player-controlled character within the virtual game world.
The virtual scene may comprise a virtual stadium.
In a fourth aspect the invention provides a server for synchronising a virtual 3D scene generated at each of a plurality of user devices, the server comprising: means for receiving, from a video server, a packet of video data corresponding to a video stream; means for generating 3D object data for generating, at a user device, at least one virtual 3D object in the virtual scene; means for transmitting the 3D object data to each of the plurality of user devices; means for generating a reference time for use by the user devices to determine video data and 3D object data to use to generate the virtual scene, wherein the reference time corresponds to a time in the video stream that is earlier than a time in the video stream associated with the packet of video data last received from the video server; and means for transmitting the reference time to each of the plurality of user devices.
In a fourth aspect the invention provides computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the first aspect.
In a fifth aspect the invention provides a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of the third aspect.
Brief Description of the Drawings
Embodiments of the invention will now be described by way of example only with reference to the attached figures in which: Figure 1 shows a virtual stadium for providing a virtual even Figure 2 schematically illustrates a system for providing a virtual event; Figure 3 shows a simplified schematic illustration of a media server; Figure 4 shows a simplified schematic illustration of a data server; Figure 5 shows a simplified schematic illustration of a user device; Figure 6 shows an illustration of a method for generating a virtual scene; Figure 7a shows a flow diagram of a method performed by a user device; Figure 7b shows a further diagram illustrating a method of synchronisation; and Figure 8 shows an example in which a 2D video feed is overlaid on an image presented to a user.
In the figures. like elements are indicated by like reference numerals throughout.
Detailed Description of Preferred Embodiments
The present embodiments represent the best ways known to the Applicant of putting the invention into practice. However, they are not the only ways in which this can be achieved.
Virtual Stadium Figure 1 shows a 3D virtual stadium 100 for providing a virtual event. The virtual stadium 100 may be used, for example, to host an esports event, but could alternatively be used to host any other suitable type of virtual event. The virtual stadium 100 may be viewed by a user using any suitable virtual reality (VR) headset. The virtual stadium 100 may be arranged to accommodate approximately 200 users. However, this need not necessarily be the case, and the virtual stadium 100 may alternatively be arranged to accommodate a larger or smaller number of users depending on the size of the event and the resources available for hosting the event (e.g. processing, memory and/or bandwidth resources). As will be described in more detail later, a plurality of virtual stadiums 100 may be provided for a single event, in order to accommodate a larger number of users.
The virtual stadium 100 comprises an event area 10 in which one or more virtual 3D objects are displayed. For an esports event, a 3D representation of the game may be shown in the event area 10. A full 3D recreation of the virtual game world including the player-controller characters 10b may be displayed, allowing the users to view the progression of the game in a manner similar to watching a live sporting event in a physical stadium. Advantageously, viewing the 3D game world in the event area 10 enables the user to more easily understand the relative position of the player-controller characters 10b in the virtual game world, compared to viewing a simple 2D representation of the game world (e.g. a 2D map). For example, when two player-controlled characters 10b are on different floors of a building in the virtual game world, this may be easily understood from viewing the 3D representation in the event area 10, but may be difficult to discern from a 2D map. More generally, the event area 10 may be used to display one or more virtual 3D objects 10a related to the virtual event. The generation of the 3D objects for display in the event area 10 by a user device will be described in more detail later.
The virtual stadium 100 comprises one or more viewing areas 14 from which users can view the event area 10 in the virtual environment. The user may be provided with a virtual avatar 16 that the user can move around the virtual environment, in which case the images presented to the user via the VR headset may be such that it appears that the avatar 16 is a virtual body of the user. In other words, the view of the virtual stadium 100 via the VR headset is presented as if the user were viewing 113 the stadium 100 from the perspective of the virtual avatar 16 within the stadium 100.
The virtual avatar 16 may be visible to all of the other users viewing the virtual environment to create an immersive shared viewing experience. The virtual avatar 16 may be fixed in position at a particular location within the virtual stadium 100 (for example, at a particular viewing position within a viewing area 14). Alternatively, the virtual avatar 16 may be moveable around the virtual stadium 100 by the user. For example, the user may be able to move the virtual avatar 16 to a different viewing position within the viewing areas 14 in order to view an alternative perspective of a 3D virtual game world displayed in the event area 10. The user may also be able to position the virtual avatar 16 within the event area 10, which provides a particularly immersive experience within the virtual game world. When a user positions the avatar 16 within the event area 10, the avatar 16 may be hidden from the other users in the virtual stadium 100, so as to avoid the avatar 16 obscuring the view of the event area 10 for the other users.
A user may control their avatar 16 in the virtual stadium 100 to interact with other users. The user may be able to control the avatar 16 to perform one of a set of preconfigured reactions such as a wave, a cheer, or an expression of disappointment. The user may also, or alternatively, be able to freely control the arms of the avatar 16 to perform a cheer or other reaction (for example, using motion-tracked handheld controllers). A cheer may also include a sound effect that is audible to the other users in the virtual stadium 100. Alternatively, when the user device comprises a microphone, the user's voice may be audible to other users in the virtual stadium. A user may control their avatar 16 to cheer in response to gameplay displayed in the event area 10 (for example, when a team wins a round of a match, or scores a point). Advantageously, as will be described in more detail later, from the perspective of each user viewing the event the cheering of other users is synchronised with the gameplay that is displayed in the event area 10. If the cheering of the avatars 16 were not synchronised with the images displayed in the event area 10 then, for example, a user may observe the avatars 16 of other users cheering in response to a point being scored before the user sees the point being scored in the event area 10, which could be confusing for the user. Methods of achieving this synchronisation between users are described in more detail later.
The virtual stadium 100 includes a display area 12 for displaying 2D video footage within the virtual 3D environment. For example, the display area 12 may comprise a virtual screen on which the 2D video footage is shown. The virtual screen is part of the virtual stadium 100, and is rendered in a 3D manner as part of the overall virtual 3D environment (even though the display area 12 displays 2D video footage). When the virtual event is an esports event, the video footage may be a livestream of the gameplay that corresponds to the 3D representation of the game in the event area 10. Any other suitable information related to the event may be shown in the display area 12. For example, for an esports event, a score or other statistics related to the match may be shown in the display area 12. It is advantageous for the images that are shown in the display area 12 to be synchronised with images presented at the event area 10. For example, when the display area 12 shows a livestream of the gameplay that corresponds to the 3D representation of the game in the event area 10, a lack of synchronisation may be confusing for the user. Methods of achieving this synchronisation between the display area 12 and the event area 10 are described in more detail later.
The virtual stadium 100 may also include one or more additional display areas 18 that could be used, for example, for displaying advertising or other information related to the event. Alternatively, when the event is an esports event, the display area 18 may show a 2D map illustrating the positions of the players in the virtual game world (corresponding to the 3D representation of the virtual game world shown in the event area 10). The additional display area 18 could also show, for example, a video feed of the competitors playing the videogame.
One or more virtual lobbies may also be provided. The lobby may be provided as a virtual 3D room, or any other suitable virtual 3D environment. The lobby may be provided as a waiting area for the users, before the users enter the virtual stadium 100. For example, the lobby may be used as a waiting area for a user whilst the virtual stadium is loading in the memory of the user's device, or may be used as a 113 waiting area for a user waiting for an event to begin in the virtual stadium. The lobby may include a game (or so-called minigame) that the user can interact with in the lobby. The user may control an avatar 16 to move around within the lobby as described above for the virtual stadium 100. The lobby may be hosted on a different server from the server used to host the virtual stadium 100, or alternatively the lobby and the virtual stadium 100 may be hosted on the same server. The lobby may include an event area 10 and/or a display area 12 as described above for the virtual stadium 100, allowing the user to view the event from the lobby. When the event comprises a plurality of virtual stadiums 100 (e.g. when the event is an esports event comprising multiple matches being played in different virtual stadiums 100), the lobby may function as a gateway from which the user can select one of the virtual stadiums 100 to enter. The lobby may comprise a marketplace or shop interface with which the user can interact to purchase goods or services. For example, a user may interact with the marketplace to purchase merchandise related to the event, or to purchase a digital asset (such as music or artwork). The lobby may be configured to accommodate, for example, approximately 50 users. However, this need not necessarily be the case, and the lobby may alternatively be arranged to accommodate a larger or smaller number of users depending on the size of the event and the resources available for hosting the event (e.g. processing, memory and/or bandwidth resources).
Apparatus for Hosting a Virtual Event With reference to Figure 2, the present disclosure provides a system 200 for hosting a virtual event. The virtual event may be presented in the virtual stadium 100 illustrated in Figure 1 and described above. The system 200 comprises a media server 20, a data server 22, a communication network 24 (e.g. the internet), and a plurality of user devices 26. The media server 20 and the data server 22 may be co-located, or alternatively may be provided remotely from each other. The media server 20 and the data server 22 communicate with the user devices 26 via the communication network 24. The media server 20 communicates with the data server 22 via the communication network 24. Alternatively the media server 20 may communicate directly with the data server 22 (e.g. via a direct wired connection when the media server 20 and the data server 22 are co-located).
Media Server Figure 3 is a block diagram illustrating the main components of the media server 20 shown in Figure 2. The media server 20 (which may also be referred to as a 'video server') transmits video data related to a virtual event to each of the user devices 26. The video data may be a video for display in the display area 12 of the virtual stadium 100 shown in Figure 1. As described above, the video may be a livestream of competitive videogame gameplay.
As shown, the media server 20 includes a communication interface 305 which is operable to transmit signals to and to receive signals from the user devices 26 via the network 24 and to transmit signals to the data server 22 via the network 24 (and may also, but need not necessarily, be operable to receive signals from the data server 22). A controller 304 controls the operation of the media server 20 in accordance with software stored in a memory 301. The software may be pre-installed in the memory 301 and/or may be downloaded via the network 24 or from a removable data storage device (RMD), for example. The software includes, among other things, an operating system 302 and a media module 303.
The media module 303 controls the transmission of video data (and corresponding audio data) to the user devices 26A. For example, the media module 303 may control the transmission of video data for display in the display area 12 of the virtual stadium 100. The video data may be a video stream of live gameplay of a game of an esports event. Each frame or group of frames of the video is associated with a corresponding timestamp. The video data may be generated at the media server 20, or may be received from another device (e.g. a third party server) via the network 24.
Data Server Figure 4 is a block diagram illustrating the main components of the data server 22 shown in Figure 2. As shown, the data server 22 includes a communication interface 407 which is operable to transmit signals to, and receive signals from, the user devices 26, and to receive signals from the media server 20 via the network 24 (and may also, but need not necessarily, be operable to transmit signals to the media server 20). The data server 22 and the user device 26 may communicate, for example, via a User Datagram Protocol (UDP). A controller 406 controls the operation of the data server 22 in accordance with software stored in a memory 401. The software may be pre-installed in the memory 401 and/or may be downloaded via the network 24 or from a removable data storage device (RMD), for example. The software includes, among other things, an operating system 402, a reference time module 403, a 3D object data module 404, and a supplementary data module 405.
The reference time module 403 controls the generation and transmission of a reference time (which may also be referred to as a 'master stream time') to the user devices 26. As will be described in more detail later, the master stream time is used to maintain synchronisation between the 3D images displayed in the event area 10 and video displayed in the display area 12, and to maintain synchronisation between the user devices 26. The reference time may correspond to a timestamp of a video stream that is transmitted to the user devices 26 from the media server 20. The reference time may be transmitted to the user devices 26 each time a packet of video data is received at the data server 22 from the media server 20. Each packet of video data received from the media server 20 corresponds to a certain duration of video. Each time a packet of video data is received, the data server 22 may add the duration of the video in the received packet to the current reference time, and transmit the updated reference time to the user devices.
Since the data server 22 transmits the reference time for synchronising a virtual scene generated at each of a plurality of user devices, the data server 22 may also be referred to as a 'synchronisation server'. It will also be appreciated that the virtual scene generated at each of the plurality of user devices need not necessarily be exactly synchronised, and may instead be synchronised within a threshold time range.
The reference time does not correspond to the timestamp of the most recent video data received from the media server 20, but instead corresponds to a previous timestamp of the video data (e.g. a timestamp that is one minute, or any other suitable time offset, behind the most recent timestamp). Advantageously, whilst this results in a delay in the video presented to the users, the offset of the reference time from the most recent video timestamp enables the synchronisation to be improved, since the video data corresponding to the reference time is more likely to have been received at each of the user devices 26, as will be described in more detail later with reference to Figure 7.
The 3D object data module 404 is responsible for transmitting, to the user devices 26, 3D object data for use by the user devices 26 to generate the 3D objects for display in the event area 10. When the 3D objects displayed in the event area 10 correspond to a 3D representation of a videogame, the 3D object data module 404 obtains information corresponding to a current state of the video game (e.g. from the media server 20, or another entity connected to the network 24) and generates corresponding 3D object data. A user device 26 can then use the 3D object data to generate and display 3D representation of the game world in the event area 10. For example, the 3D object data may indicate the position of various entities and characters in the virtual game world, which the user device 26 can use to generate the 3D representation of the virtual game world. The 3D object data transmitted to the user devices may include a corresponding timestamp, which may correspond to the reference time.
When user-controlled virtual avatars 16 are used in the virtual stadium 100, the data server 22 may receive user inputs from a user device 26 corresponding to movement and actions of a corresponding virtual avatar 16, or may receive information indicating a status (e.g. position and state) of the virtual avatar 16. For example, when a user controls a virtual avatar 16 to move to another area of the virtual stadium 100, the data server 22 receives, from the user device 26, information indicating that the user has moved the virtual avatar 16 to the new position. The data server 22 then transmits, to the other user devices 26A, information indicating the new status (e.g. position, or cheering status) of the virtual avatar 16 within the stadium 100, so that the virtual scene generated at the other user devices 26A includes the new position of the virtual avatar 16. In other words, the data server 22 receives data from the user devices 26, and transmits data to the user devices 26, so that positions and actions of each avatar 16 can be displayed at each of the connected user devices 26.
The supplementary data module 404 is responsible for transmitting, to the user devices 26, supplementary data related to the event. For example, when the event is an esports event, the supplementary data module 404 may control the transmission of a score or round timer of the event for display in the display area 12. The supplementary data module 404 may also control the transmission of data for controlling the appearance of the virtual stadium 100, or for controlling a sound generated at the virtual stadium 100. For example, the supplementary data module 404 may also control the transmission of data for controlling a lighting effect in the virtual stadium 100.
The data server 22 may control the overall hosting of the virtual stadium 100. For example, the data server 22 may manage the connection of each of the user devices 26A to the virtual stadium(s) 100 or to the lobby. The data server 22 may control the number of hosted virtual stadiums 100 and/or lobbies, any may control the allocation of users 100 to the virtual stadiums 100 and/or lobbies.
When the event includes the display of footage from a videogame (e.g. in a competitive esports event), the video game may be hosted on the media server 20, the data server 22, or a suitable external videogame server. Conceivably, the videogame could alternatively be hosted directly at the data server 22. When the video game is hosted on the media server 20 or an external videogame server, the data server 22 is configured for receiving a current status of the video game from the media server 20 or from the external videogame server, and is configured to generate the 3D object data (which may also be referred to as 'object generation data') and/or the supplementary data based on the current status of the video game.
For example, the data server 22 may receive a JavaScript Object Notation (JSON) file from the external videogame server or the media server 20, and may generate the 3D object data (for transmission to the user devices 26) based on the received JSON file. The 3D object data may explicitly indicate all of the data required to generate the 3D object at the event area 10. Alternatively, the 3D object data may only indicate differences between previous 3D object data and the current object data, in order to reduce the volume of data transmitted between the data server 22 and the user device 26. For example, when the 3D object data includes an indication of positions of characters in the virtual game world, the 3D object data may explicitly indicate all of the player positions each time the 3D object data is transmitted, or alternatively may only indicate the player positions that have changed since the pervious 3D object data was transmitted.
The data server 22 may receive, for example, a score or round timer from the 30 videogame server or the media server 20, for transmission to the user devices 26 as supplementary data for display in the display area 12.
The data server 22 may also transmit, to the user devices 26, an indication of an address (e.g. network address such as an internet protocol (IP) address) at which the user devices 26 are to connect to receive the video data from the media server 20.
Since the data server 22 is responsible for transmitting the 3D object data for generating the 3D objects in the event area 10, is responsible for coordinating the movements and actions of the user-controlled avatars 16 between the user devices 26, and may be responsible for allocating each user to a particular virtual stadium to or lobby, the data server 22 may be said to host the virtual stadium 100 (and lobbies). The data server 22 may scale the number of hosted virtual stadiums 100 depending on the number of users and the available resources (e.g. network, processing and/or memory resources).
User Device Figure 5 is a block diagram illustrating the main components of the user device 26 shown in Figure 2. The user device 26 may be, for example, a personal computer or mobile device, and may include a VR headset for viewing the virtual stadium 100.
As shown, the user device 26 includes a communication interface 510 which is operable to transmit signals to and to receive signals from the media server 20 and the data server 22 via the network 24. A controller 508 controls the operation of the user device 26 in accordance with software stored in a memory 501. The software may be pre-installed in the memory 501 and/or may be downloaded via the network 24 or from a removable data storage device (RMD), for example. The software includes, among other things, an operating system 502, a video data buffer 503, a 3D object data buffer 504, a supplementary data buffer 505, a scene generation module 506 and a synchronisation module 507.
The user interface 509 may comprise, for example, a mouse, a keyboard, a handheld controller (which may comprise a joystick and/or various control buttons, and may be a motion-tracked controller), or a touchscreen interface integral with a display screen (e.g. as in the case of a smartphone or a tablet computer). In this example, the user device 26 also includes a VR headset for displaying a virtual scene generated by the scene generation module 506 (e.g. the virtual stadium 100 illustrated in Figure 1).
The video data buffer 503 stores the video data received from the media server, and may control communication with the media server 20 to retrieve the video data (e.g. using a network address received from the data server 22).
113 The 3D object data buffer 504 stores the 3D object data received from the data server 22. The supplementary data buffer 505 stores the supplementary data received from the data server 22.
The scene generation module 506 is responsible for the generation of the overall virtual scene (described in more detail below, with reference to Figure 7). In this example, the scene generation module 506 controls the generation of the virtual stadium 100 for display to the user via the VR headset. However, it will be appreciated that the scene generation module 506 may alternatively generate any other suitable virtual scene. The scene generation module 506 generates the virtual stadium 100 based on the video data received from the media server 20, and based on the 3D object data and supplementary data received from the data server 22. For example, the scene generation module 506 controls the display of the video (and the output of corresponding audio), received from the media server, at the display area 12 of the virtual stadium 100. The scene generation module 506 also controls the generation of the 3D entities in the event area 10 of the virtual stadium 100, based on the 3D object data received from the data server 22. Wien the virtual event includes a virtual lobby, the scene generation module 506 controls the generation of the virtual lobby.
The synchronisation module 507 receives the reference time from the data server 22, and controls the synchronisation of the video data received from the media server 20, the 3D object data received from the data server 22 and the supplementary data received from the data server 22 based on the reference time. The synchronisation module 507 may store a playback time that corresponds to a timestamp of the video received from the media server 20 that is currently presented to the user via the VR headset. As will be described in more detail below, the synchronisation module 507 may determine a difference between the playback time and the reference time, and determine video data and generated 3D objects to present to the user in the virtual scene based on that difference. When the difference between the playback time and the reference time exceeds a threshold value (which may occur when there has been a significant loss of synchronisation and the playback time at the user device is significantly ahead of the reference time received from the data server 22), the synchronisation module 507 may determine that video data having a less recent timestamp (older video data) should be presented to the user, to reduce the difference between the playback time and the reference time and improve the synchronisation. Since the same reference time is transmitted to each of the user devices 26 from the data server 22, the synchronisation with respect to the reference time also achieves synchronisation between each of the user devices 26 (with respect to the video and audio data, including the generated 3D objects, that are presented to each of the users). The method achieves synchronisation between the video data and the generated 3D objects in the virtual scene, without the need for transmitting the actual video data from the data server 22 (that transmits the information used for generating the 3D objects) to the user devices 26.
Method of Generating a Virtual Scene A method of generating a virtual scene will now be described, with reference to Figure 6. The term 'virtual scene' is used to refer to overall virtual environment presented to the user (e.g. via the VR headset of the user device 26). For example, the virtual scene may include the virtual stadium 100 (including the event area 10 and the display area 12) and the virtual avatars 16 illustrated in Figure 1. The virtual scene is generated at the user device 26 by the scene generation module 506.
In step S601, the user device 26 receives video data from the media server 20.
In step S602, the user device 26 receives the 3D object data (which may also be referred to as object generation data for generating a virtual object) from the data server 22. As described above, the 3D object data includes information for generating one or more virtual 3D objects for display in the event area 10 of the virtual stadium 100. In other words, the 3D object data includes information for generating one or more 3D objects as part of the overall virtual scene.
In step S603, the user device 26 receives the supplementary data from the data server 22. As described above, the supplementary data may include, for example, a score or round timer for display in the virtual stadium 100.
In step S604, the user device 26 generates the virtual scene for display to the user via the VR headset (and generates corresponding audio for output to the user via speakers or headphones of the user device 26). For example, when the user device 26 generates the virtual stadium 100 illustrated in Figure 1, the user device 26 generates the stadium 100 so that video displayed in the display area of the virtual stadium 100 corresponds to the video received from the media server 20. Similarly, the user device 26 uses the 3D object data received from the data server 22 to generate one or more virtual 3D object for display in the event area 10 of the virtual stadium (and may also generate the virtual stadium based on the supplementary data, e.g. to display a score or round timer received in the supplementary data). When the event is an esports event, the user device 26 may generate a 3D recreation of the virtual game world using the received 3D object data, for display in the event area 10. This may be achieved, for example, by processing the received 3D object data using the same game engine as used to generate the original videogame, to generate the 3D representation.
Synchronisation A method of achieving synchronisation at the user device 26 will now be described, with reference to Figure 7. In this example, the user device 26 has received video data from the media server 20 for display in the display area 12 of the virtual stadium 100, and stores the received video data in the video data buffer 503. Similarly, the user device 26 has received at least the 3D object data from the data server 22 for generating 3D objects for display in the event area 10 of the virtual stadium, and stores the received 3D object data in the 3D object data buffer. The user device 26 may store, for example, video data corresponding to two minutes of video playback in the video data buffer 503. Similarly, the user device 26 may store, for example, 3D object data corresponding to two minutes of display in the event area 10. The scene generation module 506 of the user device 26 generates the virtual stadium 100, for presentation to the user via the VR headset, using the received video data and the received 3D object data.
As described above, each frame of the video received from the media server 20 is associated with a corresponding video timestamp. Similarly, the 3D object data is associated with a 3D object data timestamp. In order for the video displayed in the display area 12 to be synchronised with the scene displayed in the event area 10, the difference between the video timestamp corresponding to the displayed video and the 3D object data timestamp corresponding to the scene displayed in the event area should be small (ideally, zero). In other words, the object generation data used to generate the virtual 3D objects in the event area 10 is selected based on the video timestamp of the video that is presented within the virtual scene, to improve the synchronisation between the images presented in the display area 12 and the images presented in the event area 10. Moreover, in order for the cheering and other reactions of the avatars 16 of other users in the virtual stadium 100 to be synchronised with the images shown in the display area 12 and the event area 10, the display area 12 and the event area 10 should also be synchronised between each user device 26. In other words, when a video frame (and generated virtual 3D objects) corresponding to a particular timestamp is presented to the user at the display area 12, that video frame should be presented to all of the other users at approximately the same time.
In step S701, the user device 26 receives the reference time from the data server 22.
In step S702, the user device 26 determines whether the difference between the reference time and a video timestamp of the video frame displayed in the display area 12 is greater than a threshold time difference. In other words, the user device 26 determines whether the video playback at the user device 26 is further ahead of the reference time than a threshold time difference. Alternatively, for example, the user device 26 may determine whether the difference between reference time and video timestamp of the next video frame to be displayed in the display area 12 (or any other suitable video frame to be synchronised) is greater than a threshold time difference.
If the result of step 5702 is YES (i.e. the difference between reference time and video timestamp of the video frame displayed in the display area 12 is greater than the threshold time difference), then the method proceeds to step S703A. In step S703A, the user device 26 selects a video frame for display in the display area 12 based on the reference time, to reduce the difference between the timestamp of the video frame presented to the user and the reference time. For example, the user device 26 may select the video frame that has the timestamp closest to the reference time for display in the display area 12, thereby minimising the difference between the timestamp of the displayed video frame and the reference time. In other words, video displayed in the display area jumps backwards in time, to restore synchronisation with the reference time. It will be appreciated that the video data corresponding to the reference time is stored in the video data buffer 503. The user device 26 may store, for example, video data corresponding to one or several minutes of video playback in the video data buffer 503 (e.g. video data corresponding to 1 minute or 5 minutes of video playback). Corresponding 3D object data is stored in the 3D object data buffer 504.
If the result of step S702 is NO, then the method proceeds to step 8703B. In step S703B, the video shown at the display area 12 continues playback from the current frame. In other words, when the user device 26 determines that the video playback is not too far ahead of the reference time, the video playback continues from the current video frame.
Advantageously, therefore, since the reference time is received and used at each of the user devices 26 to synchronise the video data, the video presented to each 113 of the users in the display area 12 is synchronised (and the virtual 3D objects are similarly synchronised, since the 3D object data used to generate the virtual 3D objects is selected based on the timestamp of the video frame selected for display, as described above. Beneficially, since each user is able to react (e.g. cheer) to the synchronised video data, the movements and actions of the avatars 16 in the virtual stadium 100 can also be synchronised. Moreover, this method of synchronisation does not require the transmission of the video data from the data server 22 that transmits the 3D object data to the user devices 26, reducing the bandwidth required between the data server 22 and the user devices 26.
Figure 7b shows a further diagram illustrating the synchronisation method. Method steps performed by the synchronisation module 507 are shown in the dashed boxes. As shown in the figure, the user device 26 receives video data from the media server 20 and stores the received video data in the video data buffer 503. Similarly, the user device 26 receives 3D object data for generating virtual 3D objects from the data server 22 and stores the received 3D object data in the 3D object data buffer 504. The user device 26 also receives a reference time from the data server 22 for use by the synchronisation module 507.
The synchronisation module determines the time difference between the reference time and the local stream time (which may also be referred to as a 'local playback time' or 'video playback time'). The local stream time is the timestamp corresponding to the video frame (or set of video frames) currently displayed (or scheduled to be displayed next) to the user. If the difference between the reference time and the local stream time is greater than (or alternatively, greater than or equal to) a threshold time difference, then the local stream time is updated to reduce the difference between the local stream time and the reference time (e.g. by setting the local stream time to be equal to the reference time). The local stream time is then output to the scene generation module 506 for use in generating the virtual scene. The scene generation module 506 receives the local stream time (which may have been adjusted by the synchronisation module 507), retrieves corresponding video data from the video data buffer 503, and retrieves corresponding 3D object data from the 3D object data buffer 504. The scene generation module 506 then generates the virtual 3D scene using the retrieved data, for subsequent display to the user.
For example, when the threshold time difference is 2 seconds, the local stream time is 116 seconds (e.g. relative to the beginning of the video stream), and the reference time is 111 seconds, the synchronisation module 507 determines that the difference between the local stream time and the reference time On this example, 5 seconds) is greater than the threshold time difference. Therefore, the synchronisation module 507 determines to reduce the time difference between the local stream time and the reference time, by setting a new value of the local stream time. In this example, the synchronisation module 507 determines to set the local stream time to 111 seconds, equal to the reference time, and transmits the updated local stream time to the scene generation module 506. The scene generation module 506 retrieves video data corresponding to the timestamp of 111 seconds from the video data buffer 503, retrieves the corresponding 3D object data from the 3D object data buffer 504, and generates the virtual 3D scene using the retrieved data. Since the local stream time has shifted backwards in time by 5 seconds, the video and 3D objects presented to the user will also shift backwards in time by 5 seconds. However, advantageously, since the same reference time is transmitted to all user devices by the data server 22, the method ensures that the local stream time at all of the user devices 26 is within the threshold time difference of the reference time, and therefore synchronisation of the virtual scene is achieved between the user devices.
Moreover, since the 3D object data is retrieved from the 3D object data buffer 504 based on the local stream time of the video playback, synchronisation between the virtual 3D objects in the event area 10 and the video displayed in the display area 12 is achieved at the user device 26.
As described above, the reference time may correspond to a timestamp of video data that is transmitted to the user devices 26 from the media server 20. However, the reference time does not correspond to the most recent timestamp (the timestamp furthest from the beginning of the video stream) of the video data 113 received from the media server 20, but instead corresponds to older video data. The offset between the reference time and the timestamp of the most recent video data received from the media server 20 may be, for example, one minute, but alternatively any other suitable time offset could be used. Increasing the time offset increases the likelihood that video data corresponding to the reference time has been received at all of the user devices. If video data corresponding to the reference time is not available in the video data buffer 503 of the user device 26, then the user device 26 may simply use the video data in the data buffer having the timestamp that is closest to the reference time. The reference time may be transmitted to the user devices 26 each time a packet of video data is received at the data server 22 from the media server 20. The data server 22 may update the reference time based on the video playback duration corresponding to the packet of video data. For example, as illustrated in Figure 7b, the data server 22 receives the video data from the media server 20, and in this example determines that the received video data corresponds to 4 seconds of video playback. Therefore, the data server 22 increases the reference time by 4 seconds, and transmits the updated reference time to the user device 26.
Whilst the scene generation module 506 is illustrated as a single module in Figure 7b, this need not necessarily be the case. For example, the user device 26 may comprise a data management module for selecting the 3D object data to use to generate the 3D virtual scene based on the time output by the synchronisation module 507, and a video playback module for selecting the video data to be presented based on the time output by the synchronisation module 507. The data management module may receive the time output from the synchronisation module and retrieve the corresponding 3D object data from the 3D object data buffer 504, and the video playback module may receive the time output from the synchronisation module and retrieve the corresponding video data from the video data buffer 503. The virtual scene can then be generated at the user device 26 using the 3D object data and the video data.
Modifications and Alternatives Detailed embodiments and some possible alternatives have been described above.
As those skilled in the art will appreciate, a number of modifications and further alternatives can be made to the above embodiments whilst still benefiting from the inventions embodied therein. It will therefore be understood that the invention is not limited to the described embodiments and encompasses modifications apparent to those skilled in the art lying within the scope of the claims appended hereto.
VVhilst the above examples have been described with reference to a 'reference time' and calculating a difference between the reference time and a local stream time, alternatively any other suitable parameter for indicating a reference point in the video stream could be used. For example, the data server could transmit a reference frame number corresponding to a frame of the video stream.
Whilst in some of the above examples the user device has been described as being configured to set the video playback time to be equal to the reference time when the difference between the video playback time and the reference time exceeds a threshold value, causing the video displayed to the user to jump back to older video data, this need not necessarily be the case. Alternatively, the video presented to the user may be paused (or 'frozen') until synchronisation with the reference time has been restored (within the threshold time difference).
Whilst in the example shown in Figure 1 the display area 12 comprises a virtual screen in the virtual stadium 100, this need not necessarily be the case.
Alternatively, as illustrated in Figure 8, the display area 12 may comprise a 2D image that is overlaid on top of the 3D virtual stadium 100 presented to the user via the VR headset. In other words, the scene generation module 506 of the user device 26 may first generate the 3D virtual stadium 100 (without the display area 12), and then overlay the 2D display area 12 as a 2D image on top of the 3D representation in the final image frames presented to the user via the VR headset.
It will be appreciated that the term 'virtual event' as used herein should be interpreted broadly to encompass any suitable event and is not limited to esports 113 events. For example, the event may be a virtual representation of live sporting event, live music event, show, or virtual conference.
Whilst the examples above have been described mainly with reference to a virtual 3D stadium that can be viewed using a suitable VR headset, this need not necessarily be the case. Whilst the 3D stadium provides a particularly immersive environment for the user, advantages of the invention are nevertheless obtained even when a 2D representation of the 3D virtual stadium is presented to the user.
Whilst the examples above have been described mainly with reference to a user 20 device (e.g a PC or mobile device) connected to a VR headset for display of the virtual stadium 100, alternatively a standalone VR headset could be used.
Whilst the examples above have been described with reference to a 'VR headset' for viewing virtual 3D objects, any other suitable apparatus for viewing virtual 3D 25 imagery could alternatively be used. For example, augmented reality (AR) glasses could be used to view the virtual environment.

Claims (21)

  1. CLAIMS1. A method of generating a virtual 3D scene for display at a user device, the method comprising: receiving 3D object data for generating at least one virtual 3D object; storing the 3D object data in a 3D object data buffer; receiving video data corresponding to a video stream; storing the video data in a video data buffer; receiving a reference time; determining, based on the reference time, video data and 3D object data to use to generate the virtual scene; generating the virtual scene using the determined video data and 3D object data; and displaying the generated virtual 3D scene.
  2. 2. The method according to claim 1, wherein determining the video data and 3D object data to use to generate the virtual scene comprises: determining a time difference between the reference time and a video playback time corresponding to video data displayed in the virtual 3D scene at the user device, wherein the playback time corresponds to a time that is later in the video stream than the reference time; setting, when the time difference is greater than a threshold time difference, the video playback time to be equal to the reference time; and generating the virtual scene using the video data and 3D object data corresponding to the reference time.
  3. 3. The method according to claim 1 or 2, wherein each video frame of the video stream is associated with a corresponding timestamp, and the video playback time corresponds to: the timestamp associated with a frame of the video stream last used to generate the virtual 3D scene; or the timestamp associated with the next frame of the video stream that is to be used to generate the virtual 3D scene.
  4. 4. The method according to any preceding claim, wherein generating the virtual scene comprises generating the at least one virtual 3D object within the virtual scene.
  5. 5. The method according to any preceding claim, wherein generating the virtual scene comprises generating a video frame of the video stream within the virtual 113 scene.
  6. 6. The method according to any preceding claim, wherein the 3D object data corresponds to virtual 3D objects in a virtual game world.
  7. 7. The method according to claim 6, wherein the 3D object data indicates a position of a player-controlled character within the virtual game world.
  8. 8. The method according to any preceding claim, wherein the virtual scene comprises a virtual stadium.
  9. 9. The method according to any preceding claim, wherein generating the virtual scene comprises generating a 3D virtual avatar of a user within the virtual scene.
  10. 10. The method according to any preceding claim wherein the video data is received from a first server, and the 3D object data is received from a second server that is different from the first server.
  11. 11. A user device for generating and displaying a virtual 3D scene, the user device comprising: means for receiving 3D object data for generating at least one virtual 3D object; a 3D object data buffer for storing the 3D object data; means for receiving video data corresponding to a video stream; a video data buffer for storing the video data; means for receiving a reference time; means for determining, based on the reference time, video data and 3D object data to use to generate the virtual scene; means for generating the virtual scene using the determined video data and 3D object data; and means for displaying the generated virtual 3D scene.
  12. 12. The user device according to claim 11, wherein the means for displaying the generated virtual 3D scene comprises a virtual reality, VR, or augmented reality, AR, headset.
  13. 13. A method of synchronising a virtual 3D scene generated at each of a plurality of user devices, the method comprising: receiving, from a video server, a packet of video data corresponding to a video stream; generating 3D object data for generating, at a user device, at least one virtual 3D object in the virtual scene transmitting the 3D object data to each of the plurality of user devices; generating a reference time for use by the user devices to determine video data and 3D object data to use to generate the virtual scene, wherein the reference time corresponds to a time in the video stream that is earlier than a time in the video stream associated with the packet of video data last received from the video server; 25 and transmitting the reference time to each of the plurality of user devices.
  14. 14. The method according to claim 13, wherein the method further comprises: determining a video duration corresponding to the packet of received video 30 data; increasing the reference time by the video duration; and transmitting the updated reference time to each of the plurality of user devices.
  15. 15. The method according to claim 14, wherein the method comprises transmitting the reference time to each of the plurality of user devices each time a packet of video data is received from the video server.
  16. 16. The method according to any one of claims 13 to 15, wherein the method further comprises: obtaining game data corresponding to a state of a virtual game world; and generating the 3D object data based on the game data; wherein the 3D object data corresponds to one or more virtual objects within the virtual game world.
  17. 17. The method according to claim 16, wherein the 3D object data indicates a position of a player-controlled character within the virtual game world.
  18. 18. The method according to any one of claims 13 to 17, wherein the virtual scene comprises a virtual stadium.
  19. 19. A server for synchronising a virtual 3D scene generated at each of a plurality of user devices, the server comprising: means for receiving, from a video server, a packet of video data corresponding to a video stream; means for generating 3D object data for generating, at a user device, at least one virtual 3D object in the virtual scene; means for transmitting the 3D object data to each of the plurality of user devices; means for generating a reference time for use by the user devices to determine video data and 3D object data to use to generate the virtual scene, wherein the reference time corresponds to a time in the video stream that is earlier than a time in the video stream associated with the packet of video data last received from the video server; and means for transmitting the reference time to each of the plurality of user devices.
  20. 20. A computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of any of claims 1 to 10.
  21. 21. A computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of any of claims 13 to 18.
GB2216590.6A 2022-11-08 2022-11-08 Apparatus and methods for virtual events Pending GB2624172A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2216590.6A GB2624172A (en) 2022-11-08 2022-11-08 Apparatus and methods for virtual events
PCT/GB2023/052907 WO2024100393A1 (en) 2022-11-08 2023-11-07 Apparatus and methods for virtual events

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2216590.6A GB2624172A (en) 2022-11-08 2022-11-08 Apparatus and methods for virtual events

Publications (2)

Publication Number Publication Date
GB202216590D0 GB202216590D0 (en) 2022-12-21
GB2624172A true GB2624172A (en) 2024-05-15

Family

ID=84839852

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2216590.6A Pending GB2624172A (en) 2022-11-08 2022-11-08 Apparatus and methods for virtual events

Country Status (2)

Country Link
GB (1) GB2624172A (en)
WO (1) WO2024100393A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6285405B1 (en) * 1998-10-14 2001-09-04 Vtel Corporation System and method for synchronizing data signals
US20110234902A1 (en) * 2010-03-24 2011-09-29 Kishan Shenoi Synchronization of audio and video streams
US20160191592A1 (en) * 2014-12-24 2016-06-30 Sonus Networks, Inc. Methods and apparatus for communicating delay information and minimizing delays
GB2555410A (en) * 2016-10-25 2018-05-02 Sony Interactive Entertainment Inc Video content synchronisation method and apparatus
US20200236278A1 (en) * 2019-01-23 2020-07-23 Fai Yeung Panoramic virtual reality framework providing a dynamic user experience
US11082679B1 (en) * 2021-01-12 2021-08-03 Iamchillpill Llc. Synchronizing secondary audiovisual content based on frame transitions in streaming content

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000042773A1 (en) * 1999-01-19 2000-07-20 Sony Electronics Inc. System and method for implementing interactive video
US9965900B2 (en) * 2016-09-01 2018-05-08 Avid Technology, Inc. Personalized video-based augmented reality
US11025982B2 (en) * 2019-03-29 2021-06-01 Twizted Design, Inc. System and method for synchronizing content and data for customized display
KR102295264B1 (en) * 2019-11-28 2021-08-30 주식회사 알파서클 Apparaturs and method for playing vr video using single streaming video

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6285405B1 (en) * 1998-10-14 2001-09-04 Vtel Corporation System and method for synchronizing data signals
US20110234902A1 (en) * 2010-03-24 2011-09-29 Kishan Shenoi Synchronization of audio and video streams
US20160191592A1 (en) * 2014-12-24 2016-06-30 Sonus Networks, Inc. Methods and apparatus for communicating delay information and minimizing delays
GB2555410A (en) * 2016-10-25 2018-05-02 Sony Interactive Entertainment Inc Video content synchronisation method and apparatus
US20200236278A1 (en) * 2019-01-23 2020-07-23 Fai Yeung Panoramic virtual reality framework providing a dynamic user experience
US11082679B1 (en) * 2021-01-12 2021-08-03 Iamchillpill Llc. Synchronizing secondary audiovisual content based on frame transitions in streaming content

Also Published As

Publication number Publication date
GB202216590D0 (en) 2022-12-21
WO2024100393A1 (en) 2024-05-16

Similar Documents

Publication Publication Date Title
JP6976424B2 (en) Audience view of the interactive game world shown at live events held at real-world venues
US11179635B2 (en) Sound localization in an augmented reality view of a live event held in a real-world venue
US11794102B2 (en) Cloud-based game streaming
US8522160B2 (en) Information processing device, contents processing method and program
JP6707111B2 (en) Three-dimensional content distribution system, three-dimensional content distribution method, computer program
JP6714625B2 (en) Computer system
US11386681B2 (en) Information processing apparatus, information processing method, and program
US20220277493A1 (en) Content generation system and method
KR100952095B1 (en) Game machine, game machine control method, and information storage medium
JP5776954B2 (en) Information processing apparatus, control method, program, recording medium, and drawing system
JP2023169282A (en) Computer program, server device, terminal device, and method
JP6688378B1 (en) Content distribution system, distribution device, reception device, and program
KR101643102B1 (en) Method of Supplying Object State Transmitting Type Broadcasting Service and Broadcast Playing
GB2624172A (en) Apparatus and methods for virtual events
JP2011109371A (en) Server, terminal, program, and method for superimposing comment text on three-dimensional image for display
JP7225159B2 (en) 3D CONTENT DISTRIBUTION SYSTEM, 3D CONTENT DISTRIBUTION METHOD, COMPUTER PROGRAM
CN114011067A (en) Game fighting method and device, electronic equipment and storage medium
US20240114181A1 (en) Information processing device, information processing method, and program
JP2020102236A (en) Content distribution system, receiving device and program
EP4306192A1 (en) Information processing device, information processing terminal, information processing method, and program
JP2020102053A (en) Content distribution system, receiving device and program
JP7495558B1 (en) VIRTUAL SPACE CONTENT DELIVERY SYSTEM, VIRTUAL SPACE CONTENT DELIVERY PROGRAM, AND VIRTUAL SPACE CONTENT DELIVERY METHOD
WO2024101001A1 (en) Information processing system, information processing method, and program for communication points regarding events
US12041218B2 (en) Three-dimensional content distribution system, three-dimensional content distribution method and computer program
JP2023154058A (en) Game system and device