EP1047481A1 - Des jeux videos en temps reels utilisent l'emulation du mode continu sur internet dans un evenement de diffusion - Google Patents

Des jeux videos en temps reels utilisent l'emulation du mode continu sur internet dans un evenement de diffusion

Info

Publication number
EP1047481A1
EP1047481A1 EP99944472A EP99944472A EP1047481A1 EP 1047481 A1 EP1047481 A1 EP 1047481A1 EP 99944472 A EP99944472 A EP 99944472A EP 99944472 A EP99944472 A EP 99944472A EP 1047481 A1 EP1047481 A1 EP 1047481A1
Authority
EP
European Patent Office
Prior art keywords
user
animation
client
clients
real life
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP99944472A
Other languages
German (de)
English (en)
Inventor
Raoul Mallart
Atul Sinha
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/138,782 external-priority patent/US6697869B1/en
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Publication of EP1047481A1 publication Critical patent/EP1047481A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43072Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/31Communication aspects specific to video games, e.g. between several handheld game devices at close range
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/33Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
    • A63F13/335Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using Internet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/358Adapting the game course according to the network or server load, e.g. for reducing latency due to different connection speeds between clients
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
    • H04N19/27Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding involving both synthetic and natural picture components, e.g. synthetic natural hybrid coding [SNHC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234318Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into objects, e.g. MPEG-4 objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/24Systems for the transmission of television signals using pulse code modulation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/402Communication between platforms, i.e. physical link to protocol
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/407Data transfer via internet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/53Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing
    • A63F2300/534Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing for network load management, e.g. bandwidth optimization, latency reduction
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8017Driving on land or water; Flying
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • Real time video game uses emulation of streaming over the internet in a broadcast event.
  • the invention relates to streaming multimedia files via a network.
  • the invention relates in particular to enabling the emulation of streaming graphics or video animation over the Internet within a broadcast context.
  • the invention relates in particular to enabling an end-user to interact with the environment created through the graphics or video animation.
  • streaming refers to transferring data from a server to a client so that it can be processed as a steady and continuous stream at the receiving end.
  • Streaming technologies are becoming increasingly important with the growth of the Internet because most users do not have fast enough access to download large multimedia files comprising, e.g., graphics animation, audio, video, or a combination thereof, etc.
  • Streaming enables the client's browser or plug-in to start processing the data before the entire file has been received.
  • the client side receiving the file must be able to collect the data and send it as a steady stream to the application that is processing the data. This means that if the client receives the data faster than required, the excess data needs to be buffered.
  • file is used herein to indicate an entity of related data items available to a data processing and capable of being processed as an entity.
  • file may refer to data generated in real-time as well as data retrieved from storage.
  • VRML 97 stands for "Virtual Reality Modeling Language", and is an International Standard (ISO/LEC 14772) file format for describing interactive 3D multimedia content on the Internet.
  • MPEG-4 is an ISO/IEC standard being developed by MPEG (Moving Picture Experts Group).
  • a scene graph is a family tree of coordinate systems and shapes, that collectively describe a graphics world.
  • the top-most item in the scene family tree is the world coordinate system.
  • the world coordinate system acts as the parent for one or more child coordinate systems and shapes. Those child coordinate systems are, in turn, parents to further child coordinate systems and shapes, and so on.
  • VRML is a file format for describing objects.
  • VRML defines a set of objects useful for doing 3D graphics, multi-media, and interactive object/world building. These objects are called nodes, and contain elemental data which is stored in fields and events.
  • the scene graph comprises structural nodes, leaf nodes, interpolation nodes and sensor nodes.
  • the structural nodes define the spatial relationship of objects within a scene.
  • the leaf nodes define the physical appearance of the objects.
  • the interpolation nodes define animations.
  • the sensor nodes define user interaction for particular user input modalities.
  • VRML does not directly support streaming of data from a server into a client. Facilities such as synchronization between streams and time stamping that are essential in streaming do not exist in VRML.
  • VRML has a mechanism that allows external programs to interact with VRML clients. This has been used in sports applications to load animation data into the client. See, for example, "VirtuaLive Soccer" of Orad Hi-Tec Systems, Ltd at ⁇ http://www.virtualive.com>.
  • This web document discusses a process for producing realistic, animated, three-dimensional graphic clips that simulate actual soccer match highlights for being sent via the Internet.
  • the system generates content that complements television sports coverage with multimedia-rich Web pages in near real time.
  • the process works in two steps. First the graphics models of the stadium and of the soccer players are downloaded along with an external program, in this case a Java Applet. The user can then interact with the external program to request a particular animation.
  • the data for this animation is then downloaded into the client and interacted with by the user: the user can view scenes of the match from different points of view in the animation and in slow motion if desired.
  • this process first downloads the structural and leaf nodes, and thereupon the interpolation nodes.
  • the process used in this example is somewhat equivalent to a single step process in which the user can choose the complete VRML file that contains all the models (structural nodes) and all the animation data (interpolator nodes). This approach leads to long download times before any content can be played on the client. This is experienced as a frustrating experience, especially if compared to TV broadcast where content is available instantly.
  • MPEG-4 defines a binary description format for scenes (BIFS) that has a wide overlap with VRML 97.
  • MPEG-4 has been designed to support streaming of graphics as well as for video.
  • MPEG-4 defines two server/client protocols for updating and animating scenes: BLFS-Update and BLFS-Anim.
  • Some of the advantages of MPEG-4 over VRML are the coding of the scene description and of the animation data as well as the built-in streaming capability. The user does not have to wait for the complete download of the animation data. For example, in the soccer match broadcast application mentioned earlier the animation an start as soon as the models of the players and the stadium are downloaded.
  • MPEG-4 further has the advantage that it more efficient owing to its BLFS transport protocol that uses a compressed binary format.
  • the known technologies mentioned above have several limitations with regard to bandwidth usage, packet-loss concealment or recovery and multi-user interactivity, especially in a broadcast to large numbers of clients.
  • bandwidth the complete animation is generated at the server. This results in a large amount of data that needs to be transported over the network, e.g., the Internet, connecting the client to the server.
  • the 22 soccer players need to be animated.
  • Each animation data point per individual player comprises a position in 3D space and a set of, say, 15 joint rotations to model the player's posture. This represents 63 floating-point values. If it is assumed that the animation update rate is 15 data points per seconds, a bit-rate of 665 Kbps is required.
  • This bit-rate can be reduced through compression.
  • using BJPS reduces the bit-rate by a factor of 20, giving a bit-rate of about 33 Kbps.
  • this number has not taken into account overhead required for the Internet protocols (RTP, UDP and IP) and for additional data types, such as audio.
  • typical modems currently commercially available on the consumer market have a capacity of 28.8 Kbps or 33.6 Kpbs.
  • streaming animation causes a problem at the end user due to bandwidth limitations. In the case of a broadcast to a large number of clients, say 100,000 clients, the data stream will need to be duplicated at several routers. A router on the Internet determines the next network point to which a packet should be forwarded on its way toward its final destination.
  • the router decides which way to send each information packet based on its current understanding of the state of the networks it is connected to.
  • a router is located at any juncture of networks or gateway, including each Internet point-of-presence. It is clear that the broadcast could lead to an unmanageable data explosion across the Internet. To prevent that from happening, the actual bandwidth needs to be limited to much lower than 28.8 Kbps.
  • VRML-based systems utilize reliable protocols (TCP). Packet losses are not an issue here.
  • TCP reliable protocols
  • MPEG-4 BIFS uses RTP/UDP/LP. A packet loss recovery mechanism is therefore required.
  • a packet loss recovery mechanism is therefore required.
  • re- transmission of lost packets can be considered.
  • MPEG reliability requires either higher bandwidth usage (redundancy) or higher latency (retransmission).
  • both VRML and MPEG-4 are essentially based on a server-client communication. No provisions exist to enable communication among multiple clients.
  • the invention provides a method of emulating streaming a multimedia file via a network to a receiving station connected to the network. Respective state information descriptive of respective states of the file is supplied. The receiving station is enabled to receive the respective state information via the network and is enabled to locally generate the multimedia file under control of the respective state information.
  • the invention relates to a method of supplying data via a network for enabling displaying graphics animation. Respective state information is supplied over the network descriptive of successive respective states of the animation. The respective state information is received via the network. The receiving station is enabled to generate the animation under control of the respective state information upon receipt. More particularly, the invention provides a method of enabling a user to navigate in a continuously evolving electronic virtual environment.
  • the method comprises: providing to the user a world model of the environment; sending state changes of the world model representative of the evolving; enabling the user to provide input for control of a position of an object relative to the virtual environment; and creating the environment from the world model and in response to the state changes and the user input.
  • the multimedia file (animation, video or audio file) is described as a succession of states. It is this state information that gets transmitted to the clients rather than the animation data itself.
  • the term "emulating" therefore emphasizes that the information communicated to the client need not be streamed.
  • the client generates the data for play-out locally and based on the state information received. Accordingly, the user perceives a continuous and steady stream of data during play-out as if the data were streamed over the network (under optimal conditions).
  • a shared-object protocol is used to accomplish the emulation. Both a server and a client have copies of a collection of objects.
  • An object is a data structure that holds state information.
  • an object is, for example, a graphics representation of one of the soccer players.
  • the server receives a streamed video file and monitors the objects. It is noted that MPEG-4 enables the creation of video objects that are processed as an entity. If the server changes the state of this object, the shared object protocol causes the copy at the client to change accordingly. This is explained in more detail with reference to the drawings.
  • This state information is at a higher level of abstraction than the animation data itself.
  • the state information comprises the current positions of the 22 players in the field and parameters specifying their current action (e.g., "running", "jumping", etc.).
  • the use of higher level information has several advantages, in particular in a broadcast application where animation is streamed over the Internet to a large audience.
  • the content of the state information as communicated over the Internet is very compact, thus requiring lower bandwidth than in case the animation data itself is streamed.
  • the animation is generated locally from a few parameters.
  • the update rate of animation data points is lower because the state of the animation changes at a slower rate than the animation data itself. This contributes to further lowering bandwidth requirements.
  • the invention provides enhanced possibilities for packet loss recovery or concealment and for network latency jitter masking. It is easy to interpolate or extrapolate between states and to implement dead reckoning concepts. User interaction with the animation is more easily programmable because of this higher level of abstraction. Another advantage is that multi-user interaction is feasible if clients are enabled to share state information. Still another advantage is the fact that clients are enabled to convert the state information into animation based on their individual processing power that might differ from client to client. The resources available at the client may be different per client or groups of clients.
  • Fig.l is a diagram of a VRML client-server system
  • Fig.2 is a diagram of an MPEG-4 client-server system
  • Figs.3-6 are diagrams of systems used in the invention
  • Figs. 7-8 are diagrams illustrating the video game Throughout the figures, same reference numerals indicate similar or corresponding features.
  • Fig.l is a block diagram of a client-server system 100 based on VRML.
  • System 100 comprises a server 102 coupled with a client 104 via a communication channel 106, here the Internet.
  • System 100 may comprise more clients but these are not shown in order to not obscure the drawing.
  • Server 102 comprises a source encoder 108 and a channel encoder 110.
  • Client 104 comprises a channel decoder 112 and a source decoder 114.
  • Source encoder 108 is considered a content generation tool. For example, it can be a tool that generates the VRML animation data from motion capture devices (not shown) operating on video.
  • Channel encoder 110 is a sub-system that takes as input the VRML animation generated at source encoder 108 and transforms it into a form that can be transported over the Internet.
  • the VRML animation data is stored in a file.
  • the transport of this file uses a standard file transport protocol.
  • channel decoder is contained in an external program 116. It gets the animation data from the downloaded file and sends it to a VRML player 118 that performs the source decoder function.
  • the source decoder function is essentially a management of the scene graph.
  • This server-client communication procedure is not a streaming solution.
  • the specification of VRML does not consider streaming a requirement. Facilities such as synchronization between streams and time stamping, both essential to streaming, do not exist in VRML.
  • Fig.2 is a block diagram of a client-server system 200 based on MPEG-4.
  • System 200 has a server 202 coupled to a client 204 via a communication channel 206.
  • Server 202 has a source encoder 208 and a channel encoder 210.
  • Client 204 has a channel decoder 212 and a source decoder 214.
  • MPEG-4 has been designed to support streaming.
  • MPEG-4 has defined a binary description format for scenes (BIFS) that has a wide overlap with VRML 97.
  • BIFS binary description format for scenes
  • MPEG-4 defines two server/client protocols for updating and animating scenes, namely BLFS-Update and BIFS-Anim.
  • Source encoder 208 is, similarly to encoder 108, a content generation tool.
  • Channel encoder 210 is different from channel encoder 110. It generates a bit-stream in BIFS and BIFS-Anim formats. This bit-stream contains the graphics models of the players and the stadium (in the soccer match animation) as well as the animation data.
  • both systems 100 and 200 have several serious drawbacks when used in an environment for broadcasting animation to a large number of clients, say, 100 - 100,000 clients.
  • the limitations relate to network bandwidth usage, packet loss concealment and multi-user interactivity as already mentioned above.
  • a preferred embodiment of the invention provides a solution to these problems by emulating the streaming utilizing a communication protocol that supports the sharing of objects by an object owner and an object viewer (or listener).
  • a shared object is a data structure that holds state information.
  • the set of shared objects that defines the entire state is called the world model.
  • the clients and the server have their own copy of the world model.
  • an object within the context of a soccer match representation is the representation of a soccer player.
  • the object's state information is then, for example, the position of the soccer player in 3D space or an action state such as "running" or "jumping” or “sliding” or “lying on the ground, apparently hurt, but has track record of comedian".
  • Each shared object is owned by a particular party, e.g., the server.
  • the owner can change the state information contained in the object.
  • the protocol automatically synchronizes the state information across the network.
  • Such a protocol is referred to herein below as a shared-object support protocol.
  • the protocol ensures that all the world model copies remain consistent as the state of the world model evolves. Examples of protocols that can be used for this purpose are DIS (Distributed Interactive Simulation) and ISTP (Interactive Sharing Transfer Protocol).
  • An underlying idea of the invention is to describe the animation as a succession of states. For example, in the soccer software application, the animation is described as a succession of player positions on the field and action states of the players. The state at a given moment is represented by the world model. As time passes, the state evolves and the protocol synchronizes the state of the world model across the network. This can also be explained in terms of shared objects. These objects hold the state information that describes the game at a given time instant. Updates of the state information for each object result in the generation of messages that are being sent across the network to the clients.
  • Fig.3 is a block diagram of a system 300 in the invention.
  • System 300 comprises a server 302 coupled to a client 304 via a network 306.
  • Server 302 comprises a source encoder 308 and a channel encoder 310.
  • Client 304 comprises a channel decoder 312 and a source decoder 314.
  • Server 302 has a copy 316 of a world model and client 304 has a copy 318 of the world model.
  • Data is streamed to source encoder 308 at an input 320.
  • Source encoder 308 generates the required state information based on the input received and updates the state of the objects in world model copy 316 as the streaming process continues. This type of technology is used, for example, by the VirtuaLive Soccer system mentioned above.
  • Channel encoder 310 monitors world model copy 316 and encodes the state changes of the shared objects. The encoded state changes are sent to client 304 via network 306. Channel decoder receives the state changes and updates local world model copy 318.
  • Source decoder 314 performs two tasks. First, it generates the animation based on the state information received. Secondly, source decoder 314 manages the scene graph according to the animation. Source decoder 314 is now an intelligent component: it performs animation computation and, in addition, it is capable of performing other tasks such as state interpolation or extrapolation to conceal packet losses or network latency jitter.
  • copies 316 and 318 of the world model need not be identical, e.g., in appearance when rendered, as long as an object in one copy of the world model and another object in another copy of the world model are being treated as shared in the sense that they share state changes.
  • the feasibility and extent of non-identity is application-dependent. For example, if one client's user likes to represent the soccer players as, say, penguins, and another client's user prefers the representation of say, ballet dancers, the representation at both clients are being kept consistent throughout the system by means of the shared state changes.
  • client 304 may enable the user to input additional state information to control the rendering of the world model at play-out.
  • the user may select a particular point of view when watching the VirtuaLive soccer match.
  • This state information is not, and need not, be present in server 302. It is noted that the rendering of the viewpoint based on state information and the world model is much less complicated and requires fewer resources than if the image were actually streamed into client 304 as a bitmap with depth information. Accordingly, in addition to the advantages of the invention mentioned earlier, the invention facilitates user-interactivity.
  • Fig.4 is a block diagram of a system 400 according to the invention.
  • System 400 comprises a server 302 that communicates with client 204 via a translator station 406. The configuration of server 302 and client 204 has been discussed above. Translator station 406 maintains a local copy of the world model.
  • This world model is updated by the messages from server 302 so that the model represents the current state. Based on this state information, translator station 406 computes the animation.
  • the animation data is encoded in BIFS-Anim format and transmitted to MPEG-4 client 204.
  • Server 302 is similar to the one in system 300.
  • Translator station 406 is a module that performs a conversion between messages transmitted under the shared-object support protocol on the one hand, and the BIFS-Anim bit-stream on the other hand.
  • Station 406 has a channel decoder 312 discussed above, a source transcoder 410 and a channel encoder 412. Decoder 312 interprets the messages received from server 302 and updates local copy of the world model 318.
  • Source transcoder 410 comprises a program that computes the animation based on state information.
  • Fig.5 is a block diagram of a system 500 in the invention.
  • System 500 combines the configurations of systems 300 and 400.
  • System 500 comprises a server 302, a network 502, clients 504, 506, 508 and 510 connected to server 302 via network 502.
  • System 500 further comprises a translator station 406 and clients 512, 514 and 516.
  • Clients 512-516 are coupled to server 302 via network 502 and translator station 406.
  • Clients 512-516 are served by translator station 406 with BIFS bit streams, whereas clients 504-510 receive the state information in a protocol that supports shared objects and generate the animation themselves.
  • Fig.6 is a block diagram of a system 600 in the invention that enables interaction among clients.
  • System 600 comprises a server 302 coupled to clients 602 and 604 via network 606.
  • the configuration of server 302 is discussed above.
  • Server 302 has a copy of a world model with objects 608, 610, 612 and 614.
  • Clients 602 and 604 have similar copies of the world model with similar objects 608-614.
  • the copies of the world model are maintained consistent throughout system 600 through state information sent by server 302. This forms the basis for the emulation of streaming a graphics animation, a video animation or an audio file as discussed above.
  • Clients 602 and 604 now also share objects 616 and 618 with each other, but not with server 302.
  • client 602 is owner of an object "viewpoint" that represents the view of the graphics representation of the soccer match chosen by client 602.
  • client 602 Based on the state information received from server 302 client 602 renders a graphics image of the match as if it were viewed from a particular position in the stadium. The rendering of the image is based on the combination of current state information received from server 302, the local world model copy and the user input via user input means 620, e.g., a joystick or mouse, that enables selection of the viewpoint.
  • Client 604 shares the viewpoint object that is being kept consistent with the one at client 602 under the control of the latter and using the shared- object support protocol. Objects 616-618 are not shared with other clients on the system.
  • System 600 may even be a totally distributed system without a server with main ownership. Each respective one of multiple clients owns respective ones of the objects in the world model that is perceptible from all clients. The owner of an object triggers a state change that gets propagated through the network so as to maintain consistency within the shared world model. In a multi-user application the effect is a continuous play-out at each client without severe bandwidth limitations as a consequence of the emulation of streaming of the animation.
  • Figs.7-8 are diagrams that illustrate using above systems 300, 400, 500 and 600 within the context for a video game based on a broadcast event, here a motorcycle race such as the Dutch TT.
  • Fig. 7 is a diagram of a further system 700 in the invention.
  • Fig. 7 shows the lay-out of a real race track 702, here the famous circuit of Assen, the Netherlands, where the Dutch TT is held annually. The length of the TT circuit is 6.05 km (3.76 miles).
  • dots 704, 706, 708, ..., 718 on track 702 each have a transmitter onboard that sends to a receiving station 720 their current positions and velocities (speed and direction), and preferably parameters as the roll (banking angle) and pitch ("wheelie”) and yaw (power slide when accelerating hard out of a bend) of the motorcycle.
  • Station 704 sends the data received from the racers as state changes to client machines 722, 724, ..., 726 of the end users over a network 728, e.g., the Internet.
  • Clients 722- 726 have received in advance a copy of a 3D graphics world model (see above) of track 702, e.g., downloaded in advance via the Internet or on a diskette.
  • the positions and velocities of the objects representing the real life racers on track 702 are determined by the state changes received at the user's machine 722, 724 or 726.
  • the objects representing the real racers form the dynamically changing virtual environment as perceived by the end user.
  • the objects can be designed to resemble the real racers at the user's, complete with the racing numbers colors of the helmets, overalls and fairing, etc.
  • the graphics appearance can be part of the world model downloaded to the user's machine.
  • the end user has his or her controllable object that is designed to provide the user with the view as if he or she were on a motorcycle on the track and participating in the competition as one of the racers, but now safely in the virtual environment.
  • This user-controlled racer is interacted with by the user through his her client machine 722, 724 or 726, via a suitable playstation or other user-input device 730, 732 and 734.
  • Video games wherein the user races against other users or against the program wherein racers are software program controlled are widely known and form major attractions in the arcades.
  • the quality of the 3D-graphics of video games currently commercially available for use at home is astonishingly high. This quality contributes substantially to the sense of immersion and of realism.
  • the invention now adds another dimension of realism by means of providing the nearest equivalent to racing against the real professionals in a real life race without the user having any chance of causing any danger or getting hurt him/herself.
  • the world model as provided may leave open as an option to the user how the objects are to be modeled in graphics on his or her client.
  • the graphics representation of the soccer players may be chosen to reflect any shape that the user can think of, especially after a six-pack.
  • the racers in the virtual environment that now is provided with user navigation capabilities may be represented as, say, camel riders or ostrich riders, or as members of the Swiss guard running along the track, or a combination thereof.
  • the appearance of the cars or motorcycles of real life may be replaced by that of yesteryear, or may be designed by the user him- or herself, etc. Fig.
  • FIG. 8 illustrates the surreal appearance when the objects representing the real life racers are modeled after a combination of the examples introduced above, which may well contribute to the entertainment level of the video game.
  • the real life racers are represented as objects that undergo state changes.
  • the actual graphics appearance on clients 722-726 is arbitrary so long as the motion and position of the graphics representation is kept consistent with the dynamics of the world model.
  • the invention has been presented as an interactive application tied in real time to an event during the event's broadcasting, it is clear that the interactive application need not be used at the time the real life event is actually taking place.
  • a service provider could broadcast the state changes representing the competitive event at any time and let subscribers race against the professionals.
  • the invention may be considered as the video-equivalent of karaoke.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

Dans une application de diffusion, sur un réseau client-serveur, le mode continu est émulé dans des données d'animation sur Internet, pour un grand nombre de clients. L'animation est considérée comme une séquence d'états. Les informations d'états sont envoyées aux clients à la place des données graphiques elles-mêmes. Les clients génèrent les données d'animation elles-mêmes sous le contrôle des informations d'état. Le serveur et les clients communiquent à l'aide d'un protocole-objet. Ainsi, le mode continu est assuré ainsi qu'une diffusion sans rencontrer de problèmes de largeurs de bande de réseaux trop importants. Cette approche permet de mapper des événements réels de la vie, par exemple, une course automobile, dans un environnement virtuel pour laisser l'utilisateur participer à une course virtuelle contre des coureurs professionnels dans la vie réelle, la dynamique de l'environnement virtuel étant déterminée par les changements d'états envoyés à l'utilisateur.
EP99944472A 1998-08-24 1999-08-23 Des jeux videos en temps reels utilisent l'emulation du mode continu sur internet dans un evenement de diffusion Withdrawn EP1047481A1 (fr)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US09/138,782 US6697869B1 (en) 1998-08-24 1998-08-24 Emulation of streaming over the internet in a broadcast application
US138782 1998-08-24
US149950 1998-09-09
US09/149,950 US6557041B2 (en) 1998-08-24 1998-09-09 Real time video game uses emulation of streaming over the internet in a broadcast event
PCT/EP1999/006160 WO2000010663A1 (fr) 1998-08-24 1999-08-23 Des jeux videos en temps reels utilisent l'emulation du mode continu sur internet dans un evenement de diffusion

Publications (1)

Publication Number Publication Date
EP1047481A1 true EP1047481A1 (fr) 2000-11-02

Family

ID=26836547

Family Applications (1)

Application Number Title Priority Date Filing Date
EP99944472A Withdrawn EP1047481A1 (fr) 1998-08-24 1999-08-23 Des jeux videos en temps reels utilisent l'emulation du mode continu sur internet dans un evenement de diffusion

Country Status (4)

Country Link
EP (1) EP1047481A1 (fr)
JP (1) JP5160704B2 (fr)
CN (1) CN1275091A (fr)
WO (1) WO2000010663A1 (fr)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100429838B1 (ko) 2000-03-14 2004-05-03 삼성전자주식회사 인터랙티브 멀티미디어 콘텐츠 서비스에서 업스트림채널을 이용한 사용자 요구 처리방법 및 그 장치
KR100406554B1 (ko) * 2000-04-12 2003-11-22 유재우 컴퓨터 네트워크 시스템에서 자동차 경주 대회를 제공하는방법 및 그 기록 매체
EP1217567A1 (fr) * 2000-12-21 2002-06-26 Volker Goeman Méthode de participation simulée par ordinateur à une course réelle
NO314326B1 (no) * 2001-04-30 2003-03-03 Gruenderene Stonebridge Da Fremgangsmåte og system for å stötte gjennomföring av et datamaskinbasert spill med flere deltagere
US6894690B2 (en) 2001-06-20 2005-05-17 Engineering Technology Associates, Inc. Method and apparatus for capturing and viewing a sequence of 3-D images
KR20030035138A (ko) * 2001-10-30 2003-05-09 한국전자통신연구원 클라이언트-서버 기반 네트워크 가상환경에서 상태정보의전송 방법
JP2004104556A (ja) * 2002-09-11 2004-04-02 Nippon Telegr & Teleph Corp <Ntt> 映像再生方法,映像再生装置,映像再生プログラムおよび映像再生プログラムの記録媒体
JP2004105671A (ja) * 2002-09-16 2004-04-08 Genki Kk 空間位置共有システム、データ共有システム、ネットワークゲームシステム及びネットワークゲーム用クライアント
US8549574B2 (en) * 2002-12-10 2013-10-01 Ol2, Inc. Method of combining linear content and interactive content compressed together as streaming interactive video
JP4064429B2 (ja) 2006-07-26 2008-03-19 株式会社コナミデジタルエンタテインメント ゲームシステム、ゲーム端末およびサーバ装置
JP4064430B2 (ja) 2006-07-26 2008-03-19 株式会社コナミデジタルエンタテインメント ゲームシステム、ゲーム端末およびサーバ装置
AU2008243102B2 (en) 2007-11-05 2012-01-19 Videobet Interactive Sweden AB A gaming system and a method of managing bandwidth usage in a gaming system
JP2010142300A (ja) * 2008-12-16 2010-07-01 Toyota Motor Corp 運転シミュレーション用端末装置、運転シミュレーション実行方法及びそのプログラム
JP6405479B1 (ja) * 2018-02-19 2018-10-17 株式会社コナミデジタルエンタテインメント ゲームシステム、ゲーム端末、及びプログラム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5659691A (en) * 1993-09-23 1997-08-19 Virtual Universe Corporation Virtual reality network with selective distribution and updating of data to reduce bandwidth requirements
US5714997A (en) * 1995-01-06 1998-02-03 Anderson; David P. Virtual reality television system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO0010663A1 *

Also Published As

Publication number Publication date
JP2002523156A (ja) 2002-07-30
CN1275091A (zh) 2000-11-29
WO2000010663A1 (fr) 2000-03-02
JP5160704B2 (ja) 2013-03-13

Similar Documents

Publication Publication Date Title
US6557041B2 (en) Real time video game uses emulation of streaming over the internet in a broadcast event
EP1391226B1 (fr) Méthode et appareils pour implémenter des services de divertissement hautement interactif en utilisant la technologie de flux de média, permettant la mise à disposition à distance de services de réalité virtuelle
WO2000010663A1 (fr) Des jeux videos en temps reels utilisent l&#39;emulation du mode continu sur internet dans un evenement de diffusion
EP1690378B1 (fr) Dispositif et procede destines a transmettre et synchroniser des informations multisensorielles avec des donnees audio/video
US20010000962A1 (en) Terminal for composing and presenting MPEG-4 video programs
Taleb et al. Extremely interactive and low-latency services in 5G and beyond mobile systems
CN1218241A (zh) 发送和接收表示3-维虚拟空间的数据流的装置和方法
Battista et al. MPEG-4: A multimedia standard for the third millennium. 2
Hijiri et al. A spatial hierarchical compression method for 3D streaming animation
WO2000042773A9 (fr) Systeme et procede de mise en oeuvre de video interactive
MXPA00003828A (en) Emulation of streaming over the internet in a broadcast application
JPH11102450A (ja) 3次元仮想空間を表すデータストリームを送受信する装置及び方法
Signès et al. MPEG-4: Scene Representation and Interactivity
Katto et al. System architecture for synthetic/natural hybrid coding and some experiments
Darlagiannis COSMOS: Collaborative system framework based on MPEG-4 objects and streams.
Doi et al. Art gallery information service system on IP over ATM network
Zoi et al. An open framework for prototyping the delivery of multimedia-rich virtual learning experiences through the Internet
Defée et al. COMIQS system for commercial presentations on the Internet
MXPA00012717A (en) Terminal for composing and presenting mpeg-4 video programs

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20000904

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): DE FR GB

17Q First examination report despatched

Effective date: 20040405

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20050426