WO2006000786A1 - Real-time voice-chat system for an networked multiplayer game - Google Patents

Real-time voice-chat system for an networked multiplayer game Download PDF

Info

Publication number
WO2006000786A1
WO2006000786A1 PCT/GB2005/002488 GB2005002488W WO2006000786A1 WO 2006000786 A1 WO2006000786 A1 WO 2006000786A1 GB 2005002488 W GB2005002488 W GB 2005002488W WO 2006000786 A1 WO2006000786 A1 WO 2006000786A1
Authority
WO
WIPO (PCT)
Prior art keywords
control terminal
games control
audio data
games
game
Prior art date
Application number
PCT/GB2005/002488
Other languages
French (fr)
Inventor
Gregory Duddle
Original Assignee
Sony Computer Entertainment Europe Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Europe Limited filed Critical Sony Computer Entertainment Europe Limited
Publication of WO2006000786A1 publication Critical patent/WO2006000786A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/40Network security protocols
    • A63F13/12
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/57Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player
    • A63F2300/572Communication between players during game play of non game information, e.g. e-mail, chat, file transfer, streaming of audio and streaming of video

Definitions

  • This invention relates to electronic game processing.
  • Electronic games are well-known and may be categorised in many different ways, such as “racing games” (in which a player controls a vehicle around a course within the game environment); “fighting games” (in which a player controls a character within the game environment and combats other game characters); “maze games” (in which a player controls a character around a maze within the game environment); combinations of these types of game; etc..
  • a player may accumulate a score as the game is played; alternatively, a player's turn may be assessed by the level or position within the game that the player reaches. It is well-known for these games to provide special attributes to a player's game character if certain events occur.
  • a fighting game there may be armour-objects distributed within the game environment (or they may appear from time to time) and if a game character collects one of these armour-objects then that character gains advantages in combat.
  • the game may be arranged so that, if a game character reaches a certain level or is located at a certain position within the game environment then various events happen, such as extending time limits or "healing" the game character (i.e. increasing the value of a health property associated with the game character).
  • electronic games to be so-called multi-player games, in which more that one human player is involved in the game. The players may collaborate in teams or may play as individuals against each other.
  • Players may provide their input to a game by general hardware controllers (such as keyboards and mice) or by more specialised hardware controllers.
  • controllers may be connected to a single electronic games machine (such as a personal computer or a dedicated games console) to facilitate multi-player games.
  • a communications network such as a local area network (LAN), a wide area network (WAN), or the Internet.
  • LAN local area network
  • WAN wide area network
  • players may then participate in a game even when they are located at geographically different locations.
  • Networks dedicated to the communication of game data have been developed.
  • the networks used for such multi- player networked games may, in addition to the games machines, also comprise other network machines such as network servers. It is known for such networked electronics games to allow the human players to communicate with each other over the network. This may be achieved, for example, by the players composing textual messages (for example, by typing on a keyboard) and sending these messages to other players over the network. It is also known for such networked electronic games to allow audio data (such as voice data) to be distributed across the network, the audio data being input from the player via a microphone, for example.
  • audio data such as voice data
  • This invention provides a network comprising: a plurality of processing apparatus, at least two of the processing apparatus being games control terminals, each games control terminal being operable to transmit audio data to and receive audio data from another one of the games control terminals and to render received audio data; the network providing controlling logic operable to determine whether the games control terminals may perform an audio-communication-task; an audio-communication-task being one or more of the transmission of audio data from a games control terminal to another games control terminal; the reception by a games control terminal of audio data transmitted from another games control terminal; and the rendering by a games control terminal of audio data received from another games control terminal; the determination being dependent upon a game status associated with at least one of the games control terminals and/or at least one operator of a games control terminal, the game status comprising at least one
  • This invention recognises a problem with networked electronic games that allow audio data (and particularly voice data) to be communicated across the network. If several players participate in such a networked game and some or all of them talk over the network simultaneously, then it is very difficult for a player to discern who is talking to whom, and what is being said, as each player hears all of the audio data at once. This is in addition to any audio that is being generated and rendered locally to the player (i.e. not from the network). This will reduce the overall appeal of the electronic game, or, at the very least, will discourage players from participating in verbal communication across the network.
  • the invention provides a mechanism whereby the ability to talk to and/or listen to players across the network is either granted to or removed from a player depending on the current game status associated with one or more of the participating players.
  • the number of players who can communicate audio data simultaneously over the network may be reduced to a more practical level, thus preventing too many players from being able to talk simultaneously.
  • a player is therefore able to hear more clearly the content of audio communication occurring over the network and, if permitted, may contribute to the audio communication in the knowledge that the contribution will be discernible by the other players.
  • the invention also provides a network comprising: a plurality of processing apparatus, at least two of the processing apparatus being games control terminals, each games control terminal being operable to transmit audio data to and receive audio data from another one of the games control terminals and to render received audio data; the network providing controlling logic operable to determine whether the games control terminals may perform an audio-communication-task; an audio-communication-task being one or more of the transmission of audio data from a games control terminal to another games control terminal; the reception by a games control terminal of audio data transmitted from another games control terminal; and the rendering by a games control terminal of audio data received from another games control terminal; the determination being dependent upon a game status associated with at least one of the games control terminals and/or at least one operator of a games control terminal, the game status comprising one or more communication-enabling-actions performed by a game character, the network associating at least one game character with each games control terminal and/or each operator.
  • Figure 1 schematically illustrates the overall system architecture of the PlayStation2
  • Figure 2 schematically illustrates the architecture of an Emotion Engine
  • Figure 3 schematically illustrates the configuration of a Graphics Synthesiser
  • Figure 4 schematically illustrates four system units networked together
  • Figure 5 is a schematic flow chart of a first embodiment for controlling audio communication in a networked game
  • Figure 6 is a schematic flow chart of a second embodiment for controlling audio communication in a networked game
  • Figure 7 is a schematic flow chart of a third embodiment for controlling audio communication in a networked game
  • Figure 8 is a schematic flow chart of a fourth embodiment for controlling audio communication in a networked game
  • Figure 9 schematically illustrates an example positioning of game characters within a game environment.
  • FIG. 1 schematically illustrates the overall system architecture of the PlayStation2.
  • a system unit 10 is provided, with various peripheral devices connectable to the system unit.
  • the system unit 10 comprises: an Emotion Engine 100; a Graphics Synthesiser 200; a sound processor unit 300 having dynamic random access memory (DRAM); a read only memory (ROM) 400; a compact disc (CD) and digital versatile disc (DVD) reader 450; a Rambus Dynamic Random Access Memory (RDRAM) unit 500; an input/output processor (IOP) 700 with dedicated RAM 750.
  • An (optional) external hard disk drive (HDD) 390 may be connected.
  • the input/output processor 700 has two Universal Serial Bus (USB) ports 715 and an iLink or IEEE 1394 port (iLink is the Sony Corporation implementation of the IEEE 1394 standard).
  • the IOP 700 handles all USB, iLink and game controller data traffic. For example when a user is playing a game, the IOP 700 receives data from the game controller and directs it to the Emotion Engine 100 which updates the current state of the game accordingly.
  • the IOP 700 has a Direct Memory Access (DMA) architecture to facilitate rapid data transfer rates. DMA involves transfer of data from main memory to a device without passing it through the CPU.
  • the USB interface is compatible with Open Host Controller Interface (OHCI) and can handle data transfer rates of between 1.5 Mbps and 12 Mbps.
  • OHCI Open Host Controller Interface
  • Provision of these interfaces means that the PlayStation2 is potentially compatible with peripheral devices such as video cassette recorders (VCRs), digital cameras, microphones, set-top boxes, printers, keyboard, mouse and joystick.
  • VCRs video cassette recorders
  • a peripheral device connected to a USB port 715 an appropriate piece of software such as a device driver should be provided.
  • Device driver technology is very well known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the embodiment described here.
  • a USB microphone 730 is connected to the USB port. It will be appreciated that the USB microphone 730 may be a hand-held microphone or may form part of a head-set that is worn by the human operator.
  • the microphone includes an analogue-to-digital converter (ADC) and a basic hardware- based real-time data compression and encoding arrangement, so that audio data are transmitted by the microphone 730 to the USB port 715 in an appropriate format, such as 16-bit mono PCM (an uncompressed format) for decoding at the PlayStation 2 system unit 10.
  • ADC analogue-to-digital converter
  • two other ports 705, 710 are proprietary sockets allowing the connection of a proprietary non-volatile RAM memory card 720 for storing game-related information, a hand-held game controller 725 or a device (not shown) mimicking a hand-held controller, such as a dance mat.
  • the system unit 10 may be connected to a network adapter 805 that provides an interface (such as an Ethernet interface) to a network.
  • This network may be, for example, a LAN, a WAN or the Internet.
  • the network may be a general network or one that is dedicated to game related communication.
  • the network adapter 805 allows data to be transmitted to and received from other system units 10 that are connected to the same network, (the other system units 10 also having corresponding network adapters 805).
  • the Emotion Engine 100 is a 128-bit Central Processing Unit (CPU) that has been specifically designed for efficient simulation of 3 dimensional (3D) graphics for games applications.
  • the Emotion Engine components include a data bus, cache memory and registers, all of which are 128-bit. This facilitates fast processing of large volumes of multi-media data.
  • the Emotion Engine also comprises MPEG2 decoder circuitry which allows for simultaneous processing of 3D graphics data and DVD data.
  • the Emotion Engine performs geometrical calculations including mathematical transforms and translations and also performs calculations associated with the physics of simulation objects, for example, calculation of friction between two objects. It produces sequences of image rendering commands which are subsequently utilised by the Graphics Synthesiser 200.
  • the image rendering commands are output in the form of display lists.
  • a display list is a sequence of drawing commands that specifies to the Graphics Synthesiser which primitive graphic objects (e.g.
  • the Emotion Engine 100 can asynchronously generate multiple display lists.
  • the Graphics Synthesiser 200 is a video accelerator that performs rendering of the display lists produced by the Emotion Engine 100.
  • the Graphics Synthesiser 200 includes a graphics interface unit (GIF) which handles, tracks and manages the multiple display lists.
  • the rendering function of the Graphics Synthesiser 200 can generate image data that supports several alternative standard output image formats, i.e., NTSC/PAL, High Definition Digital TV and VESA.
  • the rendering capability of graphics systems is defined by the memory bandwidth between a pixel engine and a video memory, each of which is located within the graphics processor.
  • Conventional graphics systems use external Video Random Access Memory (VRAM) connected to the pixel logic via an off- chip bus which tends to restrict available bandwidth.
  • VRAM Video Random Access Memory
  • the Graphics Synthesiser 200 of the PlayStation2 provides the pixel logic and the video memory on a single high- performance chip which allows for a comparatively large 38.4 Gigabyte per second memory access bandwidth.
  • the Graphics Synthesiser is theoretically capable of achieving a peak drawing capacity of 75 million polygons per second. Even with a full range of effects such as textures, lighting and transparency, a sustained rate of 20 million polygons per second can be drawn continuously.
  • the Graphics Synthesiser 200 is capable of rendering a film-quality image.
  • the Sound Processor Unit (SPU) 300 is effectively the soundcard of the system which is capable of recognising 3D digital sound such as Digital Theater Surround (DTS®) sound and AC-3 (also known as Dolby Digital) which is the sound format used for DVDs.
  • a display and sound output device 305 such as a video monitor or television set with an associated loudspeaker arrangement 310, is connected to receive video and audio signals from the graphics synthesiser 200 and the sound processing unit 300.
  • the main memory supporting the Emotion Engine 100 is the RDRAM (Rambus Dynamic Random Access Memory) module 500 produced by Rambus Incorporated.
  • This RDRAM memory subsystem comprises RAM, a RAM controller and a bus connecting the RAM to the Emotion Engine 100.
  • Figure 2 schematically illustrates the architecture of the Emotion Engine 100 of Figure 1.
  • the Emotion Engine 100 comprises: a floating point unit (FPU) 104; a central processing unit (CPU) core 102; vector unit zero (VUO) 106; vector unit one (VUl) 108; a graphics interface unit (GIF) 110; an interrupt controller (INTC) 112; a timer unit 114; a direct memory access controller 116; an image data processor unit (IPU) 118; a dynamic random access memory controller (DRAMC) 120; a sub-bus interface (SIF) 122; and all of these components are connected via a 128-bit main bus 124.
  • FPU floating point unit
  • CPU central processing unit
  • VUO vector unit zero
  • VUl vector unit one
  • GIF graphics interface
  • IIC interrupt controller
  • timer unit 114 a direct memory access controller
  • the CPU core 102 is a 128-bit processor clocked at 300 MHz.
  • the CPU core has access to 32 MB of main memory via the DRAMC 120.
  • the CPU core 102 instruction set is based on MIPS III RISC with some MIPS IV RISC instructions together with additional multimedia instructions.
  • MIPS III and IV are Reduced Instruction Set Computer (RISC) instruction set architectures proprietary to MIPS Technologies, Inc. Standard instructions are 64-bit, two-way superscalar, which means that two instructions can be executed simultaneously.
  • Multimedia instructions use 128-bit instructions via two pipelines.
  • the CPU core 102 comprises a 16KB instruction cache, an 8KB data cache and a 16KB scratchpad RAM which is a portion of cache reserved for direct private usage by the CPU.
  • the FPU 104 serves as a first co-processor for the CPU core 102.
  • the vector unit 106 acts as a second co-processor.
  • the FPU 104 comprises a floating point product sum arithmetic logic unit (FMAC) and a floating point division calculator (FDIV). Both the FMAC and FDIV operate on 32-bit values so when an operation is carried out on a 128- bit value ( composed of four 32-bit values) an operation can be carried out on all four parts concurrently. For example adding 2 vectors together can be done at the same time.
  • the vector units 106 and 108 perform mathematical operations and are essentially specialised FPUs that are extremely fast at evaluating the multiplication and addition of vector equations.
  • FMACs Floating-Point Multiply-Adder Calculators
  • FDIVs Floating-Point Dividers
  • VIFs Vector Interface Units
  • Vector unit zero 106 can work as a coprocessor to the CPU core 102 via a dedicated 128-bit bus so it is essentially a second specialised FPU.
  • Vector unit one 108 has a dedicated bus to the Graphics synthesiser 200 and thus can be considered as a completely separate processor.
  • the inclusion of two vector units allows the software developer to split up the work between different parts of the CPU and the vector units can be used in either serial or parallel connection.
  • Vector unit zero 106 comprises 4 FMACS and 1 FDIV. It is connected to the CPU core 102 via a coprocessor connection. It has 4 Kb of vector unit memory for data and 4 Kb of micro-memory for instructions. Vector unit zero 106 is useful for performing physics calculations associated with the images for display. It primarily executes non- patterned geometric processing together with the CPU core 102. Vector unit one 108 comprises 5 FMACS and 2 FDIVs. It has no direct path to the CPU core 102, although it does have a direct path to the GIF unit 110. It has 16 Kb of vector unit memory for data and 16 Kb of micro-memory for instructions. Vector unit one 108 is useful for performing transformations.
  • the GIF 110 is an interface unit to the Graphics Synthesiser 200. It converts data according to a tag specification at the beginning of a display list packet and transfers drawing commands to the Graphics Synthesiser 200 whilst mutually arbitrating multiple transfer.
  • the interrupt controller (INTC) 112 serves to arbitrate interrupts from peripheral devices, except the DMAC 116.
  • the timer unit 114 comprises four independent timers with 16-bit counters. The timers are driven either by the bus clock (at 1/16 or 1/256 intervals) or via an external clock.
  • the DMAC 116 handles data transfers between main memory and peripheral processors or main memory and the scratch pad memory. It arbitrates the main bus 124 at the same time.
  • the image processing unit (IPU) 118 is an image data processor that is used to expand compressed animations and texture images. It performs I-PICTURE Macro-Block decoding, colour space conversion and vector quantisation.
  • the sub-bus interface (SIF) 122 is an interface unit to the IOP 700. It has its own memory and bus to control I/O devices such as sound chips and storage devices.
  • Figure 3 schematically illustrates the configuration of the Graphic Synthesiser 200.
  • the Graphics Synthesiser comprises: a host interface 202; a set-up / rasterizing unit; a pixel pipeline 206; a memory interface 208; a local memory 212 including a frame page buffer 214 and a texture page buffer 216; and a video converter 210.
  • the host interface 202 transfers data with the host (in this case the CPU core 102 of the Emotion Engine 100). Both drawing data and buffer data from the host pass through this interface.
  • the output from the host interface 202 is supplied to the graphics synthesiser 200 which develops the graphics to draw pixels based on vertex information received from the Emotion Engine 100, and calculates information such as RGBA value, depth value (i.e. Z-value), texture value and fog value for each pixel.
  • the RGBA value specifies the red, green, blue (RGB) colour components and the A (Alpha) component represents opacity of an image object.
  • the Alpha value can range from completely transparent to totally opaque.
  • the pixel data is supplied to the pixel pipeline 206 which performs processes such as texture mapping, fogging and Alpha-blending and determines the final drawing colour based on the calculated pixel information.
  • the pixel pipeline 206 comprises 16 pixel engines PEl, PE2, .... , PE 16 so that it can process a maximum of 16 pixels concurrently.
  • the pixel pipeline 206 runs at 150MHz with 32-bit colour and a 32-bit Z-buffer.
  • the memory interface 208 reads data from and writes data to the local Graphics Synthesiser memory 212.
  • the memory interface 208 also reads from local memory 212 the RGBA values for the current contents of the frame buffer.
  • the local memory 212 is a 32 Mbit (4MB) memory that is built-in to the Graphics Synthesiser 200. It can be organised as a frame buffer 214, texture buffer 216 and a 32- bit Z-buffer 215.
  • the frame buffer 214 is the portion of video memory where pixel data such as colour information is stored.
  • the Graphics Synthesiser uses a 2D to 3D texture mapping process to add visual detail to 3D geometry. Each texture may be wrapped around a 3D image object and is stretched and skewed to give a 3D graphical effect.
  • the texture buffer is used to store the texture information for image objects.
  • the Z-buffer 215 also known as depth buffer
  • Images are constructed from basic building blocks known as graphics primitives or polygons. When a polygon is rendered with Z-buffering, the depth value of each of its pixels is compared with the corresponding value stored in the Z-buffer.
  • the local memory 212 has a 1024-bit read port and a 1024-bit write port for accessing the frame buffer and Z-buffer and a 512-bit port for texture reading.
  • the video converter 210 is operable to display the contents of the frame memory in a specified output format.
  • Figure 4 schematically illustrates four system units 10a, 10b, 10c and 1Od networked together over a network 800, which may be, for example, a local area network (LAN), a wide area network (WAN), and/or the Internet.
  • the network 800 may be a network that is dedicated to the communication of game data, designed to be suited to the communication of such data.
  • the network 800 may comprise other devices (not shown) such as network servers.
  • the system units 10a, 10b, 10c and 1Od are located sufficiently far apart that words spoken by one player cannot be heard directly by another. This represents a normal use of such system units but, of course, is not an essential technical feature of the invention.
  • each of the system units 10a, 10b, 10c and 1Od is connected to a network adapter 805a, 805b, 805c and 805d respectively.
  • the network adapters 805a, 805b, 805c and 805d provide an Ethernet interface to the network 800 for the system units 10a, 10b, 10c and 1Od.
  • the system units 10a, 10b, 10c and 1Od may be networked together via different means, for example by making use of their iLink ports.
  • the four system units 10a, 10b, 10c and 1Od are arranged to collaborate so that a networked game may be played by human players 810a, 810b, 810c', 810c" and 810d.
  • the third system unit 10c is being used by more than one player. It will be appreciated that this is an example arrangement of system units 10 and players and that, in principle, any number of systems units 10 and players may be involved. In practice, due to bandwidth and data manageability constraints, there may be an upper limit on the number of system units 10 and/or players. For example, currently, up to twenty players may be involved in a networked game.
  • the data flowing over the network 800 between the system units 10a, 10b, 10c and 1Od may comprise a variety of information, such as one or more of: (i) the inputs of the players 810a, 810b, 810c', 810c" and 81Od via their respective hand-held game controllers 725; (ii) data relating to a game character associated with each of the players 810a, 810b, 810c', 810c" and 810d, such as position, health and game items collected; (iii) data relating to other game characters (such as computer generated and controlled game monsters); (iv) the current scores of the players 810a, 810b, 81 Oc' , 81 Oc" and 81 Od; (v) actions performed by the players 810a, 810b, 810c', 810c" and 81Od and how they affect the game environment, such as opening a door in the game environment; and (vi) audio data, such as voice data input by the players 8
  • each of the system units 10a, 10b, 10c and 1Od is provided with its own DVD storing a version of the game software.
  • a game session is initiated by the player 810a and the other players 810b, 810c', 810c" and 81Od are then invited to join the game session. Once all of the players 810a, 810b, 810c', 810c" and 810d have joined the game session, they may then play the game together.
  • Each of the system units 10a, 10b, 10c and 1Od may process game tasks that are essentially local, such as rendering video and audio data and processing a player's input.
  • Other game processing tasks concern more than one of the system units 10a, 10b, 10c and 1Od, such as maintaining a table of game scores and deciding the actions of computer generated monsters (which are not controlled by any of the players 810a, 810b, 810c', 810c" or 81Od).
  • Such tasks may be performed by just one of the system units 10a, 10b, 10c or 1Od, each of the system units 10a, 10b, 10c and 1Od being informed of the outcome of the processing as appropriate, so that all of the system units 10a, 10b, 10c and 1Od can operate together with a consistent game environment.
  • one or more other networked devices such as a network server (not shown) may undertake some of the processing in order to provide the networked game, for example deciding the actions of computer generated monsters or maintaining a table of game scores.
  • Figure 5 is a schematic flow chart of a first embodiment for controlling audio communication in a networked game.
  • the control of audio communication (such as a talk-channel) is granted to the player who currently possesses or has achieved the highest score.
  • the player with the highest score can provide audio data across the network to other players (i.e. talk at other players), but none of the other players can provide audio data across the network (i.e. they can only listen).
  • a player's actions are rewarded with control of audio communication if those actions result in the player holding the highest score.
  • one of the system units 10 involved in the networked game may maintain a table of the players' current game scores. This system unit 10 shall be referred to as the scoring system unit.
  • the scoring system unit uses the table of scores to determine which player currently has the highest score. If two or more players share the highest score, then one of them is selected. This may be done, for example, by a random selection; alternatively, the selection may be based on other game statistics (such as accuracy of aim in a game involving firing weapons).
  • the control of audio communication is granted to the player who has been determined to have the highest score. This player shall be referred to as the talking-player. The result of this is that audio data can be transferred across the network from the talking-player to the other players; in contrast, no audio data from the other players can be transferred across the network.
  • the talking-player can talk to/at the other players over the network whilst the other players are not able to talk to anybody at all over the network.
  • the scoring system unit instructs the system units 10 involved in the networked game as to which player is the talking-player; each of the system units 10 then handles audio communication appropriately.
  • the scoring system unit provides explicit instructions to the other system units 10 about how to handle audio communication.
  • Appropriate handling of audio communication may be achieved by: (i) the talking-player's system unit 10 allowing itself to transmit audio data across the network and the other system units 10 prohibiting themselves from transmitting audio data across the network; (ii) every system unit 10 allowing itself to receive audio data from the talking player's system unit 10 and prohibiting itself from receiving audio data from any other system unit 10 (by blocking certain network addresses for example); or (iii) every system unit 10 allowing itself to render audio data received from the talking player's system unit 10 and prohibiting itself from rendering audio data received from any other system unit 10.
  • a step S904 some or all of the players involved in the network game are informed of the identity of the talking-player.
  • the talking-player is given control of audio communication for at least a minimum period of time, for example 20 seconds.
  • the scoring system unit resets a wait period.
  • the scoring system unit tests for the expiration of the wait period. Once the wait period has expired, processing returns to the step S900.
  • FIG. 6 is a schematic flow chart of a second embodiment for controlling audio communication in a networked game.
  • control of audio communication (such as a talk-channel) is granted to a player (the talking-player) whose game character has collected one or more game objects of a particular type or types. This may be achieved, for example, by a player controlling a game character to move to a certain location and then instructing the game character to collect an object at that location.
  • a microphone-object may be provided in the game environment which, if collected by a player's game character, provides that player with control of audio communication.
  • control of audio communication allows the talking-player to provide audio data across the network to other players (i.e. talk at other players) whilst none of the other players can provide audio data across the network (i.e. they can only listen).
  • a player's system unit 10 waits for input from the player, for example via the hand-held game controller 725. This input may be, for example, to move the player's game character within the game environment.
  • the system unit 10 determines whether or not the player's character has collected a communication-object (i.e. a game object that permits the player to communicate audio data across the network). If the player's character has not collected a communication-object, processing returns to the step SlOOO; otherwise processing continues at a step S 1004.
  • the player's system unit 10 gives control of audio communication to the player and the player become the talking-player. This is achieved in a similar manner as at the step S902 of Figure 5.
  • the talking-player's system unit 10 informs some or all of the players involved in the network game of the identity of the talking-player. This is done in a similar manner as at the step S904 of Figure 5.
  • the talking-player's system unit 10 starts a wait period and tests for its expiration at a step SlOlO. The wait period grants the talking-player control of audio communication for a limited period of time.
  • the talking-player's system unit 10 removes control of audio communication from the talking-player, i.e. the player is no longer the talking-player and is not able to communicate audio information to other players across the network.
  • a new communication-object is created within the game environment which may then be collected.
  • the creation may be, for example, immediately, at a random point of time after the preceding communication-object had been collected or at a predetermined point of time after the preceding communication- object had been collected.
  • the steps S 1008, SlOlO and S 1012 may be omitted, thereby granting the player control of audio communication until another player collects a communication-obj ect;
  • a player's game character may need to collect more than one communication-object (potentially of different types) before that player is granted control of audio communication;
  • a player may need to perform further actions, such as instructing the game character to activate/use a collected communication-object, before control of audio communication is granted to the player;
  • the step S 1014 may be omitted so that communication-objects are not replaced after having been collected and/or used;
  • multiple communication-objects may be distributed at different positions within the game environment (such as different rooms); and/
  • FIG 7 is a schematic flow chart of a third embodiment for controlling audio communication in a networked game.
  • the control of audio communication (such as a talk-channel) is granted to a player (the talking-player) who has controlled their game character to be located at a specific location in the game environment and/or to a player who has reached a certain level/stage within the game.
  • control of audio communication allows the talking-player to provide audio data across the network to other players (i.e. talk at other players) whilst none of the other players can provide audio data across the network (i.e. they can only listen).
  • the player's system unit 10 tests whether the player's game character is located at a specific location in the game environment and/or whether the player has reached a certain level/stage within the game.
  • the player's system unit 10 gives control of audio communication to the player and the player become the talking-player. This is achieved in a similar manner as at the step S 902 of Figure 5.
  • the talking-player's system unit 10 informs some or all of the players involved in the network game of the identity of the talking-player. This is done in a similar manner as at the step S904 of Figure 5.
  • the talking-player's system unit 10 starts a wait period and tests for its expiration at a step Sl 108.
  • the wait period grants the talking-player control of audio communication for a limited period of time.
  • the talking-player's system unit 10 removes control of audio communication from the talking-player, i.e. the player is no longer the talking-player and is not able to communicate audio information to other players across the network. This may be achieved in a similar manner as at the step S1012 of Figure 6. It will be appreciated that other variants of the third embodiment exist.
  • FIG. 8 is a schematic flow chart of a fourth embodiment for controlling audio communication in a networked game, similar to the third embodiment.
  • the control of audio communication (such as a talk-channel) is granted to a player (the talking-player) whose game character has just performed a certain action.
  • control of audio communication allows the talking- player to provide audio data across the network to other players (i.e. talk at other players) whilst none of the other players can provide audio data across the network (i.e. they can only listen). This allows the talking player to, for example, gloat about the action that has just been performed.
  • the player's system unit 10 tests whether the player's game character has performed a certain game action (as described above).
  • the player's system unit 10 gives control of audio communication to the player and the player become the talking-player. This is achieved in a similar manner as at the step S902 of Figure 5.
  • the talking-player's system unit 10 informs some or all of the players involved in the network game of the identity of the talking-player. This is done in a similar manner as at the step S904 of Figure 5.
  • the talking-player's system unit 10 starts a wait period and tests for its expiration at a step S 1508.
  • the wait period grants the talking-player control of audio communication for a limited period of time.
  • the talking-player's system unit 10 removes control of audio communication from the talking-player, i.e. the player is no longer the talking-player and is not able to communicate audio information to other players across the network. This may be achieved in a similar manner as at the step S1012 of Figure 6. It will be appreciated that other variants of the fourth embodiment exist. For example, the steps S 1506, S 1508 and S 1510 may be omitted so that a player retains control of audio communication.
  • two players may communicate audio with each other across the network (for example within a talk-channel) if they have positioned their game characters in the game environment so that the distance between their game characters (as measured in the game environment) is less than a threshold distance.
  • a threshold distance between their game characters (as measured in the game environment) is less than a threshold distance.
  • Only players whose game characters are near to each other may ' talk with each other over the network.
  • Several game characters may be sufficiently close to each other to allow a group of players to hold a conversation; these game characters will be said to form a conversation group.
  • Figure 9 schematically illustrates an example positioning of game characters within a game environment. Two game characters 1200a and 1200b are located near each other (i.e.
  • the game characters 1200a and 1200b thus form a conversation group 121 Oab.
  • three game characters 1200c, 120Od and 120Oe are located near each other, thus allowing their controlling players to talk to each other over the network.
  • the game characters 1200c, 120Od and 120Oe thus form a conversation group 1210cde.
  • a game character 120Of is located near to a game character 120Og, thus allowing their controlling players to talk to each other over the network.
  • the game characters 120Of and 120Og thus form a conversation group 1210fg.
  • the game character 120Of is also located near to a game character 120Oh, thus allowing their controlling players to talk to each other over the network.
  • the game characters 120Of and 120Oh thus form a conversation group 1210fh.
  • the game character 120Oh is not located close enough to the game character 120Og, the players controlling these game characters cannot talk to each other over the network.
  • a game character 120Oi is not located close to any of the other game characters within the game environment. The player controlling the game character 120Oi therefore cannot talk to any of the other players over the network.
  • a player's system unit 10 is provided with positional information relating to the game characters of the other players involved in the network game.
  • This information may be provided to the system unit 10 directly from each of the other system units 10 involved in the network game; alternatively, as described above, to maintain game consistency one of the system units 10 may be responsible for collating the positional information of each of the player's game characters and then forwarding it to every system unit 10 involved in the network game. The player's system unit 10 then calculates the distance between the game characters. This may be a direct point-to-point distance; alternatively, it may be the shortest distance within the confines of the game environment (such as the distance of a route within a maze). The system unit 10 of a first player only renders audio information received from a second player if the game character of the second player is sufficiently close to the game character of the first player.
  • the first player's system unit 10 allowing itself to transmit audio information to the second player's system unit if their game characters are sufficiently near each other, otherwise such transmission is prohibited; (ii) the first player's system unit 10 allowing itself to receive audio information from the second player's system unit if their game characters are sufficiently near each other, otherwise such receipt is prohibited (by blocking the network address of the second player's system unit for example); or (iii) the first player's system unit 10 allowing itself to render audio information received from the second player's system unit if their game characters are sufficiently near each other, otherwise such rendering is prohibited. It will be appreciated that other variants of the fifth embodiment exist.
  • the conversation groups 121 Ofg and 121 Ofh may be merged to allow the players controlling the game characters 120Of, 120Og and 1200h to communicate with each other across the network;
  • the player controlling the game character 120Of may be provided with the ability to select with whom he would like to talk (i.e.
  • the constituents of the conversation groups may be determined by a single system unit 10 which then informs the all of the system units 10 involved in the network game about the conversation groups; and (iv) in order to control the number of game characters that form a conversation group, the threshold distance used to determine the constituents of a conversation group may be dynamically adjusted to enforce an upper limit on the size of the conversation group.
  • a computer controlled game character may, by virtue of its position, score, actions, and/or game items collected, be able to communicate audio data to one or more of the human players.
  • Such audio data may vary depending on the current status of the game.
  • a computer controlled game character may boast that it is about to defeat a human controlled opponent.
  • the audio data controlled by any of the embodiments described above may be only a subset of the total audio data communicated during the game.
  • the communication of data concerning verbal inputs by the players may be controlled according to any of the embodiments described above whilst the communication of other audio data (such as background music) may be controlled by other means.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephonic Communication Services (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

A network comprises a plurality of processing apparatus, at least two of the processing apparatus being games control terminals, each games control terminal being operable to transmit audio data to and receive audio data from another one of the games control terminals and to render received audio data; the network providing controlling logic >@5r0bl5 to determine whether the games control terminals may perform an audio-communication-task; an audio-communication-task being one or more of the transmission of audio data from a games control terminal to another games control terminal; the reception by a games control terminal of audio data transmitted from another games control terminal; and the rendering by a games control terminal of audio data received from another games control terminal; the determination being dependent upon a game status associated with at least one of the games control terminals and/or at least one operator of a games control terminal.

Description

REAL-TIME VOICE-CHAT SYSTEM FOR AN NETWORKED MULTIPLAYER GAME
This invention relates to electronic game processing. Electronic games are well-known and may be categorised in many different ways, such as "racing games" (in which a player controls a vehicle around a course within the game environment); "fighting games" (in which a player controls a character within the game environment and combats other game characters); "maze games" (in which a player controls a character around a maze within the game environment); combinations of these types of game; etc.. In these games, a player may accumulate a score as the game is played; alternatively, a player's turn may be assessed by the level or position within the game that the player reaches. It is well-known for these games to provide special attributes to a player's game character if certain events occur. For example, in a fighting game there may be armour-objects distributed within the game environment (or they may appear from time to time) and if a game character collects one of these armour-objects then that character gains advantages in combat. As another example, the game may be arranged so that, if a game character reaches a certain level or is located at a certain position within the game environment then various events happen, such as extending time limits or "healing" the game character (i.e. increasing the value of a health property associated with the game character). It is also well-known for electronic games to be so-called multi-player games, in which more that one human player is involved in the game. The players may collaborate in teams or may play as individuals against each other. Players may provide their input to a game by general hardware controllers (such as keyboards and mice) or by more specialised hardware controllers. Several controllers may be connected to a single electronic games machine (such as a personal computer or a dedicated games console) to facilitate multi-player games. With developments in network technology and its associated bandwidth, it is now well-known to connect several games machines, such as personal computers or dedicated games consoles, via a communications network, such as a local area network (LAN), a wide area network (WAN), or the Internet. Several players may then participate in a game even when they are located at geographically different locations. Networks dedicated to the communication of game data (and designed to be suited to the communication of such data) have been developed. The networks used for such multi- player networked games may, in addition to the games machines, also comprise other network machines such as network servers. It is known for such networked electronics games to allow the human players to communicate with each other over the network. This may be achieved, for example, by the players composing textual messages (for example, by typing on a keyboard) and sending these messages to other players over the network. It is also known for such networked electronic games to allow audio data (such as voice data) to be distributed across the network, the audio data being input from the player via a microphone, for example. Currently, this involves each player being able to talk to every other player who is involved in the network game, or at least to a fixed subset of these players if, for example, the players are divided into teams and a player is only allowed to talk to other team members. This invention provides a network comprising: a plurality of processing apparatus, at least two of the processing apparatus being games control terminals, each games control terminal being operable to transmit audio data to and receive audio data from another one of the games control terminals and to render received audio data; the network providing controlling logic operable to determine whether the games control terminals may perform an audio-communication-task; an audio-communication-task being one or more of the transmission of audio data from a games control terminal to another games control terminal; the reception by a games control terminal of audio data transmitted from another games control terminal; and the rendering by a games control terminal of audio data received from another games control terminal; the determination being dependent upon a game status associated with at least one of the games control terminals and/or at least one operator of a games control terminal, the game status comprising at least one game score associated with each games control terminal and/or each operator. This invention recognises a problem with networked electronic games that allow audio data (and particularly voice data) to be communicated across the network. If several players participate in such a networked game and some or all of them talk over the network simultaneously, then it is very difficult for a player to discern who is talking to whom, and what is being said, as each player hears all of the audio data at once. This is in addition to any audio that is being generated and rendered locally to the player (i.e. not from the network). This will reduce the overall appeal of the electronic game, or, at the very least, will discourage players from participating in verbal communication across the network. Accordingly, the invention provides a mechanism whereby the ability to talk to and/or listen to players across the network is either granted to or removed from a player depending on the current game status associated with one or more of the participating players. In this way, the number of players who can communicate audio data simultaneously over the network may be reduced to a more practical level, thus preventing too many players from being able to talk simultaneously. A player is therefore able to hear more clearly the content of audio communication occurring over the network and, if permitted, may contribute to the audio communication in the knowledge that the contribution will be discernible by the other players. The invention also provides a network comprising: a plurality of processing apparatus, at least two of the processing apparatus being games control terminals, each games control terminal being operable to transmit audio data to and receive audio data from another one of the games control terminals and to render received audio data; the network providing controlling logic operable to determine whether the games control terminals may perform an audio-communication-task; an audio-communication-task being one or more of the transmission of audio data from a games control terminal to another games control terminal; the reception by a games control terminal of audio data transmitted from another games control terminal; and the rendering by a games control terminal of audio data received from another games control terminal; the determination being dependent upon a game status associated with at least one of the games control terminals and/or at least one operator of a games control terminal, the game status comprising one or more communication-enabling-actions performed by a game character, the network associating at least one game character with each games control terminal and/or each operator. Further respective aspects and features of the invention are defined in the ^ appended claims. Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which: Figure 1 schematically illustrates the overall system architecture of the PlayStation2; Figure 2 schematically illustrates the architecture of an Emotion Engine; Figure 3 schematically illustrates the configuration of a Graphics Synthesiser; Figure 4 schematically illustrates four system units networked together; Figure 5 is a schematic flow chart of a first embodiment for controlling audio communication in a networked game; Figure 6 is a schematic flow chart of a second embodiment for controlling audio communication in a networked game; Figure 7 is a schematic flow chart of a third embodiment for controlling audio communication in a networked game; Figure 8 is a schematic flow chart of a fourth embodiment for controlling audio communication in a networked game; and Figure 9 schematically illustrates an example positioning of game characters within a game environment. Figure 1 schematically illustrates the overall system architecture of the PlayStation2. A system unit 10 is provided, with various peripheral devices connectable to the system unit. The system unit 10 comprises: an Emotion Engine 100; a Graphics Synthesiser 200; a sound processor unit 300 having dynamic random access memory (DRAM); a read only memory (ROM) 400; a compact disc (CD) and digital versatile disc (DVD) reader 450; a Rambus Dynamic Random Access Memory (RDRAM) unit 500; an input/output processor (IOP) 700 with dedicated RAM 750. An (optional) external hard disk drive (HDD) 390 may be connected. The input/output processor 700 has two Universal Serial Bus (USB) ports 715 and an iLink or IEEE 1394 port (iLink is the Sony Corporation implementation of the IEEE 1394 standard). The IOP 700 handles all USB, iLink and game controller data traffic. For example when a user is playing a game, the IOP 700 receives data from the game controller and directs it to the Emotion Engine 100 which updates the current state of the game accordingly. The IOP 700 has a Direct Memory Access (DMA) architecture to facilitate rapid data transfer rates. DMA involves transfer of data from main memory to a device without passing it through the CPU. The USB interface is compatible with Open Host Controller Interface (OHCI) and can handle data transfer rates of between 1.5 Mbps and 12 Mbps. Provision of these interfaces means that the PlayStation2 is potentially compatible with peripheral devices such as video cassette recorders (VCRs), digital cameras, microphones, set-top boxes, printers, keyboard, mouse and joystick. Generally, in order for successful data communication to occur with a peripheral device connected to a USB port 715, an appropriate piece of software such as a device driver should be provided. Device driver technology is very well known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the embodiment described here. In the present embodiment, a USB microphone 730 is connected to the USB port. It will be appreciated that the USB microphone 730 may be a hand-held microphone or may form part of a head-set that is worn by the human operator. The advantage of wearing a head-set is that the human operator's hand are free to perform other actions. The microphone includes an analogue-to-digital converter (ADC) and a basic hardware- based real-time data compression and encoding arrangement, so that audio data are transmitted by the microphone 730 to the USB port 715 in an appropriate format, such as 16-bit mono PCM (an uncompressed format) for decoding at the PlayStation 2 system unit 10. Apart from the USB ports, two other ports 705, 710 are proprietary sockets allowing the connection of a proprietary non-volatile RAM memory card 720 for storing game-related information, a hand-held game controller 725 or a device (not shown) mimicking a hand-held controller, such as a dance mat. The system unit 10 may be connected to a network adapter 805 that provides an interface (such as an Ethernet interface) to a network. This network may be, for example, a LAN, a WAN or the Internet. The network may be a general network or one that is dedicated to game related communication. The network adapter 805 allows data to be transmitted to and received from other system units 10 that are connected to the same network, (the other system units 10 also having corresponding network adapters 805). The Emotion Engine 100 is a 128-bit Central Processing Unit (CPU) that has been specifically designed for efficient simulation of 3 dimensional (3D) graphics for games applications. The Emotion Engine components include a data bus, cache memory and registers, all of which are 128-bit. This facilitates fast processing of large volumes of multi-media data. Conventional PCs, by way of comparison, have a basic 64-bit data structure. The floating point calculation performance of the PlayStation2 is 6.2 GFLOPs. The Emotion Engine also comprises MPEG2 decoder circuitry which allows for simultaneous processing of 3D graphics data and DVD data. The Emotion Engine performs geometrical calculations including mathematical transforms and translations and also performs calculations associated with the physics of simulation objects, for example, calculation of friction between two objects. It produces sequences of image rendering commands which are subsequently utilised by the Graphics Synthesiser 200. The image rendering commands are output in the form of display lists. A display list is a sequence of drawing commands that specifies to the Graphics Synthesiser which primitive graphic objects (e.g. points, lines, triangles, sprites) to draw on the screen and at which co¬ ordinates. Thus a typical display list will comprise commands to draw vertices, commands to shade the faces of polygons, render bitmaps and so on. The Emotion Engine 100 can asynchronously generate multiple display lists. The Graphics Synthesiser 200 is a video accelerator that performs rendering of the display lists produced by the Emotion Engine 100. The Graphics Synthesiser 200 includes a graphics interface unit (GIF) which handles, tracks and manages the multiple display lists. The rendering function of the Graphics Synthesiser 200 can generate image data that supports several alternative standard output image formats, i.e., NTSC/PAL, High Definition Digital TV and VESA. In general, the rendering capability of graphics systems is defined by the memory bandwidth between a pixel engine and a video memory, each of which is located within the graphics processor. Conventional graphics systems use external Video Random Access Memory (VRAM) connected to the pixel logic via an off- chip bus which tends to restrict available bandwidth. However, the Graphics Synthesiser 200 of the PlayStation2 provides the pixel logic and the video memory on a single high- performance chip which allows for a comparatively large 38.4 Gigabyte per second memory access bandwidth. The Graphics Synthesiser is theoretically capable of achieving a peak drawing capacity of 75 million polygons per second. Even with a full range of effects such as textures, lighting and transparency, a sustained rate of 20 million polygons per second can be drawn continuously. Accordingly, the Graphics Synthesiser 200 is capable of rendering a film-quality image. The Sound Processor Unit (SPU) 300 is effectively the soundcard of the system which is capable of recognising 3D digital sound such as Digital Theater Surround (DTS®) sound and AC-3 (also known as Dolby Digital) which is the sound format used for DVDs. A display and sound output device 305, such as a video monitor or television set with an associated loudspeaker arrangement 310, is connected to receive video and audio signals from the graphics synthesiser 200 and the sound processing unit 300. The main memory supporting the Emotion Engine 100 is the RDRAM (Rambus Dynamic Random Access Memory) module 500 produced by Rambus Incorporated. This RDRAM memory subsystem comprises RAM, a RAM controller and a bus connecting the RAM to the Emotion Engine 100. Figure 2 schematically illustrates the architecture of the Emotion Engine 100 of Figure 1. The Emotion Engine 100 comprises: a floating point unit (FPU) 104; a central processing unit (CPU) core 102; vector unit zero (VUO) 106; vector unit one (VUl) 108; a graphics interface unit (GIF) 110; an interrupt controller (INTC) 112; a timer unit 114; a direct memory access controller 116; an image data processor unit (IPU) 118; a dynamic random access memory controller (DRAMC) 120; a sub-bus interface (SIF) 122; and all of these components are connected via a 128-bit main bus 124. The CPU core 102 is a 128-bit processor clocked at 300 MHz. The CPU core has access to 32 MB of main memory via the DRAMC 120. The CPU core 102 instruction set is based on MIPS III RISC with some MIPS IV RISC instructions together with additional multimedia instructions. MIPS III and IV are Reduced Instruction Set Computer (RISC) instruction set architectures proprietary to MIPS Technologies, Inc. Standard instructions are 64-bit, two-way superscalar, which means that two instructions can be executed simultaneously. Multimedia instructions, on the other hand, use 128-bit instructions via two pipelines. The CPU core 102 comprises a 16KB instruction cache, an 8KB data cache and a 16KB scratchpad RAM which is a portion of cache reserved for direct private usage by the CPU. The FPU 104 serves as a first co-processor for the CPU core 102. The vector unit 106 acts as a second co-processor. The FPU 104 comprises a floating point product sum arithmetic logic unit (FMAC) and a floating point division calculator (FDIV). Both the FMAC and FDIV operate on 32-bit values so when an operation is carried out on a 128- bit value ( composed of four 32-bit values) an operation can be carried out on all four parts concurrently. For example adding 2 vectors together can be done at the same time. The vector units 106 and 108 perform mathematical operations and are essentially specialised FPUs that are extremely fast at evaluating the multiplication and addition of vector equations. They use Floating-Point Multiply-Adder Calculators (FMACs) for addition and multiplication operations and Floating-Point Dividers (FDIVs) for division and square root operations. They have built-in memory for storing micro-programs and interface with the rest of the system via Vector Interface Units (VIFs). Vector unit zero 106 can work as a coprocessor to the CPU core 102 via a dedicated 128-bit bus so it is essentially a second specialised FPU. Vector unit one 108, on the other hand, has a dedicated bus to the Graphics synthesiser 200 and thus can be considered as a completely separate processor. The inclusion of two vector units allows the software developer to split up the work between different parts of the CPU and the vector units can be used in either serial or parallel connection. Vector unit zero 106 comprises 4 FMACS and 1 FDIV. It is connected to the CPU core 102 via a coprocessor connection. It has 4 Kb of vector unit memory for data and 4 Kb of micro-memory for instructions. Vector unit zero 106 is useful for performing physics calculations associated with the images for display. It primarily executes non- patterned geometric processing together with the CPU core 102. Vector unit one 108 comprises 5 FMACS and 2 FDIVs. It has no direct path to the CPU core 102, although it does have a direct path to the GIF unit 110. It has 16 Kb of vector unit memory for data and 16 Kb of micro-memory for instructions. Vector unit one 108 is useful for performing transformations. It primarily executes patterned geometric processing and directly outputs a generated display list to the GIF 110. The GIF 110 is an interface unit to the Graphics Synthesiser 200. It converts data according to a tag specification at the beginning of a display list packet and transfers drawing commands to the Graphics Synthesiser 200 whilst mutually arbitrating multiple transfer. The interrupt controller (INTC) 112 serves to arbitrate interrupts from peripheral devices, except the DMAC 116. The timer unit 114 comprises four independent timers with 16-bit counters. The timers are driven either by the bus clock (at 1/16 or 1/256 intervals) or via an external clock. The DMAC 116 handles data transfers between main memory and peripheral processors or main memory and the scratch pad memory. It arbitrates the main bus 124 at the same time. Performance optimisation of the DMAC 116 is a key way by which to improve Emotion Engine performance.. The image processing unit (IPU) 118 is an image data processor that is used to expand compressed animations and texture images. It performs I-PICTURE Macro-Block decoding, colour space conversion and vector quantisation. Finally, the sub-bus interface (SIF) 122 is an interface unit to the IOP 700. It has its own memory and bus to control I/O devices such as sound chips and storage devices. Figure 3 schematically illustrates the configuration of the Graphic Synthesiser 200. The Graphics Synthesiser comprises: a host interface 202; a set-up / rasterizing unit; a pixel pipeline 206; a memory interface 208; a local memory 212 including a frame page buffer 214 and a texture page buffer 216; and a video converter 210. The host interface 202 transfers data with the host (in this case the CPU core 102 of the Emotion Engine 100). Both drawing data and buffer data from the host pass through this interface. The output from the host interface 202 is supplied to the graphics synthesiser 200 which develops the graphics to draw pixels based on vertex information received from the Emotion Engine 100, and calculates information such as RGBA value, depth value (i.e. Z-value), texture value and fog value for each pixel. The RGBA value specifies the red, green, blue (RGB) colour components and the A (Alpha) component represents opacity of an image object. The Alpha value can range from completely transparent to totally opaque. The pixel data is supplied to the pixel pipeline 206 which performs processes such as texture mapping, fogging and Alpha-blending and determines the final drawing colour based on the calculated pixel information. The pixel pipeline 206 comprises 16 pixel engines PEl, PE2, .... , PE 16 so that it can process a maximum of 16 pixels concurrently. The pixel pipeline 206 runs at 150MHz with 32-bit colour and a 32-bit Z-buffer. The memory interface 208 reads data from and writes data to the local Graphics Synthesiser memory 212. It writes the drawing pixel values (RGBA and Z) to memory at the end of a pixel operation and reads the pixel values of the frame buffer 214 from memory. These pixel values read from the frame buffer 214 are used for pixel test or Alpha-blending. The memory interface 208 also reads from local memory 212 the RGBA values for the current contents of the frame buffer. The local memory 212 is a 32 Mbit (4MB) memory that is built-in to the Graphics Synthesiser 200. It can be organised as a frame buffer 214, texture buffer 216 and a 32- bit Z-buffer 215. The frame buffer 214 is the portion of video memory where pixel data such as colour information is stored. The Graphics Synthesiser uses a 2D to 3D texture mapping process to add visual detail to 3D geometry. Each texture may be wrapped around a 3D image object and is stretched and skewed to give a 3D graphical effect. The texture buffer is used to store the texture information for image objects. The Z-buffer 215 (also known as depth buffer) is the memory available to store the depth information for a pixel. Images are constructed from basic building blocks known as graphics primitives or polygons. When a polygon is rendered with Z-buffering, the depth value of each of its pixels is compared with the corresponding value stored in the Z-buffer. If the value stored in the Z-buffer is greater than or equal to the depth of the new pixel value then this pixel is determined visible so that it should be rendered and the Z-buffer will be updated with the new pixel depth. If however the Z-buffer depth value is less than the new pixel depth value the new pixel value is behind what has already been drawn and will not be rendered. The local memory 212 has a 1024-bit read port and a 1024-bit write port for accessing the frame buffer and Z-buffer and a 512-bit port for texture reading. The video converter 210 is operable to display the contents of the frame memory in a specified output format. Figure 4 schematically illustrates four system units 10a, 10b, 10c and 1Od networked together over a network 800, which may be, for example, a local area network (LAN), a wide area network (WAN), and/or the Internet. The network 800 may be a network that is dedicated to the communication of game data, designed to be suited to the communication of such data. The network 800 may comprise other devices (not shown) such as network servers. For the purposes of the present description, it is assumed that the system units 10a, 10b, 10c and 1Od are located sufficiently far apart that words spoken by one player cannot be heard directly by another. This represents a normal use of such system units but, of course, is not an essential technical feature of the invention. In the example illustrated in Figure 4, each of the system units 10a, 10b, 10c and 1Od is connected to a network adapter 805a, 805b, 805c and 805d respectively. The network adapters 805a, 805b, 805c and 805d provide an Ethernet interface to the network 800 for the system units 10a, 10b, 10c and 1Od. In other configurations though, the system units 10a, 10b, 10c and 1Od may be networked together via different means, for example by making use of their iLink ports. The four system units 10a, 10b, 10c and 1Od are arranged to collaborate so that a networked game may be played by human players 810a, 810b, 810c', 810c" and 810d. Note that the third system unit 10c is being used by more than one player. It will be appreciated that this is an example arrangement of system units 10 and players and that, in principle, any number of systems units 10 and players may be involved. In practice, due to bandwidth and data manageability constraints, there may be an upper limit on the number of system units 10 and/or players. For example, currently, up to twenty players may be involved in a networked game. The data flowing over the network 800 between the system units 10a, 10b, 10c and 1Od may comprise a variety of information, such as one or more of: (i) the inputs of the players 810a, 810b, 810c', 810c" and 81Od via their respective hand-held game controllers 725; (ii) data relating to a game character associated with each of the players 810a, 810b, 810c', 810c" and 810d, such as position, health and game items collected; (iii) data relating to other game characters (such as computer generated and controlled game monsters); (iv) the current scores of the players 810a, 810b, 81 Oc' , 81 Oc" and 81 Od; (v) actions performed by the players 810a, 810b, 810c', 810c" and 81Od and how they affect the game environment, such as opening a door in the game environment; and (vi) audio data, such as voice data input by the players 810a, 810b, 810c', 810c" and 81Od via their respective microphones 730. It will be appreciated that, depending on the particular game being played, other information may be transferred across the network. Methods by which one system unit 10 collaborates with other system units 10 to provide a network game are well known and will not be described in detail. However, as an example, each of the system units 10a, 10b, 10c and 1Od is provided with its own DVD storing a version of the game software. A game session is initiated by the player 810a and the other players 810b, 810c', 810c" and 81Od are then invited to join the game session. Once all of the players 810a, 810b, 810c', 810c" and 810d have joined the game session, they may then play the game together. Each of the system units 10a, 10b, 10c and 1Od may process game tasks that are essentially local, such as rendering video and audio data and processing a player's input. Other game processing tasks concern more than one of the system units 10a, 10b, 10c and 1Od, such as maintaining a table of game scores and deciding the actions of computer generated monsters (which are not controlled by any of the players 810a, 810b, 810c', 810c" or 81Od). Such tasks may be performed by just one of the system units 10a, 10b, 10c or 1Od, each of the system units 10a, 10b, 10c and 1Od being informed of the outcome of the processing as appropriate, so that all of the system units 10a, 10b, 10c and 1Od can operate together with a consistent game environment. It will be appreciated that in other configurations, one or more other networked devices, such as a network server (not shown) may undertake some of the processing in order to provide the networked game, for example deciding the actions of computer generated monsters or maintaining a table of game scores. Figure 5 is a schematic flow chart of a first embodiment for controlling audio communication in a networked game. In this first embodiment, the control of audio communication (such as a talk-channel) is granted to the player who currently possesses or has achieved the highest score. This means that the player with the highest score can provide audio data across the network to other players (i.e. talk at other players), but none of the other players can provide audio data across the network (i.e. they can only listen). As such, a player's actions are rewarded with control of audio communication if those actions result in the player holding the highest score. At a step S900, it is determined which player currently has the highest score. As described above, one of the system units 10 involved in the networked game may maintain a table of the players' current game scores. This system unit 10 shall be referred to as the scoring system unit. At the step S900, the scoring system unit uses the table of scores to determine which player currently has the highest score. If two or more players share the highest score, then one of them is selected. This may be done, for example, by a random selection; alternatively, the selection may be based on other game statistics (such as accuracy of aim in a game involving firing weapons). At a step S902, the control of audio communication is granted to the player who has been determined to have the highest score. This player shall be referred to as the talking-player. The result of this is that audio data can be transferred across the network from the talking-player to the other players; in contrast, no audio data from the other players can be transferred across the network. In other words, the talking-player can talk to/at the other players over the network whilst the other players are not able to talk to anybody at all over the network. The scoring system unit instructs the system units 10 involved in the networked game as to which player is the talking-player; each of the system units 10 then handles audio communication appropriately. Alternatively, the scoring system unit provides explicit instructions to the other system units 10 about how to handle audio communication. Appropriate handling of audio communication may be achieved by: (i) the talking-player's system unit 10 allowing itself to transmit audio data across the network and the other system units 10 prohibiting themselves from transmitting audio data across the network; (ii) every system unit 10 allowing itself to receive audio data from the talking player's system unit 10 and prohibiting itself from receiving audio data from any other system unit 10 (by blocking certain network addresses for example); or (iii) every system unit 10 allowing itself to render audio data received from the talking player's system unit 10 and prohibiting itself from rendering audio data received from any other system unit 10. At a step S904, some or all of the players involved in the network game are informed of the identity of the talking-player. This may involve an indication using text or icons displayed on the display device 305; alternatively, the appearance of the talking- player's game character may be altered to give an indication to the other players (for example, the game character may flash or take on a glowing appearance). If the current identity of the talking-player is the same as the identity of the immediately preceding talking-player then the players need not be re-informed of the identity of the talking- player. In a game involving rapidly changing scores, it is conceivable that the identity of the player with the highest score also changes rapidly. In order to prevent the identity of the talking-player changing too rapidly (which would result in very short/meaningless/incomplete communication), the talking-player is given control of audio communication for at least a minimum period of time, for example 20 seconds. Therefore, at a step S906, the scoring system unit resets a wait period. At a step S908, the scoring system unit tests for the expiration of the wait period. Once the wait period has expired, processing returns to the step S900. It will be appreciated that other variants of this first embodiment exist. For example: (i) more than one player may be granted the ability to talk over the network depending on the players' scores; and (ii) the processing of the wait period at the steps S906 and S908 may be omitted (or the wait period set to zero) to allow a continual assessment of control of audio communication. Figure 6 is a schematic flow chart of a second embodiment for controlling audio communication in a networked game. In this second embodiment, the control of audio communication (such as a talk-channel) is granted to a player (the talking-player) whose game character has collected one or more game objects of a particular type or types. This may be achieved, for example, by a player controlling a game character to move to a certain location and then instructing the game character to collect an object at that location. For example, a microphone-object may be provided in the game environment which, if collected by a player's game character, provides that player with control of audio communication. As with the first embodiment, control of audio communication allows the talking-player to provide audio data across the network to other players (i.e. talk at other players) whilst none of the other players can provide audio data across the network (i.e. they can only listen). At a step SlOOO5 a player's system unit 10 waits for input from the player, for example via the hand-held game controller 725. This input may be, for example, to move the player's game character within the game environment. At a step S 1002, the system unit 10 determines whether or not the player's character has collected a communication-object (i.e. a game object that permits the player to communicate audio data across the network). If the player's character has not collected a communication-object, processing returns to the step SlOOO; otherwise processing continues at a step S 1004. At the step S 1004, the player's system unit 10 gives control of audio communication to the player and the player become the talking-player. This is achieved in a similar manner as at the step S902 of Figure 5. At a step S 1006, the talking-player's system unit 10 informs some or all of the players involved in the network game of the identity of the talking-player. This is done in a similar manner as at the step S904 of Figure 5. At a step S 1008, the talking-player's system unit 10 starts a wait period and tests for its expiration at a step SlOlO. The wait period grants the talking-player control of audio communication for a limited period of time. Once the wait period has expired, at a step S 1012, the talking-player's system unit 10 removes control of audio communication from the talking-player, i.e. the player is no longer the talking-player and is not able to communicate audio information to other players across the network. This may be achieved, for example, by (i) the talking-player's system unit 10 prohibiting itself from transmitting audio data across the network; (ii) the talking player's system unit 10 instructing every system unit 10 involved in the networked game to prohibit itself from receiving audio data from the talking-player's system unit 10 (by blocking the network address of the talking-player's system unit for example); or (iii) the talking player's system unit 10 instructing every system unit 10 involved in the networked game to prohibit itself from rendering audio data received from the talking-player's system unit 10. At a step S 1014, a new communication-object is created within the game environment which may then be collected. The creation may be, for example, immediately, at a random point of time after the preceding communication-object had been collected or at a predetermined point of time after the preceding communication- object had been collected. It will be appreciated that other variants of the second embodiment exist. For example: (i) the steps S 1008, SlOlO and S 1012 may be omitted, thereby granting the player control of audio communication until another player collects a communication-obj ect; (ii) a player's game character may need to collect more than one communication-object (potentially of different types) before that player is granted control of audio communication; (iii) a player may need to perform further actions, such as instructing the game character to activate/use a collected communication-object, before control of audio communication is granted to the player; (iv) the step S 1014 may be omitted so that communication-objects are not replaced after having been collected and/or used; (v) multiple communication-objects may be distributed at different positions within the game environment (such as different rooms); and/or (vi) two or more players may collect communication-objects and be granted overlapping or concurrent talk periods. Figure 7 is a schematic flow chart of a third embodiment for controlling audio communication in a networked game. In this third embodiment, the control of audio communication (such as a talk-channel) is granted to a player (the talking-player) who has controlled their game character to be located at a specific location in the game environment and/or to a player who has reached a certain level/stage within the game. As with the first two embodiments, control of audio communication allows the talking-player to provide audio data across the network to other players (i.e. talk at other players) whilst none of the other players can provide audio data across the network (i.e. they can only listen). At a step SHOO, the player's system unit 10 tests whether the player's game character is located at a specific location in the game environment and/or whether the player has reached a certain level/stage within the game. At a step Sl 102, the player's system unit 10 gives control of audio communication to the player and the player become the talking-player. This is achieved in a similar manner as at the step S 902 of Figure 5. At a step Sl 104, the talking-player's system unit 10 informs some or all of the players involved in the network game of the identity of the talking-player. This is done in a similar manner as at the step S904 of Figure 5. At a step Sl 106, the talking-player's system unit 10 starts a wait period and tests for its expiration at a step Sl 108. The wait period grants the talking-player control of audio communication for a limited period of time. Once the wait period has expired, at a step Sl 110, the talking-player's system unit 10 removes control of audio communication from the talking-player, i.e. the player is no longer the talking-player and is not able to communicate audio information to other players across the network. This may be achieved in a similar manner as at the step S1012 ofFigure 6. It will be appreciated that other variants of the third embodiment exist. For example, the steps Sl 106, Sl 108 and SlI lO may be omitted so that a player retains control of audio communication whilst that player's game character is located at a specific location. Figure 8 is a schematic flow chart of a fourth embodiment for controlling audio communication in a networked game, similar to the third embodiment. In this fourth embodiment, the control of audio communication (such as a talk-channel) is granted to a player (the talking-player) whose game character has just performed a certain action. These actions could be actions against another player's game character (such as severely wounding or killing the other player's game character, or overtaking the other player's game character in a race game); alternatively they could be more individualistic actions (such as achieving a good score or firing a particularly accurate shot in a shooting game). As with the first three embodiments, control of audio communication allows the talking- player to provide audio data across the network to other players (i.e. talk at other players) whilst none of the other players can provide audio data across the network (i.e. they can only listen). This allows the talking player to, for example, gloat about the action that has just been performed. At a step S1500, the player's system unit 10 tests whether the player's game character has performed a certain game action (as described above). At a step S 1502, the player's system unit 10 gives control of audio communication to the player and the player become the talking-player. This is achieved in a similar manner as at the step S902 of Figure 5. At a step S1504, the talking-player's system unit 10 informs some or all of the players involved in the network game of the identity of the talking-player. This is done in a similar manner as at the step S904 of Figure 5. At a step S 1506, the talking-player's system unit 10 starts a wait period and tests for its expiration at a step S 1508. The wait period grants the talking-player control of audio communication for a limited period of time. Once the wait period has expired, at a step S1510, the talking-player's system unit 10 removes control of audio communication from the talking-player, i.e. the player is no longer the talking-player and is not able to communicate audio information to other players across the network. This may be achieved in a similar manner as at the step S1012 of Figure 6. It will be appreciated that other variants of the fourth embodiment exist. For example, the steps S 1506, S 1508 and S 1510 may be omitted so that a player retains control of audio communication. In a fifth embodiment, two players may communicate audio with each other across the network (for example within a talk-channel) if they have positioned their game characters in the game environment so that the distance between their game characters (as measured in the game environment) is less than a threshold distance. Thus only players whose game characters are near to each other may' talk with each other over the network. Several game characters may be sufficiently close to each other to allow a group of players to hold a conversation; these game characters will be said to form a conversation group. Figure 9 schematically illustrates an example positioning of game characters within a game environment. Two game characters 1200a and 1200b are located near each other (i.e. the distance between them as measured in the game environment is less than the threshold distance), thus allowing their controlling players to talk to each other over the network. The game characters 1200a and 1200b thus form a conversation group 121 Oab. Similarly, three game characters 1200c, 120Od and 120Oe are located near each other, thus allowing their controlling players to talk to each other over the network. The game characters 1200c, 120Od and 120Oe thus form a conversation group 1210cde. A game character 120Of is located near to a game character 120Og, thus allowing their controlling players to talk to each other over the network. The game characters 120Of and 120Og thus form a conversation group 1210fg. The game character 120Of is also located near to a game character 120Oh, thus allowing their controlling players to talk to each other over the network. The game characters 120Of and 120Oh thus form a conversation group 1210fh. However, as the game character 120Oh is not located close enough to the game character 120Og, the players controlling these game characters cannot talk to each other over the network. Finally, a game character 120Oi is not located close to any of the other game characters within the game environment. The player controlling the game character 120Oi therefore cannot talk to any of the other players over the network. During the game, a player's system unit 10 is provided with positional information relating to the game characters of the other players involved in the network game. This information may be provided to the system unit 10 directly from each of the other system units 10 involved in the network game; alternatively, as described above, to maintain game consistency one of the system units 10 may be responsible for collating the positional information of each of the player's game characters and then forwarding it to every system unit 10 involved in the network game. The player's system unit 10 then calculates the distance between the game characters. This may be a direct point-to-point distance; alternatively, it may be the shortest distance within the confines of the game environment (such as the distance of a route within a maze). The system unit 10 of a first player only renders audio information received from a second player if the game character of the second player is sufficiently close to the game character of the first player. This may be achieved in several ways, for example: (i) the first player's system unit 10 allowing itself to transmit audio information to the second player's system unit if their game characters are sufficiently near each other, otherwise such transmission is prohibited; (ii) the first player's system unit 10 allowing itself to receive audio information from the second player's system unit if their game characters are sufficiently near each other, otherwise such receipt is prohibited (by blocking the network address of the second player's system unit for example); or (iii) the first player's system unit 10 allowing itself to render audio information received from the second player's system unit if their game characters are sufficiently near each other, otherwise such rendering is prohibited. It will be appreciated that other variants of the fifth embodiment exist. For example: (i) as the conversation groups 121 Ofg and 121 Ofh share a common game character (namely the game character 120Of), then the conversation groups 121 Ofg and 121 Ofh may be merged to allow the players controlling the game characters 120Of, 120Og and 1200h to communicate with each other across the network; (ii) the player controlling the game character 120Of may be provided with the ability to select with whom he would like to talk (i.e. with the player controlling the game character 120Og or with the player controlling the game character 120Oh or with both); (iii) the constituents of the conversation groups may be determined by a single system unit 10 which then informs the all of the system units 10 involved in the network game about the conversation groups; and (iv) in order to control the number of game characters that form a conversation group, the threshold distance used to determine the constituents of a conversation group may be dynamically adjusted to enforce an upper limit on the size of the conversation group. In any of the embodiments described above, there may be game characters that are generated and controlled by one or more of the system units 10. For example, in a fighting game, a player may combat one or more opponent game characters that are generated and controlled by a system unit 10. It will be appreciated that, during the game, a computer controlled game character may, by virtue of its position, score, actions, and/or game items collected, be able to communicate audio data to one or more of the human players. Such audio data may vary depending on the current status of the game. For example, a computer controlled game character may boast that it is about to defeat a human controlled opponent. It will be appreciated that the audio data controlled by any of the embodiments described above may be only a subset of the total audio data communicated during the game. For example, the communication of data concerning verbal inputs by the players may be controlled according to any of the embodiments described above whilst the communication of other audio data (such as background music) may be controlled by other means. It will be appreciated that the embodiments described may be combined to provide a variety of means by which a player may gain control of audio communication. In so far as the embodiments of the invention described above are implemented, at least in part, using software-controlled data processing apparatus, it will be appreciated that a computer program providing such software control, a storage medium by which such a computer program is stored and a transmission medium by which such a computer program is transmitted are envisaged as aspects of the present invention.

Claims

1. A network comprising: a plurality of processing apparatus, at least two of the processing apparatus being games control terminals, each games control terminal being operable to transmit audio data to and receive audio data from another one of the games control terminals and to render received audio data; the network providing controlling logic operable to determine whether the games control terminals may perform an audio-communication-task; an audio-communication-task being one or more of the transmission of audio data from a games control terminal to another games control terminal; the reception by a games control terminal of audio data transmitted from another games control terminal; and the rendering by a games control terminal of audio data received from another games control terminal; the determination being dependent upon a game status associated with at least one of the games control terminals and/or at least one operator of a games control terminal, the game status comprising at least one game score associated with each games control terminal and/or each operator.
2. A network according to claim 1, in which the game status associated with a games control terminal and/or operator is dependent on a comparison of the magnitude of the game score associated with that terminal and/or operator and other game scores.
3. A network according to claim 2 in which the controlling logic determines that a games control terminal may perform an audio-communication-task if the game score associated with that games control terminal or the operator of that games control terminal is one of the N largest game scores, where N is a positive integer.
4. A network according to claim 2 or claim 3, in which the controlling logic determines that a games control terminal may not perform an audio-communication-task in respect of another games control terminal if the game score associated with that other games control terminal or an operator of that other games control terminal is not one of the N largest game scores, where N is a positive integer.
5. A network according to claim 3 or claim 4, in which N is 1.
6. A network comprising: a plurality of processing apparatus, at least two of the processing apparatus being games control terminals, each games control terminal being operable to transmit audio data to and receive audio data from another one of the games control terminals and to render received audio data; the network providing controlling logic operable to determine whether the games control terminals may perform an audio-communication-task; an audio-communication-task being one or more of the transmission of audio data from a games control terminal to another games control terminal; the reception by a games control terminal of audio data transmitted from another games control terminal; and the rendering by a games control terminal of audio data received from another games control terminal; the determination being dependent upon a game status associated with at least one of the games control terminals and/or at least one operator of a games control terminal, the game status comprising one or more communication-enabling-actions performed by a game character, the network associating at least one game character with each games control terminal and/or each operator.
7. A network according to claim 6, in which the controlling logic determines that a games control terminal may perform an audio-communication-task if the game character associated with the games control terminal or the game character associated with the operator of the games control terminal performs one or more communication-enabling- actions.
8. A network according to claim 6 or claim 7, in which the controlling logic determines that a games control terminal may not perform an audio-communication-task in respect of another games control terminal if the game character associated with the other games control terminal or the game character associated with the operator of the other games control terminal performs one or more communication-enabling-actions.
9. A network according to any one of the preceding claims, in which the controlling logic is operable to wait for at least a predetermined period of time after making a determination before making a further determination.
10. A network according to any one of the preceding claims, the network providing logic operable to provide a game environment.
11. A network according to claim 10, operable to provide one or more communication-items within the game environment, the game status comprising one or more associations made between communication-items and operators.
12. A network according to claim 11, in which the controlling logic determines that a games control terminal may perform an audio-communication-task in dependence on the operator of the games control terminal being associated with one or more of the communication-items.
13. A network according to claim 11 or claim 12, in which the controlling logic determines that a games control terminals may not perform an audio-communication-task in respect of another games control terminal in dependence on the operator of the other games control terminal not being associated with one or more of the communication- items.
14. A network according to any one of claims 11 to 13, in which an operator becomes disassociated with a communication-item after a predetermined period of time after the beginning of the association of the operator with the communication-item.
15. A network according to any one of claims 10 to 14, the network providing logic operable to associate an operator with a position within the game environment.
16. A network according to claim 15, the network providing logic operable to designate one or more positions within the game environment as a communication- position, the game status comprising the determinations of whether the positions associated with the operators are communication-positions.
17. A network according to claim 16, in which the controlling logic determines that a games control terminal may perform an audio-communication-task in dependence on whether the position associated with an operator of that games control terminal is a communication-position.
18. A network according to claim 16 or claim 17, in which the controlling logic determines that a games control terminal may not perform an audio-communication-task in respect of another games control terminal in dependence on whether the position associated with an operator of that other games control terminal is not a communication- position.
19. A network according to claim 15, in which the game status comprises the distance between the positions associated with the operators.
20. A network according to claim 19, in which the controlling logic determines that the a games control terminal may perform an audio-communication-task in respect of another games control terminal in dependence on whether the distance between the position associated with an operator of that games control terminal and the position associated with an operator of the other games control terminal is less than a threshold distance.
21. A network according to claim 20, in which the threshold distance is a predetermined distance.
22. A network according to claim 20, operable to dynamically adjust the threshold distance.
23. A network according to any one of the preceding claims, the controlling logic being operable to provide information to an operator of one of the games control terminals about a determination made by the controlling logic.
24. A network according to claim 23, in which the information is visual and/or audible information.
25. A network according to any one of the preceding claims, in which audio data is input by an operator of a games control terminal by a microphone connected to the games control terminal.
26. A network according to any one of the preceding claims, in which the audio data controlled by the controlling logic forms a subset of the total amount of audio data communicated between the games control terminals.
27. A method of controlling audio data within a network, the network comprising: a plurality of processing apparatus, at least two of the processing apparatus being games control terminals, each games control terminal being operable to communicate audio data to and receive audio data from another one of the games control terminals and to render received audio data; the method comprising the steps of: (a) forming a game status associated with at least one operator of at least one of the games control terminals, the game status comprising at least one game score associated with each games control terminal and/or each operator; and (b) in dependence upon the games status, allowing or disallowing one or more of: a games control terminal communicating audio data to another games control terminal; a games control terminal receiving audio data from another games control terminal; and a games control terminal rendering audio data received from another games control terminal.
28. A method of controlling audio data within a network, the network comprising: a plurality of processing apparatus, at least two of the processing apparatus being games control terminals, each games control terminal being operable to communicate audio data to and receive audio data from another one of the games control terminals and to render received audio data; the method comprising the steps of: (a) forming a game status associated with at least one operator of at least one of the games control terminals, the game status comprising one or more communication-enabling-actions performed by a game character, the network associating at least one game character with each games control terminal and/or each operator; and (b) in dependence upon the games status, allowing or disallowing one or more of: a games control terminal communicating audio data to another games control terminal; a games control terminal receiving audio data from another games control terminal; and a games control terminal rendering audio data received from another games control terminal.
29. Computer software having program code for carrying out a method according to claim 27 or claim 28.
30. A providing medium by which software according to claim 29 is provided.
31. A medium according to claim 30, the medium being a transmission medium.
32. A medium according to claim 30, the medium being a storage medium.
33. A games control terminal connectable to a network and operable to transmit audio data to and receive audio data from another games control terminal and to render received audio data; the terminal providing controlling logic operable to determine whether the terminal may perform an audio-communication-task; an audio-communication-task being one or more of the transmission of audio data from the terminal to another games control terminal connected to the network; the reception by the terminal of audio data transmitted from another games control terminal; and the rendering by the terminal of audio data received from another games control terminal; the determination being dependent upon a game status associated with the terminal or one or more other games control terminals connected to the network and/or at least one operator of a games control terminal connected to the network, the game status comprising at least one game score associated with each games control terminal and/or each operator.
34. A games control terminal connectable to a network and operable to transmit audio data to and receive audio data from another games control terminal and to render received audio data; the terminal providing controlling logic operable to determine whether the terminal may perform an audio-communication-task; an audio-communication-task being one or more of the transmission of audio data from the terminal to another games control terminal connected to the network; the reception by the terminal of audio data transmitted from another games control terminal; and the rendering by the terminal of audio data received from another games control terminal; the determination being dependent upon a game status associated with the terminal or one or more other games control terminals connected to the network and/or at least one operator of a games control terminal connected to the network, the game status comprising one or more communication-enabling-actions performed by a game character, the network associating at least one game character with each games control terminal and/or each operator.
PCT/GB2005/002488 2004-06-25 2005-06-24 Real-time voice-chat system for an networked multiplayer game WO2006000786A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0414308.7 2004-06-25
GB0414308A GB2415392B (en) 2004-06-25 2004-06-25 Game processing

Publications (1)

Publication Number Publication Date
WO2006000786A1 true WO2006000786A1 (en) 2006-01-05

Family

ID=32800217

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2005/002488 WO2006000786A1 (en) 2004-06-25 2005-06-24 Real-time voice-chat system for an networked multiplayer game

Country Status (2)

Country Link
GB (2) GB2415392B (en)
WO (1) WO2006000786A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7553920B2 (en) 2004-06-30 2009-06-30 Dow Corning Corporation Fluorocarbon elastomer silicon vulcanizates
US8137191B2 (en) 2006-10-18 2012-03-20 Konami Digital Entertainment Co., Ltd. Game device, message display method, information recording medium and program
WO2020143256A1 (en) * 2019-01-11 2020-07-16 珠海格力电器股份有限公司 Group chat voice information processing method and device, storage medium and server

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5017013B2 (en) 2007-08-08 2012-09-05 株式会社コナミデジタルエンタテインメント Network game system, network game system control method and program
US20090049128A1 (en) * 2007-08-17 2009-02-19 Sony Computer Entertainment America Inc. Schemes for game chat routing and taunt control
JP5957177B2 (en) 2007-12-21 2016-07-27 ドルビー ラボラトリーズ ライセンシング コーポレイション Asynchronous audio for network games
JP2011510409A (en) * 2008-01-17 2011-03-31 ヴィヴォックス インコーポレイテッド A scalable technique for providing real-time avatar-specific streaming data in a virtual reality system using an avatar-rendered environment
US9401937B1 (en) 2008-11-24 2016-07-26 Shindig, Inc. Systems and methods for facilitating communications amongst multiple users
US8405702B1 (en) 2008-11-24 2013-03-26 Shindig, Inc. Multiparty communications systems and methods that utilize multiple modes of communication
US8647206B1 (en) 2009-01-15 2014-02-11 Shindig, Inc. Systems and methods for interfacing video games and user communications
US9344745B2 (en) 2009-04-01 2016-05-17 Shindig, Inc. Group portraits composed using video chat systems
US9712579B2 (en) 2009-04-01 2017-07-18 Shindig. Inc. Systems and methods for creating and publishing customizable images from within online events
US8779265B1 (en) 2009-04-24 2014-07-15 Shindig, Inc. Networks of portable electronic devices that collectively generate sound
US10271010B2 (en) 2013-10-31 2019-04-23 Shindig, Inc. Systems and methods for controlling the display of content
US9952751B2 (en) 2014-04-17 2018-04-24 Shindig, Inc. Systems and methods for forming group communications within an online event
US9733333B2 (en) 2014-05-08 2017-08-15 Shindig, Inc. Systems and methods for monitoring participant attentiveness within events and group assortments
US9711181B2 (en) 2014-07-25 2017-07-18 Shindig. Inc. Systems and methods for creating, editing and publishing recorded videos
US9734410B2 (en) 2015-01-23 2017-08-15 Shindig, Inc. Systems and methods for analyzing facial expressions within an online classroom to gauge participant attentiveness
US10133916B2 (en) 2016-09-07 2018-11-20 Steven M. Gottlieb Image and identity validation in video chat events
EP4311585A1 (en) * 2022-07-29 2024-01-31 Utopia Music AG Method for tracking audio consumption in virtual environments

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030216181A1 (en) * 2002-05-16 2003-11-20 Microsoft Corporation Use of multiple player real-time voice communications on a gaming device
US20040109023A1 (en) * 2002-02-05 2004-06-10 Kouji Tsuchiya Voice chat system
EP1519531A2 (en) * 2003-09-25 2005-03-30 Microsoft Corporation Server control of peer to peer communications

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6241612B1 (en) * 1998-11-09 2001-06-05 Cirrus Logic, Inc. Voice communication during a multi-player game
JP2001314657A (en) * 2000-05-08 2001-11-13 Sega Corp Network system and storage medium
US7503006B2 (en) * 2003-09-25 2009-03-10 Microsoft Corporation Visual indication of current voice speaker

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040109023A1 (en) * 2002-02-05 2004-06-10 Kouji Tsuchiya Voice chat system
US20030216181A1 (en) * 2002-05-16 2003-11-20 Microsoft Corporation Use of multiple player real-time voice communications on a gaming device
EP1519531A2 (en) * 2003-09-25 2005-03-30 Microsoft Corporation Server control of peer to peer communications

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7553920B2 (en) 2004-06-30 2009-06-30 Dow Corning Corporation Fluorocarbon elastomer silicon vulcanizates
US8137191B2 (en) 2006-10-18 2012-03-20 Konami Digital Entertainment Co., Ltd. Game device, message display method, information recording medium and program
WO2020143256A1 (en) * 2019-01-11 2020-07-16 珠海格力电器股份有限公司 Group chat voice information processing method and device, storage medium and server

Also Published As

Publication number Publication date
GB2415392A (en) 2005-12-28
GB0414308D0 (en) 2004-07-28
GB2446529A (en) 2008-08-13
GB2446529B (en) 2008-11-05
GB2415392B (en) 2008-11-05
GB0805290D0 (en) 2008-04-30

Similar Documents

Publication Publication Date Title
WO2006000786A1 (en) Real-time voice-chat system for an networked multiplayer game
US10099145B2 (en) Video game recording and playback with visual display of game controller manipulation
EP1880576B1 (en) Audio processing
US20020142834A1 (en) Game screen switching method performed in game machine and network game system, and program for executing the method
US20090318223A1 (en) Arrangement for audio or video enhancement during video game sequences
US20050245317A1 (en) Voice chat in game console application
US20060015560A1 (en) Multi-sensory emoticons in a communication system
JPH11272841A (en) Image processing method, video game device, and record medium
JP2010535363A (en) Virtual world avatar control, interactivity and communication interactive messaging
JP2005505358A (en) System and method for storing game data
US20090247249A1 (en) Data processing
GB2426169A (en) Controlling the respective volume of each of a plurality of loudspeakers
JP2000279637A (en) Game device, game display control method, and computer- readable record medium
US20100203971A1 (en) Entertainment apparatus and method
KR100865005B1 (en) Image generation device, automatic generation method, and medium for recording the program
Nilsen et al. Tankwar-Tabletop war gaming in augmented reality
US20100035678A1 (en) Video game
US11115442B2 (en) Initiating multiuser sessions
JP4508719B2 (en) Program, information storage medium, and game system
GB2417846A (en) Rendering an image of a display object to generate a reflection of a captured video image
KR20010049884A (en) Display method for confontation type video game capable of displaying different information to respective participating players, storage medium storing programs, and confrontation type video game system
Thomasson Retrogaming
Cermak-Sassenrath et al. AirKanoid—visual presentation vs. physical proximity in mixed reality entertainment applications
JP2024078144A (en) Information processing system, information processing device, and program
WO2008035027A1 (en) Video game

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase
ENP Entry into the national phase

Ref document number: PI0611622

Country of ref document: BR

Kind code of ref document: A2