GB2447020A - Transmitting game data from an entertainment device and rendering that data in a virtual environment of a second entertainment device - Google Patents

Transmitting game data from an entertainment device and rendering that data in a virtual environment of a second entertainment device Download PDF

Info

Publication number
GB2447020A
GB2447020A GB0704227A GB0704227A GB2447020A GB 2447020 A GB2447020 A GB 2447020A GB 0704227 A GB0704227 A GB 0704227A GB 0704227 A GB0704227 A GB 0704227A GB 2447020 A GB2447020 A GB 2447020A
Authority
GB
United Kingdom
Prior art keywords
game
game action
data
action data
operable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0704227A
Other versions
GB0704227D0 (en
Inventor
Tomas Owen Gillo
Scott Christopher Waugaman
Mitchell Goodwin
Mark Andrew Horneff
Nick Ryan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Europe Ltd
Original Assignee
Sony Computer Entertainment Europe Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB0703974.6A external-priority patent/GB0703974D0/en
Application filed by Sony Computer Entertainment Europe Ltd filed Critical Sony Computer Entertainment Europe Ltd
Priority to GB0704227A priority Critical patent/GB2447020A/en
Publication of GB0704227D0 publication Critical patent/GB0704227D0/en
Priority to EP08730776A priority patent/EP2132650A4/en
Priority to PCT/US2008/055037 priority patent/WO2008109299A2/en
Priority to JP2009551806A priority patent/JP2010533006A/en
Priority to PCT/US2008/002644 priority patent/WO2008106197A1/en
Priority to PCT/US2008/002643 priority patent/WO2008106196A1/en
Priority to EP08726220A priority patent/EP2118840A4/en
Priority to PCT/US2008/002630 priority patent/WO2008108965A1/en
Priority to JP2009551727A priority patent/JP2010535364A/en
Priority to JP2009551726A priority patent/JP2010535363A/en
Priority to EP08726207A priority patent/EP2126708A4/en
Priority to EP08726219A priority patent/EP2118757A4/en
Priority to JP2009551722A priority patent/JP2010535362A/en
Priority to PCT/GB2008/000680 priority patent/WO2008104783A1/en
Publication of GB2447020A publication Critical patent/GB2447020A/en
Priority to JP2014039137A priority patent/JP5756198B2/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/86Watching games played by other players
    • A63F13/12
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/34Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using peer-to-peer connections
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/77Game security or game management aspects involving data related to game devices or game servers, e.g. configuration data, software version or amount of memory
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/407Data transfer via internet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/408Peer to peer connection
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Child & Adolescent Psychology (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Processing Or Creating Images (AREA)
  • Pinball Game Machines (AREA)

Abstract

An entertainment system comprises a first gaming device 3100 which executes a game application program 3122 associated with a game, the gaming device including processing means 3120 which generates game action data indicative of graphical content of the game, and transmitting means (not shown) which transmits the game action data such that the game action can be rendered within the virtual environment of a second gaming device 3300 in data communication with the first gaming device. The virtual environment is associated with a virtual environment application program 3324 which is different from the game application program. The data communication between the gaming devices may comprise peer-to-peer or client server communication protocol. The entertainment system allows a user of the second gaming device 3300 to observe, within a virtual environment displayed on the second gaming device, transmitted game action of a game being played on one or more first gaming devices 3100, 3200. The game action data may be transmitted and observed in real-time, or may be stored and transmitted at a selected time. Transmission of the game action data may be dependant upon a payment status of a user of the second gaming device.

Description

ENTERTAINMENT DEVICE AND METHOD
The present invention relates to an entertainment device and corresponding method.
Recently, video game devices have become available that enable connection to an online gaming system or servers via the internet. Examples of such systems are the Sony PlayStation 3 (PS3 ) entertainment device or the Xbox36O device manufactured by Microsoft , which is provided with an online gaming service known as the Xbox Live service.
Additionally, in the field of sports entertainment, live sports such as football or boxing can be watched live by streaming the sports action to a user over the internet. Furthermore, edited highlights programs may be generated from sports footage of, for example, several boxing matches so that a user can watch the highlights program when they desire, perhaps on payment of a fee to the relevant distributor of the sports action so that the highlights program is downloaded to the user.
However, in the field of video games devices and in contrast to the above, game action of game play by highly skilled garners is difficult for other users to obtain or watch.
Additionally, garners are becoming ever more competitive and there is a desire among garners to be able to watch game action of particularly skilled garners. Furthermore, virtual gaming environments are becoming more popular but tend to be limited in the experiences provided within the virtual environment with respect to game action of other games.
The present invention seeks to alleviate or mitigate the above problems.
In a first aspect, there is provided an entertainment device operable to execute a game application program associated with a game, the device comprising: processing means operable to generate game action data from game action of the game; and transmitting means operable to transmit the game action data such that the game action can be rendered within a virtual environment by a second entertainment device in data communication with the entertainment device, the virtual environment being associated with the second entertainment device, in which: the virtual environment is associated with a virtual environment application program that is different from the game application program; and the game action data comprises graphics data indicative of graphical content of the game action data.
In a second aspect, there is provided an entertainment device, comprising: receiving means operable to communicate with a second entertainment device using a data communications link and to receive, from the data communications link, game action data from the second entertainment device; processing means operable to generate game action of a game from the received game action data, the game being associated with the second entertainment device; and rendering means operable to render the game action within a virtual environment associated with the entertainment device in dependence upon the game action generated by the processing means, in which: the game is associated with a game application program and the virtual environment is associated with a virtual environment application program that is different from the game application program; and the game action data comprises graphics data indicative of graphical content of the game action data.
In a third aspect, there is provided an entertainment network server, comprising: receiving means operable to receive game action data from one or more entertainment io devices, the game action data being generated by the one or more entertainment devices in dependence upon game action of a game associated with the one or more of the entertainment devices; storage means operable to store the game action data received from the entertainment device; and transmitting means operable to transmit the stored game action data to at least one recipient entertainment device such that the game action data can be rendered as game action i5 within a virtual environment associated with the recipient entertainment device, in which: the game is associated with a game application program and the virtual environment is associated with a virtual environment application program that is different from the game application program; and the game action data comprises graphics data indicative of graphical content of the game action data.
In a fourth aspect, there is provided a method of transmitting game action data using an entertainment device operable to execute a first application program associated with a game, the method comprising: generating game action data from game action of the game; and transmitting the game action data such that the game action can be rendered within a virtual environment by a second entertainment device in data communication with the entertainment device, the virtual environment being associated with the second entertainment device, in which: the virtual environment is associated with a second application program that is different from the first application program; and the game action data comprises graphics data indicative of graphical content of the game action data.
In a fifth aspect, there is provided a method of receiving game action data using an entertainment device having receiving means operable to communicate with a second entertainment device via a communications link, the method comprising: receiving, from the communications link, game action data generated by the second entertainment device; generating game action of a game from the received game action data, the game being associated with the second entertainment device; and rendering the game action within a
S
virtual environment associated with the entertainment device in dependence upon the game action generated by the processing means, in which: the game is associated with a first application program and the virtual environment is associated with a second, different, application program; and the game action data comprises graphics data indicative of graphical content of the game action data.
In a sixth aspect, there is provided a method of communicating game action data using an entertainment network server, the method comprising: receiving game action data from one or more entertainment devices, the game action data being generated by the one or more entertainment devices in dependence upon game action of a game associated with the one or more of the entertainment devices; storing the game action data received from the entertainment device; and transmitting the stored game action data to at least one recipient entertainment device such that the game action data can be rendered as game action within a virtual environment associated with the recipient entertainment device, in which: the game is associated with a first application program and the virtual environment is associated with a second, different, application program; and the game action data comprises graphics data indicative of graphical content of the game action data.
Advantageously, the above aspects allow game action of a game associated with an entertainment device to be viewed within a virtual environment of a different entertainment device. Therefore, for example, two highly skilled and competitive garners could play each other at a fighting game and the game action of that game could be viewed within the virtual environment Df at least another entertainment device. Additionally, the game action of the fighting game could be viewed by a plurality of users, each with their own entertainment device, in a situation analogous to watching live footage of a boxing match or snooker tournament using a television. Alternatively, for example, the above aspects allow edited highlights of exciting game action to be viewed within the virtual environment. Furthermore, by providing a way in which game action of a different game may be rendered within a virtual environment of the entertainment device, a usei's experience of the virtual environment is greatly enriched.
Further respective aspects and features of the invention are defined in the appended claims.
Embodiments of the present invention will now be described by way of example with reference to the accompanying drawings, in which: Figure 1 is a schematic diagram of an entertainment device; Figure 2 is a schematic diagram of a cell processor; Figure 3 is a schematic diagram of a video graphics processor; Figure 4 is a schematic diagram of an interconnected set of game zones in accordance with an embodiment of the present invention; Figure 5 is a schematic diagram of a Home environment online clientlserver arrangement in accordance with an embodiment of the present invention; Figure 6a is a schematic diagram of a lobby zone in accordance with an embodiment of the present invention; Figure 6b is a schematic diagram of a lobby zone in accordance with an embodiment of the present invention; Figure 6c is a schematic diagram of a cinema zone in accordance with an embodiment of the present invention; Figure 6d is a schematic diagram of a developer/publisher zone in accordance with an embodiment of the present invention; Figure 7 is a flow diagram of a method of on-line transaction in accordance with an IS embodiment of the present invention; Figure 8a is schematic diagram of an apartment zone in accordance with an embodiment of the present invention; Figure 8b is schematic diagram of a trophy room zone in accordance with an embodiment of the present invention; Figure 9 is a schematic diagram of a communication menu in accordance with an embodiment of the present invention; Figure 10 is a schematic diagram of an interactive virtual user device in accordance with an embodiment of the present invention; Figure 11 is a schematic diagram of an online client/server arrangement of entertainment devices in accordance with an embodiment of the present invention; Figure 12 is a schematic diagram of a further online clientiserver arrangement of the entertainment devices in accordance with an embodiment of the present invention; Figure 13 is a schematic diagram of a peer-to-peer arrangement of entertainment devices in accordance with an embodiment of the present invention; Figure 14 is a schematic diagram of a file structure used to store representations of the trophies in accordance with an embodiment of the present invention; Figures 15a and 15b show a schematic representation of 3D polygon scaling in accordance with an embodiment of the present invention; An entertainment device and method is disclosed. In the following description, a number of specific details are presented in order to provide a thorough understanding of embodiments of the present invention. It will be apparent however to a person skilled in the art that these specific details need not be employed to practice the present invention.
Conversely, specific details known to the person skilled in the art are omitted for the purposes of clarity in presenting the embodiments.
Figure 1 schematically illustrates the overall system architecture of the Sony Playstation 3 entertainment device. A system unit 10 is provided, with various peripheral devices connectable to the system unit.
The system unit 10 comprises: a Cell processor 100; a Rambus dynamic random access memory (XDRAM) unit 500; a Reality Synthesiser graphics unit 200 with a dedicated video random access memory (VRAM) unit 250; and an I/O bridge 700.
The system unit 10 also comprises a Blu Ray Disk BD-ROM optical disk reader 430 for reading from a disk 440 and a removable slot-in hard disk drive (HDD) 400, accessible through the 110 bridge 700. Optionally the system unit also comprises a memory card reader 450 for reading compact flash memory cards, Memory Stick memory cards and the like, which is similarly accessible through the I/O bridge 700.
The 110 bridge 700 also connects to six Universal Serial Bus (USB) 2.0 ports 710; a gigabit Ethernet port 720; an IEEE 802.1 lb/g wireless network (Wi-Fl) port 730; and a Bluetooth wireless link port 740 capable of supporting of up to seven Bluetooth connections.
In operation the I/O bridge 700 handles all wireless, USB and Ethernet data, including data from one or more game controllers 751. For example when a user is playing a game, the I/O bridge 700 receives data from the game controller 751 via a Bluetooth link and directs it to the Cell processor 100, which updates the current state of the game accordingly.
The wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controllers 751, such as: a remote control 752; a keyboard 753; a mouse 754; a portable entertainment device 755 such as a Sony Playstation Portable entertainment device; a video camera such as an EyeToy video camera 756; and a microphone headset 757. Such peripheral devices may therefore in principle be connected to the system unit 10 wirelessly; for example the portable entertainment device 755 may communicate via a Wi-Fi ad-hoc connection, whilst the microphone headset 757 may communicate via a Bluetooth link.
The provision of these interfaces means that the Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over JP telephones, mobile telephones, printers and scanners.
s In addition, a legacy memory card reader 410 may be connected to the system unit via a USB port 710, enabling the reading of memory cards 420 of the kind used by the Playstation or Playstation 2 devices.
In the present embodiment, the game controller 751 is operable to communicate wirelessly with the system unit 10 via the Bluetooth link. However, the game controller 751 can instead be connected to a USB port, thereby also providing power by which to charge the battery of the game controller 751. In addition to one or more analogue joysticks and conventional control buttons, the game controller is sensitive to motion in 6 degrees of freedom, corresponding to translation and rotation in each axis. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands. Optionally, other wirelessly enabled peripheral devices such as the Playstation Portable device may be used as a controller. In the case of the Playstation Portable device, additional game or control information (for example, control instructions or number of lives) may be provided on the screen of the device. Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or bespoke controllers, such as a single or several large buttons for a rapid-response quiz game (also not shown).
The remote control 752 is also operable to communicate wirelessly with the system unit 10 via a Bluetooth link. The remote control 752 comprises controls suitable for the operation of the Blu Ray Disk BD-ROM reader 430 and for the navigation of disk content.
The Blu Ray Disk BD-ROM reader 430 is operable to read CD-RUMs compatible with the Playstation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs. The reader 430 is also operable to read DVD-ROMs compatible with the Playstation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs. The reader 430 is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks.
The system unit 10 is operable to supply audio and video, either generated or decoded by the Playstation 3 device via the Reality Synthesiser graphics unit 200, through audio and video connectors to a display and sound output device 300 such as a monitor or television set having a display 305 and one or more loudspeakers 310. The audio connectors 210 may include conventional analogue and digital outputs whilst the video connectors 220 may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL orNTSC, or in 72Op, 1080i or lO8Op high definition.
Audio processing (generation, decoding and so on) is performed by the Cell processor 100. The Playstation 3 devices operating system supports Dolby 5.1 surround sound, Dolby Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu-Ray io disks.
In the present embodiment, the video camera 756 comprises a single charge coupled device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for Is decoding by the system unit 10. The camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit 10, for example to signify adverse lighting conditions. Embodiments of the video camera 756 may variously connect to the system unit 10 via a USB, Bluetooth or Wi-Fi communication port. Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data. In embodiments of the video camera, the CCD may have a resolution suitable for high-definition video capture. In use, images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs.
In general, in order for successful data communication to occur with a peripheral device such as a video camera or remote control via one of the communication ports of the system unit 10, an appropriate piece of software such as a device driver should be provided.
Device driver technology is well-known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described.
Referring now to Figure 2, the Cell processor 100 has an architecture comprising four basic components: external input and output structures comprising a memory controller 160 and a dual bus interface controller I 70A,B; a main processor referred to as the Power Processing Element 150; eight co-processors referred to as Synergistic Processing Elements (SPEs) I IOA-H; and a circular data bus connecting the above components referred to as the Element Interconnect Bus 180. The total floating point performance of the Cell processor is 218 GFLOPS, compared with the 6.2 GFLOPs of the Playstation 2 devices Emotion Engine.
The Power Processing Element (PPE) 150 is based upon a two-way simultaneous multithreading Power 970 compliant PowerPC core (PPU) 155 running with an internal clock of 3.2 Gl-Iz. It comprises a 512 kB level 2 (L2) cache and a 32kB level I (LI) cache. The PPE 150 is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz. The primary role of the PPE 150 is to act as a controller for the Synergistic Processing Elements 1 IOA-H, which handle most of the computational workload.
In operation the PPE 150 maintains a job queue, scheduling jobs for the Synergistic Processing Elements I IOA-H and monitoring their progress. Consequently each Synergistic Processing Element I bA-H runs a kernel whose role is to fetch a job, execute it and synchronise with the PPE 150.
Each Synergistic Processing Element (SPE) I bOA-H comprises a respective Synergistic Processing Unit (SPU) I 20A-H, and a respective Memory Flow Controller (MFC) 140A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 142A-H, a respective Memory Management Unit (MMU) 144A-H and a bus interface (not shown).
Each SPU 120A-H is a RISC processor clocked at 3.2 0Hz and comprising 256 kB local RAM 130A-H, expandable in principle to 4 GB. Each SPE gives a theoretical 25.6 GFLOPS of single precision performance. An SPU can operate on 4 single precision floating point members, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation. The SPU 120A-H does not directly access the system memory XDRAM 500; the 64-bit addresses formed by the SPU 120A-H are passed to the tvIFC I4OA-H which instructs its DMA controller 142A-H to access memory via the Element Interconnect Bus 180 and the memory controller 160.
The Element Interconnect Bus (BIB) 180 is a logically circular communication bus internal to the Cell processor 100 which connects the above processor elements, namely the PPE 150, the memory controller 160, the dual bus interface 170A,B and the 8 SPEs I IOA-H, totalling 12 participants. Participants can simultaneously read and write to the bus at a rate of 8 bytes per clock cycle. As noted previously, each SPE I IOA-H comprises a DMAC 142A-H for scheduling longer read or write sequences. The EIB comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step-wise data-flow between any two participants is six steps in the appropriate direction. The theoretical peak instantaneous EIB bandwidth for 12 slots is therefore 96B per clock, in the event of full utilisation through arbitration between participants. This equates to a theoretical peak bandwidth of 307.2 GB/s (gigabytes per second) at a clock rate of 3.2GHz.
The memory controller 160 comprises an XDRAM interface 162, developed by Rambus Incorporated. The memory controller interfaces with the Rambus XDRAM 500 with s a theoretical peak bandwidth of 25.6 GB/s.
The dual bus interface 170A,B comprises a Rambus FlexlO system interface 172A,B. The interface is organised into 12 channels each being 8 bits wide, with five paths being inbound and seven outbound. This provides a theoretical peak bandwidth of 62.4 GB/s (36.4 GB/s outbound, 26 GB/s inbound) between the Cell processor and the I/O Bridge 700 via controller 170A and the Reality Simulator graphics unit 200 via controller 170B.
Data sçnt by the Cell processor 100 to the Reality Simulator graphics unit 200 will typically comprise display lists, being a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on.
Referring now to Figure 3, the Reality Simulator graphics (RSX) unit 200 is a video accelerator based upon the NVidia G70/7 1 architecture that processes and renders lists of commands produced by the Cell processor 100. The RSX unit 200 comprises a host interface 202 operable to communicate with the bus interface controller I 70B of the Cell processor 100; a vertex pipeline 204 (VP) comprising eight vertex shaders 205; a pixel pipeline 206 (PP) comprising 24 pixel shaders 207; a render pipeline 208 (RP) comprising eight render output units (ROPs) 209; a memory interface 210; and a video converter 212 for generating a video output. The RSX 200 is complemented by 256 MB double data rate (DOR) video RAM (VRAM) 250, clocked at 600MHz and operable to interface with the RSX 200 at a theoretical peak bandwidth of 25.6 GB/s. In operation, the VRAM 250 maintains a frame buffer 214 and a texture buffer 216. The texture buffer 216 provides textures to the pixel shaders 207, whilst the frame buffer 214 stores results of the processing pipelines. The RSX can also access the main memory 500 via the E!B 180, for example to load textures into the VRAM 250.
The vertex pipeline 204 primarily processes deformations and transformations of vertices defining polygons within the image to be rendered.
The pixel pipeline 206 primarily processes the application of colour, textures and lighting to these polygons, including any pixel transparency, generating red, green, blue and alpha (transparency) values for each processed pixel. Texture mapping may simply apply a graphic image to a surface, or may include bump-mapping (in which the notional direction of a surface is perturbed in accordance with texture values to create highlights and shade in the I0 lighting model) or displacement mapping (in which the applied texture additionally perturbs vertex positions to generate a deformed surface consistent with the texture).
The render pipeline 208 performs depth comparisons between pixels to determine which should be rendered in the final image. Optionally, if the intervening pixel process will not affect depth values (for example in the absence of transparency or displacement mapping) then the render pipeline and vertex pipeline 204 can communicate depth information between them, thereby enabling the removal of occluded elements prior to pixel processing, and so improving overall rendering efficiency. In addition, the render pipeline 208 also applies subsequent effects such as full-screen anti-aliasing over the resulting image.
Both the vertex shaders 205 and pixel shadcrs 207 are based on the shader model 3.0 standard. Up to 136 shader operations can be performed per clock cycle, with the combined pipeline therefore capable of 74.8 billion shader operations per second, outputting up to 840 million vertices and 10 billion pixels per second. The total floating point performance of the RSX 200 is 1.8 TFLOPS.
Typically, the RSX 200 operates in close collaboration with the Cell processor 100; for example, when displaying an explosion, or weather effects such as rain or snow, a large number of particles must be tracked, updated and rendered within the scene. In this case, the PPU 155 of the Cell processor may schedule one or more SPEs llOA-H to compute the trajectories of respective batches of particles. Meanwhile, the RSX 200 accesses any texture data (e.g. snowflakes) not currently held in the video RAM 250 from the main system memory 500 via the element interconnect bus 180, the memory controller 160 and a bus interface controller 170B. The or each SPE I bA-H outputs its computed particle properties (typically coordinates and normals, indicating position and attitude) directly to the video RAM 250; the DMA controller 142A-H of the or each SPE I 1OA-H addresses the video RAM 250 via the bus interface controller 170B. Thus0 in effect the assigned SPEs become part of the video processing pipeline for the duration of the task.
In general, the PPU 155 can assign tasks in this fashion to six of the eight SPEs available; one SPE is reserved for the operating system, whilst one SPE is effectively disabled. The disabling of one SPE provides a greater level of tolerance during fabrication of the Cell processor, as it allows for one SPE to fail the fabrication process. Alternatively if all eight SPEs are functional, then the eighth SPE provides scope for redundancy in the event of subsequent failure by one of the other SPEs during the life of the Cell processor.
The PPU 155 can assign tasks to SPEs in several ways. For example, SPEs may be chained together to handle each step in a complex operation, such as accessing a DVD, video
II
and audio decoding, and error masking, with each step being assigned to a separate SPE.
Alternatively or in addition, two or more SPEs may be assigned to operate on input data in parallel, as in the particle animation example above.
Software instructions implemented by the Cell processor 100 and/or the RSX 200 may be supplied at manufacture and stored on the HDD 400, and/or may be supplied on a data carrier or storage medium such as an optical disk or solid state memory, or via a transmission medium such as a wired or wireless network or internet connection, or via combinations of these.
The software supplied at manufacture comprises system firmware and the Playstation 3 device's operating system (OS). In operation, the OS provides a user interface enabling a user to select from a variety of functions, including playing a game, listening to music, viewing photographs, or viewing a video. The interface takes the form of a so-called cross media-bar (XMB), with categories of function arranged horizontally. The user navigates by moving through the functions horizontally using a game controller 751, remote control 752 or othersuitable control device so as to highlight the desired function, at which point options pertaining to that function appear as a vertically scrollable list centred on that function, which may be navigated in analogous fashion. However, if a game, audio or movie disk 440 is inserted into the BD-ROM optical disk reader 430, the Playstation 3 device may select appropriate options automatically (for example, by commencing the game), or may provide relevant options (for example, to select between playing an audio disk or compressing its content to the HDD 400).
In addition, the OS provides an on-line capability, including a web browser, an interface with an on-line store from which additional game content, demos and other media may be downloaded, and a friends management capability, providing on-line communication with other Playstation 3 device users nominated by the user of the current device; for example, by text, audio or video depending on the peripheral devices available. The on-line capability also provides for on-line communication, content download and content purchase during play of a suitably configured game, and for updating the firmware and OS of the Playstation 3 device itself. It will be appreciated that the term "on-line' does not imply the physical presence of wires, as the term can also apply to wireless connections of various types.
In an embodiment of the present invention, the above-mentioned online capability comprises interaction with a virtual environment populated by avatars (graphical representations) of the user of the PS3 10 and of other PS3 users who are currently online.
The software to enable the virtual interactive environment is typically resident on the HDD 400, and can be upgraded and/or expanded by software that is downloaded, or stored on optical disk 440, or accessed by any other suitable means. Alternatively, the software may reside on a flash memory card 420, optical disk 440 or a central server (not shown).
In an embodiment of the present invention, the virtual interactive environment (hereafter called the Home environment) is selected from the cross-media bar. The Home environment then starts in a conventional manner similar to a 3D video game by loading and executing control software, loading 3D models and textures into video memory 250, and rendering scenes depicting the Home environment. Alternatively or in addition, the Home environment can be initiated by other programs, such as a separate game.
Referring now to Figure 4, which displays a notional map of the Home environment, and Figure 5, which is a schematic diagram of a Home environment online client/server arrangement, the usei's avatar is spawned within a lobby zone 1010 by default. However, a user can select among other zones 1010-1060 (detailed below) of the map, causing the select zone to be loaded and the avatar to be spawned within that zone. In an embodiment of the present invention, the map screen further comprises a sidebar on which the available zones may be listed, together with management tools such as a ranking option, enabling zones to be listed in order of user preference, or such as most recently added and/or A-Z listings. In addition a search interface may allow the user to search for a zone by name. In an embodiment of the present invention, there maybe many more zones available than are locally stored on the usei's PS3 at any one time; the local availability may be colour coded on the list, or the list may be filtered to only display locally available zones. If the user selects a locally unavailable zone, it can be downloaded from a Home environment Server 2010.
Referring now to Figure 6a, the lobby zone 1010 typically resembles a covered piazza, and may comprise parkland (grass, trees, sculptures etc.), and gathering spaces (such as open areas, single benches or rows of seats etc.) where users can meet through their avatars.
The lobby zone 1010 typically also comprises advertisement hoardings, for displaying either still or moving adverts for games or other content or products. These may be on the walls of the lobby, or may stand alone.
The lobby zone 1010 may also include an open-air cinema 1012 showing trailers, high-profile adverts or other content from third-party providers. Such content is typically streamed or downloaded from a Home environment server 2010 to which the PS3 10 connects when the Home environment is loaded, as described in more detail later.
The cinema screen is accompanied by seating for avatars in front of it, such that when an avatar sits down, the camera angle perceived by the user of the avatar also encompasses the screen.
Referring now also to Figure 6b, the lobby zone 1010 may also include general amusements 1014, such as functioning pool tables, bowling alleys, and/or a video arcade.
Games of pool or bowling may be conducted via the avatar, such that the avatar holds the pool cue or bowling ball, and is controlled in a conventional manner for such games. In the video arcade, if an avatar approaches a videogame machine, the home environment may switch to a substantially full-screen representation of the videogame selected. Such games may, for example, be classic arcade or console games such as Space Invaders (RIM), or Pac-Man (RTM), which are comparatively small in terms of memory and processing and can be emulated by the PS3 within the Home environment or run as plug-ins to the Home environment. In this case, typically the user will control the game directly, without representation by the avatar. The game will switch back to the default Home environment view if the user quits the game, or causes the avatar to move away from the videogame machine. In addition to classic arcade games, user-created game content may be featured on one or more of the virtual video game machines. Such content may be the subject of on-line competitions to be featured in such a manner, with new winning content downloaded on a regular basis.
In addition to the lobby zone 1010, other zones (e.g. zones 1020, 1030, 1040, 1050 and 1060, which may be rooms, areas or other constructs) are available. These may be accessed either via a map screen similar in nature to that of Figure 4, or alternatively the user can walk to these other areas by guiding their avatar to various exits 1016 from the lobby.
Typically, an exit 1016 takes the form of a tunnel or corridor (but may equally take the form of an anteroom) to the next area. While the avatar is within the tunnel or anteroom, the next zone is loaded into memory. Both the lobby and the next zone contain identical models of the tunnel or anteroom, or the model is a common resource to both. In either case, the use?s avatar is relocated from the lobby-based version to the new zone-based version of the tunnel or anteroom at the same position. In this way the uset's avatar can apparently walk seamlessly throughout the Home environment, without the need to retain the whole environment in memory at the same time.
Referring now also to Figure 6c, one available zone is a Cinema zone 1020. The Cinema zone 1020 resembles a multiplex cinema, comprising a plurality of screens that may show content such as trailers, movies, TV programmes, or adverts downloaded or streamed from a Home environment server 2010 as noted previously and detailed below, or may show content stored on the HOD 400 or on an optical disk 440, such as a Blu-Ray disk.
Typically, the multiplex cinema will have an entrance area featuring a screen 1022 on which high-profile trailers and adverts may be shown to all visitors, together with poster adverts 1024, typically but not limited to featuring upcoming movies. Specific screens and the selection and display of the trailers and posters can each be restricted according to the age of the user, as registered with the PS3. This age restriction can be applied to any displayed content to which an age restriction tag is associated, in any of the zones within the Home environment.
In addition, in an embodiment of the present invention the multiplex cinema provides a number of screen rooms in which featured content is available, and amongst which the user can select. Within a screen room downloaded, streamed or locally stored media can be played within a virtual cinema environment, in which the screen is set in a room with rows of seats, screen curtains, etc. The cinema is potentially available to all users in the Home environment, is and so the avatars of other users may also be visible, for example watching commonly streamed material such as a web broadcast. Alternatively, the user can zoom in so that the screen occupies the full viewing area.
Referring now also to Figure 6d, another type of zone is a developer or publisher zone 1030. Typically, there may be a plurality of such zones available. Optionally, each may have its own exit from the lobby area 1010, or alternatively some or all may share an exit from the lobby and then have separate exits from within a tunnel or ante-room model common to or replicated by each available zone therein. Alternatively they may be selected from a menu, either in the form of a pop-up menu, or from within the Home environment, such as by selecting from a set of signposts. In these latter cases the connecting tunnel or anteroom will appear to link only to the selected developer or publisher zone 1030. Alternatively or in addition, such zones may be selected via the map screen, resulting in the zone being loaded in to memory, and the avatar re-spawning within the selected zone.
Developer or publisher zones 1030 provide additional virtual environments, which may reflect the look and feel of the developer or publishei's products, brands and marks.
The developer or publisher zones 1030 are supplementary software modules to the Home environment and typically comprise additional 3D models and textures to provide the structure and appearance of the zone.
In addition, the software operable to implement the Home environment supports the integration of third party software via an application program interface (API). Therefore, I. 15 developers can integrate their own functional content within the Home environment of their own zone. This may take the form of any or all of: Downloading / streaming of specific content, such as game trailers or celebrity endorsements; s ii. Changes in avatar appearance, behaviour and/or communication options within the zone; iii. The provision of one or more games, such as basketball 1032 or a golf range 1034, optionally branded or graphically reminiscent of the developer's or publisher's games; iv. One or more interactive scenes or vignettes representative of the developer's or publisher's games, enabling the player to experience an aspect of the game, hone a specific skill of the game, or familiarise themselves with the controls of a game; v. An arena, ring, dojo, court or similar area 1036 in which remotely played is games may be represented live by avatars 1038, for spectators to watch.
Thus, for example, a developer's zone resembles a concourse in the developer's signature colours and featuring their logos, onto which open gaming areas, such as soccer nets, or a skeet range for shooting. In addition, a booth (not shown) manned by game-specific characters allows the usei's avatar to enter and either temporarily change into the lead character of the game, or zoom into a first person perspective, and enter a further room resembling a scene from the featured game. Here the user interacts with other characters from the game, and plays out a key scene. Returning to the concourse, adverts for the game and other content are displayed on the walls. At the end of the zone, the concourse opens up into an arena where a 5-a-side football match is being played, where the positions of the players and the ball correspond to a game currently being played by a popular group, such as a high-ranking game clan, in another country.
In embodiments of the present invention, developer / publisher zones are available to download. Alternatively or in addition, to reduce bandwidth they may be supplied as demo content on magazine disks, or may be installed/upgraded from disk as part of the installation process for a purchased game of the developer or publisher. In the latter two examples, subsequent purchase or registration of the game may result in further zone content being unlocked or downloaded. In any event, further modifications, and timely advert and trailer media, may be downloaded as required.
A similar zone is the commercial zone 1040. Again, there may be a plurality of such commercial zones accessible in similar manner to the developer and publisher zones. Like developer / publisher zones 1030, Commercial zones 1040 may comprise representative virtual assets of one or more commercial vendors in the form of 3D models, textures etc., enabling a rendering of their real-world shops, brands and identities, and these may be geographically and/or thematically grouped within zones.
Space within commercial zones may be rented as so-called virtual real-estate by third parties. For example, a retailer may pay to have a rendering of their shop included within a commercial zone 1040 as part of a periodic update of the Home environment supplied via the Home environment server 2010, for example on a monthly or annual renewal basis. A retailer may additionally pay for the commerce facilities described above, either on a periodic basis or per item. In this way they can provide users of the Home environment with a commercial presence.
Again, the commercial zone comprises supplementary software that can integrate with IS the home environment via an API, to provide additional communication options (shop-specific names, goods, transaction options etc), and additional functionality, such as accessing an online database of goods and services for purchase, determining current prices, the availability of goods, and delivery options. Such functions may be accessed either via a menu (either as a pop-up or within the Home environment, for example on a wall) or via communication with automated avatars. Communication between avatars is described in more detail later.
It will be appreciated that developers and publishers can also provide stores within commercial zones, and in addition that connecting tunnels between developer / publisher and commercial zones may be provided. For example, a tunnel may link a developer zone to a store that sells the developet's games. Such a tunnel may be of a'many to one variety, such that exits from several zones emerge from the same tunnel in-store. In this case, if re-used, typically the tunnel would be arranged to return the user to the previous zone rather than one of the possible others.
In an embodiment of the present invention, the software implementing the Home environment has access to an online-content purchase system provided by the PS3 OS.
Developers, publishers and store owners can use this system via an interface to specif' the IP address and query text that facilitates their own on-line transaction. Alternatively, the user can allow their PS3 registration details and credit card details to be used directly, such that by selecting a suitably enabled object, game, advert, trailer or movie anywhere within the Home environment, they can select to purchase that item or service. In particular, the Home environment server 2010 can store and optionally validate the usei's credit card and other details so that the details are ready to be used in a transaction without the user having to enter them. In this way the Home environment acts as an intermediary in the transaction.
Alternatively such details can be stored at the PS3 and validated either by the PS3 or by the Home environment server.
Thus, referring now also to Figure 7, in an embodiment of the present invention a method of sale comprises in a step s2102 a user selecting an item (goods or a service) within the Home environment. In step s2104, the PS3 10 transmits identification data corresponding with the object to the Home environment server 2010, which in step s2016 verifies the item's availability from a preferred provider (preferably within the country corresponding to the IP address of the user). If the item is unavailable then in step s2107 it informs the user by transmitting a message to the usei's PS3 10. Alternatively, it first checks for availability from one or more secondary providers, and optionally confirms whether supply from one of these providers is acceptable to the user. In step s2 108, the Home environment server retrieves from data storage the use?s registered payment details and validates them. If there is no valid payment method available, then the Home environment may request that the user enters new details via a secure (i.e. encrypted) connection. Once a valid payment method is available, then in step s2 110 the Home environment server requests from the appropriate third part payment provider a transfer of payment from the usei's account. Finally, in s2112 the Home environment server places an order for the item with the preferred provider, giving the usei's delivery address or IP address as applicable, and transferring appropriate payment to the preferred providei's account.
In this way, commerce is not limited specifically to shops. Similarly, it is not necessary for shops to provide their own commerce applications if the preferred provider for goods or services when displayed within a shop is set to be that shop's owner. Where the goods or service may be digitally provided, then optionally it is downloaded from the preferred provider directly or via a Home environment server 2010.
In addition to the above public zones, there are additional zones that are private to the individual user and may only be accessed by them or by invitation from them. These zones also have exits from the communal lobby area, but when entered by the avatar (or chosen via the map screen), load a respective version of the zone that is private to that user.
Referring to Figure 8a, the first of these zones is an apartment zone 1050. In an embodiment of the present invention, this is a user-customisable zone in which such features 1052 as wallpaper, flooring, pictures, furniture, outside scenery and lighting may be selected and positioned. Some of the furniture is functional furniture 1054, linked to PS3 functionality.
For example, a television may be placed in the apartment 1050 on which can be viewed one of several streamed video broadcasts, or media stored on the PS3 I-IDD 400 or optical disk 440. Similarly, a radio or hi-fl may be selected that contains pre-selected links to internet radio streams. In addition, user artwork or photos may be imported into the room in the form of wall hangings and pictures.
Optionally, the user (represented in Figure 8a by their avatar 1056) may purchase a larger apartment, and/or additional goods such as a larger TV, a pool table, or automated non-player avatars. Other possible items include a gym, swimming pool, or disco area. In these latter cases, additional control software or configuration libraries to provide additional character functionality will integrate with the home environment via the API in a similar fashion to that described for the commercial and developer / publisher zones 1030, 1040 described previously.
Such purchases may be made using credit card details registered with the Home environment server. In return for a payment, the server downloads an authorisation key to unlock the relevant item for use within the use?s apartment. Alternatively, the 3D model, textures and any software associated with an item may also be downloaded from the Home environment server or an authorised third-party server, optionally again associated with an authorisation key. The key may, for example, require correspondence with a firmware digital serial number of the PS3 10, thereby preventing unauthorised distribution.
A usei's apartment can only be accessed by others upon invitation from the respective user. This invitation can take the form of a standing invitation for particular friends from within a friends list, or in the form of a single-session pass conferred on another user, and only valid whilst that user remains in the current Home environment session. Such invitations may take the form of an association maintained by a Home environment server 2010, or a digital key supplied between PS3 devices on a peer-to-peer basis that enables confirmation of status as an invitee.
In an embodiment of the present invention invited users can only enter the apartment when the apartment's user is present within the apartment, and are automatically returned to the lobby if the apartment's user leaves. Whilst within the apartment, all communication between the parties present (both user and positional data) is purely peer-to-peer.
I
The apartment thus also provides a user with the opportunity to share home created content such as artwork, slideshows, audio or video with invited guests, and also to interact with friends without potential interference from other users within the public zones.
When invited guests enter a useis apartment, the configuration of the room and the furnishings within it are transmitted in a peer-to-peer fashion between the attendees using ID codes for each object and positional data. Where a room or item are not held in common between the user and a guest, the model, textures and any code required to implement it on the guests PS3 may also be transmitted, together with a single-use key or similar constraint, such as use only whilst in the usei's apartment and whilst the user and guest remain online in this session.
Referring to Figure 8b, a further private space that may similarly be accessed only by invitation is the uset's Trophy Room 1060. The Trophy Room 1060 provides a space within which trophies 1062 earned during game play may be displayed.
For example, a thirdparty game comprises seeking a magical crystal. If the player is succeeds in finding the crystal, the third party game nominates this as a trophy for the Trophy Room 1060, and places a 3D model and texture representative of the crystal in a file area accessed by the Home environment software when loading the Trophy Room 1060. The software implementing the Home environment can then render the crystal as a trophy within the Trophy Room.
When parties are invited to view a use?s trophy room, the models and textures required to temporarily view the trophies are sent from the uset's PS3 to those of the other parties on a peer-to-peer basis. This may be done as a background activity following the initial invitation, in anticipation of entering the trophy room, or may occur when parties enter a connecting tunnel / anteroom or select the usei's trophy room from the map screen.
Optionally, where another party also has that trophy, they will not download the corresponding trophy from the user they are visiting. Therefore, in an embodiment of the present invention, each trophy comprises an identifying code.
Alternatively or in addition, a trophy room may be shared between members of a group or so-called clarf, such that a trophy won by any member of the clan is transmitted to other members of the clan on a peer-to-peer basis. Therefore all members of the clan will see a common set of trophies.
Alternatively or in addition, a user can have a standing invitation to all members of the Home environment, allowing anyone to visit their trophy room. As with the commercial and developer/publisher zones, a plurality of rooms is therefore possible, for example a private, a group-based and a public trophy room. This may be managed either by selection from a pop-up menu or signposts within the Home environment as described previously, or by identifying relevant user by walking up to their avatar, and then selecting to enter their (public) trophy room upon using the trophy room exit from the lobby.
Alternatively or in addition, a public trophy room may be provided. This room may display the trophies of the person in the current instance of the Home environment who has the most trophies or a best overall score according to a trophy value scoring scheme.
Alternatively it may be an aggregate trophy room, showing the best, or a selection of, trophies from some or all of the users in that instance of the Home environment, together with the ID io of the user. Thus, for example, a user could spot a trophy from a game they are having difficulty with, identify who in the Home environment won it, and then go and talk to them about how they won it. Alternatively, a public trophy room could contain the best trophies across a plurality of Home environments, identifying the best garners within a geographical, age specific or game specific group, or even world wide. Alternatively or in addition, a leader board of the best scoring garners can be provided and updated live.
It will be appreciated that potentially a large number of additional third party zones may become available, each comprising additional 3D models, textures and control software.
As a result a significant amount of space on HDD 400 may become occupied by Home environment zones.
Consequently, in an embodiment of the present invention the number of third party zones currently associated with a usel's Home environment can be limited. In a first instance, a maximum memory allocation can be used to prevent additional third party zones being added until an existing one is deleted. Alternatively or in addition, third party zones may be limited according to geographical relevance or user interests (declared on registration or subsequently via an interface with the Home environment server 2010), such that only third party zones relevant to the user by these criteria are downloaded. Under such a system, if a new third party zone becomes available, its relevance to the user is evaluated according to the above criteria, and if it is more relevant than at least one of those currently stored, it replaces the currently least relevant third party zone stored on the user's PS3.
Other criteria for relevance may include interests or installed zones of nominated friends, or the relevance of zones to games or other media that have been played on the user's PS3.
Further zones may be admitted according to whether the user explicitly installs them, either by download or by disk.
As noted above, within the Home environment users are represented by avatars. The software implementing the Home environment enables the customisation of a useis avatar from a selection of pre-set options in a similar manner to the customisation of the usei's apartment. The user may select gender and skin tone, and customise the facial features and hair by combining a'ailable options for each. The user may also select from a wide range of clothing. To support this facility, a wide range of 3D models and textures for avatars are provided. In an embodiment of the present invention, user may import their own textures to display on their clothing. Typically, the parameters defining the appearance of each avatar only occupy around 40 bytes, enabling fast distribution via the home server when joining a populated Home environment.
Each avatar in the home environment can be identified by the uset's ID or nickname, displayed in a bubble above the avatar. To limit the proliferation of bubbles, these fade into view when the avatar is close enough that the text it contains could easily be read, or alternatively when the avatar is close enough to interact with and/or is close to the centre of the uset's viewpoint.
The avatar is controlled by the user in a conventional third-person gaming manner (e.g. using the game controller 751), allowing them to walk around the Home environment.
Some avatar behaviour is contextual; thus for example the option to sit down will only be available when the avatar is close to a seat. Other avatar behaviour is available at all times, such as for example the expression of a selected emotion or gesture, or certain communication options. Avatar actions are determined by use of the game controller 751,either directly for actions such as movement, or by the selection of actions via a pop-up menu, summoned by pressing an appropriate key on the game controller 751.
Options available via such a menu include further modification of the avatat's appearance and clothing, and the selection of emotions, gestures and movements. For example, the user can select that their avatar smiles, waves and jumps up and down when the user sees someone they know in the Home environment.
Users can also communicate with each other via their avatars using text or speech.
To communicate by text, in an embodiment of the present invention, messages appear in pop-up bubbles above the relevant avatar, replacing their name bubble if necessary.
Referring now also to Figure 9, to generate a message the user can activate a pop-up menu 1070 in which a range of preset messages is provided. These may be complete messages, or alternatively or in addition may take the form of nested menus, the navigation of which generates a message by concatenating selected options.
Alternatively or in addition, a virtual keyboard may be displayed, allowing free generation of text by navigation with the game controller 751. If a real keyboard 753 is connected via Bluetooth, then text may by typed into a bubble directly.
In an embodiment of the present invention, the lobby also provides a chat channel hosted by the Home environment server, enabling conventional chat facilities.
To communicate by speech, a user must have a microphone, such as a Bluetooth headset 757, available. Then in an embodiment of the present invention, either by selection of a speech option by pressing a button on the game controller 751, or by use of a voice activity detector within the software implementing the Home environment, the user can speak within the Home environment. When speaking, a speech icon may appear above the head of the avatar for example to alert other users to adjust volume settings if necessary.
The speech is sampled by the uset's PS3, encoded using a Code Excited Linear Prediction (CELP) codec (or other known VoIP applicable codec), and transmitted in a peer-to-peer fashion to the eight nearest avatars (optionally provided they are within a preset area within the virtual environment surrounding the usei's avatar). Where more than eight other avatars are within the preset area, one or more of the PS3s that received the speech may forward it to other PS3s having respective user avatars within the area that did not receive the speech, in an ad-hoc manner. To co-ordinate this function, in an embodiment of the present invention the PS3 will transmit a speech flag to all PS3s whose avatars are within the present area, enabling them to place a speech icon above the relevant (speaking) avatars head (enabling their user to identify the speaker more easily) and also to notify the PS3s of a transmission. Each PS3 can determine from the relative positions of the avatars which ones will not receive the speech, and can elect to forward the speech to the PS3 of whichever avatar they are closest to within the virtual environment. Alternatively, the PS3s within the area can ping each other, and whichever PS3 has the lowest lag with a PS3 that has not received the speech can elect to forward it.
It will be appreciated that the limitation to eight is exemplary, and the actual number depends upon such factors as the speech compression ratio and the available bandwidth.
In an embodiment of the present invention, such speech can also be relayed to other networks, such as a mobile telephony network, upon specification of a mobile phone number.
This may be achieved either by routing the speech via the Home environment server to a gateway server of the mobile network, or by Bluetooth transmission to the usei's own mobile phone. In this latter case, the mobile phone may require middleware (e.g. a Java applet) to interface with the PS3 and route the call.
Thus a user can contact a person on their phone from within the Home environment.
In a similar manner, the user can also send a text message to a person on their mobile phone.
In a similar manner to speech, in an embodiment of the present invention users whose PS3s are equipped with a video camera such as the Sony Eye Toy video camera can use a video chat mode, for example via a pop-up screen, or via a TV or similar device within the Home environment, such as a Sony Playstation Portable (PSP) held by the avatar. In this case video codecs are used in addition to or instead of the audio codecs.
Optionally, the avatars of users with whom you have spoken recently can be highlighted, and those with whom you have spoken most may be highlighted more prominently, for example by an icon next to their name, or a level of glow around their avatar.
Referring back to Figure 5, when a user selects to activate the Home environment on their PS3 10, the locally stored sofiware generates the graphical representation of the Home environment, and connects to a Home environment server 2010 that assigns the user to one of a plurality of online Home environments 2021, 2022, 2023, 2024. Only four home environments are shown for clarity.
It will be understood that potentially many tens of thousands of users may be online at any one time. Consequently to prevent overcrowding, the Home environment server 2010 will support a large plurality of separate online Home environments. Likewise, there may be many separate Home environment servers, for example in different countries.
Once assigned to a Home environment, a PS3 initially uploads information regarding the appearance of the avatar, and then in an ongoing fashion provides to the Home environment server with positional data for its own avatar, and receives from the Home environment server the positional data of the other avatars within that online Home environment. In practice this positional update is periodic (for example every 2 seconds) to limit bandwidth, so other PS3s must interpolate movement. Such interpolation of character movement is well-known in on-line games. In addition, each update can provide a series of positions, improving the replication of movement (with some lag), or improving the extrapolation of current movement.
In addition the IP addresses of the other PS3s 2131, 2032, 2033 within that Home environment 2024 is shared so that they can transmit other data such as speech in a peer-to-peer fashion between themselves, thereby reducing the required bandwidth of data handled by the Home entertainment server.
To prevent overcrowding within the Home environments, each will support a maximum of, for example, 64 users.
The selection of a Home environment to which a user will be connected can take account of a number of factors, either supplied by the PS3 and/or known to the Home environment server via a registration process. These include but are not limited to: i. The geographical location of the PS3; ii. The user's preferred language; iii. The usei'sage; iv. Whether any users within the current usei's'friends list are in a particular Home environment already; v. What game disk is currently within the user's PS3; vi. What games have recently been played on the user's PS3.
Thus, for example, a Swiss teenager may be connected to a Home environment on a Swiss server, with a maximum user age of 16 and a predominant language of French. In another example, a user with a copy of Revolution' mounted in their PS3 may be connected to a home environment where a predominant number of other users also currently have the same game mounted, thereby facilitating the organisation of multiplayer games. In this latter case, the PS3 10 detects the game loaded within the BD-Rom 430 and informs the Home environment server 2010. The server then chooses a Home environment accordingly.
In a further example, a user is connected to a Home environment in which three users identified on his friends list can be found. In this latter example, the friends list is a list of user names and optionally IP addresses that have been received from other users that the user given wishes to meet regularly. Where different groups of friends are located on different Home environment servers (e.g. where the current user is the only friend common to both sets) then the user may either be connected to the one with the most friends, or given the option to choose.
Conversely, a user may invite one or more friends to switch between Home environments and join them. In this case, the user can view their friends list via a pop-up menu or from within the Home environment (for example via a screen on the wall or an information booth) and determine who is on-line. The user may then broadcast an invite to their friends, either using a peer-to-peer connection or, if the friend is within a Home environment or the IP address is unknown, via the Home environment server. The friend can then accept or decline the invitation to join.
To facilitate invitation, generally a Home environment server will assign less than the maximum supported number of users to a specific home environment, thereby allowing such additional user-initiated assignments to occur. This so-called soft-limit may, for example, be 90% of capacity, and may be adaptive, for example changing in the early evening or at weekends where people are more likely to meet up with friends on-line.
Where several friends are within the same Home environment, in an embodiment of the present invention the map screen may also highlight those zones in which the friends can currently be found, either by displaying their name on the map or in association with the zone name on the side bar.
Referring now also to Figure 10, in addition, preferences, settings, functions of the Home environment and optionally other functionality may be viewed, adjusted or accessed as appropriate by use of a virtual Sony Playstation Portable (PSP) entertainment device 1072 that can be summoned by use of the game controller 751 to pop-upon screen. The user can then access these options, settings and functionality via a PSP cross-media bar 1074 displayed on the virtual PSP. As noted above, the PSP could also be used as an interface for video chat.
When a user wishes to leave the Home environment, in embodiments of the present invention they may do so by selection of an appropriate key on the game controller 751, by selection of an exit option from a pop-up menu, by selection of an exit from within the map screen, by selection of an option via their virtual PSP or by walking through a master exit within the lobby zone.
Typically, exiting the Home environment will cause the PS3 10 to return to the PS3 cross media bar.
Finally, it will be appreciated that additional, separate environments based upon the Home environment software and separately accessible from the PS3 cross-media bar are envisaged. For example, a supermarket may provide a free disk upon which a Supermarket environment, supported in similar fashion by the Home environment servers, is provided.
Upon selection, the uset's avatar can browse displayed goods within a virtual rendition of the supermarket (either as 3D models or textures applied to shelves) and click on them to purchase as described above. In this way retailers can provide and update online shopping facilities for their own user base.
The integration of third party software into the Home environment developer or publisher zones 1030 will now be described, in particular, the operation of the arena area 1036 shown in Figure 6d will be described in more detail.
As described above, the arena area 1036 may be a ring, dojo, court or the like in which remotely played games may be represented live by avatars 1038, for spectators to watch.
Alternatively, spectators may watch game action of a remotely played game at a particular time as determined by a user of the entertainment device 10 implementing the Home environment or in accordance with other criteria.
Figure 11 shows a plurality of entertainment devices according to an embodiment of the present invention that are operable to communicate with a network server 3400. The network server 3400 comprises a processor 3410 and a memory 3420. The memory 3420 may comprise any or all of: a hard disk drive; RAM; and PROM. it will be appreciated that the memory 3420 can be any memory device suitable for use in a network server. Optionally, the network server 3400 may comprise the Home environment server 2010 or the Home environment server 2010 may comprise the network server 3400.
to The network server 3400 is operable to communicate with the entertainment device 3100, the entertainment device 3200 and the entertainment device 3300 each using a respective data communication link (3520, 3530, and 3540 respectively). Typically the entertainment devices 3100, 3200 and 3300 are PS3 devices. Each entertainment device comprises a memory (3130, 3230 and 3330 respectively) that is operable to store data that is transmitted and/or received to and/or from the network server 3400 using at least one of the data communication links. This will be described in more detail later.
Additionally, each entertainment device 3100, 3200 and 3300 comprises a processor (3120, 3220 and 3320 respectively) that is operable to execute game program code (Game code 3122, 3222 and 3322 respectively), and implement and execute virtual environment program code (VE code 3124, 3224 and 3324 respectively) that comprises software operable to implement the Home environment on the PS3.
The operation of the client server arrangement shown in Figure 11 will now be described.
In the embodiment shown in Figure 11, a player controlling the actions of a game character (Player A) within, for example, a fighting game or third party game executed by the game code 3 122 of the entertainment device 3100 is playing against a second player who is controlling the actions of a different game character (Player B) within that same fighting game executed by the game code 3222 of the entertainment device 3200. In order to enable the game characters of the two players to interact within the game environment, the entertainment devices 3100 and 3200 communicate via the communications link 3510 using a peer-to-peer protocol to exchange game action data. Typically, the game action data comprises graphics data indicative of graphical content of the game action data. For example, entertainment device 3100 generates game action data and transmits it via the data communications link 3510 to entertainment device 3200 which receives the game action data.
Alternatively, entertainment device 3200 transmits and entertainment device 3100 receives the game action data.
Here the game code 3122 executed on entertainment device 3100 is the same as the game code 3222 being executed on entertainment device 3200. However, it will be appreciated that the game code need not necessarily be the same. For example, the game code could be associated with the same game having the same game character actions and scenery, but the versions of the game could be different as long as the game play is substantially the same between the two and the versions share a common games engine. Here, the term "games engine' is taken to mean the program code necessary to allow the processor 3120 or processor 3220 to control how game characters move, to generate audio material, to render the game action within the game and the like.
The processor 3120 and/or the processor 3200 are operable to generate the game action data from the game action of the game. Game action may relate to any or all of: game action performed by a game character or game characters within the game; game action relating to the position and movement of game objects within the game; and interaction of the game character or game characters with game objects within the game.
Additionally, the game action data may comprise event synchronisation data that is synchronised by the processor 3120 and/or the processor 3220 with the game action from which the game action data is generated. The event synchronisation data is used by the entertainment devices 3100 and 3200 to synchronise the game action data such that, for example, when the game character of the player using the entertainment device 3100 hits the game character of the player using the entertainment device 3200, the game action on each device is displayed to the players substantially simultaneously.
In the embodiment shown in Figure 11, the software implementing the Home environment is executed by the processor 3320 of the entertainment device 3300. In order that the game action of the game may be rendered within the Home environment of the entertainment device 3300, the entertainment devices 3100 and 3200 transmit the game action data generated during the playing of the game to the network server 3400 via the data communication links 3520 and 3530. For example, the game action data is transmitted to the entertainment device 3300 using a client server protocol such as the transport control protocol (TCP) or the user datagram protocol (UDP). The network server 3400 is operable to receive the game action data that is transmitted from the entertainment devices using a communications interface operable to receive data from the data communication links 3520 and 3520. The network server is then transmits the received game action data to entertainment device 3300 using a data transmission interface operable to transmit data to the entertainment device 300 via communications link 3540.
In an embodiment of the present invention, the network server is operable to stream the game action data to the entertainment device 3300 substantially in real time such that the game action of the game associated with the entertainment devices 3100 and 3200 is rendered at substantially the same time within the Home environment associated with the entertainment device 3300. Alternatively, the network server can store the received game action in the memory 3420. Then, at a particular time, the server transmits the game action data to the entertainment device 3300 and the entertainment device 3300 can render the game action io associated with the game action data as game action within the Home environment. For example, the network server could store game action data that relates to a particularly exciting game that took previously place between players using the entertainment devices 3100 and 3200.
At a predetermined time, the network server then transmits that game action data to s the entertainment device 3300. Alternatively, the user of entertainment device 3300 can choose when they would like to watch that game action. Optionally, the network server 3400 only transmits the game action data to the entertainment device 3300 if the user of the entertainment device 3300 has paid to download that content from the server. In the embodiment shown in Figure II, the processor 3410 of the network server 3400 determines whether the user has paid to view that game action. Alternatively, this function may be performed by the processor 3320 of the entertainment device 3300. Online download transactions are well known in the art and will not be described in further detail here.
In an embodiment of the present invention, the entertainment devices 3100 and 3200 may only transmit game action data to the server 3400 that relates to game characters associated with each respective entertainment device. For example, entertainment device 3100 may only transmit game action data relating to Player A and entertainment device 3200 may only transmit game action data relating to Player B. The processor 3410 of the network server 3400 then synchronises the game action data received from the entertainment devices based on synchronisation data supplied by the respective entertainment device and associated with the game action data. The server 3400 can then transmit the synchronised game action data to the entertainment device 3300 such that the synchronised game action data can be rendered as game action within the Home environment of the entertainment device 3300.
Figure 12 shows an embodiment of the invention in which the game action data is transmitted from the network server 3400 to the entertainment device 3200 via a communications link 4100. For example, Player A using entertainment device 3100 could play Player B (using entertainment device 3200) at a fighting game and be ignominiously defeated. The entertainment devices 3100 and 3200 generate and transmit the relevant game action data to the network server 3400 which stores the received game action data in the memory 3420. At a later time Player B may wish to visit the developer zone 1030 of the Home environment associated with VE code 3224 executing on the entertainment device 3200. The RSX 200 may then render the game action using the entertainment device 3200 in dependence upon the game action data transmitted from the server 3400 via the communications link 4100 so that Player B may view the game action of their glorious victory over Player A within the arena area 1036 of the developer zone 1030 whenever they so desire. Furthermore, avatars of other users may view Player B's glorious victory within developer zone 1030 of the Home environment.
Figure 13 show a further embodiment of the present invention. Here, the entertainment devices 3100, 3200 and 3300 communicate using a peer-to-peer communications protocol via the data communication links 3510, 5100 and 5200. As described above Player A may, for example, play against Player B using the entertainment devices 3100 and 3200 respectively.
In this embodiment, the game action data is transmitted from entertainment devices 3100 and 3200 via the data communication links 5100 and 5200 and is received by the entertainment device 3300. The processor 3320 of the entertainment device 3300 then generates the game action from the received game action data. As described above, in order that the game action may be generated correctly, the game action data transmitted from the entertainment device 3100 and the entertainment device 3200 comprises synchronisation data. The game action may then be generated by the processor 3320 and rendered within the Home environment associated with the entertainment device 3300 using the VE code 3324 in dependence upon the synchronisation data.
Alternatively, the entertainment device 3100 or entertainment device 3200 generates the game action data based on the synchronisation data transmitted via the data communications link 3510 during the playing of the game. The entertainment device 3100 (3200) can then transmit the game action data via the data communications link 5100 (5200) to the entertainment device 3300 such that the entertainment device 3300 can render the game action within the Home environment substantially in real time as described above.
Optionally, the entertainment devices 3100 and 3200 may store the game action data in their respective memories 3130 and 3230 and transmit the game action data to the 30 entertainment device 3300 at a time determined by the user of the entertainment device 3300 or a time specified by some other user.
As described above the game action data comprises graphics data indicative of the graphical content of the game action data. In an embodiment of the present invention, the s graphical content of the game action data comprises a two-dimensional representation of the game action. The two dimensional representation of the game action can be generated by the processor 3120, 3220 or 3320 of one of the entertainment devices 3100, 3200, or 3300 respectively. Typically, the processor generates the two-dimensional representation of the game action using a predetermined projection of the game action with respect to a virtual camera position within the game. This could be, for example, a virtual view that a virtual spectator within the third party game has of the game action.
Alternatively, the graphical content of the game action data comprises a three-dimensional representation of the game action generated by the processor (3120, 3220 or 3320). Here, the three dimensional graphics data may comprise any or all of: mesh geometry data that relates to the shape of virtual objects within the game; transform hierarchy data that relates to virtual spatial transformations of virtual objects within the game; shading data that relates to the virtual shading of virtual objects within the game; texture data that relates to a virtual texture of virtual objects within the game; physics data that relates to the interaction of virtual objects within the game; and animation data that relates to the evolution of attributes of virtual objects within the game with respect to time. It will be appreciated that the processors 3120 and 3220 could collaborate via the data communication link 3510 to generate the graphical content of the game action data or that a plurality of entertainment devices could collaborate via a data communications link to generate the game action data.
The file format of the game action data will now be described with reference to Figure 14. Figure 14 shows a schematic representation of a file format of an embodiment of the present invention that is used to transfer 3D graphics information between a third party game and the developer zone 1030 of the Home environment. The 3D graphics file 6100 comprises a header 6110 and 3D graphics data 6120 that relate to the game action of the third party game.
The header 6110 contains data about the contents of the data and the file format used to create the 3D graphics. For example, third party games could generate 3D graphics using OpenGL or Direct3D formats. Alternatively, an interchange file format for interactive 3D applications (e.g. the COLLADA format) can be used by the third party game to generate 3D graphics which can be rendered directly in the Home environment by the RSX 200 of the PS3 device.
The 3D graphics data 6120 of Figure 14 comprises data defining how to draw and render the 3D graphics that relate to game action of the game. The 3D graphics data includes data relating to mesh geometry (vertices and polygon formation etc.), transform hierarchy (rotation, translation, shear and scaling etc.), effects, shading, textures, physics (e.g. rigid bodies, constraints, rag doll models, and collision volumes), and animation data.
As described above, game action data is generated in third party games during the playing of that game either over a communications network as described above or on a single entertainment device. Accordingly, the third party game generates the 3D graphics data 6100 so that it can be transmitted to another entertainment device. Here, the header data 6110 may include metadata about the game action, date and time the game action took place and similar attributes. Additionally, the header 6110 may comprise the synchronisation data used to synchronise the game action data with the game action.
The rendering of the game action within the developer zone 1030 will now be described in more detail.
In the embodiments described above, the game action data comprises graphics data indicative of the graphical content of the game action data. In the case where the graphical content of the game action data is a two dimensional representation of the game action, the RSX 200 simply renders the game action on a virtual screen within the developer zone 1030.
Alternatively, when the graphical content of the game action data is a three dimensional representation of the game action of the third party game, the RSX 200 is operable to render the game action as a virtual three dimensional representation of the game action within the arena area 1036. Additionally, the RSX 200 is operable to render appearance attributes of game characters within the third party game Onto avatars 1038 within the developer zone 1030 of the Home environment.
It will be appreciated that the arena area of the developer zone may be of a finite virtual size. Therefore, the processor 100 is operable to cooperate with the RSX 200 such that the game action can be scaled to fitwithin the arena area 1036. Typically, this is accomplished by applying a geometric scaling transformation to the three dimensional graphics data in accordance with techniques well known in the art.
Alternatively, the processor 100 cooperates with the RSX 200 to determine the virtual physical extent of the game action generated by the third party game. The processor then applies a scaling transformation to the arena area such that the game action will not substantially exceed that virtual area. To achieve this, the processor buffers the game action and reviews the game action to determine the virtual physical extent of the game action. An appropriate scaling transformation is then applied to the arena area 1036 in accordance with techniques well known in the art.
Additionally, the game action data may comprise audio material generated during the playing of the third party game. The entertainment apparatus can then reproduce the audio material when the game action is rendered within the Home environment.
In the case where the game action is rendered at a particular time (rather than viewed live as the game takes place between other players), in order to reduce the amount of data that has to be transferred over the data communications links when rendering the game action within the developer zone 1030, different levels of graphical detail can be used to render the game action generated by the third party game. According to an embodiment of the present invention, the different levels of detail can be rendered by applying varying the number of polygons that create the 3D mesh of the game action as shown in Figures 1 5a and I 5b.
is Figure 15a shows a collection of polygons that may be used to form part of the game action to be rendered. Here, textures are mapped onto the polygons so as to create the 3D image to be displayed. The polygons shown in Figure 15a represent a higher level of detail than that shown in Figure 1 5b. For example, the level of detail shown in Figure 1 5a is used when the avatar is within a predetermined distance of the arena area 1036 within the developer zone 1036. When the processor 100 determines that the avatar is further away from the trophy than the predetermined distance, the trophy need not be rendered in as much detail as that shown in Figure 1 5a and instead the level of detail shown in Figure 1 5b which uses fewer polygons to render the image is used. Therefore, the trophy will require fewer pixels to be calculated during the rendering process. The polygons that are used for the lower resolution image shown in Figure 15b have substantially the same shape and vertex points as the higher detail image of Figure 1 5a.
The use of different levels of detail as shown in Figures 15a and 15b can advantageously be used to reduce the amount of data that needs to be transferred on a peer-to-peer basis between users or server-client basis when a user enters the developer zone 1030.
On entry into the developer zone 1 030, a users avatar is likely to be some distance from the game action being rendered within the developer zone 1030. Therefore, the lower level of detail is initially used to render the game action so that the amount of data that needs to be transmitted is reduced. As the user's avatar moves around the developer zone 1030, data transfer of higher detail rendering models of the game action continues as a background process so that after some elapsed time, all the higher detail models will have been downloaded to that use?s PS3 device and stored on the HDD 400. Then, instead of using the transmitted game action data to render the game action, the RSX may use the game action data stored on the HDD 400 to render the game action.
Alternatively, the higher level of detail is not downloaded until a use?s avatar approaches within a first predetermined distance of the game action. Then, on approaching closer to the game action, when the avatar comes within a second predetermined distance of the trophy the higher level of detail is used to render the game action. Therefore, if the position of the usefs avatar does not come within the first predetermined distance, the higher detail image need never be downloaded and bandwidth costs are reduced.
Additionally, a lower resolution texture may also be mapped onto the polygons.
Therefore, bandwidth can be reduced either by reducing the number of polygons used to render the trophy, mapping a lower resolution texture onto the polygons, or a combination of both. Optionally, the image may be rendered at the higher resolution regardless of the position of the avatar with respect to the game action.
It will be appreciated that the above description need not apply only to the entertainment device, and that a method of causing the above described apparatus to operate as described above is envisaged as an embodiment of the present invention.
It will be appreciated that in embodiments of the present invention, elements of the above method may be implemented in the entertainment device in any suitable manner. Thus adapting existing parts of a conventional entertainment device may comprise for example reprogramming of one or more processors therein. As such the required adaptation may be implemented in the form of a computer program product comprising processor-implementable instructions stored on a data carrier such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or transmitted via data signals on a network such as an Ethernet, a wireless network, the internet, or any combination of these or other networks.
It will be appreciated that the term thtertainment device' can encompass various types of data processing apparatus, and in particular that the devices used to view the game action (as played by others) could be, for example, internet terminals or mobile telephones. 0 34

Claims (52)

1. An entertainment device operable to execute a game application program associated with a game, the device comprising: s processing means operable to generate game action data from game action of the game; and transmitting means operable to transmit the game action data such that the game action can be rendered within a virtual environment by a second entertainment device in data communication with the entertainment device, the virtual environment being associated with the second entertainment device, in which: the virtual environment is associated with a virtual environment application program that is different from the game application program; and the game action data comprises graphics data indicative of graphical content of the game action data.
2. A device according to claim I, in which: the game action data comprises event synchronisation data; and the processing means is operable to synchronise the event synchronisation data with the game action from which the game action data is generated.
3. A device according to claim 1 or claim 2, in which the transmitting means is operable to transmit the game action data to the second entertainment device using a peer-to-peer communication protocol.
4. A device according to claim 1 or claim 2, in which the transmitting means is operable to transmit the game action data to an entertainment network server operable to receive the game action data from the transmitting means and to transmit the game action data to the second entertainment device using a client server communication protocol.
5. A device according to any one of the preceding claims, in which the transmitting means is operable to transmit the game action data such that the second entertainment device can render the game action within the virtual environment substantially in real time.
6. A device according to any one of claims Ito 4, comprising storage means operable to store the game action data, and in which the processing means is operable to selectively trigger the transmission of the stored game action data at a game action data transmission time.
7. A device according to claim 6, in which the game action data transmission time is determined in accordance with a start time defined by a user of the entertainment device or the second entertainment device.
8. A device according to any one of the preceding claims, in which the processing means is operable to selectively trigger the transmission of the game action data in dependence upon a payment status of a user of the second entertainment device.
9. A device according to any one of the preceding claims, in which the graphical content of the game action data comprises a two-dimensional representation of the game action.
10. A device according to claim 9, in which the processing means is operable to generate the two-dimensional representation of the game action using a predetermined projection of the game action with respect to a virtual camera position within the game.
11. A device according to any one of claims I to 8, in which the graphical content of the game action data is a three-dimensional representation of the game action.
12. A device according to claim 11, in which the processing means is operable to generate the three-dimensional representation of the game action, and the graphics data comprises any or all of: mesh geometry data that relates to the shape of virtual objects within the game; transform hierarchy data that relates to virtual spatial transformations of virtual objects within the game; shading data that relates to the virtual shading of virtual objects within the game; texture data that relates to a virtual texture of virtual objects within the game; physics data that relates to the interaction of virtual objects within the game; and animation data that relates to the evolution of attributes of virtual objects within the game with respect to time.
13. A device according to any one of the preceding claims, in which the game action data comprises audio material.
14. A device according to any one of the preceding claims, in which the game action comprises any or all of: game action performed by a game character or game characters within the game; game action relating to the position and movement of game objects within the game; and interaction of the game character or game characters with game objects within the game.
15. A device according to any one of the preceding claims, in which the transmitting means is operable transmit the game application program to the second entertainment device, the game application program being associated with game engine program code and the game engine program code comprising game engine program instructions that relate to the rendering of the game action within the game associated with the entertainment device.
16. An entertainment device, comprising: receiving means operable to communicate with a second entertainment device using a data communications link and to receive, from the data communications link, game action data from the second entertainment device; processing means operable to generate game action of a game from the received game action data, the game being associated with the second entertainment device; and rendering means operable to render the game action within a virtual environment associated with the entertainment device in dependence upon the game action generated by the processing means, in which: the game is associated with a game application program and the virtual environment is associated with a virtual environment application program that is different from the game application program; and the game action data comprises graphics data indicative of graphical content of the game action data.
17. A device according to claim 16, in which: the receiving means is operable to communicate with at least a third entertainment device and receive, using the communications link, second game action data generated by the third entertainment device; the game is also associated with the third entertainment device; the game action data and the second game action data comprise synchronisation data; and the processing means is operable to synchronise the first game action data with the second game action data usmg the synchronisation data and generate the game action in to dependence upon the received game action data and the received second game action data.
18. A device according to claim 16 or claim 17, in which the communication link is operable to communicate between the entertainment device and the second entertainment device using a peer-to-peer communication protocol.
19. A device according to claim 16 or claim 17, in which the communication link comprises an entertainment network server operable to receive the game action data from the second entertainment device, and the communication link is operable to communicate using a client server communication protocol.
20. A device according to any one of claims 16 to 19, in which the game action data is received from the communications link such that the game action can be rendered within the virtual environment by the rendering means substantially in real time.
21. A device according to any one of claims 16 to 19, comprising storage means operable to store the received game action data, and in which the rendering means is operable to render the game action within the virtual environment in accordance with the received game action data stored in the storage means.
22. A device according to any one of claims 16 to 21, in which the receiving means is operable to selectively receive the game action data in dependence upon a payment status of a user of the entertainment device.
23. A device according to any one of claims 16 to 22, in which the graphical content of the game action data comprises a two-dimensional representation of the game action.
24. A device according to claim 23, in which the rendering means is operable to render the s two-dimensional representation of the game action on a virtual screen within the virtual environment.
25. A device according to any one of claims 16 to 22, in which the graphical content of the game action data comprises a three-dimensional representation of the game action. I0
26. A device according to claim 25, in which the rendering means is operable to render the three-dimensional representation of the game action within the virtual environment, and the graphics data comprises any or all of: mesh geometry data that relates to the shape of virtual objects within the game; transform hierarchy data that relates to virtual spatial transformations of virtual objects within the game; shading data that relates to the virtual shading of virtual objects within the game; texture data that relates to a virtual texture of virtual objects within the game; physics data that relates to the interaction of virtual objects within the game; and animation data that relates to the evolution of attributes of virtual objects within the game with respect to time.
27. A device according to claim 25 or 26, in which the rendering means is operable to render the game action within a predefined virtual game action display area within the virtual environment.
28. A device according to claim 27, in which the processing means is operable to cooperate with the rendering means so as to apply a scaling transformation to the graphics data such that the rendering means is operable to render the game action substantially within the virtual game action display area.
29. A device according to claim 27, in which the processing means is operable to cooperate with the rendering means so as to apply a scaling transformation to the virtual game action display area such that the rendering means is operable to render the game action substantially within the virtual game action display area.
30. A device according to any one of claims 16 to 29, comprising audio reproduction means operable to reproduce audio material, and in which: the game action data comprises audio material; and the audio reproduction means is operable to reproduce the audio material of the received game action data.
31. A device according to any one of claims 16 to 30, in which the rendering means is operable to render the game action within the virtual environment in dependence upon game engine program code associated with the game application program, the game engine program code comprising game engine program instructions that relate to the rendering of the game action within the game associated with the second entertainment device.
32. A device according to claim 31, in which the entertainment device comprises a removable storage medium reading means operable to read information stored on a removable storage medium, and in which the removable storage medium comprises the game engine program code.
33. A device according to claim 32, in which the removable storage medium comprises a demonstration version of the game.
34. A device according to claim 32, in which the removable storage medium comprises a full version of the game.
35. A device according to claim 31, in which the entertainment device receives, from the communication link, the game engine program code sent by the second entertainment device.
36. An entertainment network server, comprising: receiving means operable to receive game action data from one or more entertainment devices, the game action data being generated by the one or more entertainment devices in dependence upon game action of a game associated with the one or more of the entertainment devices; f 40 storage means operable to store the game action data received from the entertainment device; and transmitting means operable to transmit the stored game action data to at least one recipient entertainment device such that the game action data can be rendered as game action s within a virtual environment associated with the recipient entertainment device, in which: the game is associated with a game application program and the virtual environment is associated with a virtual environment application program that is different from the game application program; and the game action data comprises graphics data indicative of graphical content of the game action data.
37. A server according to claim 36, comprising processing means operable to control the operation of any or all of: IS the receiving means; the storage means; the transmitting means.
38. A server according to claim 36 or claim 37, in which: the game action data comprises synchronisation data; the processing means is operable to the synchronise the game action data received from a first entertainment device with game action data received from a second entertainment device so as to generate synchronised game action data; and the transmitting means is operable to transmit the synchronised game action data to the recipient entertainment device such that the synchronised game action data can be rendered as game action within the virtual environment associated with the recipient entertainment device.
39. A server according to claim 37 or claim 38, in which the processing means is operable to selectively trigger the transmission of the game action data at a game action data transmission time.
40. A server according to claim 38, in which the game action data transmission time is determined in accordance with a start time defined by a user.
41. A server according to any one of claims 37 to 40, in which the processing means is operable to selectively trigger the transmission of the game action data in dependence upon a payment status of a user of the recipient device.
42. A server according to any one of claims 37 to 40, in which the processing means is operable to control the receiving means and the transmitting means such that the game action data is streamed to the recipient device and the game action can be rendered within the virtual environment by the rendering means of the recipient device substantially in real time. I0
43. A data carrier comprising computer readable instructions that when executed by a computer, cause the computer to operate as an entertainment device in accordance with any one of claims I to 35 or a server in accordance with any one of claims 36 to 42.
44. A method of transmitting game action data using an entertainment device operable to execute a first application program associated with a game, the method comprising: generating game action data from game action of the game; and transmitting the game action data such that the game action can be rendered within a virtual environment by a second entertainment device in data communication with the entertainment device, the virtual environment being associated with the second entertainment device, in which: the virtual environment is associated with a second application program that is different from the first application program; and the game action data comprises graphics data indicative of graphical content of the game action data.
45. A method of receiving game action data using an entertainment device having receiving means operable to communicate with a second entertainment device via a communications link, the method comprising: receiving, from the communications link, game action data generated by the second entertainment device; generating game action of a game from the received game action data, the game being associated with the second entertainment device; and S 42 rendering the game action within a virtual environment associated with the entertainment device in dependence upon the game action generated by the processing means, in which: the game is associated with a first application program and the virtual environment is associated with a second, different, application program; and the game action data comprises graphics data indicative of graphical content of the game action data.
46. A method of communicating game action data using an entertainment network server, the method comprising: receiving game action data from one or more entertainment devices, the game action data being generated by the one or more entertainment devices in dependence upon game action of a game associated with the one or more of the entertainment devices; storing the game action data received from the entertainment device; and transmitting the stored game action data to at least one recipient entertainment device such that the game action data can be rendered as game action within a virtual environment associated with the recipient entertainment device, in which: the game is associated with a first application program and the virtual environment is associated with a second, different, application program; and the game action data comprises graphics data indicative of graphical content of the game action data.
47. A data carrier comprising computer readable instructions that when executed by a computer, cause the computer to carry out the method in accordance with claim 45, 46 or 47.
48. An entertainment device substantially as described herein with reference to the accompanying drawings.
49. An entertainment network server substantially as described herein with reference to the accompanying drawings.
50. A method of transmitting game action data using a first entertainment device, the method substantially as described herein with reference to the accompanying drawings.
51. A method of receiving game action data using a second entertainment device, the method substantially as described herein with reference to the accompanying drawings.
52. A method of communicating game action data using an entertainment network server, the method substantially as described herein with reference to the accompanying drawings. I0
GB0704227A 2007-03-01 2007-03-05 Transmitting game data from an entertainment device and rendering that data in a virtual environment of a second entertainment device Withdrawn GB2447020A (en)

Priority Applications (15)

Application Number Priority Date Filing Date Title
GB0704227A GB2447020A (en) 2007-03-01 2007-03-05 Transmitting game data from an entertainment device and rendering that data in a virtual environment of a second entertainment device
EP08730776A EP2132650A4 (en) 2007-03-01 2008-02-26 System and method for communicating with a virtual world
PCT/US2008/055037 WO2008109299A2 (en) 2007-03-01 2008-02-26 System and method for communicating with a virtual world
JP2009551806A JP2010533006A (en) 2007-03-01 2008-02-26 System and method for communicating with a virtual world
JP2009551722A JP2010535362A (en) 2007-03-01 2008-02-27 Monitoring the opinions and reactions of users in the virtual world
EP08726219A EP2118757A4 (en) 2007-03-01 2008-02-27 Virtual world avatar control, interactivity and communication interactive messaging
PCT/US2008/002630 WO2008108965A1 (en) 2007-03-01 2008-02-27 Virtual world user opinion & response monitoring
PCT/US2008/002643 WO2008106196A1 (en) 2007-03-01 2008-02-27 Virtual world avatar control, interactivity and communication interactive messaging
EP08726220A EP2118840A4 (en) 2007-03-01 2008-02-27 Interactive user controlled avatar animations
PCT/US2008/002644 WO2008106197A1 (en) 2007-03-01 2008-02-27 Interactive user controlled avatar animations
JP2009551727A JP2010535364A (en) 2007-03-01 2008-02-27 Interactive user-controlled avatar animation
JP2009551726A JP2010535363A (en) 2007-03-01 2008-02-27 Virtual world avatar control, interactivity and communication interactive messaging
EP08726207A EP2126708A4 (en) 2007-03-01 2008-02-27 Virtual world user opinion&response monitoring
PCT/GB2008/000680 WO2008104783A1 (en) 2007-03-01 2008-02-29 Entertainment device and method
JP2014039137A JP5756198B2 (en) 2007-03-01 2014-02-28 Interactive user-controlled avatar animation

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US89239707P 2007-03-01 2007-03-01
GBGB0703974.6A GB0703974D0 (en) 2007-03-01 2007-03-01 Entertainment device
GB0704227A GB2447020A (en) 2007-03-01 2007-03-05 Transmitting game data from an entertainment device and rendering that data in a virtual environment of a second entertainment device

Publications (2)

Publication Number Publication Date
GB0704227D0 GB0704227D0 (en) 2007-04-11
GB2447020A true GB2447020A (en) 2008-09-03

Family

ID=39462022

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0704227A Withdrawn GB2447020A (en) 2007-03-01 2007-03-05 Transmitting game data from an entertainment device and rendering that data in a virtual environment of a second entertainment device

Country Status (2)

Country Link
GB (1) GB2447020A (en)
WO (1) WO2008104783A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010063100A1 (en) * 2008-12-01 2010-06-10 Nortel Networks Limited Method and apparatus for providing a video representation of a three dimensional computer-generated virtual environment
WO2010125435A2 (en) * 2009-04-27 2010-11-04 Jagex Ltd. Position tracking in a virtual world
US9901822B2 (en) 2014-01-09 2018-02-27 Square Enix Holding Co., Ltd. Video gaming device with remote rendering capability
US10489795B2 (en) 2007-04-23 2019-11-26 The Nielsen Company (Us), Llc Determining relative effectiveness of media content items

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0703974D0 (en) 2007-03-01 2007-04-11 Sony Comp Entertainment Europe Entertainment device
EP2375694B1 (en) 2010-03-31 2018-10-24 Sony Interactive Entertainment Europe Limited Networking system and method for a social networking server, client and entertainment device
US10898802B2 (en) * 2018-05-31 2021-01-26 Sony Interactive Entertainment LLC Bifurcation of shared controls and passing controls in a video game

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0752678A2 (en) * 1995-06-30 1997-01-08 Sony Corporation Apparatus and method for executing game programs having advertisements therein
WO2001085293A1 (en) * 2000-05-10 2001-11-15 Simation, Inc. Method and system for providing a dynamic virtual environment using data streaming
US20020183115A1 (en) * 2001-05-30 2002-12-05 Konami Conputer Entertainment Osaka, Inc. Server device for net game, net game management method, net game management program and recording medium which stores net game management program
US20030038805A1 (en) * 2001-08-22 2003-02-27 Wong Curtis G. System and method to provide a spectator experience for networked gaming
GB2409417A (en) * 2003-12-22 2005-06-29 Nokia Corp Online gaming with spectator interaction
EP1637197A1 (en) * 2004-09-15 2006-03-22 Microsoft Corporation Online gaming spectator system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060129632A1 (en) * 2004-12-14 2006-06-15 Blume Leo R Remote content rendering for mobile viewing

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0752678A2 (en) * 1995-06-30 1997-01-08 Sony Corporation Apparatus and method for executing game programs having advertisements therein
WO2001085293A1 (en) * 2000-05-10 2001-11-15 Simation, Inc. Method and system for providing a dynamic virtual environment using data streaming
US20020183115A1 (en) * 2001-05-30 2002-12-05 Konami Conputer Entertainment Osaka, Inc. Server device for net game, net game management method, net game management program and recording medium which stores net game management program
US20030038805A1 (en) * 2001-08-22 2003-02-27 Wong Curtis G. System and method to provide a spectator experience for networked gaming
GB2409417A (en) * 2003-12-22 2005-06-29 Nokia Corp Online gaming with spectator interaction
EP1637197A1 (en) * 2004-09-15 2006-03-22 Microsoft Corporation Online gaming spectator system

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10489795B2 (en) 2007-04-23 2019-11-26 The Nielsen Company (Us), Llc Determining relative effectiveness of media content items
US11222344B2 (en) 2007-04-23 2022-01-11 The Nielsen Company (Us), Llc Determining relative effectiveness of media content items
WO2010063100A1 (en) * 2008-12-01 2010-06-10 Nortel Networks Limited Method and apparatus for providing a video representation of a three dimensional computer-generated virtual environment
WO2010125435A2 (en) * 2009-04-27 2010-11-04 Jagex Ltd. Position tracking in a virtual world
WO2010125435A3 (en) * 2009-04-27 2010-12-23 Jagex Ltd. Position tracking in a virtual world
US8441486B2 (en) 2009-04-27 2013-05-14 Jagex Ltd. Position tracking in a virtual world
US9203880B2 (en) 2009-04-27 2015-12-01 Jagex Ltd. Position tracking in a virtual world
US9901822B2 (en) 2014-01-09 2018-02-27 Square Enix Holding Co., Ltd. Video gaming device with remote rendering capability

Also Published As

Publication number Publication date
WO2008104783A1 (en) 2008-09-04
GB0704227D0 (en) 2007-04-11

Similar Documents

Publication Publication Date Title
US9259641B2 (en) Entertainment device and method
EP2131934B1 (en) Entertainment device and method
EP2131935B1 (en) Apparatus and method of data transfer
US9345970B2 (en) Switching operation of an entertainment device and method thereof
EP2044987B1 (en) Apparatus and method of on-line reporting
US20100030660A1 (en) Apparatus and method of on-line transaction
US20130132837A1 (en) Entertainment device and method
US20100203968A1 (en) Apparatus And Method Of Avatar Customisation
EP2058756A1 (en) Apparatus and method of administering modular online environments
WO2008104783A1 (en) Entertainment device and method
GB2461175A (en) A method of transferring real-time multimedia data in a peer to peer network using polling of peer devices

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)