GB2450757A - Avatar customisation, transmission and reception - Google Patents

Avatar customisation, transmission and reception Download PDF

Info

Publication number
GB2450757A
GB2450757A GB0713186A GB0713186A GB2450757A GB 2450757 A GB2450757 A GB 2450757A GB 0713186 A GB0713186 A GB 0713186A GB 0713186 A GB0713186 A GB 0713186A GB 2450757 A GB2450757 A GB 2450757A
Authority
GB
United Kingdom
Prior art keywords
avatar
user
dimensional mesh
modified
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0713186A
Other versions
GB0713186D0 (en
Inventor
Andrew George Gill
Keith Thomas Ribbons
John Foster
Mark Horneff
Nick Ryan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Europe Ltd
Original Assignee
Sony Computer Entertainment Europe Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Europe Ltd filed Critical Sony Computer Entertainment Europe Ltd
Priority to GB0713186A priority Critical patent/GB2450757A/en
Publication of GB0713186D0 publication Critical patent/GB0713186D0/en
Priority to JP2010514129A priority patent/JP2010532890A/en
Priority to PCT/GB2008/002321 priority patent/WO2009007701A1/en
Priority to US12/667,775 priority patent/US20100203968A1/en
Priority to EP08762523A priority patent/EP2175950A1/en
Publication of GB2450757A publication Critical patent/GB2450757A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • A63F2300/5553Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history user representation in the game field, e.g. avatar
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6009Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content
    • A63F2300/6018Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content where the game content is authored by the player, e.g. level editor or by game device at runtime, e.g. level is created from music data on CD
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2021Shape modification

Abstract

An entertainment device comprises skeletal modelling means to control placements of a three dimensional mesh representing some or all of a user avatar in response to the position of one or more skeletal components of the user avatar, skeleton modification means to modify one or more physical properties of one or more skeletal components of the user avatar via a user interface, and rendering means to render the user avatar responsive to the modified user avatar skeleton. Also disclosed is administering a multi-player online virtual environment, comprising reception means to receive data descriptive of respective modified avatar skeletons from a plurality of remote entertainment devices, and transmission means to transmit data descriptive of respective modified avatar skeletons to a plurality of remote entertainment devices.

Description

APPARATUS AND METHOD OF AVATAR CUSTOMISATION
This invention relates to an apparatus and method of avatar customisation.
In online-gaming, it is conventional for players to adopt distinctive names for their in-game characters (generally termed avatars'). In addition, these avatars may also be customised, for example according to race (real or fictional) or gender, and a range of different heads and bodies are often provided. In addition, features such as hair styles, skin tone and age may be customised The purpose of such naming and custornisation is typically to project the user's personality within the game, and/or to be as distinctive as possible. For example, see http.//starwarsgalaxies.station.sony.com/players/guides.vm?id=70000.
As on-line gaming continues to grow there is an increasing move to explore the social aspect of these virtual environments, and consequently a greater need for the user's avatar within such an environment to be distinctive enough to fulfil the requirements of social interaction between many individuals (e.g. see www.selectparks.net/blundellcharcust.pdf).
Conventional means of further customising a user's avatar for such a purpose may include uploading the user's own face as a texture to use on the avatar (for example, see http://research.m icrosoft.comI-zhanFace/redherringReport.htm), or modifying the gestures and expressions of the avatar to reflect a particular mood that the user wishes to express.
However, gestures and expressions are not instantly recognisable as they first need to be carried out. Meanwhile, uploading images of users faces is potentially intrusive, and rendering the images in a consistent manner when each face may be captured under different lighting conditions and at different effective resolutions is difficult. Moreover, the user may be dissatisfied with the result if it is a poor approximation. Finally, many people online wish to present a fictional appearance whilst remaining true to their personality, or wish to appear appropriately in character' within the game environment; in this case a captured image is not a desirable solution.
It has been suggested that therefore it would be desirable if the end-user could have access to customisation options down to the level of the shape of a nostril' (see the introduction to www.selectparks.net/blundeli_charcust.pdf). However, this would result in a bewildering array of options for the user to work through, and significantly would also result in considerable work in providing the different customisation options during initial game development. In addition, a significant data overhead in terms of transmission of configuration data in a massively multiplayer game is also likely. Consequently such systems do not appear to have been realised in-game (see again http://starwarsgalaxies.station.sony.com/players/guides.vm?ici=70000).
The present invention seeks to address the above concerns.
In a first aspect of the present invention, an entertainment device comprises skeletal modelling means to configure a three dimensional mesh representing some or all of a user avatar in response to at least a first property of one or more skeletal components of the user avatar, skeleton modification means to modify one or more properties of one or more skeletal components of the user avatar via a user interface, and rendering means to render the user avatar in accordance with the three dimensional mesh as configured in response to the modified user avatar skeleton.
By configuring the three-dimensional mesh used to render the user avatar in accordance with a skeletal model, then by manipulation of one or more skeletal components the user can create distinctive faces for their avatars in a comparatively simple fashion before applying any further, more conventional changes such as texture or accessory selection to the is mesh.
in another aspect of the present invention, a server operable to administer a multi-player online virtual environment comprises reception means to receive data descriptive of respective modified avatar skeletons from a plurality of remote entertainment devices, and transmission means to transmit data descriptive of respective modified avatar skeletons to a plurality of remote entertainment devices.
In another aspect of the present invention, a system comprising a server and two or more entertainment devices as described in the above aspects co-operate to allow the two or more entertainment devices to render the modified avatars of the users of the respective other devices.
By enabling the distribution of user-modified skeletal models for avatars, advantageously a population of avatars within an on-line environment can therefore be more easily differentiated when the populated environment is rendered by each participating remote entertainment device.
Further respective aspects and features of the invention are defined in the appended claims, including corresponding methods of operation as appropriate.
Embodiments of the present invention will now be described by way of example with reference to the accompanying drawings, in which: Figure 1 is a schematic diagram of an entertainment device; Figure 2 is a schematic diagram of a cell processor; fo 3 Figure 3 is a schematic diagram of a video graphics processor; Figure 4 is a schematic diagram of an interconnected set of game zones in accordance with an embodiment of the present invention; Figure 5 is a schematic diagram of a Home environment online clientlserver arrangement in accordance with an embodiment of the present invention; Figure 6a is a schematic diagram of a lobby zone in accordance with an embodiment of the present invention; Figure 6b is a schematic diagram of a lobby zone in accordance with an embodiment of the present invention; Figure 6c is a schematic diagram of a cinema zone in accordance with an embodiment of the present invention; Figure 6d is a schematic diagram of a developer/publisher zone in accordance with an embodiment of the present invention; Figure 7 is a flow diagram of a method of on-line transaction in accordance with an embodiment of the present invention; Figure 8a is schematic diagram of an apartment zone in accordance with an embodiment of the present invention; Figure 8b is schematic diagram of a trophy room zone in accordance with an embodiment of the present invention; Figure 9 is a schematic diagram of a communication menu in accordance with an embodiment of the present invention; Figure 10 is a schematic diagram of an interactive virtual user device in accordance with an embodiment of the present invention; Figure 1 I is a schematic diagram of a user interface in accordance with an embodiment of the present invention; Figure 12 is a schematic diagram of a user interface in accordance with an embodiment of the present invention; Figure 13 is a schematic diagram of a user interface in accordance with an embodiment of the present invention; Figures 14A and 14B are schematic diagrams of a user interface in accordance with an embodiment of the present invention; Figures ISA, B &C are schematic diagrams of a user avatar in accordance with an embodiment of the present invention; I, 4 Figure 16 is a flow diagram of a method of user identification in accordance with an embodiment of the present invention; An apparatus and method of user identification are disclosed. In the following description, a number of specific details are presented in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to a person skilled in the art that these specific details need not be employed to practice the present invention. Conversely, specific details known to the person skilled in the art are omitted for the purposes of clarity where appropriate.
In a summary embodiment of the present invention, a user of an entertainment device connected to an on-line virtual environment selects and customises an avatar using conventional options such as gender, clothing and skin-tone. In addition, however, the user can also modify the three-dimensional mesh used to define the surface of the user's avatar, upon which textures relating to gender, skin tone, age etc., can be applied. This configuration is achieved using a comparatively simple user interface by positioning vertices of the avatar IS mesh in response to a skeletal model underpinning the avatar's mesh structure. The user can therefore modify the mesh of their avatar by making simple parametric adjustments to the so-called bones' of the skeletal models. Typically these bones are interlinked so that modification to one bone or set of bones also affects other related bones, so maintaining a harmonious set of physical proportions for the mesh. The user interface provides a hierarchy of adjustments, from whole-face skeletal adjustments (e.g. by race) to partial face skeletal adjustments (e.g. upper face, lower face, cranium) to individual features (e.g. cheek bones).
This allows a quick modification of the avatar without compromising the ability to fine tune the results. Moreover, the user interface allows the modification of bone parameters to provide an asymmetric mesh, as this conveys a more naturalistic appearance for the avatar, as well as providing an additional source of distinctiveness and identity.
Figure 1 schematically illustrates the overall system architecture of the Sony Playstation 3 entertainment device. A system unit 10 is provided, with various peripheral devices connectable to the system unit.
The system unit 10 comprises: a Cell processor 100; a Rambus dynamic random access memory (XDRAM) unit 500; a Reality Synthesiser graphics unit 200 with a dedicated video random access memory (VRAM) unit 250; and an I/O bridge 700.
The system unit 10 also comprises a Blu Ray Disk BD-ROM optical disk reader 430 for reading from a disk 440 and a removable slot-in hard disk drive (HDD) 400, accessible through the I/O bridge 700. Optionally the system unit also comprises a memory I, 5 card reader 450 for reading compact flash memory cards, Memory Stick memory cards and the like, which is similarly accessible through the I/O bridge 700.
The I/O bridge 700 also connects to four Universal Serial Bus (USB) 2.0 ports 710; a gigabit Ethernet port 720; an IEEE 802.1 lb/g wireless network (Wi-Fi) port 730; and a s Bluetooth wireless link port 740 capable of supporting up to seven Bluetooth connections.
In operation the I/O bridge 700 handles all wireless, USB and Ethernet data, including data from one or more game controllers 751. For example when a user is playing a game, the I/O bridge 700 receives data from the game controller 751 via a Bluetooth link and directs it to the Cell processor 100. which updates the current state of the game accordingly.
The wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controllers 751, such as: a remote control 752; a keyboard 753; a mouse 754; a portable entertainment device 755 such as a Sony Playstation Portable entertainment device; a video camera such as an EyeToy video camera 756; and a microphone headset 757. Such peripheral devices may therefore in principle be connected to the system unit 10 wirelessly; for example the portable entertainment device 755 may communicate via a Wi-Fi ad-hoc connection, whilst the microphone headset 757 may communicate via a Bluetooth link.
The provision of these interfaces means that the Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over IP telephones, mobile telephones, printers and scanners.
In addition, a legacy memory card reader 410 may be connected to the system unit via a USB port 710, enabling the reading of memory cards 420 of the kind used by the Playstation or Playstation 2 devices.
In the present embodiment, the game controller 751 is operable to communicate wirelessly with the system unit 10 via the Bluctooth link. However, the game controller 751 can instead be connected to a USB port, thereby also providing power by which to charge the battery of the game controller 751. In addition to one or more analogue joysticks and conventional control buttons, the game controller is sensitive to motion in 6 degrees of freedom, corresponding to translation and rotation in each axis. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands. Optionally, other wirelessly enabled peripheral devices suh as the Playstation Portable device may be used as a controller. In the case of the Playstation Portable device, additional game or control * 6 information (for example, control instructions or number of lives) may be provided on the screen of the device. Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or bespoke controllers, such as a single or several large buttons for a rapid-response quiz game (also not shown).
The remote control 752 is also operable to communicate wirelessly with the system unit 10 via a Bluetooth link. The remote control 752 comprises controls suitable for the operation of the Blu Ray Disk BD-ROM reader 430 and for the navigation of disk content.
The Blu Ray Disk BD-ROM reader 430 is operable to read CD-ROMs compatible with the Playstation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs. The reader 430 is also operable to read DVD-ROMs compatible with the Playstation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs. The reader 430 is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks.
The system unit 10 is operable to supply audio and video, either generated or decoded by the Playstation 3 device via the Reality Synthesiser graphics unit 200, through audio and video connectors to a display and sound output device 300 such as a monitor or television set having a display 305 and one or more loudspeakers 310. The audio connectors 210 may include conventional analogue and digital outputs whilst the video connectors 220 may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL or NTSC, or in 720p, 1080i or 1080p high definition.
Audio processing (generation, decoding and so on) is performed by the Cell processor 100. The Playstation 3 device's operating system supports Dolby 5.1 surround sound, Dolby Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu-Ray disks.
In the present embodiment, the video camera 756 comprises a single charge coupled device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by the system unit 10. The camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit 10, for example to signify adverse lighting conditions. Embodiments of the video camera 756 may variously connect to the O 7 system unit 10 via a USB, Bluetooth or Wi-Fi communication port. Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data. In embodiments of the video camera, the CCD may have a resolution suitable for high-definition video capture. In use, images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs.
in general, in order for successful data communication to occur with a peripheral device such as a video camera or remote control via one of the communication ports of the system unit 10, an appropriate piece of software such as a device driver should be provided.
Device driver technology is well-known and will not be described in detail here, except to say 0 that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described.
Referring now to Figure 2, the Cell processor tOO has an architecture comprising four basic components: external input and output structures comprising a memory controller 160 and a dual bus interface controller 170A,B; a main processor referred to as the Power Processing Element 150; eight co-processors referred to as Synergistic Processing Elements (SPEs) I IOA-H; and a circular data bus connecting the above components referred to as the Element Interconnect Bus 180. The total floating point performance of the Cell processor is 218 GFLOPS, compared with the 6.2 GFLOPs of the Playstation 2 device's Emotion Engine.
The Power Processing Element (PPE) 150 is based upon a two-way simultaneous multithreading Power 970 compliant PowerPC core (PPU) 155 running with an internal clock of 3.2 GHz. It comprises a 512 kB level 2 (L2) cache and a 32 kB level I (LI) cache. The PPE 150 is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz. The primary role of the PPE 150 is to act as a controller for the Synergistic Processing Elements 1 IOA-H, which handle most of the computational workload.
In operation the PPE 150 maintains a job queue, scheduling jobs for the Synergistic Processing Elements 11 OA-H and monitoring their progress. Consequently each Synergistic Processing Element 1 IOA-H runs a kernel whose role is to fetch a job, execute it and synchronise with the PPE 1 50.
Each Synergistic Processing Element (SPE) I bA-H comprises a respective Synergistic Processing Unit (SPU) 120A-H, and a respective Memory Flow Controller (MFC) l4OA-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 142A-H, a respective Memory Management Unit (MMU) 144A-I-1 and a bus interface (not shown).
Each SPU 120A-H is a RISC processor clocked at 3.2 GHz and comprising 256 kB local RAM 130A-H, expandable in principle to 4 GB Each SPE gives a theoretical 25.6 GFLOPS of single precision performance. An SPU can operate on 4 single precision floating point members, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation. The SPU 120A-H does not directly access the system memory XDRAM 500; the 64-bit addresses formed by the SPU 120A-H are passed to the MFC 140A-H which instructs its DMA controller 142A-1-l to access memory via the Element Interconnect Bus 180 and the memory controller 160.
The Element Interconnect Bus (EIB) 180 is a logically circular communication bus internal to the Cell processor 100 which connects the above processor elements, namely the PPE 150, the memory controller 160, the dual bus interface I 70A,B and the 8 SPEs I I OA-H, totalling 12 participants. Participants can simultaneously read and write to the bus at a rate of 8 bytes per clock cycle. As noted previously, each SPE I IOA-H comprises a DMAC 142A-I-1 for scheduling longer read or write sequences. The EIB comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step-wise data-flow between any two participants is six steps in the appropriate direction. The is theoretical peak instantaneous EIB bandwidth for 12 slots is therefore 9613 per clock, in the event of full utilisation through arbitration between participants. This equates to a theoretical peak bandwidth of 307.2 GB/s (gigabytes per second) at a clock rate of 3.2GHz.
The memory controller 160 comprises an XDRAM interface 162, developed by Rambus Incorporated. The memory controller interfaces with the Rambus XDRAM 500 with a theoretical peak bandwidth of 25.6 GB/s.
The dual bus interface 170A,B comprises a Rambus FlexIO system interface l72A,B. The interface is organised into 12 channels each being 8 bits wide, with five paths being inbound and seven outbound. This provides a theoretical peak bandwidth of 62.4 GB/s (36.4 GB/s outbound, 26 GB/s inbound) between the Cell processor and the I/O Bridge 700 via controller l7OA and the Reality Simulator graphics unit 200 via controller 170B.
Data sent by the Cell processor 100 to the Reality Simulator graphics unit 200 will typically comprise display lists, being a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on.
Referring now to Figure 3, the Reality Simulator graphics (RSX) unit 200 is a video accelerator based upon the NVidia G70/71 architecture that processes and renders lists of commands produced by the Cell processor 100. The RSX unit 200 comprises a host interface 202 operable to communicate with the bus interface controller 170B of the Cell processor 100; a vertex pipeline 204 (VP) comprising eight vertex shaders 205; a pixel pipeline 206 (PP) comprising 24 pixel shaders 207; a render pipeline 208 (RP) comprising eight render I. 9 output units (ROPs) 209; a memory interface 210; and a video converter 212 for generating a video output. The RSX 200 is complemented by 256 MB double data rate (DDR) video RAM (VRAM) 250, clocked at 600MHz and operable to interface with the RSX 200 at a theoretical peak bandwidth of 25.6 GB/s. In operation, the VRAM 250 maintains a frame buffer 214 and a texture buffer 216. The texture buffer 216 provides textures to the pixel shaders 207, whilst the frame buffer 214 stores results of the processing pipelines. The RSX can also access the main memory 500 via the EIB 180, for example to load textures into the VRAM 250.
The vertex pipeline 204 primarily processes deformations and transformations of vertices defining polygons within the image to be rendered.
The pixel pipeline 206 primarily processes the application of colour, textures and lighting to these polygons, including any pixel transparency, generating red, green, blue and alpha (transparency) values for each processed pixel. Texture mapping may simply apply a graphic image to a surface, or may include bump-mapping (in which the notional direction of a surface is perturbed in accordance with texture values to create highlights and shade in the lighting model) or displacement mapping (in which the applied texture additionally perturbs vertex positions to generate a deformed surface consistent with the texture).
The render pipeline 208 performs depth comparisons between pixels to determine which should be rendered in the final image. Optionally, if the intervening pixel process will not affect depth values (for example in the absence of transparency or displacement mapping) then the render pipeline and vertex pipeline 204 can communicate depth information between them, thereby enabling the removal of occluded elements prior to pixel processing, and so improving overall rendering efficiency. In addition, the render pipeline 208 also applies subsequent effects such as full-screen anti-aliasing over the resulting image.
Both the vertex shaders 205 and pixel shaders 207 are based on the shader model 3.0 standard. Up to 136 shader operations can be performed per clock cycle, with the combined pipeline therefore capable of 74.8 billion shader operations per second, outputting up to 840 million vertices and 10 billion pixels per second. The total floating point performance of the RSX 200 is 1.8 TFLOPS.
Typically, the RSX 200 operates in close collaboration with the Cell processor 100; for example, when displaying an explosion, or weather effects such as rain or snow, a large number of particles must be tracked, updated and rendered within the scene. In this case, the PPU 155 of the Cell processor may schedule one or more SPEs I IOA-H to compute the trajectories of respective batches of particles. Meanwhile, the RSX 200 accesses any texture data (e.g. snowflakes) not currently held in the video RAM 250 from the main system * to memory 500 via the element interconnect bus 180, the memory controller 160 and a bus interface controller I 70B. The or each SPE I IOA-H outputs its computed particle properties (typically coordinates and normals, indicating position and attitude) directly to the video RAM 250; the DMA controller 142A-H of the or each SPE I IOA-H addresses the video RAM 250 via the bus interface controller I 70B Thus in effect the assigned SPEs become part of the video processing pipeline for the duration of the task.
In general, the PPU 155 can assign tasks in this fashion to six of the eight SPEs available; one SPE is reserved for the operating system, whilst one SPE is effectively disabled. The disabling of one SPE provides a greater level of tolerance during fabrication of tO the Cell processor, as it allows for one SPE to fail the fabrication process. Alternatively if all eight SPEs are functional, then the eighth SPE provides scope for redundancy in the event of subsequent failure by one of the other SPEs during the life of the Cell processor.
The PPU 155 can assign tasks to SPEs in several ways. For example, SPEs may be chained together to handle each step in a complex operation, such as accessing a DVD, video and audio decoding, and error masking, with each step being assigned to a separate SPE.
Alternatively or in addition, two or more SPEs may be assigned to operate on input data in parallel, as in the particle animation example above.
Software instructions implemented by the Cell processor 100 and/or the RSX 200 may be supplied at manufacture and stored on the HDD 400, and/or may be supplied on a data carrier or storage medium such as an optical disk or solid state memory, or via a transmission medium such as a wired or wireless network or internet connection, or via combinations of these.
The software supplied at manufacture comprises system firmware and the Playstation 3 device's operating system (OS). In operation, the OS provides a user interface enabling a user to select from a variety of functions, including playing a game, listening to music, viewing photographs, or viewing a video. The interface takes the form of a so-called cross media-bar (XMB), with categories of function arranged horizontally. The user navgates by moving through the function icons (representing the functions) horizontally using the game controller 751, remote control 752 or other suitable control device so as to highlight a desired function icon, at which point options pertaining to that function appear as a vertically scrollabte list of option icons centred on that function icon, which may be navigated in analogous fashion. However, if a game, audio or movie disk 440 is inserted into the BD-ROM optical disk reader 430, the Playstation 3 device may select appropriate options automatically * I Ii (for example, by commencing the game), or may provide relevant options (for example, to select between playing an audio disk or compressing its content to the HOD 400).
In addition, the OS provides an on-line capability, including a web browser, an interface with an on-line store from which additional game content, demonstration games (demos) and other media may be downloaded, and a friends management capability, providing on-line communication with other Playstation 3 device users nominated by the user of the current device; for example, by text, audio or video depending on the peripheral devices available. The on-line capability also provides for on-line communication, content download and content purchase during play of a suitably configured game, and for updating the firmware and OS of the Playstation 3 device itself. It will be appreciated that the term online" does not imply the physical presence of wires, as the term can also apply to wireless connections of various types.
In an embodiment of the present invention, the above-mentioned online capability comprises interaction with a virtual environment populated byavatars (graphical representations) of the user of the PS3 10 and of other PS3 users who are currently online.
The software to enable the virtual interactive environment is typically resident on the I-IDE) 400, and can be upgraded and/or expanded by software that is downloaded, or stored on optical disk 440, or accessed by any other suitable means. Alternatively, the software may reside on a flash memory card 420, optical disk 440 or a central server (not shown).
In an embodiment of the present invention, the virtual interactive environment (hereafter called the Home' environment) is selected from the cross-media bar. The Home environment then starts in a conventional manner similar to a 3D video game by loading and executing control software, loading 3D models and textures into video memory 250, and rendering scenes depicting the Home environment Alternatively or in addition, the Home environment can be initiated by other programs, such as a separate game.
Referring now to Figure 4, which displays a notional map of the Home environment, and Figure 5, which is a schematic diagram of a Home environment online client/server arrangement, the user's avatar is spawned within a lobby zone 1010 by default. However, a user can select among other zones 10 10-1060 (detailed below) of the map, causing the select zone to be loaded and the avatar to be spawned within that zone. In an embodiment of the present invention, the map screen further comprises a sidebar on which the available zones may be listed, together with management tools such as a ranking option, enabling zones to be listed in order of user preference, or such as most recently added and/or A-Z listings. In addition a search interface may allow the user to search for a zone by name. In an embodiment of the present invention, there maybe many more zones available than are locally stored on the user's PS3 at any one time; the local availability may be colour coded on the list, or the list may be filtered to only display locally available zones. If the user selects a locally unavailable zone, it can be downloaded from a Home environment Server 2010.
S Referring now to Figure 6a, the lobby zone 1010 typically resembles a covered piazza, and may comprise parkiand (grass, trees, sculptures etc.), and gathering spaces (such as open areas, single benches or rows of seats etc.) where users can meet through their avatars.
The lobby zone 1010 typically also comprises advertisement hoardings, for displaying either still or moving adverts for games or other content or products. These may be on the walls of the lobby, or may stand alone The lobby zone 1010 may also include an open-air cinema 1012 showing trailers, high-profile adverts or other content from third-party providers Such content is typically streamed or downloaded from a Home environment server 2010 to which the PS3 10 connects when the Home environment is loaded, as described in more detail later.
The cinema screen is accompanied by seating for avatars in front of it, such that when an avatar sits down, the camera angle perceived by the user of the avatar also encompasses the screen.
Referring now also to Figure 6b, the lobby zone 1010 may also include general amusements 1014, such as functioning pool tables, bowling alleys, and/or a video arcade.
Games of pool or bowling may be conducted via the avatar, such that the avatar holds the pool cue or bowling ball, and is controlled in a conventional manner for such games. in the video arcade, if an avatar approaches a videogame machine, the home environment may switch to a substantially full-screen representation of the videogame selected. Such games may, for example, be classic arcade or console games such as Space invaders (RTM), or Pac-Man (RTM), which are comparatively small in terms of memory and processing and can be emulated by the PS3 within the Home environment or run as plug-ins to the Home environment. In this case, typically the user will control the game directly, without representation by the avatar. The game will switch back to the default Home environment view if the user quits the game, or causes the avatar to move away from the videogame machine. In addition to classic arcade games, user-created game content may be featured on one or more of the virtual video game machines. Such content may be the subject of on-line competitions to be featured in such a manner, with new winning content downloaded on a regular basis.
In addition to the lobby zone 1010, other zones (e.g. zones 1020, 1030, 1040, 1050 and 1060, which may be rooms, areas or other constructs) are available. These may be accessed either via a map screen similar in nature to that of Figure 4, or alternatively the user can walk to these other areas by guiding their avatar to various exits 1016 from the lobby.
Typically, an exit 1016 takes the form of a tunnel or corndor (but may equally take the form of an anteroom) to the next area. While the avatar is within the tunnel or anteroom, the next zone is loaded into memory Both the lobby and the next zone contain identical models of the tunnel or anteroom, or the model is a common resource to both. In either case, the user's avatar is relocated from the lobby-based version to the new zone-based version of the tunnel or anteroom at the same position. in this way the user's avatar can apparently walk seamlessly throughout the Home environment, without the need to retain the whole environment in memory at the same time.
Referring now also to Figure 6c, one available zone is a Cinema zone 1020. The Cinema zone 1020 resembles a multiplex cinema, comprising a plurality of screens that may show content such as trailers, movies, TV programmes, or adverts downloaded or streamed from a Home environment server 2010 as noted previously and detailed below, or may show content stored on the HDD 400 or on an optical disk 440, such as a E3lu-Ray disk.
Typically, the multiplex cinema will have an entrance area featuring a screen 1022 on which high-profile trailers and adverts may be shown to all visitors, together with poster adverts 1024, typically but not limited to featuring upcoming movies. Specific screens and the selection and display of the trailers and posters can each be restricted according to the age of the user, as registered with the PS3. This age restriction can be applied to any displayed content to which an age restriction tag is associated, in any of the zones within the Home environment.
In addition, in an embodiment of the present invention the multiplex cinema provides a number of screen rooms in which featured content is available, and amongst which the user can select. Within a screen room downloaded, streamed or locally stored media can be played within a virtual cinema environment, in which the screen is set in a room with rows of seats, screen curtains, etc. The cinema is potentially available to all users in the Home environment, and so the avatars of other users may also be visible, for example watching commonly streamed material such as a web broadcast. Alternatively, the user can zoom in so that the screen occupies the full viewing area.
Referring now also to Figure 6d, another type of zone is a developer or publisher zone 1030. Typically, there may be a plurality of such zones available. Optionally, each may have *1 its own exit from the lobby area 1010, or alternatively some or all may share an exit from the lobby and then have separate exits from within a tunnel or ante-room model common to or replicated by each available zone therein. Alternatively they may be selected from a menu, either in the form of a pop-up menu, or from within the Home environment, such as by selecting from a set of signposts. In these latter cases the connecting tunnel or anteroom will appear to link only to the selected developer or publisher zone 1030. Alternatively or in addition, such zones may be selected via the map screen, resulting in the zone being loaded in to memory, and the avatar re-spawning within the selected zone.
Developer or publisher zones 1030 provide additional virtual environments, which o may reflect the look and feel of the developer or publisher's products, brands and marks.
The developer or publisher zones 1030 are supplementary software modules to the Home environment and typically comprise additional 3D models and textures to provide the structure and appearance of the zone.
In addition, the software operable to implement the Home environment supports the Is integration of third party software via an application program interface (API). Therefore, developers can integrate their own functional content within the Home environment of their own zone. This may take the form of any or all of: i. Downloading! streaming of specific content, such as game trailers or celebrity endorsements; ii. Changes in avatar appearance, behaviour and/or communication options within the zone; iii. The provision of one or more games, such as basketball 1032 or a golf range 1034, optionally branded or graphically reminiscent of the developer's or publisher's games; iv. One or more interactive scenes or vignettes representative of the developer's or publisher's games, enabling the player to experience an aspect of the game, hone a specific skill of the game, or familiarise themselves with the controls of a game; v. An arena, ring, dojo, court or similar area 1036 in which remotely played games may be represented live by avatars 1038, tbr spectators to watch.
Thus, for example, a developer's zone resembles a concourse in the developer's signature colours and featuring their logos, onto which open gaming areas, such as soccer nets, or a skeet range for shooting. In addition, a booth (not shown) manned by game-specific characters allows the user's avatar to enter and either temporarily change into the lead character of the game, or zoom into a first person perspective, and enter a further room resembling a scene from the featured game. Here the user interacts with other characters from the game, and plays out a key scene. Returning to the concourse, adverts for the game and other content are displayed on the walls. At the end of the zone, the concourse opens up into s an arena where a 5-a-side football match is being played, where the positions of the players and the ball correspond to a game currently being played by a popular group, such as a high-ranking game clan, in another country.
In embodiments of the present invention, developer / publisher zones are available to download. Altcrnativcly or in addition, to reduce bandwidth they may be supplied as demo iO content on magazine disks, or may be installed/upgraded from disk as part of the installation process for a purchased game of the developer or publisher. In the latter two examples, subsequent purchase or registration of the game may result in further zone content being unlocked or downloaded. In any event, further modifications, and timely advert and trailer media, may be downloaded as required.
A similar zone is the commercial zone 1040. Again, there may be a plurality of such commercial zones accessible in similar manner to the developer and publisher zones. Like developer / publisher zones 1030, commercial zones 1040 may comprise representative virtual assets of one or more commercial vendors in the form of 3D models, textures etc., enabling a rendering of their real-world shops, brands and identities, and these may be geographically and/or thematically grouped within zones.
Space within commercial zones may be rented as so-called virtual real-estate' by third parties. For example, a retailer may pay to have a rendering of their shop included within a commercial zone 1040 as part of a periodic update of the Home environment supplied via the Home environment server 2010, for example on a monthly or annual renewal basis. A retailer may additionally pay for the commerce facilities described above, either on a periodic basis or per item. In this way they can provide users of the Home environment with a commercial presence.
Again, the commercial zone comprises supplementary software that can integrate with the home environment via an API, to provide additional communication options (shop-specific names, goods, transaction options etc), and additional functionality, such as accessing an online database of goods and services for purchase, determining current prices, the availability of goods, and delivery options. Such functions may be accessed either via a menu (either as a pop-up or within the Home environment, for example on a wall) or via *1 16 communication with automated avatars. Communication between avatars is described in more detail later.
It will be appreciated that developers and publishers can also provide stores within commercial zones, and in addition that connecting tunnels between developer / publisher and commercial zones may be provided. For example, a tunnel may link a developer zone to a store that sells the developer's games. Such a tunnel may be of a many to one' variety, such that exits from several zones emerge from the same tunnel in-store. In this case, if re-used, typically the tunnel would be arranged to return the user to the previous zone rather than one of the possible others.
In an embodiment of the present invention, the software implementing the Home environment has access to an online-content purchase system provided by the PS3 Os.
Developers, publishers and store owners can use this system via an interface to specify the IP address and query text that facilitates their own on-line transaction. Alternatively, the user can allow their PS3 registration details and credit card details to be used directly, such that by selecting a suitably enabled object, game, advert, trailer or movie anywhere within the Home environment, they can select to purchase that item or service. In particular, the Home environment server 2010 can store and optionally validate the user's credit card and other details so that the details are ready to be used in a transaction without the user having to enter them. In this way the Home environment acts as an intermediary in the transaction.
Alternatively such details can be stored at the PS3 and validated either by the PS3 or by the I-tome environment server.
Thus, referring now also to Figure 7, in an embodiment of the present invention a method of sale comprises in a step s2 102 a user selecting an item (goods or a service) within the Home environment. In step s2104, the PS3 10 transmits identification data corresponding with the object to the Home environment server 2010, which in step s2016 verifies the item's availability from a preferred provider (preferably within the country corresponding to the lP address of the user). If the item is unavailable then in step s2107 it informs the user by transmitting a message to the user's PS3 10. Alternatively, it first checks for availability from one or more secondary providers, and optionally confirms whether supply from one of these providers is acceptable to the user. In step s2 108, the Home environment server retrieves from data storage the user's registered payment details and validates them. If there is no valid payment method available, then the Home environment may request that the user enters new details via a secure (i.e. encrypted) connection. Once a valid payment method is available, then in step s21 10 the Home environment server requests from the appropriate third party payment provider a transfer of payment from the user's account. Finally, in s2 112 the Home environment server places an order for the item with the preferred provider, giving the user's delivery address or IP address as applicable, and transferring appropriate payment to the preferred provider's account.
In this way, commerce is not limited specifically to shops. Similarly, it is not necessary for shops to provide their own commerce applications if the preferred provider for goods or services when displayed within a shop is set to be that shop's owner. Where the goods or service may be digitally provided, then optionally it is downloaded from the preferred provider directly or via a Home environment server 2010.
In addition to the above public zones, there are additional zones that are private to the individual user and may only be accessed by them or by invitation from them. These zones also have exits from the communal lobby area, but when entered by the avatar (or chosen via the map screen), load a respective version of the zone that is private to that user.
Referring to Figure 8a, the first of these zones is an apartment zone 1050. In an embodiment of the present invention, this is a user-customisable zone in which such features 1052 as wallpaper, flooring, pictures, furniture, outside scenery and lighting may be selected and positioned. Some of the furniture is functional furniture 1054, linked to PS3 functionality For example, a television may be placed in the apartment 1050 on which can be viewed one of several streamed video broadcasts, or media stored on the PS3 HDD 400 or optical disk 440. Similarly, a radio or hi-fl may be selected that contains pre-selected links to internet radio streams. In addition, user artwork or photos may be imported into the room in the form of wall hangings and pictures.
Optionally, the user (represented in Figure 8a by their avatar 1056) may purchase a larger apartment, and/or additional goods such as a larger TV, a pool table, or automated non-player avatars. Other possible items include a gym, swimming pool, or disco area. In these latter cases, additional control software or configuration libraries to provide additional character functionality will integrate with the home environment via the API in a similar fashion to that described for the commercial and developer / publisher zones 1030, 1040 described previously.
Such purchases may be made using credit card details registered with the Home environment server. In return for a payment, the server downloads an authorisation key to unlock the relevant item for use within the user's apartment. Alternatively, the 3D model, textures and any software associated with an item may also be downloaded from the Home environment server or an authorised third-party server, optionally again associated with an authorisation key. The key may, for example, require correspondence with a firmware digital serial number of the PS3 10, thereby preventing unauthorised distribution.
A user's apartment can only be accessed by others upon invitation from the respective user. This invitation can take the form of a standing invitation for particular friends from within a friends list, or in the form of a single-session pass conferred on another user, and only valid whilst that user remains in the current Home environment session. Such invitations may take the form of an association maintained by a Home environment server 2010, or a digital key supplied between PS3 devices on a peer-to-peer basis that enables confirmation of status as an invitee.
io In an embodiment of the present invention invited users can only enter the apartment when the apartment's user is present within the apartment, and are automatically returned to the lobby if the apartment's user leaves. Whilst within the apartment, all communication between the parties present (both user and positional data) is purely peer-to-peer.
The apartment thus also provides a user with the opportunity to share home created is content such as artwork, slideshows, audio or video with invited guests, and also to interact with friends without potential interference from other users within the public zones.
When invited guests enter a user's apartment, the configuration of the room and the furnishings within it are transmitted in a peer-to-peer fashion between the attendees using ID codes for each object and positional data. Where a room or item are not held in common between the user and a guest, the model, textures and any code required to implement it on the guest's PS3 may also be transmitted, together with a single-use key or similar constraint, such as use only whilst in the user's apartment and whilst the user and guest remain online in this session.
Referring to Figure 8b, a further private space that may similarly be accessed only by invitation is the user's Trophy Room 1060. The Trophy Room 1060 provides a space within which trophies 1062 earned during game play may be displayed.
For example, a third-party game comprises seeking a magical crystal. If the player succeeds in finding the crystal, the third party game nominates this as a trophy for the Trophy Room 1060, and places a 3D model and texture representative of the crystal in a file area accessed by the Home environment software when loading the Trophy Room 1060. The software implementing the Home environment can then render the crystal as a trophy within the Trophy Room.
When parties are invited to view a user's trophy room, the models and textures required to temporarily view the trophies are sent from the user's PS3 to those of the other parties on a peer-to-peer basis. This may be done as a background activity following the initial invitation, in anticipation of entering the trophy room, or may occur when parties enter a connecting tunnel / anteroom or select the user's trophy room from the map screen.
Optionally, where another party also has that trophy, they will not download the corresponding trophy from the user they are visiting. Therefore, in an embodiment of the present invention, each trophy comprises an identifying code.
Alternatively or in addition, a trophy room may be shared between members of a group or so-called clan', such that a trophy won by any member of the clan is transmitted to other members of the clan on a peer-to-peer basis. Therefore all members of the clan will see a common set of trophies.
Alternatively or in addition, a user can have a standing invitation to all members of the Home environment, allowing anyone to visit their trophy room. As with the commercial and developer/publisher zones, a plurality of rooms is therefore possible, for example a private, a group-based and a public trophy room. This may be managed either by selection from a pop-up menu or signposts within the Home environment as described previously, or by identifying a relevant user by walking up to their avatar, and then selecting to enter their (public) trophy room upon using the trophy room exit from the lobby.
Alternatively or in addition, a public trophy room may be provided. This room may display the trophies of the person in the current instance of the Home environment who has the most trophies or a best overall score according to a trophy value scoring scheme.
Alternatively it may be an aggregate trophy room, showing the best, or a selection of, trophies from some or all of the users in that instance of the Home environment, together with the ID of the user. Thus, for example, a user could spot a trophy from a game they are having difficulty with, identify who in the Home environment won it, and then go and talk to them about how they won it. Alternatively, a public trophy room could contain the best trophies across a plurality of Home environments, identifying the best garners within a geographical, age specific or game specific group, or even worldwide. Alternatively or in addition, a leader board of the best scoring garners can be provided and updated live.
It will be appreciated that potentially a large number of additional third party zones may become available, each comprising additional 3D models, textures and control software.
As a result a significant amount of space on I-IDD 400 may become occupied by Home environment zones.
Consequently, in an embodiment of the present invention the number of third party zones currently associated with a user's Home environment can be limited. In a first instance, a maximum memory allocation can be used to prevent additional third party zones being added until an existing one is deleted. Alternatively or in addition, third party zones may be limited according to geographical relevance or user interests (declared on registration or subsequently via an interface with the Home environment server 2010), such that only third s party zones relevant to the user by these criteria are downloaded Under such a system, if a new third party zone becomes available, its relevance to the user is evaluated according to the above criteria, and if it is more relevant than at least one of those currently stored, it replaces the currently least relevant third party zone stored on the user's PS3.
Other criteria for relevance may include interests or installed zones of nominated friends, or the relevance of zones to games or other media that have been played on the user's PS3.
Further zones may be admitted according to whether the user explicitly installs them, either by download or by disk.
As noted above, within the Home environment users are represented by avatars. The software implementing the Home environment enables the customisation of a user's avatar from a selection of pre-set options in a similar manner to the customisation of the user's apartment. The user may select gender and skin tone, and customise the facial features and hair by combining available options for each. The user may also select from a wide range of clothing. To support this facility, a wide range of 3D models and textures for avatars are provided. In an embodiment of the present invention, user may import their own textures to display on their clothing. Typically, the parameters defining the appearance of each avatar only occupy around 40 bytes, enabling fast distribution via the home server when joining a populated Home environment.
Each avatar in the home environment can be identified by the user's ID or nickname, displayed in a bubble above the avatar. To limit the proliferation of bubbles, these fade into view when the avatar is close enough that the text it contains could easily be read, or alternatively when the avatar is close enough to interact with and/or is close to the centre of the user's viewpoint.
The avatar is controlled by the user in a conventional third-person gaming manner (e.g. using the game controller 751), allowing them to walk around the Home environment.
Some avatar behaviour is contextual; thus for example the option to sit down will only be available when the avatar is close to a seat. Other avatar behaviour is available at all times, such as for example the expression of a selected emotion or gesture, or certain communication options. Avatar actions are determined by use of the game controller 751, either directly for actions such as movement, or by the selection of actions via a pop-up menu, summoned by pressing an appropriate key on the game controller 751.
Options available via such a menu include further modification of the avatar's appearance and clothing, and the selection of emotions, gestures and movements. For example, the user can select that their avatar smiles, waves and jumps up and down when the user sees someone they know in the Home environment.
Users can also communicate with each other via their avatars using text or speech.
To communicate by text, in an embodiment of the present invention, messages appear in pop-up bubbles above the relevant avatar, replacing their name bubble if necessary.
Referring now also to Figure 9, to generate a message the user can activate a pop-up menu 1070 in which a range of preset messages is provided. These may be complete messages, or alternatively or in addition may take the form of nested menus, the navigation of which generates a message by concatenating selected options.
Alternatively or in addition, a virtual keyboard may be displayed, allowing free generation of text by navigation with the game controller 751. If a real keyboard 753 is connected via Bluetooth, then text may by typed into a bubble directly.
In an embodiment of the present invention, the lobby also provides a chat channel hosted by the Home environment server, enabling conventional chat facilities.
To communicate by speech, a user must have a microphone, such as aBluetooth headset 757, available. Then in an embodiment of the present invention, either by selection of a speech option by pressing a button on the game controller 751, or by use of a voice activity detector within the sofhvare implementing the Home environment, the user can speak within the Home environment. When speaking, a speech icon may appear above the head of the avatar for example to alert other users to adjust volume settings if necessary.
The speech is sampled by the user's PS3, encoded using a Code Excited Linear Prediction (CELP) codec (or other known VoIP applicable codec), and transmitted in a peer-to-peer fashion to the eight nearest avatars (optionally provided they are within a preset area within the virtual environment surrounding the user's avatar). Where more than eight other avatars are within the preset area, one or more of the PS3s that received the speech may forward it to other PS3s having respective user avatars within the area that did not receive the speech, in an ad-hoc manner. To co-ordinate this function, in an embodiment of the present invention the PS3 will transmit a speech flag to all PS3s whose avatars are within the preset area, enabling them to place a speech icon above the relevant (speaking) avatars head (enabling their user to identify the speaker more easily) and also to notify the PS3s of a transmission. Each PS3 can determine from the relative positions of the avatars which ones will not receive the speech, and can elect to forward the speech to the PS3 of whichever avatar they are closest to within the virtual environment. Alternatively, the PS3s within the area can ping each other, and whichever PS3 has the lowest lag with a PS3 that has not received the speech can elect to fonvard it, It will be appreciated that the limitation to eight is exemplary, and the actual number depends upon such factors as the speech compression ratio and the available bandwidth.
In an embodiment of the present invention, such speech can also be relayed to other networks, such as a mobile telephony network, upon specification of a mobile phone number.
This may be achieved either by routing the speech via the Home environment server to a gateway server of the mobile network, or by Bluetooth transmission to the user's own mobile phone. In this latter case, the mobile phone may require middleware (e.g. a Java applet) to interface with the PS3 and route the call.
Thus a user can contact a person on their phone from within the Home environment is In a similar manner, the user can also send a text message to a person on their mobile phone.
In a similar manner to speech, in an embodiment of the present invention users whose PS3s are equipped with a video camera such as the Sony Eye Toy video camera can use a video chat mode, for example via a pop-up screen, or via a TV or similar device within the Home environment, such as a Sony Playstation Portable (PSP) held by the avatar. In this case video codecs are used in addition to or instead of the audio codecs.
Optionally, the avatars of users with whom you have spoken recently can be highlighted, and those with whom you have spoken most may be highlighted more prominently, for example by an icon next to their name, or a level of glow around their avatar.
Referring back to Figure 5, when a user selects to activate the 1-lome environment on their PS3 10, the locally stored software generates the graphical representation of the Home environment, and connects to a Home environment server 2010 that assigns the user to one of a plurality of online Home environments 2021, 2022, 2023, 2024. Only four home environments are shown for clarity.
It will be understood that potentially many tens of thousands of users may be online at any one time. Consequently to prevent overcrowding, the Home environment server 2010 will support a large plurality of separate online Home environments. Likewise, there may be many separate Home environment servers, for example in different countries Once assigned to a Home environment, a PS3 initially uploads information regarding the appearance of the avatar, and then in an ongoing fashion provides the Home environment server with positional data for its own avatar, and receives from the Home environment server the positional data of the other avatars within that online Home environment. In practice this positional update is periodic (for example every 2 seconds) to limit bandwidth, so other PS3s must interpolate movement. Such interpolation of character movement is well-known in on-line games. In addition, each update can provide a series of positions, improving the replication of movement (with some lag), or improving the extrapolation of current movement.
In addition the IP addresses of the other PS3s 2031, 2032, 2033 within that Home environment 2024 is shared so that they can transmit other data such as speech in a peer-to-peer fashion between themselves, thereby reducing the required bandwidth of data handled by the Home environment server.
To prevent overcrowding within the Home environments, each will support a maximum of, for example, 64 users.
The selection of a Home environment to which a user will be connected can take account of a number of factors, either supplied by the PS3 and/or known to the Home environment server via a registration process. These include but are not limited to: i. The geographical location of the PS3; ii. The user's preferred language; iii. The user's age; iv. Whether any users within the current user's friends list' are in a particular Home environment already; v. What game disk is currently within the user's PS3; vi. What games have recently been played on the user's PS3.
Thus, for example, a Swiss teenager may be connected to a Home environment on a Swiss server, with a maximum user age of 16 and a predominant language of French. In another example, a user with a copy of Revolution' mounted in their PS3 may be connected to a home environment where a predominant number of other users also currently have the same game mounted, thereby facilitating the organisation of inultiplayer games. In this latter case, the PS3 10 detects the game loaded within the BD-ROM 430 and informs the Home environment server 2010. The server then chooses a Home environment accordingly.
In a further example, a user is connected to a Home environment in which three users identified on his friends list can be found. In this latter example, the friends list is a list of user names and optionally IP addresses that have been received from other users that the user given wishes to meet regularly. Where different groups of friends are located on different Home environment servers (e.g. where the current user is the only friend common to both sets) then the user may either be connected to the one with the most friends, or given the option to choose.
Conversely, a user may invite one or more friends to switch between Home environments and join them. In this case, the user can view their friends list via a pop-up menu or from within the Home environment (for example via a screen on the wall or an information booth) and determine who is on-line. The user may then broadcast an invite to their friends, either using a peer-to-peer connection or, if the friend is within a Home environment or the IP address is unknown, via the Home environment server. The friend can then accept or decline the invitation to join.
To facilitate invitation, generally a Home environment server will assign less than the maximum supported number of users to a specific home environment, thereby allowing such additional user-initiated assignments to occur. This so-called soft-limit' may, for example, be 90% of capacity, and may be adaptive, for example changing in the early evening or at i weekends where people are more likely to meet up with friends on-line.
Where several friends are within the same Home environment, in an embodiment of the present invention the map screen may also highlight those zones in which the friends can currently be found, either by displaying their name on the map or in association with the zone name on the side bar.
Referring now also to Figure 10, in addition, preferences, settings, functions of the Home environment and ptionally other functionality may be viewed, adjusted or accessed as appropriate by use of a virtual Sony Playstation Portable (PSP) entertainment device 1072 that can be summoned by use of the game controller 751 to pop-upon screen. The user can then access these options, settings and functionality via a PSP cross-media bar 1074 displayed on the virtual PSP. As noted above, the PSP could also be used as an interface for video chat.
When a user wishes to leave the Home environment, in embodiments of the present invention they may do so by selection of an appropriate key on the game controller 751, by selection of an exit option from a pop-up menu, by selection of an exit from within the map screen, by selection of an option via their virtual PSP or by walking through a master exit within the lobby zone.
Typically, exiting the Home environment will cause the PS3 10 to return to the PS3 cross media bar.
Finally, it will be appreciated that additional, separate environments based upon the Home environment software and separately accessible from the PS3 cross-media bar are envisaged. For example, a supermarket may provide a free disk upon which a Supermarket environment, supported in similar fashion by the Home environment servers, is provided.
Upon selection, the user's avatar can browse displayed goods within a virtual rendition of the supermarket (either as 3D models or textures applied to shelves) and click on them to purchase as described above. In this way retailers can provide and update online shopping facilities for their own user base.
In an embodiment of the present invention, the avatar model comprises two aspects, a 0 mesh, or skin, defining the three dimensional surface upon which textures are placed, and a hierarchy of so-called bones' used to modify the vertices of the mesh. It should be understood that these bones are typically one-dimensional lines comprising a position, size and orientation, and are associated with vertices or regions of a mesh and/or optionally with other bones. As such, they do not correspond to human bones in the conventional sense.
is The mesh typically has a default design, e.g. for male and female avatars. In conventional animation, this mesh can be deformed by so called blend shapes' (or morph targets'). Blend shapes are commonly used for facial animation of game characters or avatars using known techniques. For example, during the designing of a game, an artist will typically explicitly design a default mesh describing the head of a game character or avatar together with blend shapes that depict facial expressions of that game character or avatar, such as left eyebrow raised, right eyebrow raised, mouth saying "oo", mouth saying "ee", and the like.
During animation, the animator assigns a blend weight to each blend shape so as to specify the degree to which that blend shape will influence the distortion of the template mesh during the animation. The rendered position of each vertex consequently corresponds to the sum of the blend shape offset positions multiplied by their respective weights plus the vertex position of the template mesh. Therefore, for example, an animator might choose to create a smiling avatar with a raised eyebrow by assigning an appropriate weight to those blend shapes describing those facial expressions.
Alternatively or in addition to blend shapes, animators traditionally also use skeletal animation. In skeletal animation, each bone in the skeleton is associated with one or more vertices of the mesh. Conversely, these vertices can also be associated with one or more bones, each association being determined by a vertex weight. Consequently the displacement of the mesh is determined by the weighted influence of the neighbouring bones.
Where one bone is linked to another bone, the relationship is described as a parentlchild' relationship. In this relationship, the positioning and orientation of the mesh node or region associated with the child bone is a product of the positioning, scaling and orientation of both the child and parent bone.
In conventional so-called skeletal' animation, this hierarchy simplifies the positioning of a character frame-by frame since, for example, moving a thigh bone will also move a lower leg bone, resulting in a change in the position of the associated mesh of the whole leg.
In the present embodiment, a skeletal model is used to enable the user to change bone parameters so as to modify, warp and otherwise deform the default mesh of the avatar, independent of whether skeletal animation is subsequently used to move the avatar about.
Likewise, blend shapes can be used to provide global modifications to the mesh of the avatar that can then be adjusted by the skeletal model in a similar fashion, again independently of whether blend shapes are used in subsequent animation.
Referring now to Figure II, to facilitate a fast customisation of the user's avatar, sets is of blend shapes (or morph targets) are pre-determined for different ethnicities such as Caucasian, Native American, African, Middle Eastern, Oriental, and Native Australian. As noted above, a blend shape is a deformed version of a mesh and in the present embodiment is defined with respect to the vertex points that describe the default male or female mesh. For example, the vertices of the blend shape could be defined as the positional offset from the vertices that describe the default, un-deformed mesh. To generate ethnic characteristics, the blend shape associated with the brow of a Native Australian, for example, will be different to that of a Caucasian.
According to the present embodiment, a user interface enables the user to select the percentage of each ethnicity they wish to include in their avatar, thereby providing a weighted average of the different pie-determined blend shapes for each ethnicity. Accordingly, a user can interact with the user interface to adjust the blend weight of each blend shape associated with each ethnic type so as to quickly and straightforwardly modify the avatars basic appearance. For example, an avatar could be 50% Caucasian and 50% African, or alternatively 40% Native American, 30% Oriental and 30% Native Australian. Optionally predetermined skin tones may be mixed in the same proportions, but preferably these are also independently adjustable.
Therefore, by modifying the blend weight associated with each blend shape that describes each ethnic type, a wide range of familiar face structures can be quickly imposed upon the default mesh of the avatar, providing an initial source of distinctiveness for the user.
It will be understood that different blend shape sets may be used for male and female avatars as appropriate, and that the available ethnicities are not limited to the above examples, and indeed can extend to fictional races and species.
Weighted blend shapes may be used to modify the appearance of the avatar as to body type or amount of body fat. For example, blend shapes for different neck thicknesses could be used or blend shapes relating to fatter or thinner body types could be used to modify the basic body mesh describing an avatar.
In addition to adjusting ethnicity and body type using weighted blend shapes (where the skeleton is unchanged by altering the blend weights). various bones and groups of bones can be adjusted directly via respective user interfaces so as to modify or further modify the appearance of the avatar. Thus, for example, bones used to control the jaw-line can be adjusted by selecting from a menu of adjustable features the option lower jaw-line' and then adjusting the bone parameters using the controller 751. As the controller has at least two sets of directional controls, typically two sets of parameters can be adjusted together Referring is now to Figure 12, these adjustments may comprise, for example, rear jaw width (left-hand horizontal control) and the angle between the chin and the rear of the jaw (left-hand vertical control), and chin prominence (right-hand horizontal control) and degree of double-chin (right-hand vertical control). These adjustments change the position, size and orientation of bones associated with the jaw accordingly, and can also affect any bones coupled to these jaw bones in the skeletal hierarchy; thus for example the cheek, upper jaw, nose, ears and neck may each be affected by changes to the jaw line.
Notably, the position, scale and orientation of each individual bone is not necessarily accessible via the user interface; the parameters input by the user are translated into parameters of the individual bones by the entertainment device. This makes the adjustment of the facial features simple for the user.
Other bone or bone groups can be adjusted in a similar manner to modify facial features, including cheek bones, brow ridge, eye socket size, lateral eye position, vertical eye position, nose position, cranial shape, upper face shape and lower face shape. As noted previously, bones do not correspond to literal bones and so may also be used for such features as lip size and shape, nose profile, vertical ear position and ear shape, as well as for double chins as in the preceding example.
In an embodiment of the present invention, an additional naturalistic effect is achieved by allowing modifications to one or more bones or bone groups to be asymmetric. For example, ears are rarely exactly matched, both in terms of size and vertical position on the head.
Thus, for example, bones used to control the ears can be adjusted by selecting from a menu of adjustable features the option ear disparity' or the like and then adjusting the bone parameters using the controller 751. Referring now to Figure 13, for example the relative imbalance in vertical position of the ears can be controlled by the left-hand horizontal control of the controller such that a move to the right raises the right ear and lowers the left, whilst a move to the left raises the left ear and lowers the right. Meanwhile, the relative imbalance in ear size can be controlled for example by the left-hand vertical control of the controller such to that a move upward increases the size of the right ear and decreases the size of the left, and a move downward increases the size of the left ear and decreases the size of the right.
Other bones or bone groups can be adjusted to create asymmetry in a similar manner, such as eye socket size, lateral and vertical eye position, brow tilt, vertical and lateral cheek bone position, nose position and profile, lateral jaw offset and jaw tilt. Several of these features can also be further grouped to provide quick asymmetry adjustments, typically based upon size and angle disparity with respect to the centre line of the face. Referring to Figures 14A and B, these may include for example upper-and/or lower-face symmetry. head (cranial) symmetry and overall face symmetry. In each case, the user input is very simple (e.g. vertical axis adjustments affect size aspects and horizontal axis adjustments affect balance aspects of the asymmetry), and are coupled to parameters of a relevant subset of bones in the skeletal model by a series of weights or transforms, so that for example eye, ear and nose changes in a single face symmetry' adjustment arc not necessarily to the same degree, but are proportionate with respect to each other so as to create a plausible human face following the subsequent deformation of the face mesh.
Again the specific position, scale and orientation of each individual bone is not necessarily accessible via the user interface; rather the user controls the degree of asymmetry with respect to certain parameters of certain bones. Again, this makes the adjustment of the facial features simple for the user when in practice there may be nearly 1 00 bones within the avatar's face.
It will be appreciated by a person skilled in the art that the input convention outlined above is one option, but that other suitable input conventions and methods are possible; for example, using the EyeToy video camera 756 to adjust one or more bone or bone groups by gesturing with respect to an on-screen depiction of the avatar.
in addition to adjustment of the bones in the avatar (and in particular in the head and face of the avatar), the user interface enables the addition of further meshes that may interact with bones of the avatar or be associated with further bones. These meshes typically provide accessories for the head and face, such as spectacles, hair, hats, and headphones, and for exotic races or amusing modifications, e.g. features such as horns, crests and trunks.
Thus, for example, a spectacles mesh may be associated with ear, cheek and nose bones, so that the position of the spectacles automatically reacts to the structure of the face.
Similarly, hair, hats and headphones may be associated with ear and cranial bones so that they stretch to fit the size of head and line up with the ear position. Other facial features such as beard and moustaches would be associated with the nose and jaw bones in a manner similar to the facial mesh over which they are placed or which they replace.
By using a skeletal model within the avatar (or at least within the avatar's head and face) in this fashion, and enabling bones or bone groups to be parametrically adjusted, and furthermore enabling the asymmetric modification of such bones or bone groups, a highly distinctive and naturalistic avatar can be generated. Moreover, by grouping adjustments in decreasing scales of detail, from overall ethnicity (which may be determined via weighted blend shapes), to large scale adjustments (e.g. upper/lower face), to a selection o intuitive adjustments to certain parameters of certain bone groups and finally bones, distinctive and individual faces can be quickly made for the avatar, whilst keeping its facial features harmonious by virtue of the linkages between bones in the skeletal model and the vertex weighting of the bones to the mesh.
By way of example, Figure ISA shows a default female avatar adjusted by weighted blend shapes to correspond to 50% Caucasian, 50% Native American. Figures 1 5B and I 5C then show this female avatar after various bone parameters have been adjusted in the manner described herein, so as to produce two highly distinctive faces by subsequent manipulation of the skeletal model.
The avatar may be further customised by the application of texture layers and texture effects. Different texture layers adding wrinkles, freckles, moles or blemishes and scars may be added with varying degrees of transparency via the user interface. Likewise, skin texture may be introduced and varied by controlling the degree of bump mapping (or other lighting effect) used in relation to one or more of these textures, or other dedicated bump-mapping textures. Thus scars, pock-marking and stubble may be added to varying degrees.
Optionally one or more texture layers may be alpha-blended, bump-mapped or otherwise included on the avatar according to an age' slider, thereby providing a quick way to vary the age of the character. Optionally certain bone parameters may also be coupled to this slider, for example to cause sunken cheekbones, deeper eye sockets, a thinner neck and larger ears as age progresses. Optionally different age sliders or an age profile switch could be provided to give different age effects; for example another age profile could result in the character becoming fatter-faced, red-cheeked and jowly with age, rather than gaunt.
Typically, the skeletal model and the mesh deformations responsive to the skeletal model and to the weighted blend shapes are computed by one or more SPEs I bA-H. The resulting modified mesh is used in combination with one or more textures by the RSX unit to render the avatar. The modified mesh is further manipulated to move the avatar, animate the face, perform lip-syncing and display emotions in substantially the same manner as would be done with a conventional or default avatar mesh.
Referring now back again to Figure 5, when the user logs into the home environment, the PS3 entertainment device transmits configuration details of the user's avatar including the modified bone parameters and blend shape weights to the Home environment server 2010, or optionally in a peer-to-peer fashion to the other PS3s 2031, 2032, 2033 in the same instance of the Home environment.
Likewise, it also receives from either the Home environment server or the peer PS3s the respective configuration details of the other avatars within that instance of the Home environment, also including their respective modified bone parameters and blend shape weights.
The PS3 entertainment device 10 then computes the mesh deformation for each avatar responsive to the relevant blend shape weights and modified bone parameters, before rendering it according to the other configuration details received for that avatar.
In an alternative embodiment, the data describing the deformed mesh is transmitted rather than the data describing the blend shape weights and bone parameters, thereby avoiding the need for the recipient PS3 to compute the effect of the blend shape weights and bone parameters on the mesh for each of the other avatars. In this case, the mesh may be transmitted in a conventional manner or as a set of deviations from a default mesh (or a default male or female mesh as applicable).
In an embodiment of the present invention the Home environment server is therefore operable to receive data descriptive of respective blend shape weights and modified avatar skeletons from each remote entertainment device in an instance of' the Home environment, and to transmit this data to the respective remaining PS3s. Alternatively it is operable to * 31 transmit mesh data for each avatar, either in a conventional mesh format or as deviations from a default mesh.
Thus an on-line system comprising a Home environment server and two or more PS3s allows the users of each PS3 to customise their own avatar by virtue of a skeletal modifier and blend shape weight adjuster coupled to a user interface, and to then distribute these modified avatars within the Home environment via the Home environment server before each PS3 renders the Home environment populated with said modified avatars.
It will be apparent to a person skilled in the art that embodiments of the present invention are not be limited to the Home environment, but are also applicable to any multiplayer on-line environment where a plurality of users may encounter each other through avatars, such as for example in an on-line game.
It will be appreciated by a person skilled in the art that the ethnic changes to the avatars face generated by weighted blend shapes may also be achieved by other deformers such as a lattice deformer or sculpt deformer, or by suitably placed bones with different parameter values, or by a combination of the above.
Referring now to Figure 16, in an embodiment of the present invention a method of user identification in an on-line virtual environment comprises in a first step slO, selecting a user avatar for use in the on-line virtual environment (for example selecting an initial gender or character class). Then in a second step s20, one or more physical properties of one or more skeletal components of the user avatar are modified via a user interface. In a third step s30, vertices in a three dimensional mesh representing some or all of the user avatar are adjusted in response to the position of one or more skeletal components of the user avatar, such that the placement of such vertices is responsive to one or more bones of the modified skeletal model.
Then in a fourth step s40, the user avatar is rendered responsive to the modified user avatar skeleton.
It will be apparent to a person skilled in the art that variations in the above method corresponding to operation of the various embodiments ofthe apparatus disclosed herein are considered within the scope of the present invention, including but not limited to: -transmitting data descriptive of the modified user avatar skeleton to one or more remote entertainment devices; -receiving data descriptive of respective modified avatar skeletons corresponding to one or more respective remote entertainment devices; -rendering a plurality of respective avatars corresponding to one or more respective remote entertainment devices, the rendering of each avatar being responsive to its respective modified avatar skeleton; -modif'ing one or more physical properties of one or more skeletal components of the user avatar asymmetrically; -modifying vertex positions of the three dimensional mesh in accordance with one or more blend shapes; and -selecting of additional skeletal components for incorporation into the user avatar.
It will be appreciated by a person skilled in the art that in embodiments of the present invention, elements of the method of user identification in an on-line virtual environment and corresponding skeletal modelling, skeletal modification, blend shape weighting, user input and rendering means of the apparatus may be implemented in any suitable manner.
Thus the adaptation of existing parts of a conventional equivalent entertainment device may be implemented in the form of a computer program product comprising processor implementable instructions stored on a data carrier such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or transmitted via data signals on a network such as an Ethernet, a wireless network, the Internet, or any combination of these of other networks, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the conventional equivalent device.

Claims (1)

  1. * . 33
    I. An entertainment device, comprising: skeletal modelling means to configure a three dimensional mesh representing some or S all of a user avatar in response to at least a first property of one or more skeletal components of the user avatar; skeleton modification means to modify one or more properties of one or more skeletal components of the user avatar via a user interface; and rendering means to render the user avatar in accordance with the three dimensional mesh as configured in response to the modified user avatar skeleton.
    2. An entertainment device according to claim 1, comprising: transmission means to transmit data descriptive of the modified user avatar skeleton to one or more remote entertainment devices, reception means to receive data descriptive of respective modified avatar skeletons corresponding to one or more respective remote entertainment devices; and in which: the rendering means is operable to render a plurality of respective avatars corresponding to one or more respective remote entertainment devices, the rendering of each avatar being responsive to its respective modified avatar skeleton.
    3. An entertainment device according to claim 2, in which transmission to remote entertainment devices and reception from remote entertainment devices is via an on-line server.
    4. An entertainment device according to any one of claims I to 3, in which the user interface of the skeletal modification means enables asymmetric modifications of one or more skeletal components of the user avatar.
    5. An entertainment device according to claim 4, in which the skeletal components that may be asymmetrically modified relate to facial features comprising one or more of the following: i. lateral eye position; ii. vertical eye position; iii. vertical ear position; iv. nose position; v. nose profile; vi. upper cranial shape; vii. upper face shape; viii. lower face shape; and ix. jaw tine.
    6 An entertainment device according any one of the preceding claims, in which the user interface of the skeletal modification means enables the selection of additional skeletal components for incorporation into the user avatar.
    7. An entertainment device according to claim 6, in which the additional skeletal components comprise one or more of the following.
    IS i. glasses; ii. hair, iii. hats; iv. headphones; v. horns; vi. crests; and vii. trunks.
    8. An entertainment device according to any one of the preceding claims, comprising: texture component adjustment means to adjust one or more parameters of one or more texture layers applied to the three dimensional mesh representing some or all of a user avatar.
    9. An entertainment device according to claim 8, in which an adjustable parameter is texture transparency.
    10. An entertainment device according to claim 8 or claim 9, in which an adjustable parameter is the degree of bump-mapping applied to a texture.
    II. An entertainment device according to any one of the preceding claims, comprising: * --35 mesh deformation means operable to alter the positions of vertices of the three dimensional mesh via the user interface, in which: the mesh deformation means is operable to alter the positions of the vertices of the three dimensional mesh in dependence upon the respective vertex positions of at least one of a plurality of predetermined three dimensional meshes each defined with respect to the three dimensional mesh representing some or all of the user avatar; the extent to which the three dimensional mesh is altered by a predetermined three dimensional mesh is dependent upon a blend weight associated with the predetermined three dimensional mesh and adjustable via the user interface; and the rendering means is operable to render the user avatar in accordance with the modified three dimensional mesh as modified by the mesh deformation means.
    12. An entertainment device according to claim 11, in which: each vertex of each of the plurality of predetermined three dimensional meshes is is defined as a positional offset with respect to the position of the corresponding vertex of the un-deformed three dimensional mesh; and the mesh deformation means is operable to modify the positions of the vertices of the three dimensional mesh in dependence upon a sum of the positional offsets for each vertex of the plurality of predetermined three dimensional meshes multiplied by their respective blend weights plus the vertex positions of the un-modified three dimensional mesh.
    13. An entertainment device according to claim II or claim 12, in which each of the plurality of predetermined three dimensional meshes is associated with an ethnic type.
    14. A server operable to administer a multi-player online virtual environment, the server comprising: reception means to receive data descriptive of respective modified avatar skeletons from a plurality of remote entertainment devices; and transmission means to transmit data descriptive of respective modified avatar skeletons to a plurality of remote entertainment devices.
    15. A server according to claim 14, operable to maintain a plurality of substantially similar on-line virtual environments each comprising a distinct plurality of avatars, wherein data descriptive of each avatar's skeleton is distributed between members of its respective on-line virtual environment.
    16. An on-line system comprising: first and second entertainment devices; an on-line server operable to administer a multi-player online virtual environment, the first entertainment device comprising: skeletal modelling means to configure a three dimensional mesh representing some or all of a user avatar in response to at least a first property of one or more skeletal components of the user avatar; skeleton modification means to modify one or more properties of one or more skeletal components of the user avatar via a user interface; rendering means to render the user avatar in accordance with the three dimensional mesh as configured in response to the modified user avatar skeleton; and transmission means to transmit data descriptive of the modified user avatar skeleton to the second entertainment device, and the second entertainment device comprising: reception means to receive data descriptive of modified user avatar skeleton from the first entertainment device; and rendering means operable to render the modified avatar of the first entertainment device accordance with the three dimensional mesh as configured in response to the modified avatar skeleton, the on-line server comprising: reception means to receive data descriptive of the modified user avatar skeleton from the first entertainment device; and transmission means to transmit data descriptive of the modified user avatar skeleton to a the second entertainment device.
    17. A method of avatar customisation for an on-line virtual environment comprising the steps of: selecting a user avatar for use in the on-line virtual environment; modifying one or more properties of one or more skeletal components of the user avatar via a user interface; configuring a three dimensional mesh representing some or all of the user avatar in response to at least a first property of one or more skeletal components of the user avatar; and rendering the user avatar in accordance with the three dimensional mesh as configured in response to the modified user avatar skeleton.
    8 A method of avatar custom isation according to claim 17, comprising the steps of.
    transmitting data descriptive of the modified user avatar skeleton to one or more remote entertainment devices; receiving data descriptive of respective modified avatar skeletons corresponding to one or more respective remote entertainment devices; and rendering a plurality of respective avatars corresponding to one or more respective remote entertainment devices, the rendering of each avatar being responsive to its respective modified avatar skeleton.
    19. A method of avatar customisation according to claim 17 or claim 18, in which the step of modifying one or more physical properties of one or more skeletal components of the user avatar enables asymmetric modifications of one or more skeletal components of the user avatar.
    20. A method of avatar customisation according to any one of claims 17 to 19, in which the step of modifying one or more physical properties of one or more skeletal components of the user avatar enables the selection of additional skeletal components for incorporation into the user avatar.
    21. A method of avatar customisation according to any one of claims 17 to 20, comprising the step of adjusting one or more parameters of one or more texture layers applied a three dimensional mesh representing some or all of a user avatar.
    22. A method of avatar customisation according to claim 21, in which an adjustable parameter is texture transparency.
    23. A method of avatar customisation according to claim 21 or claim 22, in which an adjustable parameter is the degree of bump-mapping applied to a texture. 1 38
    24. A method of avatar customisation according to any one of claims 17 to 23, comprising the steps of: deforming the positions of vertices of the three dimensional mesh via the user interface in dependence upon the respective vertex positions of at least one of a plurality of predetermined three dimensional meshes each defined with respect to the three dimensional mesh representing some or all of the user avatar, wherein the extent to which the three dimensional mesh is altered by a predetermined three dimensional mesh being dependent upon a blend weight associated with the predetermined three dimensional mesh and adjustable via the user interface; and to rendering the user avatar in accordance with the deformed three dimensional mesh as deformed by the step of deforming the positions of the three dimensional mesh.
    25. A method of avatar customisation according to claim 24, in which: each vertex of each of the plurality of predetermined three dimensional meshes is is defined as the positional offset with respect to the position of the corresponding vertex of the un-deformed three dimensional mesh; and the step of deform ing the positions of the vertices of the three dimensional mesh deforms the positions of the vertices of the three dimensional mesh in dependence upon a sum of the positional offsets for each vertex of the plurality of predetermined three dimensional meshes multiplied by their respective blend weights plus the vertex positions of the un-deformed three dimensional mesh.
    26. A method of administering a multi-player online virtual environment, comprising the steps of: receiving data descriptive of respective modified avatar skeletons from a plurality of remote entertainment devices; and transmitting data descriptive of respective modified avatar skeletons to a plurality of remote entertainment devices.
    27. A method of administering a multi-player online virtual environment according to claim 26, comprising the step of maintaining a plurality of substantially similar on-line virtual environments each comprising a distinct plurality of avatars, wherein data descriptive of each avatar's skeleton is distributed between members of its respective on-line virtual environment. * 39
    28. A data carrier comprising computer readable instructions that, when executed by a computer, cause the computer to operate as an entertainment device according to any one of claims ito 13.
    29. A data carrier comprising computer readable instructions that, when executed by a computer, cause the computer to operate as a server according to any one of claims 14 and 15.
    30. A data carrier comprising computer readable instructions that, when executed by a computer, cause the computer to operate as a component of and on-line system according to claim 16.
    31. A data carrier comprising computer readable instructions that, when executed by a computer, cause the computer to carry out the method of any one of claims 17 to 23.
    32. A data carrier comprising computer readable instructions that, when executed by a computer, cause the computer to carry out the method of any one of claims 26 to 27.
    33. A data signal comprising computer readable instructions that, when executed by a computer, cause the computer to operate as an entertainment device according to any one of claims Ito 13.
    34. A data signal comprising computer readable instructions that, when executed by a computer, cause the computer to operate as a server according to any one of claims 14 and 15.
    35. A data signal comprising computer readable instructions that, when executed by a computer, cause the computer to operate as a component of an on-line system according to claim 16.
    36. A data signal comprising computer readable instructions that, when executed by a computer, cause the computer to carry out the method of any one of claims 17 to 23.
    37. A data signal comprising computer readable instructions that, when executed by a computer, cause the computer to carry out the method of any one of claims 26 to 27.
GB0713186A 2007-07-06 2007-07-06 Avatar customisation, transmission and reception Withdrawn GB2450757A (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
GB0713186A GB2450757A (en) 2007-07-06 2007-07-06 Avatar customisation, transmission and reception
JP2010514129A JP2010532890A (en) 2007-07-06 2008-07-04 Avatar customization apparatus and method
PCT/GB2008/002321 WO2009007701A1 (en) 2007-07-06 2008-07-04 Apparatus and method of avatar customisation
US12/667,775 US20100203968A1 (en) 2007-07-06 2008-07-04 Apparatus And Method Of Avatar Customisation
EP08762523A EP2175950A1 (en) 2007-07-06 2008-07-04 Apparatus and method of avatar customisation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0713186A GB2450757A (en) 2007-07-06 2007-07-06 Avatar customisation, transmission and reception

Publications (2)

Publication Number Publication Date
GB0713186D0 GB0713186D0 (en) 2007-08-15
GB2450757A true GB2450757A (en) 2009-01-07

Family

ID=38440552

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0713186A Withdrawn GB2450757A (en) 2007-07-06 2007-07-06 Avatar customisation, transmission and reception

Country Status (5)

Country Link
US (1) US20100203968A1 (en)
EP (1) EP2175950A1 (en)
JP (1) JP2010532890A (en)
GB (1) GB2450757A (en)
WO (1) WO2009007701A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2177251A3 (en) * 2008-10-17 2011-01-26 Kabushiki Kaisha Square Enix (also trading as Square Enix Co., Ltd.) Three-dimensional design support apparatus and three-dimensional model display system
EP2459289A2 (en) * 2009-07-29 2012-06-06 Microsoft Corporation Auto-generating a visual representation

Families Citing this family (229)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2008245444B9 (en) 2007-04-30 2013-11-14 Acres Technology Gaming device with personality
US20120246585A9 (en) * 2008-07-14 2012-09-27 Microsoft Corporation System for editing an avatar
US8446414B2 (en) * 2008-07-14 2013-05-21 Microsoft Corporation Programming APIS for an extensible avatar system
US8384719B2 (en) * 2008-08-01 2013-02-26 Microsoft Corporation Avatar items and animations
US20100035692A1 (en) * 2008-08-08 2010-02-11 Microsoft Corporation Avatar closet/ game awarded avatar
US8473356B2 (en) 2008-08-26 2013-06-25 International Business Machines Corporation System and method for tagging objects for heterogeneous searches
US8704832B2 (en) 2008-09-20 2014-04-22 Mixamo, Inc. Interactive design, synthesis and delivery of 3D character motion data through the web
US8460107B2 (en) 2008-10-09 2013-06-11 Wms Gaming, Inc. Controlling and presenting virtual wagering game environments
US8749556B2 (en) 2008-10-14 2014-06-10 Mixamo, Inc. Data compression for real-time streaming of deformable 3D models for 3D animation
US8584031B2 (en) 2008-11-19 2013-11-12 Apple Inc. Portable touch screen device, method, and graphical user interface for using emoji characters
US8982122B2 (en) 2008-11-24 2015-03-17 Mixamo, Inc. Real time concurrent design of shape, texture, and motion for 3D character animation
US8659596B2 (en) 2008-11-24 2014-02-25 Mixamo, Inc. Real time generation of animation-ready 3D character models
JP2010134798A (en) * 2008-12-05 2010-06-17 Namco Bandai Games Inc Program, information storage medium, game device, and game system
US9105014B2 (en) 2009-02-03 2015-08-11 International Business Machines Corporation Interactive avatar in messaging environment
JP4843060B2 (en) * 2009-02-05 2011-12-21 株式会社スクウェア・エニックス GAME DEVICE, GAME CHARACTER DISPLAY METHOD, GAME PROGRAM, AND RECORDING MEDIUM
US20100259547A1 (en) 2009-02-12 2010-10-14 Mixamo, Inc. Web platform for interactive design, synthesis and delivery of 3d character motion data
US20100231582A1 (en) * 2009-03-10 2010-09-16 Yogurt Bilgi Teknolojileri A.S. Method and system for distributing animation sequences of 3d objects
US8788943B2 (en) * 2009-05-15 2014-07-22 Ganz Unlocking emoticons using feature codes
US20100306685A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation User movement feedback via on-screen avatars
US8284157B2 (en) 2010-01-15 2012-10-09 Microsoft Corporation Directed performance in motion capture system
TWI439960B (en) 2010-04-07 2014-06-01 Apple Inc Avatar editing environment
US8928672B2 (en) 2010-04-28 2015-01-06 Mixamo, Inc. Real-time automatic concatenation of 3D animation sequences
US8771064B2 (en) 2010-05-26 2014-07-08 Aristocrat Technologies Australia Pty Limited Gaming system and a method of gaming
US9245177B2 (en) * 2010-06-02 2016-01-26 Microsoft Technology Licensing, Llc Limiting avatar gesture display
US20110304629A1 (en) * 2010-06-09 2011-12-15 Microsoft Corporation Real-time animation of facial expressions
US8843585B1 (en) 2010-07-06 2014-09-23 Midnight Studios, Inc. Methods and apparatus for generating a unique virtual item
JP5134653B2 (en) * 2010-07-08 2013-01-30 株式会社バンダイナムコゲームス Program and user terminal
US8797328B2 (en) 2010-07-23 2014-08-05 Mixamo, Inc. Automatic generation of 3D character animation from 3D meshes
US9345973B1 (en) 2010-08-06 2016-05-24 Bally Gaming, Inc. Controlling wagering game system browser areas
US8911294B2 (en) 2010-08-06 2014-12-16 Wms Gaming, Inc. Browser based heterogenous technology ecosystem
JP5620743B2 (en) * 2010-08-16 2014-11-05 株式会社カプコン Facial image editing program, recording medium recording the facial image editing program, and facial image editing system
US20130031475A1 (en) * 2010-10-18 2013-01-31 Scene 53 Inc. Social network based virtual assembly places
US8832284B1 (en) 2011-06-16 2014-09-09 Google Inc. Virtual socializing
US10049482B2 (en) 2011-07-22 2018-08-14 Adobe Systems Incorporated Systems and methods for animation recommendations
US10748325B2 (en) 2011-11-17 2020-08-18 Adobe Inc. System and method for automatic rigging of three dimensional characters for facial animation
US20140364239A1 (en) * 2011-12-20 2014-12-11 Icelero Inc Method and system for creating a virtual social and gaming experience
US9747495B2 (en) 2012-03-06 2017-08-29 Adobe Systems Incorporated Systems and methods for creating and distributing modifiable animated video messages
US10147146B2 (en) * 2012-03-14 2018-12-04 Disney Enterprises, Inc. Tailoring social elements of virtual environments
US20130296046A1 (en) * 2012-03-30 2013-11-07 Marty Mianji System and method for collaborative shopping through social gaming
WO2013152453A1 (en) 2012-04-09 2013-10-17 Intel Corporation Communication using interactive avatars
US10155168B2 (en) 2012-05-08 2018-12-18 Snap Inc. System and method for adaptable avatars
US20140078144A1 (en) * 2012-09-14 2014-03-20 Squee, Inc. Systems and methods for avatar creation
JP2014199536A (en) * 2013-03-29 2014-10-23 株式会社コナミデジタルエンタテインメント Face model generating device, method for controlling face model generating device, and program
CN103218844B (en) * 2013-04-03 2016-04-20 腾讯科技(深圳)有限公司 The collocation method of virtual image, implementation method, client, server and system
US9524582B2 (en) 2014-01-28 2016-12-20 Siemens Healthcare Gmbh Method and system for constructing personalized avatars using a parameterized deformable mesh
US10438631B2 (en) * 2014-02-05 2019-10-08 Snap Inc. Method for real-time video processing involving retouching of an object in the video
US9536138B2 (en) 2014-06-27 2017-01-03 Microsoft Technology Licensing, Llc Dynamic remapping of components of a virtual skeleton
US10564820B1 (en) 2014-08-08 2020-02-18 Amazon Technologies, Inc. Active content in digital media within a media universe
GB2531549A (en) * 2014-10-21 2016-04-27 Sony Computer Entertainment Europe Ltd System and method of watermarking
US9830728B2 (en) 2014-12-23 2017-11-28 Intel Corporation Augmented facial animation
US10116901B2 (en) 2015-03-18 2018-10-30 Avatar Merger Sub II, LLC Background modification in video conferencing
US9940637B2 (en) 2015-06-05 2018-04-10 Apple Inc. User interface for loyalty accounts and private label accounts
US10213688B2 (en) * 2015-08-26 2019-02-26 Warner Bros. Entertainment, Inc. Social and procedural effects for computer-generated environments
US10445425B2 (en) 2015-09-15 2019-10-15 Apple Inc. Emoji and canned responses
US9868059B2 (en) 2015-10-21 2018-01-16 Activision Publishing, Inc. Interactive videogame using game-related physical objects
CN105426039A (en) * 2015-10-30 2016-03-23 广州华多网络科技有限公司 Method and apparatus for pushing approach image
US20170173473A1 (en) * 2015-12-16 2017-06-22 Crytek Gmbh Apparatus and method for automatically generating scenery
WO2017101094A1 (en) * 2015-12-18 2017-06-22 Intel Corporation Avatar animation system
US10339365B2 (en) 2016-03-31 2019-07-02 Snap Inc. Automated avatar generation
US10474353B2 (en) 2016-05-31 2019-11-12 Snap Inc. Application control using a gesture based trigger
US11580608B2 (en) 2016-06-12 2023-02-14 Apple Inc. Managing contact information for communication applications
EP3475920A4 (en) 2016-06-23 2020-01-15 Loomai, Inc. Systems and methods for generating computer ready animation models of a human head from captured data images
US10559111B2 (en) 2016-06-23 2020-02-11 LoomAi, Inc. Systems and methods for generating computer ready animation models of a human head from captured data images
US10360708B2 (en) 2016-06-30 2019-07-23 Snap Inc. Avatar based ideogram generation
US10348662B2 (en) 2016-07-19 2019-07-09 Snap Inc. Generating customized electronic messaging graphics
DK179978B1 (en) 2016-09-23 2019-11-27 Apple Inc. Image data for enhanced user interactions
US10609036B1 (en) 2016-10-10 2020-03-31 Snap Inc. Social media post subscribe requests for buffer user accounts
US10198626B2 (en) 2016-10-19 2019-02-05 Snap Inc. Neural networks for facial modeling
US10593116B2 (en) 2016-10-24 2020-03-17 Snap Inc. Augmented reality object manipulation
US10432559B2 (en) 2016-10-24 2019-10-01 Snap Inc. Generating and displaying customized avatars in electronic messages
US10769860B2 (en) * 2016-10-31 2020-09-08 Dg Holdings, Inc. Transferrable between styles virtual identity systems and methods
US10930086B2 (en) 2016-11-01 2021-02-23 Dg Holdings, Inc. Comparative virtual asset adjustment systems and methods
JP6392832B2 (en) * 2016-12-06 2018-09-19 株式会社コロプラ Information processing method, apparatus, and program for causing computer to execute information processing method
US10238968B2 (en) 2016-12-06 2019-03-26 Colopl, Inc. Information processing method, apparatus, and system for executing the information processing method
US10242503B2 (en) 2017-01-09 2019-03-26 Snap Inc. Surface aware lens
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US10242477B1 (en) 2017-01-16 2019-03-26 Snap Inc. Coded vision system
US10951562B2 (en) 2017-01-18 2021-03-16 Snap. Inc. Customized contextual media content item generation
US10454857B1 (en) 2017-01-23 2019-10-22 Snap Inc. Customized digital avatar accessories
US11069103B1 (en) 2017-04-20 2021-07-20 Snap Inc. Customized user interface for electronic communications
US10212541B1 (en) 2017-04-27 2019-02-19 Snap Inc. Selective location-based identity communication
US11893647B2 (en) 2017-04-27 2024-02-06 Snap Inc. Location-based virtual avatars
KR102455041B1 (en) 2017-04-27 2022-10-14 스냅 인코포레이티드 Location privacy management on map-based social media platforms
DK180007B1 (en) 2017-05-16 2020-01-16 Apple Inc. RECORDING AND SENDING EMOJI
DK179948B1 (en) 2017-05-16 2019-10-22 Apple Inc. Recording and sending Emoji
US10679428B1 (en) 2017-05-26 2020-06-09 Snap Inc. Neural network-based image stream modification
US11122094B2 (en) 2017-07-28 2021-09-14 Snap Inc. Software application manager for messaging applications
US10586368B2 (en) 2017-10-26 2020-03-10 Snap Inc. Joint audio-video facial animation system
US10657695B2 (en) 2017-10-30 2020-05-19 Snap Inc. Animated chat presence
US11069112B2 (en) * 2017-11-17 2021-07-20 Sony Interactive Entertainment LLC Systems, methods, and devices for creating a spline-based video animation sequence
US11460974B1 (en) 2017-11-28 2022-10-04 Snap Inc. Content discovery refresh
KR102387861B1 (en) 2017-11-29 2022-04-18 스냅 인코포레이티드 Graphic rendering for electronic messaging applications
WO2019108700A1 (en) 2017-11-29 2019-06-06 Snap Inc. Group stories in an electronic messaging application
US10949648B1 (en) 2018-01-23 2021-03-16 Snap Inc. Region-based stabilized face tracking
US10467793B2 (en) * 2018-02-08 2019-11-05 King.Com Ltd. Computer implemented method and device
US10726603B1 (en) 2018-02-28 2020-07-28 Snap Inc. Animated expressive icon
US10979752B1 (en) 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
JP7100842B2 (en) * 2018-03-29 2022-07-14 国立研究開発法人情報通信研究機構 Model analysis device, model analysis method, and model analysis program
US11310176B2 (en) 2018-04-13 2022-04-19 Snap Inc. Content suggestion system
WO2019204464A1 (en) 2018-04-18 2019-10-24 Snap Inc. Augmented expression system
DK201870378A1 (en) 2018-05-07 2020-01-13 Apple Inc. Displaying user interfaces associated with physical activities
US10375313B1 (en) 2018-05-07 2019-08-06 Apple Inc. Creative camera
DK201870374A1 (en) 2018-05-07 2019-12-04 Apple Inc. Avatar creation user interface
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
US10198845B1 (en) 2018-05-29 2019-02-05 LoomAi, Inc. Methods and systems for animating facial expressions
US20210166461A1 (en) * 2018-07-04 2021-06-03 Web Assistants Gmbh Avatar animation
US11074675B2 (en) 2018-07-31 2021-07-27 Snap Inc. Eye texture inpainting
US11030813B2 (en) 2018-08-30 2021-06-08 Snap Inc. Video clip object tracking
US10896534B1 (en) 2018-09-19 2021-01-19 Snap Inc. Avatar style transformation using neural networks
US10895964B1 (en) 2018-09-25 2021-01-19 Snap Inc. Interface to display shared user groups
US11189070B2 (en) 2018-09-28 2021-11-30 Snap Inc. System and method of generating targeted user lists using customizable avatar characteristics
US11245658B2 (en) 2018-09-28 2022-02-08 Snap Inc. System and method of generating private notifications between users in a communication session
US10698583B2 (en) 2018-09-28 2020-06-30 Snap Inc. Collaborative achievement interface
US10904181B2 (en) 2018-09-28 2021-01-26 Snap Inc. Generating customized graphics having reactions to electronic message content
CA3111498A1 (en) 2018-10-26 2020-04-30 Soul Machines Limited Digital character blending and generation system and method
US11103795B1 (en) 2018-10-31 2021-08-31 Snap Inc. Game drawer
US10872451B2 (en) 2018-10-31 2020-12-22 Snap Inc. 3D avatar rendering
US11176737B2 (en) 2018-11-27 2021-11-16 Snap Inc. Textured mesh building
US10902661B1 (en) 2018-11-28 2021-01-26 Snap Inc. Dynamic composite user identifier
US10861170B1 (en) 2018-11-30 2020-12-08 Snap Inc. Efficient human pose tracking in videos
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US11055514B1 (en) 2018-12-14 2021-07-06 Snap Inc. Image face manipulation
US11516173B1 (en) 2018-12-26 2022-11-29 Snap Inc. Message composition interface
US11032670B1 (en) 2019-01-14 2021-06-08 Snap Inc. Destination sharing in location sharing system
US10939246B1 (en) 2019-01-16 2021-03-02 Snap Inc. Location-based context information sharing in a messaging system
US11107261B2 (en) 2019-01-18 2021-08-31 Apple Inc. Virtual avatar animation based on facial feature movement
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US10984575B2 (en) 2019-02-06 2021-04-20 Snap Inc. Body pose estimation
US10656797B1 (en) 2019-02-06 2020-05-19 Snap Inc. Global event-based avatar
US10936066B1 (en) 2019-02-13 2021-03-02 Snap Inc. Sleep detection in a location sharing system
US10964082B2 (en) 2019-02-26 2021-03-30 Snap Inc. Avatar based on weather
US10852918B1 (en) 2019-03-08 2020-12-01 Snap Inc. Contextual information in chat
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11166123B1 (en) 2019-03-28 2021-11-02 Snap Inc. Grouped transmission of location data in a location sharing system
US10674311B1 (en) 2019-03-28 2020-06-02 Snap Inc. Points of interest in a location sharing system
US10992619B2 (en) 2019-04-30 2021-04-27 Snap Inc. Messaging system with avatar generation
DK201970531A1 (en) 2019-05-06 2021-07-09 Apple Inc Avatar integration with multiple applications
CN110111247B (en) * 2019-05-15 2022-06-24 浙江商汤科技开发有限公司 Face deformation processing method, device and equipment
USD916809S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
USD916810S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a graphical user interface
USD916872S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a graphical user interface
USD916811S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
USD916871S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
US10893385B1 (en) 2019-06-07 2021-01-12 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11676199B2 (en) 2019-06-28 2023-06-13 Snap Inc. Generating customizable avatar outfits
US11188190B2 (en) 2019-06-28 2021-11-30 Snap Inc. Generating animation overlays in a communication session
US11189098B2 (en) 2019-06-28 2021-11-30 Snap Inc. 3D object camera customization system
US11307747B2 (en) 2019-07-11 2022-04-19 Snap Inc. Edge gesture interface with smart interactions
US11551393B2 (en) 2019-07-23 2023-01-10 LoomAi, Inc. Systems and methods for animation generation
US11455081B2 (en) 2019-08-05 2022-09-27 Snap Inc. Message thread prioritization interface
US10911387B1 (en) 2019-08-12 2021-02-02 Snap Inc. Message reminder interface
US11320969B2 (en) 2019-09-16 2022-05-03 Snap Inc. Messaging system with battery level sharing
US11425062B2 (en) 2019-09-27 2022-08-23 Snap Inc. Recommended content viewed by friends
US11080917B2 (en) 2019-09-30 2021-08-03 Snap Inc. Dynamic parameterized user avatar stories
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US20230071947A1 (en) * 2019-11-25 2023-03-09 Sony Group Corporation Information processing system, information processing method, program, and user interface
US11063891B2 (en) 2019-12-03 2021-07-13 Snap Inc. Personalized avatar notification
US11128586B2 (en) 2019-12-09 2021-09-21 Snap Inc. Context sensitive avatar captions
US11036989B1 (en) 2019-12-11 2021-06-15 Snap Inc. Skeletal tracking using previous frames
US11263817B1 (en) 2019-12-19 2022-03-01 Snap Inc. 3D captions with face tracking
US11227442B1 (en) 2019-12-19 2022-01-18 Snap Inc. 3D captions with semantic graphical elements
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11140515B1 (en) 2019-12-30 2021-10-05 Snap Inc. Interfaces for relative device positioning
US11169658B2 (en) 2019-12-31 2021-11-09 Snap Inc. Combined map icon with action indicator
US11284144B2 (en) 2020-01-30 2022-03-22 Snap Inc. Video generation system to render frames on demand using a fleet of GPUs
US11356720B2 (en) 2020-01-30 2022-06-07 Snap Inc. Video generation system to render frames on demand
WO2021155249A1 (en) 2020-01-30 2021-08-05 Snap Inc. System for generating media content items on demand
US11036781B1 (en) 2020-01-30 2021-06-15 Snap Inc. Video generation system to render frames on demand using a fleet of servers
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11217020B2 (en) 2020-03-16 2022-01-04 Snap Inc. 3D cutout image modification
US11818286B2 (en) 2020-03-30 2023-11-14 Snap Inc. Avatar recommendation and reply
US11625873B2 (en) 2020-03-30 2023-04-11 Snap Inc. Personalized media overlay recommendation
US11956190B2 (en) 2020-05-08 2024-04-09 Snap Inc. Messaging system with a carousel of related entities
DK202070625A1 (en) 2020-05-11 2022-01-04 Apple Inc User interfaces related to time
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
JP6818219B1 (en) * 2020-05-29 2021-01-20 株式会社PocketRD 3D avatar generator, 3D avatar generation method and 3D avatar generation program
US11922010B2 (en) 2020-06-08 2024-03-05 Snap Inc. Providing contextual information with keyboard interface for messaging system
US11543939B2 (en) 2020-06-08 2023-01-03 Snap Inc. Encoded image based messaging system
US11356392B2 (en) 2020-06-10 2022-06-07 Snap Inc. Messaging system including an external-resource dock and drawer
US11580682B1 (en) 2020-06-30 2023-02-14 Snap Inc. Messaging system with augmented reality makeup
CN111899321B (en) * 2020-08-26 2023-09-26 网易(杭州)网络有限公司 Method and device for displaying expression of virtual character
US11863513B2 (en) 2020-08-31 2024-01-02 Snap Inc. Media content playback and comments management
US11360733B2 (en) 2020-09-10 2022-06-14 Snap Inc. Colocated shared augmented reality without shared backend
US11452939B2 (en) 2020-09-21 2022-09-27 Snap Inc. Graphical marker generation system for synchronizing users
US11470025B2 (en) 2020-09-21 2022-10-11 Snap Inc. Chats with micro sound clips
US11910269B2 (en) 2020-09-25 2024-02-20 Snap Inc. Augmented reality content items including user avatar to share location
US11660022B2 (en) 2020-10-27 2023-05-30 Snap Inc. Adaptive skeletal joint smoothing
US11615592B2 (en) 2020-10-27 2023-03-28 Snap Inc. Side-by-side character animation from realtime 3D body motion capture
US11688117B2 (en) 2020-11-04 2023-06-27 Sony Group Corporation User expressions in virtual environments
US11748931B2 (en) 2020-11-18 2023-09-05 Snap Inc. Body animation sharing and remixing
US11734894B2 (en) 2020-11-18 2023-08-22 Snap Inc. Real-time motion transfer for prosthetic limbs
US11450051B2 (en) 2020-11-18 2022-09-20 Snap Inc. Personalized avatar real-time motion capture
US11790531B2 (en) 2021-02-24 2023-10-17 Snap Inc. Whole body segmentation
US11562536B2 (en) * 2021-03-15 2023-01-24 Tencent America LLC Methods and systems for personalized 3D head model deformation
US11908243B2 (en) 2021-03-16 2024-02-20 Snap Inc. Menu hierarchy navigation on electronic mirroring devices
US11809633B2 (en) 2021-03-16 2023-11-07 Snap Inc. Mirroring device with pointing based navigation
US11734959B2 (en) 2021-03-16 2023-08-22 Snap Inc. Activating hands-free mode on mirroring device
US11798201B2 (en) 2021-03-16 2023-10-24 Snap Inc. Mirroring device with whole-body outfits
US11544885B2 (en) 2021-03-19 2023-01-03 Snap Inc. Augmented reality experience based on physical items
US11562548B2 (en) 2021-03-22 2023-01-24 Snap Inc. True size eyewear in real time
CN113050795A (en) * 2021-03-24 2021-06-29 北京百度网讯科技有限公司 Virtual image generation method and device
US11636654B2 (en) 2021-05-19 2023-04-25 Snap Inc. AR-based connected portal shopping
US11714536B2 (en) 2021-05-21 2023-08-01 Apple Inc. Avatar sticker editor user interfaces
US11776190B2 (en) 2021-06-04 2023-10-03 Apple Inc. Techniques for managing an avatar on a lock screen
US11941227B2 (en) 2021-06-30 2024-03-26 Snap Inc. Hybrid search system for customizable media
US11854069B2 (en) 2021-07-16 2023-12-26 Snap Inc. Personalized try-on ads
US11908083B2 (en) 2021-08-31 2024-02-20 Snap Inc. Deforming custom mesh based on body mesh
US11670059B2 (en) 2021-09-01 2023-06-06 Snap Inc. Controlling interactive fashion based on body gestures
US11673054B2 (en) 2021-09-07 2023-06-13 Snap Inc. Controlling AR games on fashion items
US11663792B2 (en) 2021-09-08 2023-05-30 Snap Inc. Body fitted accessory with physics simulation
US11900506B2 (en) 2021-09-09 2024-02-13 Snap Inc. Controlling interactive fashion based on facial expressions
US11734866B2 (en) 2021-09-13 2023-08-22 Snap Inc. Controlling interactive fashion based on voice
US11798238B2 (en) 2021-09-14 2023-10-24 Snap Inc. Blending body mesh into external mesh
US11836866B2 (en) 2021-09-20 2023-12-05 Snap Inc. Deforming real-world object using an external mesh
US11636662B2 (en) 2021-09-30 2023-04-25 Snap Inc. Body normal network light and rendering control
US11790614B2 (en) 2021-10-11 2023-10-17 Snap Inc. Inferring intent from pose and speech input
US11836862B2 (en) 2021-10-11 2023-12-05 Snap Inc. External mesh with vertex attributes
US11651572B2 (en) 2021-10-11 2023-05-16 Snap Inc. Light and rendering of garments
US11763481B2 (en) 2021-10-20 2023-09-19 Snap Inc. Mirror-based augmented reality experience
US11748958B2 (en) 2021-12-07 2023-09-05 Snap Inc. Augmented reality unboxing experience
US11880947B2 (en) 2021-12-21 2024-01-23 Snap Inc. Real-time upper-body garment exchange
US11928783B2 (en) 2021-12-30 2024-03-12 Snap Inc. AR position and orientation along a plane
US11887260B2 (en) 2021-12-30 2024-01-30 Snap Inc. AR position indicator
US11823346B2 (en) 2022-01-17 2023-11-21 Snap Inc. AR body part tracking system
US11954762B2 (en) 2022-01-19 2024-04-09 Snap Inc. Object replacement system
US11870745B1 (en) 2022-06-28 2024-01-09 Snap Inc. Media gallery sharing and management
US11893166B1 (en) 2022-11-08 2024-02-06 Snap Inc. User avatar movement control using an augmented reality eyewear device
JP7382112B1 (en) 2023-01-25 2023-11-16 Kddi株式会社 Information processing device and information processing method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0934874A (en) * 1995-07-21 1997-02-07 Nippon Telegr & Teleph Corp <Ntt> Distributing coordinate space configuration method and system
EP0804032A2 (en) * 1996-04-25 1997-10-29 Matsushita Electric Industrial Co., Ltd. Transmitter-receiver of three-dimensional skeleton structure motions and method thereof

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6219045B1 (en) * 1995-11-13 2001-04-17 Worlds, Inc. Scalable virtual world chat client-server system
GB9626825D0 (en) * 1996-12-24 1997-02-12 Crampton Stephen J Avatar kiosk
US6147692A (en) * 1997-06-25 2000-11-14 Haptek, Inc. Method and apparatus for controlling transformation of two and three-dimensional images
JP3338382B2 (en) * 1997-07-31 2002-10-28 松下電器産業株式会社 Apparatus and method for transmitting and receiving a data stream representing a three-dimensional virtual space
CA2227361A1 (en) * 1998-01-19 1999-07-19 Taarna Studios Inc. Method and apparatus for providing real-time animation utilizing a database of expressions
JP2001216531A (en) * 2000-02-02 2001-08-10 Nippon Telegr & Teleph Corp <Ntt> Method for displaying participant in three-dimensional virtual space and three-dimensional virtual space display device
US6545682B1 (en) * 2000-05-24 2003-04-08 There, Inc. Method and apparatus for creating and customizing avatars using genetic paradigm
KR100327541B1 (en) * 2000-08-10 2002-03-08 김재성, 이두원 3D facial modeling system and modeling method
WO2005020129A2 (en) * 2003-08-19 2005-03-03 Bandalong Entertainment Customizable avatar and differentiated instant messaging environment
US7388580B2 (en) * 2004-05-07 2008-06-17 Valve Corporation Generating eyes for a character in a virtual environment
US20070268293A1 (en) * 2006-05-19 2007-11-22 Erick Miller Musculo-skeletal shape skinning

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0934874A (en) * 1995-07-21 1997-02-07 Nippon Telegr & Teleph Corp <Ntt> Distributing coordinate space configuration method and system
EP0804032A2 (en) * 1996-04-25 1997-10-29 Matsushita Electric Industrial Co., Ltd. Transmitter-receiver of three-dimensional skeleton structure motions and method thereof

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2177251A3 (en) * 2008-10-17 2011-01-26 Kabushiki Kaisha Square Enix (also trading as Square Enix Co., Ltd.) Three-dimensional design support apparatus and three-dimensional model display system
EP2386335A3 (en) * 2008-10-17 2011-11-30 Kabushiki Kaisha Square Enix (also trading as Square Enix Co., Ltd.) Three-dimensional design support apparatus and three-dimensional model display system
EP2386336A3 (en) * 2008-10-17 2011-11-30 Kabushiki Kaisha Square Enix (also trading as Square Enix Co., Ltd.) Three-dimensional design support apparatus and three-dimensional model display system
US8941642B2 (en) 2008-10-17 2015-01-27 Kabushiki Kaisha Square Enix System for the creation and editing of three dimensional models
EP2459289A2 (en) * 2009-07-29 2012-06-06 Microsoft Corporation Auto-generating a visual representation
EP2459289A4 (en) * 2009-07-29 2013-11-13 Microsoft Corp Auto-generating a visual representation

Also Published As

Publication number Publication date
GB0713186D0 (en) 2007-08-15
EP2175950A1 (en) 2010-04-21
WO2009007701A1 (en) 2009-01-15
JP2010532890A (en) 2010-10-14
US20100203968A1 (en) 2010-08-12

Similar Documents

Publication Publication Date Title
US20100203968A1 (en) Apparatus And Method Of Avatar Customisation
US9259641B2 (en) Entertainment device and method
EP2131935B1 (en) Apparatus and method of data transfer
EP2131934B1 (en) Entertainment device and method
US20100030660A1 (en) Apparatus and method of on-line transaction
US8771083B2 (en) Apparatus and method of on-line reporting
US8606904B2 (en) Apparatus and method of administering modular online environments
US20130132837A1 (en) Entertainment device and method
EP2153617B1 (en) Apparatus and method of data transfer
WO2008104783A1 (en) Entertainment device and method
GB2461175A (en) A method of transferring real-time multimedia data in a peer to peer network using polling of peer devices

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)