GB2516241A - Avatar creation system and method - Google Patents

Avatar creation system and method Download PDF

Info

Publication number
GB2516241A
GB2516241A GB201312640A GB201312640A GB2516241A GB 2516241 A GB2516241 A GB 2516241A GB 201312640 A GB201312640 A GB 201312640A GB 201312640 A GB201312640 A GB 201312640A GB 2516241 A GB2516241 A GB 2516241A
Authority
GB
United Kingdom
Prior art keywords
avatar
target
feature definitions
key
system according
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB201312640A
Other versions
GB201312640D0 (en
Inventor
Michael James Levy
Original Assignee
Michael James Levy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Michael James Levy filed Critical Michael James Levy
Priority to GB201312640A priority Critical patent/GB2516241A/en
Publication of GB201312640D0 publication Critical patent/GB201312640D0/en
Publication of GB2516241A publication Critical patent/GB2516241A/en
Application status is Pending legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/10Control of the course of the game, e.g. start, progess, end
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/12Video games, i.e. games using an electronically generated display having two or more dimensions involving interaction between a plurality of game devices, e.g. transmisison or distribution systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • A63F2300/5553Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history user representation in the game field, e.g. avatar

Abstract

An avatar creation system comprises a database and a conversion tool. The database is operable to store, in relation to a key avatar, a set of key feature definitions representative of the appearance of the key avatar. The conversion tool is operable to extract from the database at least a subset of the key feature definitions, preferably drawn up in a look up table, and to generate from the extracted key feature definitions a set of target avatar feature definitions for rendering an avatar at a target avatar handling service. By storing key feature definitions for a key "master" avatar, a very detailed description of the appearance of the avatar can be generated once, and used to spawn avatars for difference platforms, applications and environment at varying different levels of detail. In this way, a user need only generate his avatar once at a high level of detail, rather than multiple times for different handling services. A wide variety of feature definitions can be envisaged, including facial features, physical build, clothing, accessories, pets, holidays and events, sports and hobbies, movements and background scenes.

Description

AVATAR CREATION SYSTEM AND METHOD

Field of the Invention

The present invention relates to an avatar creation system and method.

Background of the Invention

Avatars are widely used in games and other online environments to give a user an online presence and personality. The widespread usage of avatars is curtailed by the need to recreate an avatar for a given user for every online experience and game platform. This is sometimes referred to as YAP -Yet Another Protocol. Presently every sing'e Avatar world or new experience has to start with the making -or more likely -remaking of the user's likeness to form a new Avatar. Since building an avatar can take up to two hours this acts as an effective barrier to popular use by App makers, software writers and the user customer.

While players or social chat-room users can he offered varying short cut avatars, these are far less desirable than custom avatars since all the advantages of a likeness with some sort of fidelity with the user makes for a more exciting and immersive experience. Research has shown that the mind's reward and p'easure receptors cannot distinguish between getting virtual rewards and getting rewarded in real life. This principle is most true with an avatar which the user has created of themself.

The company Gravatar while offering a Universal avatar" across some platforms is limited through the use of its own avatar (albeit a static image likeness) and the non-partnership of other Avatar sites and platforms. In short it is not commercially "Avatar agnostic".

Another solution to quickly providing an avatar that looks like the user is to use the user's PC camera to capture their face, which is then converted to a quick representation -cartoonising the face as the basis for the avatar. The problem with this is that there is currently no way this data can convert itself into other commonly used avatar platforms e.g.: Xbox, Wii, second life, PS3 etc.

Summary of the Invention

According to an aspect of the invention, there is provided an avatar creation system, comprising: a database, operable to store, in relation to a key avatar, a set of key feature definitions representative of the appearance of the key avatar; a conversion tool, operable to extract from the database at least a subset of the key feature definitions, and to generate from the extracted key feature definitions a set of target avatar feature definitions for rendering an avatar at a target avatar handling service.

By storing detailed key feature definitions for a key master", avatars can be generated for different platforms, applications and environments at varying levels of detail. Each of these target" avatars has an appearance represented by a set of target avatar feature definitions. A relationship between the master definitions and the target definitions can be known in advance, permitting a conversion tool to operate to extract at least some of the key feature definitions from the database and to generate corresponding target avatar feature definitions which can be processed at a target avatar handling service. In this way, a user need only generate his likeness once at a high level of detail, rather than multiple times for different handling services. Each time he wishes to generate an avatar on a new platform, it can be generated automatically without the user needing to re-enter the same information again. It will be appreciated that the database, in operation, is likely to store data relating to multiple users and multiple key likenesses. A given user may have one, or more than one, key avatar stored in the database. It will be understood that, due to differences between the characteristics used to define avatars in different environments, the avatars will generally not appear identical, but will have noticeable similarities in appearance.

Each feature definition may comprise one or both of a numerical value and a text descriptor. For example, numerical values may be used to specify a metric such as "height" in cm, or to specify an identification number indicative of a particular colour.

Text descriptors may be used to indicate a specific characteristic, such as "Dark Brown" for hair colour or pale blue" for eye colour.

One or more of the set of target avatar feature definitions may be generated by referring to a lookup table defining, for each permissible value, range of values or text descriptor for a key feature definition, a corresponding value, range of values or text descriptor for a corresponding target avatar feature definition. Other techniques for mapping from key feature definitions to target feature definitions, such as a mapping function, could also be used.

Generally, a key feature definition is likely to take on a greater number of permissible values or text descriptors than a corresponding target avatar feature definition. This is because the key avatar is required to store each feature definition at a sufficient level of detail that a corresponding target feature definition can be generated for any given avatar handling service. However, it will be appreciated that in some cases an avatar handling service may require at least some of its target feature definitions to be provided at the same level of detail as the key feature definitions.

Where a lookup table is used, it may define a corresponding target avatar feature definition for each of a plurality of different target avatar handling services.

This enables the same table to be used to convert key feature definitions into target feature definitions for multiple different target avatar handling services. Alternatively, separate lookup tables may be used for each target avatar handling service.

An avatar creation tool may be provided which is operable to generate a set of key feature definitions representative of the appearance of an avatar and upload the set of key feature definitions to the database. The avatar creation tool can be manipulated by the user to generate the key feature definitions. The database and the conversion tool may be implemented on an avatar creation server. The avatar creation tool may be a website through which the user can enter certain details, or select from various options, in order to arrive at a key avatar which they are happy with.

The conversion tool may be responsive to a request to extract and generate the target feature definitions, the request indicating the target avatar handling service in relation to which the target feature definitions are to be generated. In response to the request, the conversion tool selects the key feature definitions for extraction and generating the appropriate corresponding target feature definitions based on the indication of the target avatar handling service provided by the request.

The target avatar handling service may be provided by a software application, the software application being operable to request the avatar creation server to provide the set of target avatar feature definitions in relation to an avatar; to receive from the avatar creation server the requested set of target avatar feature definitions; to populate an avatar modol using the received set of target avatar feature definitions to form an avatar; and to render a visual representation of the avatar.

The software application permits a user to log in to the avatar creation server via the software application to access their avatar. The software application may be responsive to the log in by the user to obtain the set of target avatar feature definitions from the avatar creation server. This means that the user does not need to separately log into the avatar creation tool and the target avatar handling service.

In an alternative embodiment, the conversion tool is implemented in a software application running the target avatar handling service. In this case, the software application is operable to request the avatar creation server to provide the extracted subset of key feature definitions in relation to an avatar; to generate from the extracted subset of the key feature definitions the set of target avatar feature definitions; to populate an avatar model using the set of target avatar feature definitions to form an avatar; and to render a visual representation of the avatar.

In this case, the software application may permit a user to log in to the avatar creation server via the software application to access their avatar, the software application being responsive to the log in by the user to obtain the subset of the key feature definitions from the database.

It will be appreciated from the above that the database and conversion tool may be implemented together at a server, which is responsive to requests to provide target avatar feature definitions to the requesting software application, or akernatively the conversion tool may be provided separately. In the latter example the target avatar handling service may execute the conversion tool itself.

The avatar creation tool may be operable to generate at least some of the key feature definitions from an image of an individual. In one example, the image may be a photo image of the user captured by a camera. In this way, a user wishing to make an avatar which ooks like hims&f is able to at least part populate the key avatar profile automatically using a web cam for example. Certain features such as hair colour and eye colour can be detected automatically and used to set the relevant key avatar feature definitions. In some cases the key avatar profile may be generated based on an image with no user input required.

The avatar creation tool may be operable to partly generate an avatar profile of a user based on an image of the user, the avatar creation tool being responsive to user input to refine the avatar profile to generate the key feature definitions of the avatar for storage in the database. In this way the image of the user gives the user a head-start in

S

populating the key avatar profile by automatically populating certain feature definitions, but the user retains overall control and can amend the key avatar profile as they desire.

While the image could be obtained from a web cam, personal camera, mobile telephone or other personal device, the image could instead be obtained from a social media website. For example, the avatar creation tool may have a function to accessing and analysing the user's social media pages to identify photos or other images stored on their profile, which could be used to assist in the generation of a key avatar profile. In some cases information from the social media website other than images may be used to obtain information about the user for populating the key avatar feature definitions.

The target avatar handling service maybe operable to part populate an avatar profile based on the target avatar feature definitions; to complete the avatar profile based on user input; and to render an avatar based on the completed avatar profile.

The conversion tool may be operable to provide the target feature definitions to the target avatar handling service in a format selected in dependence on the target avatar handling service. This means that not only the substance of the key avatar feature definitions, but also their form can be converted as appropriate to be usable by the target avatar handling service.

A wide variety of feature definitions can be envisaged, including facial features, physical build, clothing accessories, pets, holidays and events, sports and hobbies,

movements and background scenes.

The target feature definitions can be rendered to form a visual representation which approximates a visual representation which could be rendered using the key feature definitions.

According to another aspect of the present invention, there is provided an avatar creation method, comprising the steps of: storing, in a database, in relation to a key avatar, a set of key feature definitions representative of the appearance of the key avatar; extracting from the database at least a subset of the key feature definitions; and generating from the extracted key feature definitions a set of target avatar feature definitions for rendering an avatar at a target avatar handling service.

Further aspects of the present invention include a computer program and a storage medium for storing the computer program.

Brief Description of the Drawings

Embodiments of the present invention will now be described with reference to the following drawings, in which: Figure 1 schematically illustrates a variety of different avatar types used by different platforms and software programs; Figure 2 schematically illustrates how a single source avatar can be used to generate platform specific avatars for a plurality of different platforms and software programs; Figure 3 schematically illustrates an avatar handling system according to a first embodiment; Figure 4 schematically illustrates an avatar handling system according to a second embodiment; Figure 5 schematically illustrates the inputs and outputs to an avatar handling database; Figure 6 schematically illustrates the different data outputs which are provided for different platforms in relation to the same source avatar; and Figure 7 schematically illustrates a method of creating and handling avatars.

Description of the Example Embodiments

Referring to Figure 1, a variety of different avatar types used by different platforms and software programs are shown. Each of the avatars shown in Figure 1 represents the same individual in the real world. It can be seen that, while not identical, the avatars in Figure 1 are all reminiscent of each other. Some of the avatars are of a cartoon, or caricature, nature, with relatively little detail. Others are much more detailed. Embodiments of the present invention seek to provide an online digital process whereby one key identity avatar [a 3D or 2D representation of a user -animated or non-animated) can be converted into other avatar style platforms or creative styles. The process seeks to maintain the recognisable attributes of the key identity" from one avatar style to another through the creation of a database for the key identity avatar likeness, but also interface and supply all partner avatar styles with their database/software requirements, and language.

In some embodiments the online digital process permits a person to create their "core' avatar once in an "avatar creation suite" to be saved in a central database, with an avatar creation process associated with any further participating avatar game/experience, chat room or app being skipped by accessing the central database, which holds their core avatar I.D and description. The central database can format the Core ID. into whatever configuration the cafling experience requires. This removes or at least reduces the need for the user to re-create their avatar in the pertinent avatar game experience. The process of creating the avatar means the details of the person's physical characteristics or "core l.D." can be called on by other apps, experiences, games, chat-rooms etc. and be converted" or transmitted in the desired format or skin thus negating or reducing the need for another build process. The conversion takes into account any differences in the ratios of head to body, features on face, or avatar to external objects such as a car, bike or tree. The "core ID" may for example be an average skeleton architecture, and all deviations from that mean/average may be approximated within the required skin's world/architecture.

In embodiments of the present invention, the user visits a website where they will create their avatar key identity using an avatar creation suite. The user's avatar key identity is created digitally as part of an avatar creation process, and stored as data on the central server. From here it is available for access from any participating platforms, avatar creation suites, consumer experiences, app builders and software writers in the participating platform's digital format and language. This enables App builders or designers to offer an avatar style "ready made" -by accessing the central identity database. The user is recognised and a description of the avatar in whatever language or format the software or app requires can be provided based on the key identity to provide the closest achievable match to the key identity within the constraints of the requested language and format. This also means that the software writers can potentially offer multiple avatar styles within their app whilst preserving the individual's key characteristics across the styles. Using this identity database allows for a choice of avatar forms with fidelity to the essential "core" attributes that makes an avatar recognisable to the individual, and to others.

By making use of the database, on encountering a new game or other avatar enabled experience (such as an avatar enables psycho therapy, or airport security), it is envisaged that a single button press, followed by security (which might be login and password), may provide the software with the information required to generate a fully built avatar in the style of the game/platform. In particular, the software (either the game/avatar handling software or software operating on the server -both implementations being described below) is responsive to the user logging into the server via the game/avatar handling service software, obtains the set of target feature

B

definitions from the server (database) and uses these to build the avatar. Although the style is that of the platform or app requesting the avatar key identity -it will still have the user's features to the nearest approximation supported by the platform.

The process of holding the key attributes of a person's "look" or likeness in a database to be accessed and thereby on request release data in an appropriate format and in a multiplicity of styles in various avatar platforms addresses the problem of the user having to create their avatar again and again, and may also result in greater consistency between their avatars on different platforms. By using the database as a "nexus" for an avatar key identity across a diverse network of experiences and games and chat-rooms then a universe of networked avatar experiences is possible. In particular, many different styles of avatar may be used, facilitating new apps and avatar offering being formulated because designers can now offer a single button instant you" avatar, and have the choice to pass on multiple skins whereby the user could choose to appear as his/her XBOX avatar, or perhaps something more realistic like a P53 avatar.

Without this methodology avatar experiences will continue in discrete and disparate walled gardens and the ease of use of a one button Plug and play" for avatar creation will rely on other less comprehensive solutions -such as individual avatar makers/platforms offering best guess avatars from instant PC camera capture.

However, this is not a network solution nor for distinct style avatars (non realistic), easy to achieve.

In the example of Figure 1, the physical characteristics of each avatar, while clearly different in architecture/style each serve to carry the key characteristics of the user. Similarly, referring now to Figure 2, a source avatar is identified, defined at a high level of detail [the level of detail being represented by the 9 code pages next to the source avatar representation -note that while in Figure 2 the source avatar is shown without any detail, in practice the source avatar will generally be closer to photo-realistic than any of the target avatars). The source avatar is used to generate target avatars for each of a plurality of different target platforms or programs, examples of which are shown in Figure 2. In the present case, corresponding target avatars for the Faceyourmanga, Farmville, Weeworld, Unique, XBOX36O and Second Life environments are shown. It can be seen that the respective avatars are defined at various levels of detail -represented by the number of code pages shown to the right of their indicative representation. For example, the Faceyourmanga avatar is defined at a low level of detail (represented here by a single code page), while the Second Life avatar is defined at a much higher (almost photorealistic) level of detail (represented here by six code pages). It will therefore be understood that for any given avatar handling service, not all feature definitions of the key avatar are required. Accordingly, the database need not necessarily provide data corresponding to all feature definitions to the requesting avatar handling service. Instead, only those feature definitions which are actually relevant to the creation of an avatar at the requesting avatar handling service are provided.

It will be appreciated that not only do different avatar handling platforms/services define avatars at different levels of detail, in addition they define avatars in different formats, using different parameters. Accordingly, the feature definitions extracted from the database may need to undergo format conversion into a target format associated with the requesting avatar handling service, and may also need to be mapped from the values used to define the feature definitions in the key avatar to values used to define corresponding feature definitions of the target avatar.

One or more of the set of target avatar feature definitions may be generated by referring to a lookup table. Such a lookup table may define, for each permissible value, range of values or text descriptor for a key feature definition, a corresponding value, range of values or text descriptor for a corresponding target avatar feature definition.

This idea is represented in Table 1, below. Table 1 is a correspondence table (which may be used as a lookup table), which sets out, for each of three different example features (hair colour, hair style and physical build), the possible values which can be taken on for each of the key avatar and avatars associated with two different avatar handling services (labelled "Service 1" and "Service 2").

In relation to the feature "hair colour", it can be seen that 99 different values are available for defining hair colour for the key avatar. Four of these are shown in the table, these being black (feature value=1), black with grey (feature value=2), red and blonde (feature value=98) and bright red (feature value=99). It can be seen that each of service 1 and service 2 are able to match the colour black (feature value=1), but use different feature values J and B respectively. However, the colour black with grey" (feature value=2) of the key avatar is not available in service 1, and therefore the colour black (feature value=J) is to be used again. Service 2 has a closer colour, dark grey, which can be used (feature value=DG), but again is only able to approximate the colour specified in the key feature definition. Similarly, the key avatar feature definition red and blonde" is not available for service 2, and must be substituted with red (feature value=X), while service 2 is able to provide blond/red highlights (feature value=BRH).

In relation to the feature "hair style", it can be seen that 79 different values are available in relation to the key avatar. Two of these are shown, these being "bald" (feature value=1) and "dreadlocks" (feature value=79). The "bald" hair style can be seen to be available in each of service 1 and service 2, albeit represented by the different values "U" and "B" respectively. However, the hair style "dreadlocks" (feature value=79) is only available in service 2 (with the feature value of Dr). In service 1, the best approximation to dreadlocks is tong/Straight" hair, with a feature value of F'.

Finally, in relation to the feature "physical build", it can be seen that 27 different values are available in relation to the key avatar. Two of these are shown, these being "bodybuilder" (feature value=1) and "gaunt" (feature value=27). The "bodybuilder" physical build maps onto the physical build heavy" of service 1 (with the feature value "M") and the physical build "bodybuilder" of service 2 (with the feature value "Bb"J.

From the above, it will be appreciated that the lookup table can be used to convert a given feature definition value of the key avatar to a feature definition value for a corresponding feature definition of a target avatar handling service. It will be appreciated that different avatar handling services may use different names for the same feature definitions. In addition to the key avatar being defined using a wider range of values (than any given avatar handling service), the key avatar may also be defined using a larger number of features than the avatar handling services. Different avatar handling services may use a different number of features. For example, some avatar handling services may provide a small number of fixed outfits, whereas others may allow separate selection of top half (e.g. shirts) and bottom half (e.g. trousers) garments. In this case, a given feature definition value for a target avatar handling service may be selected based on the values of a plurality of different features definitions of the key avatar. For example, an outfit for the target avatar handling service may be selected in dependence on both the top half garment type specified in a first feature definition of the key avatar and the bottom half garment type specified in a second feature definition of the key avatar.

It will be appreciated that Table 1 is merely exemplary, and shows only a small portion of the type of lookup table which would be required to facilitate the conversion of any given permissible value of any given feature definition of the key avatar into a corresponding value of the corresponding feature definition for a target avatar handling service.

In general, more values will be available to define each feature definition for the key avatar than for an avatar for any avatar handling service. However, in some cases one or more avatar handling services may define a particular characteristic with the same number as values as for the key avatar. In principle it would be possible to have an avatar handling service which uses a feature definition having a greater number of possible values than the corresponding feature definition for the key avatar, however this would be suboptimal since it would mean that the key avatar would only be able to provide an approximation of the avatar handling service, rather than the other way around. In Table 1, either a number or a letter combination is used to indicate a particular value for a feature definition. More generally, each feature definition comprises one or both of a numerical value and a text descriptor.

Key Avatar Service 1 Service 2 Hair 1 Black J Black B Black colour 2 Black with grey J Black DG Dark grey 98 Red and blonde X Red BRH Blonde/Red highlights 99 Bright red X Red BR Bright Red Hairstyle 1 Bald D Bald B Bald 79 Dreadlocks F Long/Straight Dr Dreadlocks Physical 1 Bodybuilder M Heavy Bb Bodybuilder build 27 Gaunt L Thin Ga Gaunt Table 1: Correspondence (Lookup) Table Referring to Figure 3, an example architecture according to a first embodiment is schematically illustrated. In Figure 3, an avatar creation server 101 and a target avatar handling service 102 (for example the XBOX36O avatar handling engine) are provided. The avatar creation server 101 comprises a database 103 for storing, in relation to a key avatar, a set of key feature definitions representative of the appearance of the key avatar, and a conversion tool 104 for extracting from the database 103 at least a subset of the key feature definitions, and for generating from the extracted key feature definitions a set of target avatar feature definitions for rendering an avatar at the target avatar handling service 102. The conversion tool 104 comprises a lookup table 105 which is used to map stored feature definition values of the key avatar to corresponding feature definition values of an avatar which can be rendered at the target avatar handling service 102. In operation, the target avatar handling service 102 issues to the avatar creation server 101 a request 108 for the avatar creation server 101 to provide a set of avatar feature definitions suitable for rendering, at the target avatar handling service 102, an avatar for a particular user. The request includes an indication of one or both of the identity of the user (user ID), and the identity of the desired avatar (avatar ID). The request may also include an indication of the type of the target avatar handling service (handler ID). In response to the request, the conversion tool 104 is operable to issue a request to the database 103 to provide at least a subset of the feature definitions for an identified user/avatar -the subset being those feature definitions (determined from the handler ID) required to generate the target feature definitions for the requesting target avatar handling service 102. In some embodiments the type of the target avatar handling service may instead be inferred from the origin of the request 106. In response to the request 108, the database provides to the conversion tool 104 the requested subset of feature definitions for the indicated user in a message 109. The conversion tool 104 then uses the lookup table 105 to convert the key feature definitions provided by the database 103 into the corresponding target feature definitions required by the target avatar handling service 102. The conversion tool 104 then formats the generated target feature definitions into a structure appropriate for the target avatar handling service 102, and then sends the resulting formatted target avatar feature definitions to the target avatar handling service 102 via the message 107. The target avatar handling service 102 is then able to populate an avatar model using the target avatar feature definitions and render a visual representation of the avatar based on the populated model.

Referring to Figure 4, an example architecture according to a second embodiment is schematically illustrated. In Figure 4, an avatar creation server 201 and a target avatar handling service 202 (for example the XBQX36O avatar handling engine) are provided. The avatar creation server 201 comprises a database 203 for storing, in relation to a key avatar, a set of key feature definitions representative of the appearance of the key avatar. The target avatar handling service 202 comprises a conversion tool 204 for extracting from the database 203 at least a subset of the key feature definitions, and for generating from the extracted key feature definitions a set of target avatar feature definitions for rendering an avatar atthe target avatar handling service 202.

The conversion tool 204 comprises a lookup table 205 which is used to map stored feature definition values of the key avatar to corresponding feature definition values of an avatar which can be rendered at the target avatar handling service 202. In operation, the target avatar handling service 202 uses the conversion tool 204 to issue to the avatar creation server 201 a request 210 for the avatar creation server 201 to provide a set of avatar feature definitions suitable for rendering, at the target avatar handling service 202, an avatar for a particular user. The request includes an indication of one or both of the identity of the user (user ID), and the identity of the desired avatar (avatar ID). In this case the request does not include an indication of the type of the target avatar handling service, because the request 210 itself specifies the subset of the feature definitions to be extracted. This is possible because the conversion tool 204 is part of the target avatar handling service 202, and is thus dedicated to conversion from the key avatar feature definitions and feature definitions of the target avatar handling service 202. This is in contrast with the conversion tool 104 of Figure 3, which is required to handle conversion from the key avatar feature definitions to feature definitions for multiple different target avatar handling services. In response to the request 110, the database 203 provides to the conversion tool 204 the requested subset of feature definitions for the indicated user in a message 212. The conversion tool 204 then uses the lookup table 205 to convert the key feature definitions provided by the database 203 into the corresponding target feature definitions required by the target avatar handling service 202. The conversion tool 204 then formats the generated target feature definitions into a structure appropriate for the target avatar handling service 202, and then makes the resulting formatted target avatar feature definitions available for the target avatar handling service 202 to populate an avatar model and render a visual representation of the avatar.

It will be appreciated from Figures 3 and 4 as described above that the conversion tool can be located either at the avatar creation server, or at the target avatar handling service. In the former case the conversion tool handles conversion from the key avatar feature definitions into multiple target avatar feature definitions, on request. In the latter case the conversion tool handles conversion from the key avatar feature definitions into the target avatar feature definitions specifically required by the target avatar handling service with which the conversion tool is associated. The arrangement of Figure 3 has the advantage that the avatar creation server has full control of the process, and the target avatar handling service need only issue a request for an avatar definition in a format which it is able to handle. In other words, the target avatar handling service does not need any knowledge of how avatar data is stored at the avatar creation server. 1he arrangement of Figure 4 has the advantage that the administrator of the target avatar handling service is able to readily modify the lookup table used by the conversion tool to achieve a desired mapping from the key avatar feature definitions to its own avatar feature definitions.

If a new platform joins the central avatar handling system, in the case of the Figure 3 arrangement a new column can simply be added to the lookup table to permit conversion into a new format In the case of the Figure 4 arrangement, a lookup table can be issued to the new platform to enable the new platform to implement a conversion tool to obtain and convert key avatar data from the database into a format which can be processed at the new platform.

Referring to Figure 5, inputs 2 and outputs 4 to/from an avatar handling database 3 as described above are schematically illustrated. In Figure 5, an avatar user X 1 enters various data at an avatar creation server as part of an avatar creation process to generate his key avatar identity as a set of key feature definitions. The data entered in the example of FigureS includes that related to 3D architecture, movement articulation, world structure backgrounds, API language protocols, face and features build options, clothing options, sports and hobbies, holidays and events, accessories and pets. The data is stored in the avatar handling database 3 as an avatar key identity profile for the user 1, and is represented as a set of key feature definitions.

Subsequently, the user X 1 wants to use a 3rd party application Y, which calls on the database 3 to provide the necessary avatar feature definition information. In response, the database 3 recognises API type and so can supply required data correctly formatted (database search probe diamond), and checks security (database checks security diamond), then provides the requested information (target avatar feature definitions) to the 3id party application Y. In the example of FigureS, the information provided by the database to the 3rd party application are the face and features build options, clothing options, sports and hobbies, holidays and events, accessories, pets 3D architecture, movement articulation, world structure backgrounds and API language protocols. In this case all feature definitions are provided, but as described above this may not always be the case. For example, certain platforms may not make used of "holidays and events" or "pets", in which case there will be no purpose in extracting this information from the database. The user Xis then able to view his avatar S in the style of the 3' party application.

Referring to Figure 6 the different data outputs which are provided for different platforms in relation to the same source avatar are schematically illustrated. The database recognises the API type of the requesting target avatar handling service, and so can supply the required data in the correct format. In Figure 6, three code portions are shown. The code portion on the left of Figure 6 is formatted for the Second Life platform, and includes a highlighted code portion which provides the code corresponding to the user's face. The code portion in the centre of Figure 6 is formatted for the XBOX36O, and includes a highlighted code portion which again provides the code corresponding to the user's face. The code portion on the right of Figure 6 is formatted for the Unique platform, and includes a highlighted code portion which provides the code corresponding to the user's face. It can be seen that Referring to Figure 7 a method of creating and handling avatars is schematically illustrated. At a step Si, a user creates a key avatar using an online avatar creation suite accessible via a mobile device, tablet computer or desktop computer, for example. The key avatar identity data is stored into a central avatar identity database at a step S2. At a step 53, the user subsequently wishes to access a participating avatar experience and to use their own avatar. This access can again be by way of mobile device, tablet computer or desktop computer, for example. At a step S4, a request for a game/app/experience to gain access to the central avatar identity for the user is made to the central avatar identity database. The database can output a central avatar identity from the creation suite in all participating macro or non macro formats. It should be noted that the central avatar identity should preferably exceed or equal the output formats in terms of detail. It can therefore be considered that the other formats are simplified versions within their inherent architecture. Accordingly, at a step S5, in response to the step S4, the central avatar identity database outputs the nearest likeness possible within the specific platform and macro parameters. In Figure 7, (different) outputs are shown to be generated for Second Life, Unique and XBOX36O.

Additional formats such as Meez, PS3 and other current or future platforms/formats can also be handled.

It will be appreciated from Figure 7 that the process stores a high end detailed avatar profile in digital form in a database. The database is then able to send out requested avatar data in a language required by any experience, game, or app. If the experience game or app is so configured and capable, then the whole range of stored avatar profiles of all participating platforms can be chosen from at will for use with the chosen app or game. The avatar data can be fed out across participating digital platforms, mobile, tablet, PC/Mac or interactive 1V, as desired.

In one example, the following steps may be conducted: 1. The user logs into the avatar creation website and is presented with full data capture legals and permissions to authorise the central storage of data associated with the user 2. Now the user can choose between (a) Quick build, or (b) Manual build 3. If the user opts for the Quick build, then a. A picture is taken (for example using a webcam) for comparison and to send the user at a later date as a reminder to upgrade the look (for example to get more detail into the camera image) b. The camera image is digitally disseminated i.e.: skin tones, face shape and mapped against as many points as can be gleaned [facial recognition) -in this way at least some of the key feature definitions can be generated automatically from an image of an individual without the individual having to enter data manually.

c. A comparison across other social media sites and the user's social graph is conducted to determine if photos are available across the user's social network, e.g. facebook, tumblr, etc. and a range of pre made avatars are offered to the user to pick the closest d. The user clicks on the desired pre made avatar and is invited to upgrade and refine this to give a more accurate physical match (if not, then this prompts an email upgrade request with live link back to the Suite) -this enables the part populated (by camera) avatar profile to be completed based on user inputs.

e. The user accepts [d) the Quick Avatar build invite and they are placed into the Quick Build exercise in the Creation Suite 4. If the user opts to use the Manual build then the user goes to the Avatar Creation Suite, where a. A picture is taken for comparison and to steer the experience towards proffering the user optimum speed-customisation-direction and to send the user at a later date a reminder email to provide more detail b. The system creates a rough working sketch" of the user based on the camera image. It is not intended that this is used [although it can and forms the basis of 3 above), but is rather intended to provoke a response from the user to create a likeness closer to their own self view c. The User finishes their likeness and saves to the database d. The data is compartmentalised into the constituent parts which includes: Structure of avatar [Mean height, Physical characteristic parameters) Presentation (Clothing, Colour palette) World (Viewing characteristics, Three quarters above, 2D picture, 3D rendering and geography) Articulation, if designed for motion (How limbs are connected, Pre ordained moves, Options for individual moves controlled by the user, Speed of movement) Particular Avatar Creation suite options (Language input/outputs protocols, Switches, sliders, blending options, etc) Options [Gender, Features, generic Face, Body) Data capture methodology [PC cam, Picture download by user to database, Mouse click (Rapidity, accuracy), Keyboard usage [Mistakes in typing, speed etc) Language/programming (Cookies, Inputs/outputs aligned, Data presentation speeds/loads) e. The user leaves the website f. The user joins an experience/game/platform that is a participant in the avatar network -e.g.: their API is available to determine an appropriate format conversion, and they have incorporated a Membership login button for accessing the avatar server.

g. The user presses the Login button and the platform/game/experience calls on the avatar database for the avatar within format protocols appropriate to the platform h. The avatar database checks for security and sends code in the correct format for the platform.

i. The game/experience/platform supplies the experience with an avatar that as closely resembles the user as possible. In some cases the avatar may only be part-populated by the avatar creation website, with the game/experience/platform then completing the avatar profile based on user's inputs, and rendering the resulting avatar.

j.The usage of the style of Avatar and time spent is logged The creation suite may be of sufficient complexity and detail that any call from different avatar platforms will result in less than the maximum amount of data relating to the avatar being passed out to an avatar platform. The database can present the data within the options for that specific avatar's platform within the animation protocols, macros, limitations by options etc specified for that platform. For example, for a user with dreadlocked brown hair some avatar styles may offer this exact type, while others may not. An algorithm governing the closest style to the one in the central ID profile may in these cases where exact fidelity is not possible -offer long brown (smooth) hair instead. For example if in the creation suite the user indicates that it/he has dreadlocks then in all skins (if there is an option) for dreadlocks the database will chose this as suitable. If it is not an option, then hair length, colour and/or an overall style will be substituted. Therefore the outputs are a best in class rendering of the core ID within the parameters allowed or available. This way, the nearest approximation to the user can be achieved. This process of approximation can be extended all the way to the most primitive avatar renderings -where the resulting avatar will at least be the closest to the user's identity given the limited options available for that platform.

Claims (24)

  1. CLAIMS1. An avatar creation system, comprising: a database, operable to store, in relation to a key avatar, a set of key feature definitions representative of the appearance of the key avatar; a conversion tool, operable to extract from the database at least a subset of the key feature definitions, and to generate from the extracted key feature definitions a set of target avatar feature definitions for rendering an avatar at a target avatar handling service.
  2. 2. An avatar creation system according to claim 1, wherein each feature definition comprises one or both of a numerical value and a text descriptor.
  3. 3. An avatar creation system according to claim 2, wherein one or more of the set of target avatar feature definitions are generated by referring to a lookup table defining, for each permissible value, range of values or text descriptor for a key feature definition, a corresponding value, range of values or text descriptor for a corresponding target avatar feature definition.
  4. 4. An avatar creation system according to claim 3, wherein a key feature definition may take on a greater number of permissible values or text descriptors than a corresponding target avatar feature definition.
  5. 5. An avatar creation system according to claim 3 or claim 4, wherein the lookup table defines a corresponding target avatar feature definition for each of a plurality of different target avatar handling services.
  6. 6. An avatar creation system according to claim 1, comprising: an avatar creation tool, operable to generate a set of key feature definitions representative of the appearance of an avatar and upload the set of key feature definitions to the database.
  7. 7. An avatar creation system according to any preceding claim, wherein the database and the conversion tool are implemented on an avatar creation server.
  8. 8. An avatar creation system according to claim 7, wherein the conversion tool is responsive to a request to extract and generate the target feature definitions, the request indicating the target avatar handling service in relation to which the target feature definitions are to be generated, the conversion tool selecting the key feature definitions for extraction and generating the appropriate corresponding target feature definitions based on the indication of the target avatar handling service provided by the request.
  9. 9. An avatar creation system according to claim 7 or claims, comprising: a software application providing the target avatar handling service, the software application being operable to request the avatar creation server to provide the set of target avatar feature definitions in relation to an avatar; to receive from the avatar creation server the requested set of target avatar feature definitions; to populate an avatar model using the received set of target avatar feature definitions to form an avatar; and to render a visual representation of the avatar.
  10. 10. An avatar creation system according to claim 9, wherein the software application permits a user to log in to the avatar creation server via the software application to access their avatar, the software application being responsive to the log in by the user to obtain the set of target avatar feature definitions from the avatar creation server.
  11. 11. An avatar creation system according to any one of claims 1 to 6, comprising: a software application providing the target avatar handing service, wherein the conversion tool is implemented in the software application, the software application being operable to request the avatar creation server to provide the extracted subset of key feature definitions in relation to an avatar; to generate from the extracted subset of the key feature definitions the set of target avatar feature definitions; to populate an avatar model using the set of target avatar feature definitions to form an avatar; and to render a visual representation of the avatar.
  12. 12. An avatar creation system according to claim 11, wherein the software application permits a user to log in to the avatar creation server via the software application to access their avatar, the software application being responsive to the log in by the user to obtain the subset of the key feature definitions from the database.
  13. 13. An avatar creation system according to claim 6, wherein the avatar creation tool is operable to generate at least some of the key feature definitions from an image of an individual.
  14. 14. An avatar creation system according to claim 13, wherein the image is a photo image of the user captured by a camera.
  15. 15. An avatar creation system according to claim 6, wherein the avatar creation tool is operable to partly generate an avatar profile of a user based on an image of the user, the avatar creation tool being responsive to user input to refine the avatar profile to generate the key feature definitions of the avatar for storage in the database.
  16. 16. An avatar creation system according to claim 15, wherein the image is obtained from a social media website.
  17. 17. An avatar creation system according to any preceding claim, wherein the target avatar handling service is operable to part populate an avatar profile based on the target avatar feature definitions; to complete the avatar profile based on user input; and to render an avatar based on the completed avatar profile.
  18. 18. An avatar creation system according to any preceding claim, wherein the conversion tool is operable to provide the target feature definitions to the target avatar handling service in a format selected in dependence on the target avatar handling service.
  19. 19. An avatar creation system according to any preceding claim, wherein the feature definitions relate to one or more of facial features, physical build, clothing, accessories, pets, holidays and events, sports and hobbies, movements and background scenes.
  20. 20. An avatar creation system according to any preceding claim, wherein the target feature definitions can be rendered to form a visual representation which approximates a visual representation which could be rendered using the key feature definitions.
  21. 21. An avatar creation method, comprising the steps of: storing, in a database, in relation to a key avatar, a set of key feature definitions representative of the appearance of the key avatar; extracting from the database at least a subset of the key feature definitions; and generating from the extracted key feature definitions a set of target avatar feature definitions for rendering an avatar at a target avatar handling service.
  22. 22. A computer program which when executed on a computer causes the computer to execute a method according to claim 21.
  23. 23. An avatar creation system substantially as hereinbefore described with reference to the accompanying drawings.
  24. 24. An avatar creation method substantially as hereinbefore described with reference to the accompanying drawings.
GB201312640A 2013-07-15 2013-07-15 Avatar creation system and method Pending GB2516241A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB201312640A GB2516241A (en) 2013-07-15 2013-07-15 Avatar creation system and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB201312640A GB2516241A (en) 2013-07-15 2013-07-15 Avatar creation system and method
PCT/GB2014/052128 WO2015008042A1 (en) 2013-07-15 2014-07-11 Avatar creation system and method

Publications (2)

Publication Number Publication Date
GB201312640D0 GB201312640D0 (en) 2013-08-28
GB2516241A true GB2516241A (en) 2015-01-21

Family

ID=49081294

Family Applications (1)

Application Number Title Priority Date Filing Date
GB201312640A Pending GB2516241A (en) 2013-07-15 2013-07-15 Avatar creation system and method

Country Status (2)

Country Link
GB (1) GB2516241A (en)
WO (1) WO2015008042A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040179037A1 (en) * 2003-03-03 2004-09-16 Blattner Patrick D. Using avatars to communicate context out-of-band
EP2291816A2 (en) * 2008-06-18 2011-03-09 Microsoft Corporation User avatar available across computing applications and devices
WO2013120851A1 (en) * 2012-02-13 2013-08-22 Mach-3D Sàrl Method for sharing emotions through the creation of three-dimensional avatars and their interaction through a cloud-based platform
WO2013152455A1 (en) * 2012-04-09 2013-10-17 Intel Corporation System and method for avatar generation, rendering and animation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8692830B2 (en) * 2010-06-01 2014-04-08 Apple Inc. Automatic avatar creation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040179037A1 (en) * 2003-03-03 2004-09-16 Blattner Patrick D. Using avatars to communicate context out-of-band
EP2291816A2 (en) * 2008-06-18 2011-03-09 Microsoft Corporation User avatar available across computing applications and devices
WO2013120851A1 (en) * 2012-02-13 2013-08-22 Mach-3D Sàrl Method for sharing emotions through the creation of three-dimensional avatars and their interaction through a cloud-based platform
WO2013152455A1 (en) * 2012-04-09 2013-10-17 Intel Corporation System and method for avatar generation, rendering and animation

Also Published As

Publication number Publication date
WO2015008042A1 (en) 2015-01-22
GB201312640D0 (en) 2013-08-28

Similar Documents

Publication Publication Date Title
US9802119B2 (en) Virtual environment for computer game
US8019654B2 (en) System and method for producing custom merchandise from a virtual environment
CN101452339B (en) Method and system for rendering of real world objects and interactions into a virtual universe
US8788943B2 (en) Unlocking emoticons using feature codes
JP3685877B2 (en) Communication device
US20160180597A1 (en) Computer implemented methods and systems for generating virtual body models for garment fit visualisation
US20020095523A1 (en) Virtual world system, server computer and information processor
RU2419844C2 (en) Setting up interactive extra-game context game sessions
US20080158232A1 (en) Animation control method for multiple participants
US9542038B2 (en) Personalizing colors of user interfaces
US8130219B2 (en) Metadata for avatar generation in virtual environments
CN105917626B (en) The method and apparatus for controlling peripheral equipment by social network-i i-platform
US20090199275A1 (en) Web-browser based three-dimensional media aggregation social networking application
US8738533B2 (en) Real-world items unlocking virtual items online and in video games
DE60206059T2 (en) Method, system and storage medium for an icon language communication tool
US8108774B2 (en) Avatar appearance transformation in a virtual universe
JP2014519124A (en) Emotion-based user identification for online experiences
US20070021200A1 (en) Computer implemented character creation for an interactive user experience
EP2745894A2 (en) Cloud-based game slice generation and frictionless social sharing with instant play
CN1209723C (en) Forming method of virtual images and virtual scenes capable of being combined freely
US8814704B2 (en) Game server that allows online game user to designate proxy to exercise an area limited acquisition right, game controlling method thereof, game system, and non-transitory computer-readable medium
US7342587B2 (en) Computer-implemented system and method for home page customization and e-commerce support
US20120330785A1 (en) Systems and methods for purchasing virtual goods in multiple virtual environments
US9402057B2 (en) Interactive avatars for telecommunication systems
CN104170318B (en) Use the communication of interaction incarnation