WO2007007020A1 - System of animated, dynamic, expresssive and synchronised non-voice mobile gesturing/messaging - Google Patents

System of animated, dynamic, expresssive and synchronised non-voice mobile gesturing/messaging Download PDF

Info

Publication number
WO2007007020A1
WO2007007020A1 PCT/GB2006/001129 GB2006001129W WO2007007020A1 WO 2007007020 A1 WO2007007020 A1 WO 2007007020A1 GB 2006001129 W GB2006001129 W GB 2006001129W WO 2007007020 A1 WO2007007020 A1 WO 2007007020A1
Authority
WO
WIPO (PCT)
Prior art keywords
character
user
application
message
data
Prior art date
Application number
PCT/GB2006/001129
Other languages
French (fr)
Inventor
Daniel Maclaren
Original Assignee
Minimemobile Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB0502174A external-priority patent/GB0502174D0/en
Priority claimed from GB0520651A external-priority patent/GB0520651D0/en
Application filed by Minimemobile Ltd filed Critical Minimemobile Ltd
Publication of WO2007007020A1 publication Critical patent/WO2007007020A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C65/00Joining or sealing of preformed parts, e.g. welding of plastics materials; Apparatus therefor
    • B29C65/78Means for handling the parts to be joined, e.g. for making containers or hollow articles, e.g. means for handling sheets, plates, web-like materials, tubular articles, hollow articles or elements to be joined therewith; Means for discharging the joined articles from the joining apparatus
    • B29C65/7802Positioning the parts to be joined, e.g. aligning, indexing or centring
    • B29C65/7832Positioning the parts to be joined, e.g. aligning, indexing or centring by setting the overlap between the parts to be joined, e.g. the overlap between sheets, plates or web-like materials
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65HHANDLING THIN OR FILAMENTARY MATERIAL, e.g. SHEETS, WEBS, CABLES
    • B65H27/00Special constructions, e.g. surface features, of feed or guide rollers for webs
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B35/00Work-feeding or -handling elements not otherwise provided for
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/04Protocols specially adapted for terminals or networks with limited capabilities; specially adapted for terminal portability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/75Indicating network or usage conditions on the user display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/30Definitions, standards or architectural aspects of layered protocol stacks
    • H04L69/32Architecture of open systems interconnection [OSI] 7-layer type protocol stacks, e.g. the interfaces between the data link level and the physical level
    • H04L69/322Intralayer communication protocols among peer entities or protocol data unit [PDU] definitions
    • H04L69/329Intralayer communication protocols among peer entities or protocol data unit [PDU] definitions in the application layer [OSI layer 7]

Definitions

  • SMS Short Message Service
  • MMS Picture and Video messaging
  • the present Invention provides products, processes, apparatus and systems for Users to Send and receive graphical, character based, dynamic, animated, expressive and synchronised non-voice mobile messages using mobile wireless devices 50.
  • the present Invention incorporates an application which allows the creation of these Characters or Avatars to a high specification to resemble the appearance of the User and mimic the behaviour of the user, as illustrated in Figure 4.
  • the main application houses these characters in indoor or outdoor graphical spaces ('Environments') which also contains objects available to the Character to use in its daily activities.
  • the physical appearance changes over time to simulate growth of the avatar and to reflect its mood over the level of care it has sustained.
  • the behaviour also changes over time in accordance with the inbuilt artificial intelligence (AI) which reflects whether the avatar is happy with the level of care it has received.
  • a happy avatar deports a clearly happy countenance and performs caring actions to delight its owner e.g. dancing.
  • An unhappy avatar destroys objects in its environment by setting them on fire or breaking them and deports an unhappy or angry countenance.
  • the present invention in one embodiment supports mobile phones, PDA, desktop computers and the like.
  • using a Java Enabled Mobile Phone to use the character based, dynamic, synchronised, expressive, animated messaging ('Chat' or 'Chat Messaging')
  • a user 'User A'
  • invites a user(s) from their address book 'User B'
  • the request goes to the Gateway server which finds a place on the same Messaging server for all the participants to chat.
  • User B receives a message saying that User A has requested a chat should they press yes the chat is initiated and User A's character walks onto the screen to join their Character and User B's Character walks onto the screen of User A's phone to join their character.
  • the User A or B select or type a desired emotion in conjunction with a proposed action and enters the message (As illustrated in Figure 14).
  • This message is data rather than images which is directed through a database so that the User can be billed for the minimum for each message sent and all appropriate data such as the recipient(s) details are recorded.
  • This message data is translated on the recipients phone to produce the animation, emotion and action.
  • the message will appear on the screen of the recipient's phone and their character will display the desired emotion and perform the action to the recipient's character. This can be observed on User A and User B's phone.
  • the textual component of the return message is presented underneath any previous messages and may be scrolled through to see what occurred previously.
  • the present invention overcomes the limitations of existing forms of non-voice mobile communication by allowing mobile phone Users ('Users') to communicate through miniature, artificially intelligent avatars ('Character(s)', 'SimBaby' or 'SimBabies', as applicable) based closely upon their own appearance and personalities.
  • These Characters reside within an application on the Users' handset, in one embodiment in the wallpaper space of the Users' handset. More specifically, the Characters reside in 'Environments' which consist of virtual spaces i.e. outdoor or indoor settings that can be designed by the User (or pre-designed by a third party) to be as large or small as the phones memory will allow as these environments are split over multiple screens.
  • the invention creates a 'virtual mobile world' through which mobile phone Users can communicate and interact with one another in an animated, dynamic and expressive manner, much as they would in the real and physical world.
  • a User can communicate with other Users through visual, highly personalised, miniature animations based upon his or her own nature and appearance;
  • the animation is dynamic meaning that movement is not a pre rendered animation sent by the server, it is happening live on the handset by sending data which manipulates the characters on the handsets
  • a User can meaningfully communicate mood, movement, gesticulation and action for the first time;
  • a User can simultaneously view a multiple-party exchange of communication (i.e. incoming and outgoing messages from all Users participating in a communication) on a single handset in an apparent 'real-time' capacity;
  • 'synchronised communication Two or more Users can view a 'synchronised communication'.
  • the term 'synchronised' describes the outcome whereby the animated character of the sender will appear on the handset screen of the recipient alongside the animated character of the recipient, whilst simultaneously appearing on the screen of the sender (i.e. the sender of a message and the recipient can view the same screen on their separate handsets and watch how a particular animated interaction or message is played out).
  • the invention allows Users to communicate in a highly personal, visual way that overcomes the constraints of time and location. In effect, Users can communicate through and participate in a constantly changing and evolving cartoon of their own lives which is played out and simulated on their mobile phones.
  • the invention includes:
  • the invention also includes a feature currently called "MiniMeLove". This builds on the personal data provided by the User in accessing the invention.
  • the feature requires the User to input further data about the qualities of that User's ideal love partner. Once this data has been stored and integrated into the Application, the User can use Bluetooth or Infra Red technologies to see how good or bad a love match another acquiescing User is.
  • Animations and gestures are achieved by sending data which animates the character on the handset
  • the application can be updated. This means image files and code can be sent to the application.
  • the effect is that new environments can be added, characters can grow, characters can change clothing, objects and the data of how they are used can be added and the character can have abilities (actions/gestures) added
  • the character is artificially intelligent meaning that in one embodiment the character can hold a conversation with a single user and its behavior and attitude to objects and its owner can change to reflect how the owner has "treated” it.
  • the character can accept data from third parties to communicate with the user. For instance the character could read out lottery results or dress appropriately for the weather forecast.
  • unique words entered into the users phone dictionary can be incorporated into the AI server so that the character communicates with the user, using words and phrases that the user often employs
  • the Character is a miniature version of the User however it is also possible to create a fictional Character. Additionally, the Character concept should be interpreted to extend to miniature pets, aliens, secretaries or any other animatable subject. The invention can also be extended to facilitate Mobile Gambling and the physical character can be used in third party games as the protagonist.
  • Two or more Users can view a 'synchronised animated communication' as described earlier.
  • a User can imitate and communicate real life gestures and interactions with objects and avatars (for example, a User A's Character "giving" a bouquet of flowers to User B's Character or User A's Character kicking User B's Character with the result of User B's Character exhibiting pain);
  • Figure 1 is a functional block diagram illustrating application of an embodiment of the present invention to telecommunications network/TCP/IP Network infrastructure.
  • Figure 2 is a flow chart diagram illustrating a method allowing for the generation of a graphically expressive synchronised chat message according to an embodiment of the present invention
  • Figure 3 is a flow chart diagram providing a method allowing for the generation of a graphically expressive message according to an embodiment of the present invention
  • Figure 4 is user flow diagram of the GUI (screenshots) of the Setup Application used to Create a Character and initially define its physical and behavioural characteristics.
  • FIG. 5 is a functional block diagram illustrating application of an embodiment of the present invention in a load balanced, fully redundant, clustered network infrastructure. Specifically this diagram focuses on Chat and Message servers.
  • Figure 6A-6D is a flow chart diagram setting forth the overall process flow associated with the Setup Application according to an embodiment of the present invention
  • Figure 7 is an architectural chart diagram setting forth the architecture of the Setup Application Figure 8 is a diagram setting forth the architecture for the current embodiment of the J2ME Client Application
  • FIG. 9 is a diagram setting forth the architecture for the current embodiment of the J2ME Setup Application
  • Figure 10 is a diagram setting forth the flow of data between classes for the current embodiment of the J2ME Client Application.
  • Figure 11 is a screenshot of the Character Statistics Screen including a close graphic representation of the character and its mood and the behavioural characteristics according to an embodiment of the current invention.
  • Figure 12 is a series of screenshots of a Character in its environment according to an embodiment of the current invention
  • Figure 13 is a series of screenshots illustrating the user sequence for receiving dynamic, expressive animated messages through character transference.
  • Figure 14 is a series of screenshots illustrating the sequence of sending and receiving dynamic, expressive, animated, synchronised chat messages through simultaneous character transference, using the command line and chat screen and includes the text only screen and message details
  • Figure 15 is a flow chart diagram setting forth the overall process flow associated with the receiving, viewing and animating of a message.
  • Figure 16 is a flow chart diagram setting forth the overall process flow associated with the Client Application initialization and launching.
  • FIG. 1 illustrates a network environment including an embodiment of the present invention.
  • the system, Server Side Application 30, is operably connected to the Internet (IP Network 20) in order to send and receive data, via a socket(s) or other appropriate data transport protocol, to and from end systems such as client computers 55 or Client Handsets 50.
  • IP Network 20 Internet
  • the network environment includes wireless network 40.
  • wireless network 40 comprises gateways for EDGE/GPRS/SMS and any other suitable data transport layer.
  • Gateway 12 balances the data load of incoming requests and directs them to the appropriate clustered server to provide redundancy in a scalable environment.
  • SMS Gateway 26 is a failover, should the user not have access to GPRS, it allows for the sending of administration messages or any other data to the handset 50 using SMS. It also allows handsets to send and receive messages to and from each other using SMS rather than GPRS.
  • Client computers 55 are connected to IP Network 20 through any means such as an Internet Service Provider (ISP) and the like.
  • Client computers can be any suitable internet enabled computing device, i.e. a desktop computer, laptop, PDA which is capable of sending and/or receiving data be it in a wireless or fixed wire capacity.
  • client computer 55 includes instant messenger software as a software container for sending, viewing and receiving animated, dynamic, expressive and synchronised messages.
  • Server side application 30 in one embodiment includes User Details Database 36, Message Server 31, Message Database 38, Chat Server 35, Application Server 32, Application Database 37, Character Database 39, Object Database 34, Environments Database 43, Artificial Intelligence Server 41 and Artificial Intelligence Database 42.
  • Server Side Application, 30 is operative to execute the functionality, described herein, allowing users to send and receive animated, dynamic, expressive and synchronised non-voice messages.
  • Server Side Application 30 packages and distributes the client application to handsets 50 that allows the creation of animated, dynamic, expressive and synchronised non-voice messages.
  • Server Side Application 30 operates in connection with User Detail Database 36, Application Server 32, Application Database 37, Object Database 34, Environments Database 43 and Character Database 39.
  • This system is operative to dynamically package and distribute the jar (J2ME) file to a variety of mobile handsets 50 based on the User's custom character specifications and necessary changes that need to be made to the jar file to be compatible with various handsets 50.
  • the jar file is the Client Side Application which is installed on mobile handset 50 and is detailed further on in this document.
  • Application Server 32 hosts an application (Client Application Compiler) which polls User Detail Database 36, when a request for the client application is received, to discover the user's phone number, what handset type and model the user has, what physical characteristics the user has requested for their character (eg white male, black hair, blue eyes, athletic etc), what behavioural characteristics the user has weighted for their character (eg Intelligence 5, Mischievous 7, Self-reliant 0, quick tempered 9 (as illustrated in Figure 4). These characteristics effect how the character behaves in its environment) and the environment they have selected to start with (eg. Apartment as illustrated in Figure 12).
  • the Character database 39 is polled for the png's (image files) which correspond with the specified physical characteristics and are sent to the Client Application Compiler Hosted on the Application Server 32. Pngs corresponding with the users requests for objects and/or environments are also sent to the Application Server 32.
  • the Application Database 37 changes variables to affect how a character will behave in its environment according to the user defined behaviours.
  • the component parts Once the component parts have been collated on the Application Server 32, they are packaged to form the user requested jar file (Client Application) and sent to the phone number recorded on the User Detail Database 36.
  • Application Server 32 can store more versions of the Client Application Compiler than just J2ME (Java 2 Mobile Edition) for forming the client application, such as Binary Runtime Environment for Wireless (BREW) or iMode enabled devices.
  • J2ME Java 2 Mobile Edition
  • iMode enabled devices such as Binary Runtime Environment for Wireless (BREW) or iMode enabled devices.
  • Message Database 38 is responsible for storing message data corresponding to messages created by users.
  • each message is stored in a table including the following fields for each message
  • the messages are deleted from the Message Database 38.
  • the User Detail Database is polled for a login and password as is common in the art.
  • Gateway 12 identifies if incoming data is a chat communication or a message communication and sends it to the appropriate server.
  • the Message Server 31 is operative to serve message data corresponding to messages created by users.
  • Message Server 31 receives message data from a user and sends it to Message Database 38 for storage.
  • Message Database 38 When data is requested from Message database 38 it is passed to Message Server 31 and sent to handset 50 or computer 55 via the same protocol it was received by.
  • Message Server may involve a clustered environment with a gateway for load balancing and identification of the type of message being sent/received.
  • Chat server 35 is operative to serve message data corresponding to Chat Messages created by users.
  • the Chat Server 35 itself is in a clustered environment (As illustrated in Figure 5) containing a Gateway Server 12 that provides Network Load Balancing to ensure redundancy by providing a set of hosts that deal with the communications.
  • the network load balancer addresses a gateway web server that is least busy (i.e. Gateway Server A).
  • Gateway Server A A back ground process on the gateway server checks that one of the peer hosts is running and has enough space to handle the traffic for the number of requested chat participants. The user is then redirected to that host, otherwise the gateway looks for another host to try and connect to.
  • a database on the server flags where the chat user has connected to and sends the request to the remaining participants who will be directed straight to the correct server.
  • Each chat message is stored as text based characters which are translated on the application installed client handset 50.
  • Environments Database 43 is operative to store the graphics of each environment at a screen size which has been optimised for the various target devices on which the client application is installed.
  • the AI Server 41 (Artificial Intelligence Server) is operative to interpret questions and statements put to it and to reply to the user in a coherent and contextually correct manner.
  • the character communicates with the user in conversational terms eg. User:"How are you today?" Character:"I am well”
  • ALICE a preexisting Artificial Intelligence communications program, freely available from www.alice.org which is installed on the server.
  • the AI Server also polls the User Detail Database 36 for the phone number of the user, the age and the mood of the character and uses this data to access the appropriate vocabulary from the AI Database 42 for the character, when communicating with the user.
  • the AI server can also be used to push messages to a connected user in an effort for the character to start a conversation.
  • the AI Database 42 (Artificial Intelligence Database) is operative to define and store the words and language used by a character when communicating with its user. Data uploaded from the user's custom dictionary is stored in AIML (Artificial Intelligence Markup Language) for retrieval by the ALICE system on the AI Server 41.
  • AIML Artificial Intelligence Markup Language
  • the Client Application sends all custom words via Communications Layer 814which have been entered into the user's handset dictionary to AI Database 42 for insertion into the database which is linked to the Users Profile on the User Details Database 36.
  • Figure 3 sets forth the process flow associated with an embodiment of the present invention for the sending, receiving, viewing and deleting of messages.
  • a user accesses the messaging component on the Client Application, causing the client application to transmit a request to the Message Server 31.
  • the Message Server 31 receives the request (230), it analyses the request to identify the messaging account by the primary key of phone number.
  • the account is identified by user name only which indicates that the request is coming from an instant messaging application and the messages should be sent to an individual desktop computer 55 via HTTP, SOAP or any other appropriate protocol.
  • the messages are retrieved from the Message Database (step 230) and sent to the Client Application Inbox via GPRS for display (step 232). Any messages sent to the client application on the handset 50 or desktop computer 55 are flagged as viewed and are not sent to the client application again.
  • the client application displays the textual component of the message as headlines in bold, (step 232)
  • the data is parsed to separate the emotion, and action instructions from the textual component of the message.
  • the emotion instruction translates to the appropriate expression png (image file) which is transposed onto the face of the character sent in the message.
  • the action instruction translates on the client application installed on handset 50 to move the limbs and body of the sender's character to produce the animation of that action in conjunction with the receiver's character.
  • Figure 3 sets forth the process flow associated with an embodiment of the present invention for the replying to and deleting of messages which occurs on the Client Side Application.
  • the delete function (240) is initiated from under "Options" of the list view of the Inbox displaying the text based headlines (232) and is controlled by the Business Layer 816, specifically the Message and Contacts Management Component 850, which is illustrated in Figure 8.
  • the user is prompted with the warning "Delete Message?" should yes be pressed the message is removed from the Client Application.
  • Wireless Network 40 enables communication between wireless device 50 and other systems operabley connected thereto specifically the internet or any other suitable IP Network.
  • Wireless Network 40 can be any suitable digital or analogue network that supports the transfer of text and/or data including, but not limited to Time Division Multiple Access (TDMA) Network, a Global System for Mobile communication (GSM) Network, or a Code-Division Multiple Access Network.
  • TDMA Time Division Multiple Access
  • GSM Global System for Mobile communication
  • Wireless Network 40 includes functionality supporting GPRS, a communication protocol enabling wireless devices 50 to access the internet or similar computer network.
  • Wireless network 40 includes functionality supporting SMS to send data to and from Server Side Application 30 and handset 50.
  • Wireless devices 50 are operative to receive data from wireless network 40 and transmit data to wireless network 40 for routing to appropriate devices.
  • Wireless devices 50 in one embodiment, are Internet-enabled devices capable of sending and receiving data from remote servers.
  • the Client Application has the following components
  • a character is created by a Java application installed on the client handset 50 (Setup Application).
  • the architecture of the Setup Application is illustrated in Figure 9.
  • This application is operative to gather information to a high specification on the physical and behavioral characteristics are sent to the User Database (Fig 1 36) in order to form a profile of the current character as detailed previously.
  • the correct Setup Application for the phone type and model is selected by the user by sending an alphanumeric text message to an SMS gateway.
  • the alphanumeric text message is interpreted by Application Server 32 and will send out the Setup Application which is correct for that phone type and model.
  • This method of Java application distribution is common in the art.
  • the User In order to create a Character on the User handset, the User needs the "Make a Simbaby" component of the Application ('Setup Application' as illustrated in Figure 4). In an embodiment of the current invention, this is a downloadable J2ME program, from the Application Server 32 which is installed on the handset 50.
  • the Setup Application holds all image files (pngs) and the Business Logic ( Figure 7) and displays these image files instantly in accordance with user selections for instant feedback on changes to the characters physical characteristics.
  • the Setup Application GUI 700 allows users to manipulate parameters to a high physical and behavioural specification.
  • the Parameter Interpreter 702 compresses this data before transferring it to the Communications Module 706 which is responsible for all communication external to the Setup Application.
  • the Setup Application sets the following Character parameters which will are stored in the User Details Database 32 as a User Profile when the character setup is saved or completed..
  • the user's Phone number is the primary key under which the users profile is created. Full Details of the contents of a Users Profile are contained in this document.
  • the entered data tells the application the users age (dd-mm-yyyy).
  • the Character Database 39 to select the correspondonding image(s) that make up the character which are compiled into the final client application when the character setup is complete.
  • the Height of the character can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Body Style and stored in the User Profile
  • the Size of the character refers to the width or fatness of the character and can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Body Style and stored in the User Profile
  • the length of the character's face can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Face Style and stored in the User Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
  • Width refers to the width or roundness of the character's face and can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Facial Style and stored in the User Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
  • the length of the character's hair can be increased or decreased proportionally on a scale of 1-10. This number is stored in the User Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application. • Hair Curl (Curl)
  • Curl refers to the width curliness of a characters hair and can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Hair Style and is stored in the User Profile where it is referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
  • Color refers to the colour of the character's hair and can selected on the handset 50.
  • the selected hair colour is given a code and is stored in the User Profile where it is referred to by the by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
  • the height of the character's eyes can be increased or decreased proportionally on a scale of 1-5. This number is stored in the User Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
  • Width refers to the width a characters eyes or the horizontal space they take up on the face and can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Eye Style and is stored in the User Profile where it is referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
  • Color refers to the colour of the character's eyes and can selected on the handset 50.
  • the selected eye colour is given a code and is stored in the User Profile where it is referred to by the by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
  • Nose Style(Style) There are multiple styles of Noses for the characters.
  • a nose style is given a code which will eventually be stored in the Users Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
  • the length of the character's nose can be increased or decreased proportionally on a scale of 1-5. This number is stored in the User Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
  • Width refers to the width a characters nose or the horizontal space it occupies on the character's face and can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Nose Style and is stored in the User Profile where it is referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
  • the width of the character's mouth Refers to the width of the character's mouth.
  • the width of the character's mouth or the horizontal space that it occupies across the character's face can be increased or decreased proportionally on a scale of 1-5. This number is stored in the User Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
  • Width refers to the thickness of the character's lips and can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Mouth Style and is stored in the User Profile where it is referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
  • Lip Colour refers to the colour of the character's lips and can selected on the handset 50.
  • the selected lip colour is given a code and is stored in the User Profile where it is referred to by the by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
  • the following parameters will be referred to by the Application Database 37 to add the correct behavioural variables to the compiled client application.
  • These behavioural variables result in the frequency of specific actions and likelihood of specific moods when the Character is living in its environment (Tamagotchi Mode).
  • These behavioural parameters change over time in accordance with how the user has looked after or conversely, neglected its character.
  • the Setup Application Behavioural Parameters are paired and traded off against each other so that a Character cannot be set up with "perfect" behaviours. For Example Intelligence and Strength are paired. As intelligence is increased Strength is decreased.
  • the following Behavioural parameters are stored in the User Details Database 32 as a User Profile.
  • a highly intelligent character will have access to more actions than an unintelligent character.
  • a Character with a high strength score will be able to inflict more damage on a another users character and destroy an object with fewer blows.
  • a score of 10 means that there is no likelihood of the character destroying or throwing items within it's environment.
  • a score of 10 means that there is a guaranteed likelihood of the character demanding attention, playing games or having a conversation
  • a score of 10 means that there is a no likelihood of the character demanding attention, playing games or having a conversation
  • the user is prompted to Enter the Name of the Character.
  • the Name of the character, the physical and behavioural characteristics are then updated via GPRS or SMS to the to the User Profile on the User Details Database 36.
  • GUI for the Setup Application is in Flash (swf) and all gathered user, physical and behavioural data is sent to the same databases, through Gateway 12 using HTTP or any other suitable TCP/IP protocol.
  • Application Database 37 selects the appropriate Java base application that has been optimised for the user's phone type and model. It then collects the appropriate image files from Character Database 39 that corresponds with the chosen Physical Characteristics of the user's character from the User Profile on User Details Database 36. Application Database 37 also changes the behavioural variables in accordance with the Behavioural characteristics selected in the Setup Application and stored in the User Profile on User Details Database 36. The Java application is then compiled on Application Server 32 and sent to the target device's Phone number which is stored in the User Details Database 36.
  • the client Application consists of the following components:
  • chat Client ( Figure 9:904) o Set up the chat sessions o Send receive messages via appropriate protocol
  • Client Application 800 in one embodiment includes User Key Events 802, Non-Interactive Actions 804, Server Messages 806, Action Event Controller Layer 808, Key Events Listener 810, AI (Artificial Intelligence) Action Generator 812, Communication Layer 814, Business Layer 816, Application State Registry 818, Command Interpreter 820, Message Interpreter 822, Environments Layer 824 (which comprises of Bedroom 826, Environment Map 828, Environment Layers 830, Environment Actions 832, Office 834, Kitchen 836, Bathroom 838 and any other environment loaded onto the handset) Items Layer 840, Item 842, Item Layer 844, Item Physics 846, Menu/Message Screen Generator Component 848, Message and Contacts Management Component 850, Chat Component 852 and Data Layer 854.
  • the architecture may change slightly due to the art but the idea remains the same
  • the following components are implemented as a collection of functionalities and functional areas.
  • This component traps messages and key events from User 802 input or Server 806 and passes the events to the business layer 816 for execution.
  • This component is divided into 3 sections (Key Events Listener 810, Screensaver Mode (Native Functions and Data), Server Message Listener/Communication Layer 814) and each section is responsible for trapping data coming from various sources and passing them to the business Layer 816 for processing.
  • Application State Registry 818 Is operative to record the state of the Application and what function it is performing at that time, whether it be Tamagotchi Mode, Chat or Messaging
  • AI Artificial Intelligence
  • Action Generator 812 Is operative to calculate the actions of the character in Tamagotchi mode based on it's mood and behaviour
  • Command Interpreter 820 Parses instructions typed by the User to the Character to perform actions, change moods or enter other environments
  • Message Interpreter 822 Is operative to receive messages from the appropriate Communication modules and identifies whether it is a chat communication or a message communication,. In another embodiment it parses messages and chat data into actions, message text and emotions and sends it to the appropriate server or gateway 12.
  • Menu/Message Screen Generator Component 848 is operative to generate menus based on Application state
  • • Message and Contacts Management Component. 850 is operative to manage whether messages have been deleted from the application (RMS) or received from the message server 3 land added to the RMS. Likewise the component allows the adding/deletion of contacts, their names and phone numbers from the Client Application (RMS)
  • Chat Component 852 is operative to manage requests to and from the Chat Server 35
  • Data Layer ( Figure 8: 854): is operative to provide an interface to the RMS useing generic functions to add, update, delete and synchronize records from the Client Application to the Server Side Application 30, specifically synchronizing the Character Logs with the User Details Database 36. The following is stored on the data layer
  • ⁇ User profile including Phone model, type, identifier and number. This data is there in case it needs to be polled by the server
  • Environment Layer ( Figure 8: 824): This layer is operative to load the necessary data and images (pngs) to populate and display the Environments.
  • the following data is loaded:
  • ⁇ Environment Map 828 contains the data of x and y coordinates of the screen for the specific handset 50 make and model. It also contains of the data of x and y coordinates images representing items which populate the Environment.
  • ⁇ Environment Layer 830 contains the layering information. This information determines z- axis coordinates of the objects based on the Character's position. For example should a character be required to go from the from of a desk to behind a desk the z-axis coordinates must swap.
  • ⁇ Environment Action 832 Handles all data images related to an action/activity that may be performed by a character in a specific Environment.
  • the Communication layer implements all communications related to the receiving and sending of Chat and Messages to and from the Server via GPRS, Sockets, SMS or any other appropriate protocol. In one embodiment of the present invention it listens to specified ports via TCP/IP socket connection for incoming requests and responses and passes it on to the Message Interpreter 822 for translation and appropriate handling.
  • a Socket connection is set-up (over GPRS) with the Server at the time of initialization of the Client Application.
  • Administration Layer ( Figure8: 856): Is operative to cater to administrative functionalities such as:
  • Figure 10 illustrates the class diagram for the for the J2ME embodiment of the current invention (Client Application)
  • the Client Application Class is the MIDlet class and is operative to handle all states of the Client Application from the device (50) Operating System provided by the Java Virtual Machine implementation. This is the interface between the System device and the Java Application. (Client Application)
  • the Application Manager Class is operative to be the center controller class of the J2ME Client Application. This Class Registers the state of the Application and prompts the Client Application to perform the necessary actions, traps key events, generates message boxes, generates menus, registers with the communication layer for messages and chat events, draws necessary UI, Character and Environments.
  • PNG Maps referencing images
  • this class brings out the images of that particular character type and assembles the images (body, hands, legs etc) and displays to the User.
  • Character class has contains the x and y coordinates of all images, which are contained in the frames that when animated create actions and emotions..
  • the Moving Objects Class is operative to hold all the interactive objects of the Environment(s). I.e. objects that can move or have animations associated with them eg flame are implemented in this class.
  • This class is also operative to detect Collisions between the character and the MovingObjects. It defines the state of the object once a collision has been detected, and performs whatever interaction needs to take place based on that collision, updates images and initiates animations to simulate the effect. Eg Throwing a cup and a window breaking Communication Class (Figurel0:1018)
  • This class is operative to register a port and listen for incoming messages, reads/translates messages and sends messages.
  • the communication class is used by the Application Manager to check and change states of the Application.
  • This class is a Generic Font Class, which enables developers to implement different fonts by using font images. It implements the Cursor and device key maps of the handset 50. The class can be easily extended to add key maps of different phones.
  • this class is operative to store all the message information for retrieval. Every message in the Inbox / Outbox has an associated message object.
  • Message objects are instances of the message class at the time of arrival of the message or dispatch of the message These objects store the information like message state (incoming or outgoing), message sender name, message phone number, date and time, message receiver name and text of the message.
  • Every Contact in the contact list has an associated contact object.
  • the RMS (1006) gets the data from the contact object. Data retrieved is contact name, contact phone number and the port number. This data is stored in the contact object.
  • the following is the Data-structure that stores information related to the user
  • ⁇ Message Type > - Identifying what data is being communicated.
  • ⁇ Sender Name > - Name of the sender
  • ⁇ Chat Session Id > - Indicating the chat session.
  • ⁇ SimbabyId > - Id of the SimBaby character of the sender that is displayed on screen.
  • the present invention consists of an Application which broadly enables the following functionality:
  • the Application introduces a novel non-voice method of mobile communication which allows one mobile phone User to communicate and interact with another mobile phone User through a highly personalized, artificially intelligent avatar closely resembling the User's own appearance and personality.
  • the preview on the right is instantly updated in response to the Users' selections.
  • the Client Application also allows Users to monitor the mood, personality traits and appearance of the Character on an ongoing basis in reaction to the care given to the character and the events which have occurred to it, by accessing Figure 11. This screen gives an indication of what behavior to expect from the avatar for example whether the character will be happy and dance to amuse its owner or be angry and destroy its environment.
  • One embodiment of the current application also facilitates the design a virtual home for the Character to "live in".
  • This "home” is known as an 'Environment' in the Application terms.
  • the User can choose an Environment from a number of pre-designed Environments available on the User handset, purchase a higher specification Environment or design his own. These Environments are multi-roomed residences designed to a high specification as illustrated in Figure 12:
  • the 'rooms' shown in Figure 12 form the 'Apartment', one of the pre-designed Environments available in the current embodiment of the Client Application.
  • Environments are not just limited to indoor "rooms” they can comprise outdoor settings such as a beach or skifield. Environments are not just a setting for the character to occupy they hold objects which the avatar can use and also contain code which controls the physics of the environment eg. The character can jump through a window but not a door.
  • the Environment contains movable Objects for example toothbrushes, footballs and coffee cups, for the Character to use when in idle mode.This is the mode when the character is going about it's daily business of "living" on the handset 50, performing actions which suit its behavioural characteristics. These items can also be used in a communication.
  • the Application contains a number of standard items for use by the Character in the Environment however the User may purchase additional items/more unusual items such as rocket packs, beds, spud guns on an ongoing basis.
  • Objects also carry their own code and graphics which tells the application how the object can behave and whether it can be burnt or smashed. It also tells the application how the object should look when it is used or burnt of smashed or any of the other possibilities associated with the look and behaviour of an object.
  • the User must send this data to the Server via GPRS [and/or HTTP TCP/IP depending upon which one of the above vehicles was used during design stage]. Abstractly, the data is sent to the server.
  • the Server Side Application 30 will integrate the design data and create an image file (png) of the Character recording all of the behavioural and appearance statistics specified by the User together with an extrapolation of the Character's component body parts to enable the Application to generate detailed Character movement on an ongoing basis.
  • the .png file is subsequently installed into the .jar file of the Application (this takes place Server side).
  • This .jar file is then sent to and installed on the User's handset 50 via SMS, GPRS or any other applicable data transfer mechanism.
  • This installation enables the User to access the functions described in this document.
  • the Client Application is installed and integrated into the operating system of the handset in question for example, Symbian.
  • the Client Application functions, as described outside of that operating system.
  • the Character will appear in the application installed on the User's handset living in the chosen Environment and will be ready for use.
  • the Character will appear in the wallpaper space of the User's handset.
  • the Application allows the User to perform the following functions:
  • This function allows two or more Users to send and receive messages, actions and gestures to one another. These Messages are animated, dynamic and expressive and are the 'virtual' equivalent of the Users interacting in the real and physical world.
  • Dynamic Animation The dynamic nature of the Application message sending data to move the animations on the client rather than rendering animations on the server and sending those through to the client. This is a novel approach to and uses substantially less data than an MMS, Picture or Video Message.
  • command language could be and two non alpha numeric characters to surround the emotion or action.
  • the purpose is for the parser in the receiver's application to recognise the expression(s)/action(s) and act accordingly.
  • a mixture of expressions and actions such as "*Happy* (kiss) *Sad* (hug)" will be displayed sequentially in logical pairs so that the Character will wear a happy face whilst giving the kiss and a sad face whilst giving the hug.
  • a User can use the menu system installed on the handset to construct and send a Message.
  • Chat This is an extension of the Messaging principle described above. This feature allows a User to invite multiple Characters over to its Environment to "play”. Once the invitation to Chat has been accepted by each of the invited Users, their Characters will automatically appear in the Environment of the Inviter. In Chat, all participants see the same screen in a synchronised capacity so all participants will see the Environment of the Inviter.
  • the Characters can exchange actions, emotions, text messages in apparent "real time” and in context with each other. The text is available to be scrolled through on each independent handset. Should this be done, the text does not scroll on any other users handset
  • Chat can be activated in broadly the same ways as Messaging i.e. through the command line or through the menu system.
  • Figure 14 illustrates the user flow of a Chat communication.
  • the command language for Chat is as follows:
  • the User can kick start or exit Chat through the menu system
  • Entities in the Application are: ⁇ Setup Application ⁇ Server Side Application ⁇ Mobile service Provider (for SMSC GAV, GPRS, Integration APIs)
  • the Application's Client side will has the following components:
  • the User In order to create a Character on the User handset, the User needs the "Setup Application” Program. This is a downloadable module from the Server which is installed on the handset.
  • the Setup Application GUI will allows the User to manipulate the parameters and design the Character to a high physical and behavioural specification as illustrated in Figure 4.
  • the Server has the following three-tier architecture: Database, Application and Presentation.
  • the Server is partially connected to the Client program and has automatic push and pull components to retrieve information about the Client, push data to the client for the AI eg. Complex conversations from the Character will be performed on the server and sent to the client Application.
  • the application and server synchronized during high signal availability and pauses in messaging or chat to send data on the Character so that the Server is always aware of any changes that have happened to the Character and can push images and data to the Client Application to reflect these changes.
  • the Message Server is integrated to the SMSC (Short Message Service Center) G/W of the service provider though the Integration Module.
  • the Message Server contains the Business Logic to control/send the messages from/to the Application.
  • the Message Server integrates with the Database, Application Logic and the Integration Module.
  • the Chat Server has the capability to implement chat sessions between multiple Users.
  • the Chat Server integrates with the Integration Module, Databases and Application Logic. Chat utilizes the GPRS connectivity of the mobile to send and receive chat messages however it can use any protocol to send or receive data, even SMS. Chat messages support text and special characters.
  • the Application Logic is developed using J2ME, however it can be written in any language which is available to a mobile device. It deals with the processing that takes place behind the scenes rather than the presentation logic required to display the data on the screen.
  • the Communication Module handles all of the communication of the Application with external interfaces. Communication Module identifies and handles:
  • Billing Module Billing transactions are updated on the fly subtracting from a prepaid amount The functionality of this module also creates information logs from the raw data about the usage.
  • Figure 16 sets forth a process flow of classes and states associated with the launching and initialization of the Client Application in an embodiment of the current invention
  • the Client Application interprets the message.
  • the message is parsed. Images/ emotions/actions are retrieved and displayed manipulated according to the vector data that has been sent in the message to form the animation contained within the message..

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Textile Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Computer Security & Cryptography (AREA)
  • Manipulator (AREA)
  • Transmission Devices (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The present invention includes a system of animated, dynamic, expressive and synchronised non- voice mobile gesturing. Specifically, it enables the transference of artificially intelligent Characters from mobile phone user to mobile phone user regardless of mobile network or mobile handset, capturing the mood and movement in a mobile communication for the first time.

Description

SYSTEM OF ANIMATED, DYNAMIC, EXPRESSSIVE AND SYNCHRONISED NON-VOICE MOBILE GESTURING/MESSAGING
2. Background
Current methods of non-voice mobile communication suffer from one or more of the following limitations:
(a) The majority of communication is text only;
(b) Data is only ever one way per exchange
(c) Communication cannot be a simultaneous ongoing exchange
(d) There is not context to the singular statements of a multiparty exchange
(e) Lacking in expression due to the non visual nature of the medium and therefore prone to misinterpretation
SMS (Short Message Service) for example, is limited to pure text or at its most expressive, the use of ACπ characters used in an order that when read sideways gives an idea of a feeling or emotion by looking like a little bit like a facial expression eg :) as a smile :( as a frown. In addition, the recipient of the message only sees the incoming data; there is not the previous data to give the exchange context. Equally, although MMS, Picture and Video messaging are visual forms of non-voice communication, they present their own problems. Both are not immediate or convenient enough from a user's perspective, in terms of a process flow, to be useful to convey emotion or gesture within the confines of a text message.
E.g.: To send a picture message:
• the user must produce the text,
• take the photo,
• browse to the photo,
• attach the photo
• send the photo and text. Taking video is an identical process. MMS, Video Messaging and Picture Messaging are prohibitively expensive and both the sender and the receiver must have an enabled account and phone to view the media. Likewise they deny the users to view multiple visual and textual exchanges at the same time.
The table below illustrates the limitations of existing forms of non- voice mobile communications against the present invention.
Figure imgf000003_0001
3. Summary of the Invention
The present Invention provides products, processes, apparatus and systems for Users to Send and receive graphical, character based, dynamic, animated, expressive and synchronised non-voice mobile messages using mobile wireless devices 50. The present Invention incorporates an application which allows the creation of these Characters or Avatars to a high specification to resemble the appearance of the User and mimic the behaviour of the user, as illustrated in Figure 4.
The main application ('Client Application') of the present invention, houses these characters in indoor or outdoor graphical spaces ('Environments') which also contains objects available to the Character to use in its daily activities. The physical appearance changes over time to simulate growth of the avatar and to reflect its mood over the level of care it has sustained. The behaviour also changes over time in accordance with the inbuilt artificial intelligence (AI) which reflects whether the avatar is happy with the level of care it has received. A happy avatar deports a clearly happy countenance and performs caring actions to delight its owner e.g. dancing. An unhappy avatar destroys objects in its environment by setting them on fire or breaking them and deports an unhappy or angry countenance.
The present invention in one embodiment supports mobile phones, PDA, desktop computers and the like. In one embodiment using a Java Enabled Mobile Phone to use the character based, dynamic, synchronised, expressive, animated messaging ('Chat' or 'Chat Messaging') a user ('User A') invites a user(s) from their address book ('User B'). The request goes to the Gateway server which finds a place on the same Messaging server for all the participants to chat. User B receives a message saying that User A has requested a chat should they press yes the chat is initiated and User A's character walks onto the screen to join their Character and User B's Character walks onto the screen of User A's phone to join their character. In this embodiment the User A or B select or type a desired emotion in conjunction with a proposed action and enters the message (As illustrated in Figure 14). This message is data rather than images which is directed through a database so that the User can be billed for the minimum for each message sent and all appropriate data such as the recipient(s) details are recorded. This message data is translated on the recipients phone to produce the animation, emotion and action. The message will appear on the screen of the recipient's phone and their character will display the desired emotion and perform the action to the recipient's character. This can be observed on User A and User B's phone. The textual component of the return message is presented underneath any previous messages and may be scrolled through to see what occurred previously.
In the same embodiment to use character based, dynamic, expressive, animated messaging (Messaging) The user enters a mood ('Expression') an action to be performed to the recipients character and a textual message. The message is then transmitted to a database and stored in the same format as a Chat message where it can be retrieved by the recipient using an Internet enabled device such as a mobile phone, PDA or desktop computer.
4. Statement of Invention
The present invention overcomes the limitations of existing forms of non-voice mobile communication by allowing mobile phone Users ('Users') to communicate through miniature, artificially intelligent avatars ('Character(s)', 'SimBaby' or 'SimBabies', as applicable) based closely upon their own appearance and personalities. These Characters reside within an application on the Users' handset, in one embodiment in the wallpaper space of the Users' handset. More specifically, the Characters reside in 'Environments' which consist of virtual spaces i.e. outdoor or indoor settings that can be designed by the User (or pre-designed by a third party) to be as large or small as the phones memory will allow as these environments are split over multiple screens. In this way, the invention creates a 'virtual mobile world' through which mobile phone Users can communicate and interact with one another in an animated, dynamic and expressive manner, much as they would in the real and physical world.
The uniqueness of this invention lies in the fact that:
(a) A User can communicate with other Users through visual, highly personalised, miniature animations based upon his or her own nature and appearance;
(b) Communication takes places through 'Character Transference' i.e. when a User sends a communication it will appear as though that Character has walked off that User's handset into the frame of the recipient User;
(c) The animation is dynamic meaning that movement is not a pre rendered animation sent by the server, it is happening live on the handset by sending data which manipulates the characters on the handsets
(d) A User can meaningfully communicate mood, movement, gesticulation and action for the first time; (e) A User can simultaneously view a multiple-party exchange of communication (i.e. incoming and outgoing messages from all Users participating in a communication) on a single handset in an apparent 'real-time' capacity;
(f) Two or more Users can view a 'synchronised communication'. For the purposes of this specification, the term 'synchronised' describes the outcome whereby the animated character of the sender will appear on the handset screen of the recipient alongside the animated character of the recipient, whilst simultaneously appearing on the screen of the sender (i.e. the sender of a message and the recipient can view the same screen on their separate handsets and watch how a particular animated interaction or message is played out).
(g) The sending of a message via GRPS in the guise of an SMS.
(h) The technique used for the animation means that even if the application is written in a different language for a mobile handset/wireless device (eg Flash Lite) or is incorporated into a messaging system (eg MSN Messenger) the messaging data will not be affected and will not have to be resent to be translated correctly by the Client Application.
The invention allows Users to communicate in a highly personal, visual way that overcomes the constraints of time and location. In effect, Users can communicate through and participate in a constantly changing and evolving cartoon of their own lives which is played out and simulated on their mobile phones.
Other Features of the Invention In addition, the invention includes:
(ii) A feature that allows for the User to care for and nurture the Character. This function requires the Character to begin "life" as a "baby" and to grow through its life cycle at an accelerated rate. The User can interact with its Character to feed/nurture/play with its Character through an application which is installed on the phone. This aspect of the invention is purely for entertainment and does not require a mobile network signal; and
(iii) The invention also includes a feature currently called "MiniMeLove". This builds on the personal data provided by the User in accessing the invention. The feature requires the User to input further data about the qualities of that User's ideal love partner. Once this data has been stored and integrated into the Application, the User can use Bluetooth or Infra Red technologies to see how good or bad a love match another acquiescing User is. (iv) Animations and gestures are achieved by sending data which animates the character on the handset
(v) The application can be updated. This means image files and code can be sent to the application. The effect is that new environments can be added, characters can grow, characters can change clothing, objects and the data of how they are used can be added and the character can have abilities (actions/gestures) added
(vi) The character is artificially intelligent meaning that in one embodiment the character can hold a conversation with a single user and its behavior and attitude to objects and its owner can change to reflect how the owner has "treated" it.
(vii) In one embodiment the character can accept data from third parties to communicate with the user. For instance the character could read out lottery results or dress appropriately for the weather forecast.
(viii) In one embodiment unique words entered into the users phone dictionary can be incorporated into the AI server so that the character communicates with the user, using words and phrases that the user often employs
In the present iteration of the invention, the Character is a miniature version of the User however it is also possible to create a fictional Character. Additionally, the Character concept should be interpreted to extend to miniature pets, aliens, secretaries or any other animatable subject. The invention can also be extended to facilitate Mobile Gambling and the physical character can be used in third party games as the protagonist.
All of the above is made possible through a technical application ('Client Application') described in detail in the following pages which is installed onto Users' mobile phone handsets 50. The application often sends and receives data from multiple databases on the sever ('Server Side Application') which is also described in detail in the following pages. The ability to incorporate data from third party feeds ('Feeds') is not detailed.
4. Advantages
The invention has the following advantages:
(a) A User can communicate through visual animations;
(b) A User can communicate sentiment, mood and movement;
(c) A User can simultaneously view a multiple-party exchange of animated communication;
(d) Two or more Users can view a 'synchronised animated communication' as described earlier. (e) A User can imitate and communicate real life gestures and interactions with objects and avatars (for example, a User A's Character "giving" a bouquet of flowers to User B's Character or User A's Character kicking User B's Character with the result of User B's Character exhibiting pain);
(f) Community building;
(g) Social responsibility and educational value in as far the User learns the responsibility of raising the Character through its life cycle by ensuring the Character is fed, cared for and educated;
(h) Mobile network agnostic;
(i) Mobile handset agnostic;
Q) Modular thereby allowing Users to utilize one or more functions of the application;
(k) Ability to collect significant amounts of data personal to the User.
(1) A constantly changing application due to the updateability of the application
(m) Offers a significant saving over sending an MMS, Video Message, Picture Message or even an SMS. Therefore significant profits can be attained
5. Description of the Drawings
Figure 1 is a functional block diagram illustrating application of an embodiment of the present invention to telecommunications network/TCP/IP Network infrastructure.
Figure 2 is a flow chart diagram illustrating a method allowing for the generation of a graphically expressive synchronised chat message according to an embodiment of the present invention
Figure 3 is a flow chart diagram providing a method allowing for the generation of a graphically expressive message according to an embodiment of the present invention
Figure 4 is user flow diagram of the GUI (screenshots) of the Setup Application used to Create a Character and initially define its physical and behavioural characteristics.
Figure 5 is a functional block diagram illustrating application of an embodiment of the present invention in a load balanced, fully redundant, clustered network infrastructure. Specifically this diagram focuses on Chat and Message servers.
Figure 6A-6D is a flow chart diagram setting forth the overall process flow associated with the Setup Application according to an embodiment of the present invention
Figure 7 is an architectural chart diagram setting forth the architecture of the Setup Application Figure 8 is a diagram setting forth the architecture for the current embodiment of the J2ME Client Application
Figure 9 is a diagram setting forth the architecture for the current embodiment of the J2ME Setup Application
Figure 10 is a diagram setting forth the flow of data between classes for the current embodiment of the J2ME Client Application.
Figure 11 is a screenshot of the Character Statistics Screen including a close graphic representation of the character and its mood and the behavioural characteristics according to an embodiment of the current invention.
Figure 12 is a series of screenshots of a Character in its environment according to an embodiment of the current invention
Figure 13 is a series of screenshots illustrating the user sequence for receiving dynamic, expressive animated messages through character transference.
Figure 14 is a series of screenshots illustrating the sequence of sending and receiving dynamic, expressive, animated, synchronised chat messages through simultaneous character transference, using the command line and chat screen and includes the text only screen and message details
Figure 15 is a flow chart diagram setting forth the overall process flow associated with the receiving, viewing and animating of a message.
Figure 16 is a flow chart diagram setting forth the overall process flow associated with the Client Application initialization and launching.
6. Description of Preferred Embodiment(s)
Operating Environment
Figure 1 illustrates a network environment including an embodiment of the present invention. The system, Server Side Application 30, is operably connected to the Internet (IP Network 20) in order to send and receive data, via a socket(s) or other appropriate data transport protocol, to and from end systems such as client computers 55 or Client Handsets 50. As illustrated in Figure 1 the network environment includes wireless network 40. This is the method whereby mobile wireless devices connect to the internet through their service provider allowing the transmission of data to and from mobile wireless devices to servers. In one embodiment wireless network 40 comprises gateways for EDGE/GPRS/SMS and any other suitable data transport layer. Also illustrated in Figure 1 is Gateway 12. In one embodiment, Gateway 12 balances the data load of incoming requests and directs them to the appropriate clustered server to provide redundancy in a scalable environment.
SMS Gateway 26 is a failover, should the user not have access to GPRS, it allows for the sending of administration messages or any other data to the handset 50 using SMS. It also allows handsets to send and receive messages to and from each other using SMS rather than GPRS.
Client computers 55 are connected to IP Network 20 through any means such as an Internet Service Provider (ISP) and the like. Client computers can be any suitable internet enabled computing device, i.e. a desktop computer, laptop, PDA which is capable of sending and/or receiving data be it in a wireless or fixed wire capacity. In one embodiment client computer 55 includes instant messenger software as a software container for sending, viewing and receiving animated, dynamic, expressive and synchronised messages.
Server Side Application
As figure 1 illustrates, Server side application 30, in one embodiment includes User Details Database 36, Message Server 31, Message Database 38, Chat Server 35, Application Server 32, Application Database 37, Character Database 39, Object Database 34, Environments Database 43, Artificial Intelligence Server 41 and Artificial Intelligence Database 42. Server Side Application, 30 is operative to execute the functionality, described herein, allowing users to send and receive animated, dynamic, expressive and synchronised non-voice messages. In one embodiment of the current invention Server Side Application 30 packages and distributes the client application to handsets 50 that allows the creation of animated, dynamic, expressive and synchronised non-voice messages.
In one embodiment Server Side Application 30 operates in connection with User Detail Database 36, Application Server 32, Application Database 37, Object Database 34, Environments Database 43 and Character Database 39. This system is operative to dynamically package and distribute the jar (J2ME) file to a variety of mobile handsets 50 based on the User's custom character specifications and necessary changes that need to be made to the jar file to be compatible with various handsets 50. The jar file is the Client Side Application which is installed on mobile handset 50 and is detailed further on in this document.
For example in one embodiment, Application Server 32 hosts an application (Client Application Compiler) which polls User Detail Database 36, when a request for the client application is received, to discover the user's phone number, what handset type and model the user has, what physical characteristics the user has requested for their character (eg white male, black hair, blue eyes, athletic etc), what behavioural characteristics the user has weighted for their character (eg Intelligence 5, Mischievous 7, Self-reliant 0, quick tempered 9 (as illustrated in Figure 4). These characteristics effect how the character behaves in its environment) and the environment they have selected to start with (eg. Apartment as illustrated in Figure 12). Once these details have been entered and seleceted, the Character database 39 is polled for the png's (image files) which correspond with the specified physical characteristics and are sent to the Client Application Compiler Hosted on the Application Server 32. Pngs corresponding with the users requests for objects and/or environments are also sent to the Application Server 32. The Application Database 37 changes variables to affect how a character will behave in its environment according to the user defined behaviours. Once the component parts have been collated on the Application Server 32, they are packaged to form the user requested jar file (Client Application) and sent to the phone number recorded on the User Detail Database 36. Application Server 32 can store more versions of the Client Application Compiler than just J2ME (Java 2 Mobile Edition) for forming the client application, such as Binary Runtime Environment for Wireless (BREW) or iMode enabled devices.
Message Database Message Database 38 is responsible for storing message data corresponding to messages created by users. In one embodiment each message is stored in a table including the following fields for each message
• a unique message identifier
• a recipient identifier
• a sender identifier
• a message mood
• a message action
• any objects associated with the message action
• message text
• a data/time stamp
• a flag indicating whether a message has been viewed.
Once the user has logged in to the server and the messages have been sent to their handset 50, the messages are deleted from the Message Database 38. In other embodiments the User Detail Database is polled for a login and password as is common in the art.
Gateway
In one embodiment of the present invention, Gateway 12 identifies if incoming data is a chat communication or a message communication and sends it to the appropriate server.
Message Server
The Message Server 31 is operative to serve message data corresponding to messages created by users. In one embodiment Message Server 31 receives message data from a user and sends it to Message Database 38 for storage. When data is requested from Message database 38 it is passed to Message Server 31 and sent to handset 50 or computer 55 via the same protocol it was received by. In one embodiment Message Server may involve a clustered environment with a gateway for load balancing and identification of the type of message being sent/received.
Chat Server
Chat server 35 is operative to serve message data corresponding to Chat Messages created by users. In one embodiment the Chat Server 35 itself is in a clustered environment (As illustrated in Figure 5) containing a Gateway Server 12 that provides Network Load Balancing to ensure redundancy by providing a set of hosts that deal with the communications. When a client attempts to connect to the gateway, the network load balancer addresses a gateway web server that is least busy (i.e. Gateway Server A). A back ground process on the gateway server checks that one of the peer hosts is running and has enough space to handle the traffic for the number of requested chat participants. The user is then redirected to that host, otherwise the gateway looks for another host to try and connect to. Once successful, a database on the server flags where the chat user has connected to and sends the request to the remaining participants who will be directed straight to the correct server.
In one embodiment each message is stored in a table including the following fields for each message
• a unique message identifier
• a recipient identifier
• a sender identifier
• a message mood
• a message action
• any objects associated with the message action
• message text
• a data/time stamp
• a flag indicating whether a message has been viewed.
Each chat message is stored as text based characters which are translated on the application installed client handset 50.
Object Database
Stores the image files (pngs) of objects and images of those objects in various forms of degradation through burning, smashing or some other form of destruction. It also stores physics data indicating how an object can be used and with which actions
In one embodiment each object is stored in a table including the following fields for each message
• A unique object identifier
• Object title
• Associated actions
• Data describing which directions the object can move in and how far
• Unique identifier for front image
• Front Image
• X size Front Image
• Y Size Front Image
• Z index (which layer the image is on)
• Unique identifier for side image • Side Image
• X size Side Image
• Y Size Side Image
• Z index (which layer the image is on)
Environments Database 43 is operative to store the graphics of each environment at a screen size which has been optimised for the various target devices on which the client application is installed.
The AI Server 41 (Artificial Intelligence Server) is operative to interpret questions and statements put to it and to reply to the user in a coherent and contextually correct manner. As mentioned below, in one embodiment, the character communicates with the user in conversational terms eg. User:"How are you today?" Character:"I am well" This process of understanding and replying to questions on an ongoing basis is handled by ALICE (a preexisting Artificial Intelligence communications program, freely available from www.alice.org which is installed on the server. The AI Server also polls the User Detail Database 36 for the phone number of the user, the age and the mood of the character and uses this data to access the appropriate vocabulary from the AI Database 42 for the character, when communicating with the user. The AI server can also be used to push messages to a connected user in an effort for the character to start a conversation.
The AI Database 42 (Artificial Intelligence Database) is operative to define and store the words and language used by a character when communicating with its user. Data uploaded from the user's custom dictionary is stored in AIML (Artificial Intelligence Markup Language) for retrieval by the ALICE system on the AI Server 41. In one embodiment the Client Application sends all custom words via Communications Layer 814which have been entered into the user's handset dictionary to AI Database 42 for insertion into the database which is linked to the Users Profile on the User Details Database 36.
Sending Receiving Viewing Messages
Figure 3 sets forth the process flow associated with an embodiment of the present invention for the sending, receiving, viewing and deleting of messages. In one embodiment, a user accesses the messaging component on the Client Application, causing the client application to transmit a request to the Message Server 31. When the Message Server 31 receives the request (230), it analyses the request to identify the messaging account by the primary key of phone number. In one embodiment, the account is identified by user name only which indicates that the request is coming from an instant messaging application and the messages should be sent to an individual desktop computer 55 via HTTP, SOAP or any other appropriate protocol. Once the account has been verified, the messages are retrieved from the Message Database (step 230) and sent to the Client Application Inbox via GPRS for display (step 232). Any messages sent to the client application on the handset 50 or desktop computer 55 are flagged as viewed and are not sent to the client application again.
In one embodiment the client application displays the textual component of the message as headlines in bold, (step 232) When the message is read, the data is parsed to separate the emotion, and action instructions from the textual component of the message. The emotion instruction translates to the appropriate expression png (image file) which is transposed onto the face of the character sent in the message. The action instruction translates on the client application installed on handset 50 to move the limbs and body of the sender's character to produce the animation of that action in conjunction with the receiver's character.
Figure 3 sets forth the process flow associated with an embodiment of the present invention for the replying to and deleting of messages which occurs on the Client Side Application. The delete function (240) is initiated from under "Options" of the list view of the Inbox displaying the text based headlines (232) and is controlled by the Business Layer 816, specifically the Message and Contacts Management Component 850, which is illustrated in Figure 8. Once delete has been selected the user is prompted with the warning "Delete Message?" should yes be pressed the message is removed from the Client Application.
Following the viewing of the selected message, the user is given the option through a menu to delete reply or return to the inbox (238) . In one embodiment, Deleting a message happens in the same manner described above by removing the message from the RMS (Figure 10: 1006). Reply is found under the Options Menu and offers the user the ability to compose a new message 238. The back option sends the user back to the inbox where the headlines are displayed. Once a reply message is sent, a request to check for new incoming messages is sent to the Message Database for new entries which are displayed in the inbox. The process flow of viewing a message is illustrated in Figure 13. Wireless Network 40 enables communication between wireless device 50 and other systems operabley connected thereto specifically the internet or any other suitable IP Network. Wireless Network 40 can be any suitable digital or analogue network that supports the transfer of text and/or data including, but not limited to Time Division Multiple Access (TDMA) Network, a Global System for Mobile communication (GSM) Network, or a Code-Division Multiple Access Network. In one embodiment, Wireless Network 40 includes functionality supporting GPRS, a communication protocol enabling wireless devices 50 to access the internet or similar computer network. Wireless network 40 includes functionality supporting SMS to send data to and from Server Side Application 30 and handset 50.
Wireless devices 50 are operative to receive data from wireless network 40 and transmit data to wireless network 40 for routing to appropriate devices. Wireless devices 50, in one embodiment, are Internet-enabled devices capable of sending and receiving data from remote servers.
Client Application
The Client Application has the following components
Creating a Character
In one embodiment of the present invention, a character is created by a Java application installed on the client handset 50 (Setup Application). The architecture of the Setup Application is illustrated in Figure 9. This application is operative to gather information to a high specification on the physical and behavioral characteristics are sent to the User Database (Fig 1 36) in order to form a profile of the current character as detailed previously. In the present invention, the correct Setup Application for the phone type and model is selected by the user by sending an alphanumeric text message to an SMS gateway. The alphanumeric text message is interpreted by Application Server 32 and will send out the Setup Application which is correct for that phone type and model. This method of Java application distribution is common in the art.
The Setup Application
In order to create a Character on the User handset, the User needs the "Make a Simbaby" component of the Application ('Setup Application' as illustrated in Figure 4). In an embodiment of the current invention, this is a downloadable J2ME program, from the Application Server 32 which is installed on the handset 50. The Setup Application holds all image files (pngs) and the Business Logic (Figure 7) and displays these image files instantly in accordance with user selections for instant feedback on changes to the characters physical characteristics. As illustrated in Figure 7 the Setup Application GUI 700 allows users to manipulate parameters to a high physical and behavioural specification. The Parameter Interpreter 702 compresses this data before transferring it to the Communications Module 706 which is responsible for all communication external to the Setup Application. The final graphical resemblance of the character appears under the screen in the current invention called "Character Statistics" as illustrated in Figure 11. The Setup Application sets the following Character parameters which will are stored in the User Details Database 32 as a User Profile when the character setup is saved or completed..
Phone Number of User.
In one embodiment the user's Phone number is the primary key under which the users profile is created. Full Details of the contents of a Users Profile are contained in this document.
Type of Character - Figure 4:400
• "As Much Like me as Possible "
This Flags the User's Profile in the User Details Database as being real data that may be used for marketing purposes.
• "A Complete Fantasy"
This Flags the User's Profile in the User Details Database as being false data that should be excluded from any marketing data.
• "This is just a test"
This Flags the User's Profile in the User Details Database as being false data that should be excluded from any marketing data.
User Details - Figure 4:404
• " WJien were you born?"
The entered data tells the application the users age (dd-mm-yyyy).
• " Wliat Sex would you like your Character (SimBaby) to be?" The entered data tells the application the users gender.
In the current embodiment of the invention, the following parameters will be referred to by the Character Database 39 to select the correspondonding image(s) that make up the character which are compiled into the final client application when the character setup is complete.
Character Shape - Figure 4:406 Mannequin Outfit (1 ofx)
• Body Style
There are multiple styles of body for each sex of character. These basic shapes, which differ in breast size, hip size and shoulder size, are given a code which will eventually be stored in the Users Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application. • Height
The Height of the character can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Body Style and stored in the User Profile
• Size
The Size of the character, refers to the width or fatness of the character and can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Body Style and stored in the User Profile
Character Face - Figure 4:408
• Facial Style(Style)
There are multiple styles of face for each sex of character. These basic shapes, are given a code which will eventually be stored in the Users Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
• Facial Length (Length)
Refers to the length of the character's face. The length of the character's face can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Face Style and stored in the User Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
• Facial Width (Width)
Width, refers to the width or roundness of the character's face and can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Facial Style and stored in the User Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
Character Hair - Figure 4:410
• Hair Style(Style)
There are multiple styles of Hair for each sex of character. These basic shapes, are given a code which will eventually be stored in the Users Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
• Hair Length (Length)
Refers to the length of the character's hair. The length of the character's hair can be increased or decreased proportionally on a scale of 1-10. This number is stored in the User Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application. • Hair Curl (Curl)
Curl, refers to the width curliness of a characters hair and can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Hair Style and is stored in the User Profile where it is referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
• Hair Colour (Color)
Color, refers to the colour of the character's hair and can selected on the handset 50. The selected hair colour is given a code and is stored in the User Profile where it is referred to by the by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
Character Eyes Figure 4:412
• Eye Style(Style)
There are multiple styles of Eyes for each sex of character. These 2 shapes (left and right), are given a code which will eventually be stored in the Users Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
• Eye Size (Size)
Refers to the height of the character's eyes. The height of the character's eyes can be increased or decreased proportionally on a scale of 1-5. This number is stored in the User Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
• Eye Width (Width)
Width, refers to the width a characters eyes or the horizontal space they take up on the face and can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Eye Style and is stored in the User Profile where it is referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
• Eye Colour (Color)
Color, refers to the colour of the character's eyes and can selected on the handset 50. The selected eye colour is given a code and is stored in the User Profile where it is referred to by the by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
Character Nose Figure 4:414
• Nose Style(Style) There are multiple styles of Noses for the characters. A nose style is given a code which will eventually be stored in the Users Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
• Nose Length (Length)
Refers to the Length of the character's nose. The length of the character's nose can be increased or decreased proportionally on a scale of 1-5. This number is stored in the User Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
• Nose Width (Width)
Width, refers to the width a characters nose or the horizontal space it occupies on the character's face and can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Nose Style and is stored in the User Profile where it is referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
Character Mouth - Figure 4:416
• Mouth Style(Style)
There are multiple styles of Mouth for each sex of character. These basic shapes, are given a code which will eventually be stored in the Users Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
• Mouth Width (Width)
Refers to the width of the character's mouth. The width of the character's mouth or the horizontal space that it occupies across the character's face can be increased or decreased proportionally on a scale of 1-5. This number is stored in the User Profile where it will be referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
• Lip Width (Lip Width)
Width, refers to the thickness of the character's lips and can be increased or decreased proportionally on a scale of 1-5. This number is applied to the overall Mouth Style and is stored in the User Profile where it is referred to by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
• Lip Colour (Lip Color) Lip Color, refers to the colour of the character's lips and can selected on the handset 50. The selected lip colour is given a code and is stored in the User Profile where it is referred to by the by the Character Database 39 when compiling the graphics or image files for this body part(s) into the final client application.
In the current embodiment of the invention, the following parameters will be referred to by the Application Database 37 to add the correct behavioural variables to the compiled client application. These behavioural variables result in the frequency of specific actions and likelihood of specific moods when the Character is living in its environment (Tamagotchi Mode). These behavioural parameters change over time in accordance with how the user has looked after or conversely, neglected its character. In the current embodiment of the Setup Application Behavioural Parameters are paired and traded off against each other so that a Character cannot be set up with "perfect" behaviours. For Example Intelligence and Strength are paired. As intelligence is increased Strength is decreased. The following Behavioural parameters are stored in the User Details Database 32 as a User Profile.
Behavioural Characteristics
In the current embodiment of the Setup Application within the current invention, the two behavioural characteristics below are relative to each other as one increase on a scale of 1-10 the other decreases Figure 4:420
• Intelligence
Refers to the intelligence of a character. In one embodiment of the current invention a highly intelligent character will have access to more actions than an unintelligent character.
• Strength
Refers to the strength of a character. A Character with a high strength score will be able to inflict more damage on a another users character and destroy an object with fewer blows.
In the current embodiment of the Setup Application within the current invention, the two behavioural characteristics below are relative to each other as one increase on a scale of 1-10 the other decreases Figure 4:422
• Placid
Refers to the likelihood of the character destroying or throwing items within it's environment. A score of 10 means that there is no likelihood of the character destroying or throwing items within it's environment.
• Mischievousness
Refers to the likelihood of the character destroying or throwing items within it's environment. A score of 10 means that there is a guaranteed likelihood of the character destroying or throwing items within it's environment. In the current embodiment of the Setup Application within the current invention, the two behavioural characteristics below are relative to each other as one increase on a scale of 1-10 the other decreases Figure 4:422
• Talkative
Refers to the likelihood of the character demanding attention, playing games or having a conversation. A score of 10 means that there is a guaranteed likelihood of the character demanding attention, playing games or having a conversation
• Self Reliant
Refers to the likelihood of the character demanding attention, playing games or having a conversation. A score of 10 means that there is a no likelihood of the character demanding attention, playing games or having a conversation
In the current embodiment of the Setup Application within the current invention, the two behavioural characteristics below are relative to each other as one increase on a scale of 1-10 the other decreases Figure 4:422
• Charismatic
Refers to the likelihood of the character entertaining the user by dancing, showing love and playing A score of 10 means that there is a guaranteed likelihood of the character dancing, showing love and playing
• Quick Tempered
Refers to the likelihood of the character entertaining the user by dancing, showing love and playing A score of 10 means that there is no likelihood of the character dancing, showing love and playing and will frequently get cross in Chat and punch and kick other participants for no reason.
Once the Physical and Behavioural characteristics have been selected, the user is prompted to Enter the Name of the Character. The Name of the character, the physical and behavioural characteristics are then updated via GPRS or SMS to the to the User Profile on the User Details Database 36.
In another embodiment for a setup by client computer 55 the GUI for the Setup Application is in Flash (swf) and all gathered user, physical and behavioural data is sent to the same databases, through Gateway 12 using HTTP or any other suitable TCP/IP protocol. Client Side Application
In an embodiment of the current invention, Application Database 37 selects the appropriate Java base application that has been optimised for the user's phone type and model. It then collects the appropriate image files from Character Database 39 that corresponds with the chosen Physical Characteristics of the user's character from the User Profile on User Details Database 36. Application Database 37 also changes the behavioural variables in accordance with the Behavioural characteristics selected in the Setup Application and stored in the User Profile on User Details Database 36. The Java application is then compiled on Application Server 32 and sent to the target device's Phone number which is stored in the User Details Database 36.
The client Application consists of the following components:
• Client Application GUI Functionality (Figure 9:900) o "Tamagotchi" style Interaction o Menu Structure o Interaction with rest of the blocks
• Message Client (Figure 9:902) o Send receive messages via appropriate protocol
• Chat Client (Figure 9:904) o Set up the chat sessions o Send receive messages via appropriate protocol
• Application logic(Figure 9:906) o Consist of small Database and logic to control the Application o Logic to pull and push information from/to the Server
• Client Communication Module (Figure 9:908) o Logic for choosing the communication mode
Client Application
As diagram 8 illustrates Client Application 800, in one embodiment includes User Key Events 802, Non-Interactive Actions 804, Server Messages 806, Action Event Controller Layer 808, Key Events Listener 810, AI (Artificial Intelligence) Action Generator 812, Communication Layer 814, Business Layer 816, Application State Registry 818, Command Interpreter 820, Message Interpreter 822, Environments Layer 824 (which comprises of Bedroom 826, Environment Map 828, Environment Layers 830, Environment Actions 832, Office 834, Kitchen 836, Bathroom 838 and any other environment loaded onto the handset) Items Layer 840, Item 842, Item Layer 844, Item Physics 846, Menu/Message Screen Generator Component 848, Message and Contacts Management Component 850, Chat Component 852 and Data Layer 854. In another embodiment the architecture may change slightly due to the art but the idea remains the same
The following components are implemented as a collection of functionalities and functional areas.
Action Event Controller Layer (Figure 8: 808)
• This component traps messages and key events from User 802 input or Server 806 and passes the events to the business layer 816 for execution. This component is divided into 3 sections (Key Events Listener 810, Screensaver Mode (Native Functions and Data), Server Message Listener/Communication Layer 814) and each section is responsible for trapping data coming from various sources and passing them to the business Layer 816 for processing.
• Business Layer (Figure 8: 816) The Business Layer holds all the business/application logic of the Client Side Application. The Business Layer is divided into the 7 following subsections.
• Application State Registry 818: Is operative to record the state of the Application and what function it is performing at that time, whether it be Tamagotchi Mode, Chat or Messaging
• AI (Artificial Intelligence) Action Generator 812: Is operative to calculate the actions of the character in Tamagotchi mode based on it's mood and behaviour
• Command Interpreter 820: Parses instructions typed by the User to the Character to perform actions, change moods or enter other environments
• Message Interpreter 822: Is operative to receive messages from the appropriate Communication modules and identifies whether it is a chat communication or a message communication,. In another embodiment it parses messages and chat data into actions, message text and emotions and sends it to the appropriate server or gateway 12. • Menu/Message Screen Generator Component 848: is operative to generate menus based on Application state
• Message and Contacts Management Component. 850: is operative to manage whether messages have been deleted from the application (RMS) or received from the message server 3 land added to the RMS. Likewise the component allows the adding/deletion of contacts, their names and phone numbers from the Client Application (RMS)
• Chat Component 852: is operative to manage requests to and from the Chat Server 35
• Data Layer (Figure 8: 854): is operative to provide an interface to the RMS useing generic functions to add, update, delete and synchronize records from the Client Application to the Server Side Application 30, specifically synchronizing the Character Logs with the User Details Database 36. The following is stored on the data layer
Message Logs: the messages that have been sent from the server to the application and their senders information
Chat logs: What date and time the chat was initiated, with whom and for how long,
Purchase Logs: What Items, abilities, environments have been purchased and which ones have yet to be delivered
Third Party Data Logs: What Third Party data sources have been accessed and for how long
Data logs: What data has been sent and received from the application and to whom
Character logs: What has happened to the character and how the physical attributes and behavior of the character have changed. This data is synchronized with the Server Side Application 30 and image are uploaded if required
User profile: including Phone model, type, identifier and number. This data is there in case it needs to be polled by the server
• Environment Layer (Figure 8: 824): This layer is operative to load the necessary data and images (pngs) to populate and display the Environments. The following data is loaded: Environment Map 828: contains the data of x and y coordinates of the screen for the specific handset 50 make and model. It also contains of the data of x and y coordinates images representing items which populate the Environment.
Environment Layer 830: contains the layering information. This information determines z- axis coordinates of the objects based on the Character's position. For example should a character be required to go from the from of a desk to behind a desk the z-axis coordinates must swap.
Environment Action 832: Handles all data images related to an action/activity that may be performed by a character in a specific Environment.
Bedroom 826, Office 834, Kitchen 836, Bathroom 838: These are names of preloaded environments but can be any environment loaded into the Client application Communication Layer (Figure 8: 814): The Communication layer implements all communications related to the receiving and sending of Chat and Messages to and from the Server via GPRS, Sockets, SMS or any other appropriate protocol. In one embodiment of the present invention it listens to specified ports via TCP/IP socket connection for incoming requests and responses and passes it on to the Message Interpreter 822 for translation and appropriate handling. A Socket connection is set-up (over GPRS) with the Server at the time of initialization of the Client Application. In the current embodiment of the present invention, when a Client wants to send a message socket send command is initiated. The Server is always listening to the port for incoming data On receiving data from the Client, The Server puts the message received in the Database and pushes the data to the other Client using the second socket connection, which has been established initially. Data is presented to the Client as a Chat message.
Administration Layer (Figure8: 856): Is operative to cater to administrative functionalities such as:
Preferences: Sounds on/off, Backlight On/Off
Ordering and tracking sent, received Messages and Chat messages
Sequence of Chat Messages
Storing contents of address book
Tracking and addressing purchased items/abilities/environments
Storage of Character Data to be sent to User Detail Database 36 and noting what data has been sent by Server Side Application 30 to the Client Side Application installed on handset 50
Storage of access details to third party data
Storing phone model, type, identifier and number 6.8 Detail Level Design
Figure 10 illustrates the class diagram for the for the J2ME embodiment of the current invention (Client Application)
Client Application Class (Figurel0:1004)
The Client Application Class is the MIDlet class and is operative to handle all states of the Client Application from the device (50) Operating System provided by the Java Virtual Machine implementation. This is the interface between the System device and the Java Application. (Client Application)
Application Manager Class (FigurelO:1002)
In the current embodiment of the invention The Application Manager Class is operative to be the center controller class of the J2ME Client Application. This Class Registers the state of the Application and prompts the Client Application to perform the necessary actions, traps key events, generates message boxes, generates menus, registers with the communication layer for messages and chat events, draws necessary UI, Character and Environments.
Character Class (Figurel0:1024)
Is operative to draw the character, referencing images (PNG Maps) to correspond to the character's physical and behavioral attributes.
During initialization, based on the character parameters, this class brings out the images of that particular character type and assembles the images (body, hands, legs etc) and displays to the User. Character class has contains the x and y coordinates of all images, which are contained in the frames that when animated create actions and emotions..
Environment Class (Figurel0:1028)
Environment Class is operative to draw the environments, checks for the character position and adjusts the z axis of the layers in the environment so that the character can walk behind and in front of objects. In addition this class is operative to check for the character's interaction with an object and changes the state of that object based on the interaction with it. Eg Burning a table
Moving Objects Class (Figurel0:1026)
In the current embodiment of the invention the Moving Objects Class is operative to hold all the interactive objects of the Environment(s). I.e. objects that can move or have animations associated with them eg flame are implemented in this class.
This class is also operative to detect Collisions between the character and the MovingObjects. It defines the state of the object once a collision has been detected, and performs whatever interaction needs to take place based on that collision, updates images and initiates animations to simulate the effect. Eg Throwing a cup and a window breaking Communication Class (Figurel0:1018)
This class is operative to register a port and listen for incoming messages, reads/translates messages and sends messages. The communication class is used by the Application Manager to check and change states of the Application.
RMS Class (Figurel0:1006)
Is operative to Add, Delete and Update Records on the Client Application
Font Library Class (Figurel0:1014)
This class is a Generic Font Class, which enables developers to implement different fonts by using font images. It implements the Cursor and device key maps of the handset 50. The class can be easily extended to add key maps of different phones.
Messages Class (FigurelOdOlO)
In the current embodiment of the invention this class is operative to store all the message information for retrieval. Every message in the Inbox / Outbox has an associated message object. Message objects are instances of the message class at the time of arrival of the message or dispatch of the message These objects store the information like message state (incoming or outgoing), message sender name, message phone number, date and time, message receiver name and text of the message.
Contacts Class (Figurel0:1016)
Is operative to store all the contact information for latter use. Every Contact in the contact list has an associated contact object. The RMS (1006) gets the data from the contact object. Data retrieved is contact name, contact phone number and the port number. This data is stored in the contact object.
Character Details Class (Figurel0:1012)
Is operative to store all character information within itself, for later use. Li all Messages and Chat Messages the Character has an associated CharacterDetails Object.
Message UI Class (Figurel0:1008)
Is operative to display the following screens as part of the UI :
Talk To Screen
Alert Messages
Inbox Display
Outbox Display Database
In one embodiment of the present invention, the following is the Data-structure that stores information related to the user
User info
Figure imgf000027_0001
Mobile info
Figure imgf000027_0002
User Pa ment Info
Figure imgf000027_0003
User Pa ment trans Info
Figure imgf000027_0004
User SimBaby Info
Figure imgf000027_0005
Message Logs Database
Message Type Table
Figure imgf000027_0006
Messa e trans Table
Figure imgf000027_0007
Figure imgf000028_0001
SimBab T e Table
Figure imgf000028_0002
SimBaby Character Table
Figure imgf000028_0003
SimBaby Mood Table
Figure imgf000028_0004
SimBaby Body Type Table
Figure imgf000028_0005
SimBaby Clip Table
Figure imgf000028_0006
SimBaby Canvas Table
Figure imgf000028_0007
Environment Master Table
Figure imgf000028_0008
Environment Detail Table
Figure imgf000029_0001
Ob ect Master Table
Figure imgf000029_0002
Object Clip Table
Figure imgf000029_0003
Object Canvas Table
Figure imgf000029_0004
Message Formats
Following is an example of the message data being sent between the Server Side Application and the Client Application:
• Chat message data format.
o Chat Message from Server to Client.
<Message Type > - Identifying what data is being communicated. <Sender Name > - Name of the sender <Sender Port No > - Port Number of the sender <Chat Session Id > - Indicating the chat session. <NoOfUsers> - Indicating the number of Users in the chat <SimbabyId > - Id of the SimBaby character of the sender that is displayed on screen.
<Text Message > - Text message of the chat. o Chat Send Invitation message
<Message Type > - Identify what data is being communicated.
<User Name> - Name of the invitee
<Contact Name> - Name of the contact being invited
<Contact Port No> - The port No. and the telephone number of the contact. o Chat \Accept / Decline Invitation message
<Message Type > - Identify what data is being communicated.
<User Name> - Name of the invitee
<Contact Name> - Name of the contact being invited
<Contact Port No> - Port No. and the telephone number of the contact.
• Message Data format. o Message from Server to Client.
<Message Type > - Identify what data is being communicated.
<Sender Name > - Name of the sender
<Sender Port No > - Port Number of the sender
<Receiver Id > - Indicate the chat session
<NoOfUsers> - Indicate the number of Users in the chat
<SimbabyId > - Id of the SimBaby char of the sender that will be displayed on screen.
<Text Message > - Text message of the chat.
7. Detailed Description
The present invention consists of an Application which broadly enables the following functionality:
• Creation of a miniature, artificially intelligent avatar of User or two Users choosing to make a Character together;
• Defines and updates the avatar's 'personality' before sending this data to the Server
• Designs of a virtual home or environment for the SimBaby™ to inhabit; • The care, nurturing and educating of a SimBaby™ through its life cycle; and importantly,
• Communicating with other Users (regardless of network or handset model) through this Character in an expressive, dynamic, animated and synchronised fashion; and
• The ability to create a "Love Match" with another User using Infra Red or Bluetooth technologies or their equivalent.
• The ability to express oneself through interaction with objects, environments and other Users avatars
The Application is described in greater detail in the following pages and although the application is written in J2ME to work on a mobile phone, the same principles apply regardless of the programming language it is written in ie Flash Lite. In one embodiment of the application which appears integrated into a messaging application on the desktop it is written in C++. This does not change the principle of sending or receiving messages or the code used in the sent and received messages.
Characters, Objects and the Environment
Character
As described, the Application introduces a novel non-voice method of mobile communication which allows one mobile phone User to communicate and interact with another mobile phone User through a highly personalized, artificially intelligent avatar closely resembling the User's own appearance and personality.
This Character can be designed:
(a) On the User's handset 50 (through Application software installed on the handset); or
(b) On a designated website.
The Application software installed on the handset allows the User to design the Character to a high specification as illustrated in Figure 4
The preview on the right is instantly updated in response to the Users' selections. The Client Application also allows Users to monitor the mood, personality traits and appearance of the Character on an ongoing basis in reaction to the care given to the character and the events which have occurred to it, by accessing Figure 11. This screen gives an indication of what behavior to expect from the avatar for example whether the character will be happy and dance to amuse its owner or be angry and destroy its environment.
Environment
One embodiment of the current application also facilitates the design a virtual home for the Character to "live in". This "home" is known as an 'Environment' in the Application terms. The User can choose an Environment from a number of pre-designed Environments available on the User handset, purchase a higher specification Environment or design his own. These Environments are multi-roomed residences designed to a high specification as illustrated in Figure 12:
Collectively, the 'rooms' shown in Figure 12 form the 'Apartment', one of the pre-designed Environments available in the current embodiment of the Client Application. Environments are not just limited to indoor "rooms" they can comprise outdoor settings such as a beach or skifield. Environments are not just a setting for the character to occupy they hold objects which the avatar can use and also contain code which controls the physics of the environment eg. The character can jump through a window but not a door.
Objects
As mentioned above, the Environment contains movable Objects for example toothbrushes, footballs and coffee cups, for the Character to use when in idle mode.This is the mode when the character is going about it's daily business of "living" on the handset 50, performing actions which suit its behavioural characteristics. These items can also be used in a communication. The Application contains a number of standard items for use by the Character in the Environment however the User may purchase additional items/more unusual items such as rocket packs, beds, spud guns on an ongoing basis. Objects also carry their own code and graphics which tells the application how the object can behave and whether it can be burnt or smashed. It also tells the application how the object should look when it is used or burnt of smashed or any of the other possibilities associated with the look and behaviour of an object.
Sound
AU of the Characters' actions will be accompanied by background sound effects however there will be no associated voice function in the Application. Initialisation.
Once the Character design and Environment design (or selection) is complete, the User must send this data to the Server via GPRS [and/or HTTP TCP/IP depending upon which one of the above vehicles was used during design stage]. Abstractly, the data is sent to the server.
Once received at Server, the Server Side Application 30 will integrate the design data and create an image file (png) of the Character recording all of the behavioural and appearance statistics specified by the User together with an extrapolation of the Character's component body parts to enable the Application to generate detailed Character movement on an ongoing basis. The .png file is subsequently installed into the .jar file of the Application (this takes place Server side). This .jar file is then sent to and installed on the User's handset 50 via SMS, GPRS or any other applicable data transfer mechanism. This installation enables the User to access the functions described in this document. In one embodiment the Client Application is installed and integrated into the operating system of the handset in question for example, Symbian. In another embodiment the Client Application functions, as described outside of that operating system. Upon User initialization, the Character will appear in the application installed on the User's handset living in the chosen Environment and will be ready for use. In one embodiment the Character will appear in the wallpaper space of the User's handset.
User Functions
In broad terms, the Application allows the User to perform the following functions:
"Tamagotchi" style interaction
This allows the User to interact with his Character in the Environment; to feed, educate and play with the Character to ensure its growth. This feature can be broken down further into the following two sub-components:
(a) User Interaction Mode when the User instructs and commands the Character to perform certain activities (for example, read a book, brush teeth or set fire to the kitchen table). The Application provides three ways in which the User can instruct its Character namely:
By typing natural language into the Command Line on the handset (instructions are interpreted through a 'Parser Dictionary' in the Application);
Through a Menu System; or
Through a 'Talk to' function (where the user can command the character to perform actions through a separate screen, navigate to via the Menu).
(b) Non-User Interaction Mode when the Character is effectively in idle mode and getting on with its day to day life. In this mode, the Character will play out a range of acts based on its current personality independently of User instruction. This is made possible by the artificially intelligent component of the Application. If the Character has been neglected it will not wish to please its owner by dancing and being happy it will look angry and often try to destroy objects in its environments ie throwing a cup through a window or setting fire to objects.
This aspect of the Application is purely for the User's personal entertainment and no network signal is required.
Messaging
This function allows two or more Users to send and receive messages, actions and gestures to one another. These Messages are animated, dynamic and expressive and are the 'virtual' equivalent of the Users interacting in the real and physical world.
The novelty of this function lies in the following:
- Character Transference ;
Expressiveness. The ability to capture the mood of the User sending a message ('Sender') through facial expressions and movements in the Character;
Dynamic Animation. The dynamic nature of the Application message sending data to move the animations on the client rather than rendering animations on the server and sending those through to the client. This is a novel approach to and uses substantially less data than an MMS, Picture or Video Message.
- Synchronisation. The synchronised display of the Characters on the screens of the participating Users. This particular feature means that each Character will appear as if it's walking off the edge of the sender's phone and into the frame of the recipient(s) User. Each User will see both Characters interacting on his handset display simultaneously. As the handset is always 'talking to' the Server, the Users can interact through their respective Characters in a dynamic, almost 'real-time' fashion. The Application facilitates the recording and replay of all Messages.
An illustration of the Messaging sequence is provided in Figure 13:
Messaging can be achieved in the following two ways: (a) Command Language
A User can use the simple "command language" set out below to type instructions/ expressions into the Command Line visible on the screen. Expressions are surrounded by "*" for example *Happy* Actions are surrounded by "( )" for example (kiss)
Neither actions nor expressions are case sensitive. The format for the command language could be and two non alpha numeric characters to surround the emotion or action. The purpose is for the parser in the receiver's application to recognise the expression(s)/action(s) and act accordingly.
Users can build multiple expressions or actions or a combination of expressions and actions into the Message. The following examples illustrate this mechanism:
The instruction "*Happy* *Sad*" will result in the Character displaying a happy face followed by a sad face. The expressions will be displayed for between 1 second.
The instruction "(kiss) (hug)" will be displayed in sequentially, in the natural time it takes to perform the action.
A mixture of expressions and actions such as "*Happy* (kiss) *Sad* (hug)" will be displayed sequentially in logical pairs so that the Character will wear a happy face whilst giving the kiss and a sad face whilst giving the hug.
If no expression is specified in conjunction with a specified action, the default expression will be neutral.
If the User wishes to break up expressions and actions with some text, this can be achieved in the following ways:
^expression* text (action) or
*expression* (action) text or
(action) *expression* text
Naturally, other combinations will also be possible, (b) Menu System.
Alternatively, a User can use the menu system installed on the handset to construct and send a Message.
Chat This is an extension of the Messaging principle described above. This feature allows a User to invite multiple Characters over to its Environment to "play". Once the invitation to Chat has been accepted by each of the invited Users, their Characters will automatically appear in the Environment of the Inviter. In Chat, all participants see the same screen in a synchronised capacity so all participants will see the Environment of the Inviter. The Characters can exchange actions, emotions, text messages in apparent "real time" and in context with each other. The text is available to be scrolled through on each independent handset. Should this be done, the text does not scroll on any other users handset
Chat can be activated in broadly the same ways as Messaging i.e. through the command line or through the menu system. Figure 14 illustrates the user flow of a Chat communication.
(a) Command Language
The command language for Chat is as follows:
In order to initiate Chat, the User need only type "invite.." followed by the name of the desired User(s).
In order to reply to an Chat message, the User must type "reply.." followed by the message. In order to exit Chat, the User must type "exit.." into the command line
(b) Menu System
Alternatively, the User can kick start or exit Chat through the menu system
Entities in the Application are: α Setup Application α Server Side Application α Mobile service Provider (for SMSC GAV, GPRS, Integration APIs)
□ Client Application
Application Client Side System
The Application's Client side will has the following components:
Make a SimBaby (Setup Application)
In order to create a Character on the User handset, the User needs the "Setup Application" Program. This is a downloadable module from the Server which is installed on the handset. The Setup Application GUI will allows the User to manipulate the parameters and design the Character to a high physical and behavioural specification as illustrated in Figure 4.
Server Side Application
The Server has the following three-tier architecture: Database, Application and Presentation. The Server is partially connected to the Client program and has automatic push and pull components to retrieve information about the Client, push data to the client for the AI eg. Complex conversations from the Character will be performed on the server and sent to the client Application. In addition the application and server synchronized during high signal availability and pauses in messaging or chat to send data on the Character so that the Server is always aware of any changes that have happened to the Character and can push images and data to the Client Application to reflect these changes.
Client Application
• Message Server
The Message Server is integrated to the SMSC (Short Message Service Center) G/W of the service provider though the Integration Module. The Message Server contains the Business Logic to control/send the messages from/to the Application. The Message Server integrates with the Database, Application Logic and the Integration Module.
• Chat Server
The Chat Server has the capability to implement chat sessions between multiple Users. The Chat Server integrates with the Integration Module, Databases and Application Logic. Chat utilizes the GPRS connectivity of the mobile to send and receive chat messages however it can use any protocol to send or receive data, even SMS. Chat messages support text and special characters.
• Application Logic
The Application Logic is developed using J2ME, however it can be written in any language which is available to a mobile device. It deals with the processing that takes place behind the scenes rather than the presentation logic required to display the data on the screen.
• Communication Module
The Communication Module handles all of the communication of the Application with external interfaces. Communication Module identifies and handles:
Chat messages
■ SMS messages
■ Control messages « Data messages
• Billing Module Billing transactions are updated on the fly subtracting from a prepaid amount The functionality of this module also creates information logs from the raw data about the usage.
Character/Attribute generation at runtime at Application Launch
Figure 16 sets forth a process flow of classes and states associated with the launching and initialization of the Client Application in an embodiment of the current invention
Attribute generation of how a character carries a message to a remote terminal.
As Illustrated in Figure 15, all images for the character are stored in the Client Application and can be reused to innumerable permutations. When a Message is received it is parsed to identify the emotions / actions for the Character, the text of the message is displayed and the animation is formed dynamically.
Character/ Attribute generation of character at remote terminal
As illustrated in Figure 2, on receiving the text data the Client Application interprets the message. On accepting the message the message is parsed. Images/ emotions/actions are retrieved and displayed manipulated according to the vector data that has been sent in the message to form the animation contained within the message..
Definitions
Figure imgf000039_0001

Claims

8. ClaimsWhat is claimed is:A system enabling a character-based, animated, dynamic, expressive and synchronised non-voice mobile messaging service comprising;
1. The Application Client for storing, for at least one character, a set of image files which can be manipulated by transferred data to animate body parts and expressions;
2. An application server operative to collate image files, assemble code and compile an application client in accordance with selections made during the setup process;
3. An application for the presentation and composition of animated, dynamic, expressive messages;
4. A setup application with instant visual update for the recording, entering and selecting of physical and behavioural attributes which simulate the user;
5. Wherein the chat server/message server is operative to transmit to the recipient a message which is translated on the handset to display mood, expressions, text and actions
6. The system of claim 1 wherein the user interface allows Character Transference whereby a character can walk onto the screen of another users phone;
7. The system of claim 1 wherein the user interface allows the synchronised display of the Characters and their messages on the screens of the participating Users.
8. The system of claim 1 wherein the user interface allows the ability to capture the mood of the User sending a message ('Sender') through facial expressions and movements in the Character;
9. The system of claim 1 wherein the user interface allows the translation of text based data to result in dynamic animation
10. The system of claim 1 wherein the user interface allows the translation of text based data to result in dynamic animation
11. The system of claim 1 of a message via GRPS in the guise of an SMS.
12. The system of claim 1 allowing for control of the client application and its interaction with the server to be text based through the Command Language
13. The system of claim 1 whereby a User can communicate with other Users through visual, highly personalised, miniature animations based upon his or her own nature and appearance;
14. A system whereby a User can simultaneously view a multiple-party exchange of communication (i.e. incoming and outgoing messages from all Users participating in a communication) on a single handset in an apparent 'real-time' capacity;
15. A system that allows for the User to care for and nurture the Character. This function requires the Character to begin "life" as a "baby" and to grow through its life cycle at an accelerated rate. The User can interact with its Character to feed/nurture/play with its Character through an application which is installed on the phone. This aspect of the invention is purely for entertainment and does not require a mobile network signal; and
16. A system called "MiniMeLove". This builds on the personal data provided by the User in accessing the invention. The feature requires the User to input further data about the qualities of that User's ideal love partner. Once this data has been stored and integrated into the Application, the User can use Bluetooth or Infra Red technologies to see how good or bad a love match another acquiescing User is.
17. The system of claim 1 whereby the client application can be updated. This means image files and code can be sent to the application.
18. The System whereby unique words entered into the users phone dictionary can be incorporated a remote server
19. The system of claim 1 referred to as Dynamic Animation. The dynamic nature of the Application message sending data to move the animations on the client rather than rendering animations on the server and sending those through to the client. This is a novel approach to and uses substantially less data than an MMS, Picture or Video Message.
PCT/GB2006/001129 2005-02-02 2006-03-29 System of animated, dynamic, expresssive and synchronised non-voice mobile gesturing/messaging WO2007007020A1 (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
GB0502174A GB0502174D0 (en) 2005-02-02 2005-02-02 A wheel for providing axial and radial movement
GB0514051A GB0514051D0 (en) 2005-02-02 2005-07-07 Improvements in the movement of materials
GB0514051.2 2005-07-07
GB0520651A GB0520651D0 (en) 2005-10-11 2005-10-11 Machine configuration concept
GB0520651.1 2005-10-11

Publications (1)

Publication Number Publication Date
WO2007007020A1 true WO2007007020A1 (en) 2007-01-18

Family

ID=36337572

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/GB2006/000259 WO2006082369A2 (en) 2005-02-02 2006-01-26 Manipulator apparatus and drive elements therefor
PCT/GB2006/001129 WO2007007020A1 (en) 2005-02-02 2006-03-29 System of animated, dynamic, expresssive and synchronised non-voice mobile gesturing/messaging

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/GB2006/000259 WO2006082369A2 (en) 2005-02-02 2006-01-26 Manipulator apparatus and drive elements therefor

Country Status (1)

Country Link
WO (2) WO2006082369A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090199110A1 (en) * 2008-02-05 2009-08-06 Samsung Electronics Co., Ltd. Apparatus and method for transmitting animation-based message
US10250537B2 (en) 2014-02-12 2019-04-02 Mark H. Young Methods and apparatuses for animated messaging between messaging participants represented by avatar

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1073243A1 (en) * 1999-07-14 2001-01-31 Realtime ApS Short message service communication system, message service communication system and entertainment platform
US20020090985A1 (en) * 2000-09-07 2002-07-11 Ilan Tochner Coexistent interaction between a virtual character and the real world
US20020116263A1 (en) * 2000-02-23 2002-08-22 Paul Gouge Data processing system, method and computer program, computer program and business method
WO2002078284A2 (en) * 2001-03-26 2002-10-03 K Technologies Limited Peer to peer data transfer between wireless information devices
EP1258827A2 (en) * 2001-05-18 2002-11-20 Aruze Co., Ltd. Game method using network, server executing the game method, and storage medium storing program executing the game method
EP1462945A1 (en) * 2001-12-05 2004-09-29 Cybird Co., Ltd Communication information sharing system, communication information sharing method, communication information sharing program

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6057582B2 (en) * 1979-06-12 1985-12-16 キヤノン株式会社 Original alignment device
DE3804576A1 (en) * 1987-03-28 1988-10-06 Heidelberger Druckmasch Ag DEVICE FOR PROMOTING AND ALIGNING BOWS IN BOW-PROCESSING MACHINES
US4836119A (en) * 1988-03-21 1989-06-06 The Charles Stark Draper Laboratory, Inc. Sperical ball positioning apparatus for seamed limp material article assembly system
JPH064116B2 (en) * 1989-10-24 1994-01-19 ハムス株式会社 Baffle forming device for sewing machine
FR2665191B1 (en) * 1990-07-27 1992-11-20 Rouleau Patrick FABRIC GUIDING DEVICE AND AUTOMATIC SEWING METHOD.
GB9218657D0 (en) * 1992-09-01 1992-10-21 British United Shoe Machinery Workpiece positioning apparatus
JPH08225183A (en) * 1995-02-20 1996-09-03 Ricoh Co Ltd Image forming device
US5806449A (en) * 1996-04-03 1998-09-15 Jet Sew Technologies, Inc. Method and apparatus for attaching sleeves to shirt bodies
US6059284A (en) * 1997-01-21 2000-05-09 Xerox Corporation Process, lateral and skew sheet positioning apparatus and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1073243A1 (en) * 1999-07-14 2001-01-31 Realtime ApS Short message service communication system, message service communication system and entertainment platform
US20020116263A1 (en) * 2000-02-23 2002-08-22 Paul Gouge Data processing system, method and computer program, computer program and business method
US20020090985A1 (en) * 2000-09-07 2002-07-11 Ilan Tochner Coexistent interaction between a virtual character and the real world
WO2002078284A2 (en) * 2001-03-26 2002-10-03 K Technologies Limited Peer to peer data transfer between wireless information devices
EP1258827A2 (en) * 2001-05-18 2002-11-20 Aruze Co., Ltd. Game method using network, server executing the game method, and storage medium storing program executing the game method
EP1462945A1 (en) * 2001-12-05 2004-09-29 Cybird Co., Ltd Communication information sharing system, communication information sharing method, communication information sharing program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090199110A1 (en) * 2008-02-05 2009-08-06 Samsung Electronics Co., Ltd. Apparatus and method for transmitting animation-based message
US10250537B2 (en) 2014-02-12 2019-04-02 Mark H. Young Methods and apparatuses for animated messaging between messaging participants represented by avatar
US10979375B2 (en) 2014-02-12 2021-04-13 Mark H. Young Methods and apparatuses for animated messaging between messaging participants represented by avatar

Also Published As

Publication number Publication date
WO2006082369A2 (en) 2006-08-10
WO2006082369A3 (en) 2007-01-18

Similar Documents

Publication Publication Date Title
US10979375B2 (en) Methods and apparatuses for animated messaging between messaging participants represented by avatar
US7342587B2 (en) Computer-implemented system and method for home page customization and e-commerce support
KR100854253B1 (en) Communication method and apparatus including rich media tools
CN109885367B (en) Interactive chat implementation method, device, terminal and storage medium
CN101303622B (en) Method for chatting using chat interface with haptic feedback function
WO2017092194A1 (en) Method and apparatus for enabling communication interface to produce animation effect in communication process
Guo et al. Design-in-play: improving the variability of indoor pervasive games
KR100382854B1 (en) System for and method of information exchange by recording and playing avatar action
US20110078578A1 (en) Interactive avatar
KR20010000477A (en) A method amd apparatus for controlling a virtual agent used for e-mail, a control program supply medium
WO2013120851A1 (en) Method for sharing emotions through the creation of three-dimensional avatars and their interaction through a cloud-based platform
CN107085495A (en) A kind of information displaying method, electronic equipment and storage medium
CN102801652A (en) Method, client and system for adding contact persons through expression data
US20050153678A1 (en) Method and apparatus for interaction over a network
Bredikhina et al. Avatar driven VR society trends in Japan
JP2002278691A (en) Game machine
KR100957858B1 (en) Avatar presenting method and computer readable medium processing the method
WO2007007020A1 (en) System of animated, dynamic, expresssive and synchronised non-voice mobile gesturing/messaging
CN110166351A (en) A kind of exchange method based on instant messaging, device and electronic equipment
Rockwell An infrastructure for social software
CN116561439A (en) Social interaction method, device, equipment, storage medium and program product
CN114527912A (en) Information processing method and device, computer readable medium and electronic equipment
CN114764361A (en) Expression special effect display method, device, terminal and storage medium
Greenberg Collaborative physical user interfaces
CN114995924A (en) Information display processing method, device, terminal and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A) SENT 06.06.2008

122 Ep: pct application non-entry in european phase

Ref document number: 06726538

Country of ref document: EP

Kind code of ref document: A1