WO2004017596A1 - Methods and device for transmitting emotion within a wireless environment - Google Patents

Methods and device for transmitting emotion within a wireless environment Download PDF

Info

Publication number
WO2004017596A1
WO2004017596A1 PCT/GB2003/003560 GB0303560W WO2004017596A1 WO 2004017596 A1 WO2004017596 A1 WO 2004017596A1 GB 0303560 W GB0303560 W GB 0303560W WO 2004017596 A1 WO2004017596 A1 WO 2004017596A1
Authority
WO
WIPO (PCT)
Prior art keywords
telecommunications
communication
toy
image
identifier
Prior art date
Application number
PCT/GB2003/003560
Other languages
French (fr)
Inventor
Toby Moores
Joanne Elizabeth Allen
Mark Anthony Hilton
Stewart Burnett Jones
Benjamin James Last
Original Assignee
Sleepydog Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to GB0218927.2 priority Critical
Priority to GB0218927A priority patent/GB0218927D0/en
Priority to GB0224206.3 priority
Priority to GB0224206A priority patent/GB0224206D0/en
Application filed by Sleepydog Limited filed Critical Sleepydog Limited
Publication of WO2004017596A1 publication Critical patent/WO2004017596A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/12Messaging; Mailboxes; Announcements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L29/00Arrangements, apparatus, circuits or systems, not covered by a single one of groups H04L1/00 - H04L27/00
    • H04L29/02Communication control; Communication processing
    • H04L29/06Communication control; Communication processing characterised by a protocol
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network-specific arrangements or communication protocols supporting networked applications
    • H04L67/06Network-specific arrangements or communication protocols supporting networked applications adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network-specific arrangements or communication protocols supporting networked applications
    • H04L67/34Network-specific arrangements or communication protocols supporting networked applications involving the movement of software or configuration parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/57Arrangements for indicating or recording the number of the calling subscriber at the called subscriber's set
    • H04M1/575Means for retrieving and displaying personal data about calling party
    • H04M1/576Means for retrieving and displaying personal data about calling party associated with a pictorial or graphical representation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72522With means for supporting locally a plurality of applications to increase the functionality
    • H04M1/72544With means for supporting locally a plurality of applications to increase the functionality for supporting a game or graphical animation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72522With means for supporting locally a plurality of applications to increase the functionality
    • H04M1/72547With means for supporting locally a plurality of applications to increase the functionality with interactive input/output means for internally managing multimedia messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Application independent communication protocol aspects or techniques in packet data networks
    • H04L69/30Definitions, standards or architectural aspects of layered protocol stacks
    • H04L69/32High level architectural aspects of 7-layer open systems interconnection [OSI] type protocol stacks
    • H04L69/322Aspects of intra-layer communication protocols among peer entities or protocol data unit [PDU] definitions
    • H04L69/329Aspects of intra-layer communication protocols among peer entities or protocol data unit [PDU] definitions in the application layer, i.e. layer seven
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2203/00Aspects of automatic or semi-automatic exchanges
    • H04M2203/10Aspects of automatic or semi-automatic exchanges related to the purpose or context of the telephonic communication
    • H04M2203/1016Telecontrol
    • H04M2203/1025Telecontrol of avatars
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/02Details of telephonic subscriber devices including a Bluetooth interface

Abstract

A method of communicating non-verbal messages between first and second individuals using respective first and second telecommunications devices is described. The method comprises: storing a plurality of sets of predetermined graphical images on the first telecommunications device, each graphical image being assigned an image identifier and the graphical images within each set representing different non-verbal messages; linking one of the graphical images sets with a device identifier of the second telecommunications device; receiving a communication for conveying a non-verbal message from the second telecommunications device, the communication specifying one of the image identifiers and the device identifier of the second telecommunications device; retrieving the graphical image specified by the first communication using the received image and device identifiers; and displaying the retrieved graphical image, thereby conveying the non-verbal message of the communication to the first individual.

Description

METHODS AND DEVICE FOR TRANSMITTING EMOTICON WITHIN A WIRELESS ENVIRONMENT

Field of the Invention

The present invention concerns improvements relating to communication where individuals are remote from one another and provides, more specifically, a method of and apparatus for communicating non-verbal messages using telecommunications devices.

Background of the Invention

Human communications can be categorised into two types: those made without physical aids (for example by talking, through physical contact or body language) and those made with physical aids (for example by writing a letter, holding a telephone conversation or e-mailing using a personal computer). Physical communication aids were revolutionised by technology over the course of the last century. In the early 1900s long distance communication was still effected primarily by postal services, but by the mid-century telephones were installed in many homes; the end of the century saw individuals having a whole host of communication aids at their disposal, such as fax machines, personal computers, lap tops and mobile phones.

However, despite the ready availability of such technology, be it a telephone in the lounge, a personal computer in the study or a mobile phone in our bag, we are, at times, reluctant to use it. Increased working hours and commuting have undoubtedly had a negative effect on our personal lives. The seemingly small efforts required to keep in contact with friends and family, such as the writing of letters, the sending of e-mails or the making of telephone calls, become tasks which cannot be faced at the end of a long working day. The inclination to communicate is not missing, people are usually very aware of not having been in contact with friends or family members; rather it is the types of communication which are off-putting - most requiring a significant investment of time and effort. The types of communication listed above can all too easily become protracted activities which, in our time-pressured personal lives, we seek to avoid. With hindsight then, it is perhaps not so surprising that the most popular communications phenomenon of the new millennium, namely the Short Messaging Service (SMS), is one which fits in readily with the pace of modern life. The Short Messaging Service was first introduced in the early 1980s, when digital technology was implemented to increase capacity of the mobile phone networks. The radio spectrum, under standards governing wireless communication, is divided into channels which occupy specific frequency ranges (different frequencies being utilised by different networks). There are two different types of channel: communication channels carry voice signals, whilst control channels handle signalling inputs and outputs from the communication channels and manage network transmission tasks. The Short Messaging Service operates within the control channels, taking advantage of spare frequency capacity. As the name suggests, the service enables short messages, of no more than 160 alphanumeric characters in length, to be sent across mobile networks and the messages are delivered almost instantaneously.

The SMS facility was initially used for voicemail notification, whereby a user is alerted to messages stored by the network, on their behalf, in a central network repository. Typically, when a user reconnects their handset to a network, a message such as "3 NEW MESSAGES" is transmitted to the mobile handset display screen. In due course, the service was adapted so that messages could originate from the mobile phones themselves and be sent to the network or to other mobile phones on which the service was enabled. More recently, a Multimedia Messaging Service (MMS) has been developed to take advantage of newly available broadband frequency ranges, allowing digital graphical image data (such as digital photos or video clips) to be included within messages. To the surprise of the telecommunications industry, messaging services have proved immensely popular with mobile phone users - by the end of 2002 around 30 billion messages were sent throughout the world per month (source GSM Association).

Messaging services have proved popular for a variety of reasons, but a key aspect is their unobtrusiveness. Users do not have to engage in conversation, the services provide a forum in which brevity of messages is socially acceptable, recipient's can read and respond to messages at their own convenience. Accordingly, users often elect to use the messaging function on their mobile phone in preference to the voice function. The present inventors have realised that as well as looking at new ways to expand the processing capabilities of mobile phones, enabling them to handle ever larger quantities of data at ever increasing speeds, attention should also be directed to the behavioural traits of users, determining the types of communications which users want to send and developing technical solutions which facilitate the sending of those messages.

Not all communications between humans are time consuming, particularly those which are non-verbal and do not involve speaking or writing. In particular, non-verbal communications giving signs of affection and reassurance have been shown to contribute significantly to our psychological well-being; these communications have been termed "warm fuzzies" (referred to as WFs) by the distinguished clinical psychologist and transactional analyst Claude Steiner.

The need to make non-verbal as well as verbal communications has been evidenced, in part, by the widespread adoption of so-called "emoticons" in electronic messages. An emoticon is a string of text characters which, when viewed sideways, depicts a face expressing a particular emotion. Common emoticons include " :-) " or " :) " to indicate smiling at a joke or happiness,

" ;-) " to indicate a jovial wink, and " :-o " to indicate boredom. In the absence of body language in the electronic environment, emoticons are typically used after text to supplement its meaning. Emoticons are, however, fairly crude, unsophisticated and one-dimensional in nature. Various efforts to enhance emoticons have been made, including developing software to recognise emoticon characters received in a text message and convert them into graphical icons [e.g. " :-) " would be converted into the icon " © "]. In addition, GB-A-2,376,379 discloses a method for a phone system which converts emoticons within text messages into corresponding audio messages.

The communication of non-verbal messages has yet to be properly addressed by modern technology. Whilst multimedia messaging services allow the transmission of digital photographs and video clips from telecommunications devices, taken using on-board cameras, these messages are at present expensive in terms of cost and bandwidth to send and receive. The creation of both types of self-image requires significant input from the device user, with video clips in particular generally requiring speech. The meaning of a digital self-image, intended to convey a non-verbal message, may also not be immediately discernible to the recipient - explanatory text may also be required. In addition, many people dislike photographic images of themselves and so would be reluctant to use technology in this way. Furthermore, the uptake of multimedia messaging mobile phones, in particular, has been slower than expected and so only limited numbers are presently in use.

US 2002-A-0196262 attempts to expand messaging functionality and the available options for personal expression. One intention of this prior art document is for a user of a wireless terminal to be able to "send a package of content and functionality, called an entity, to another user who may display and invoke the entity at the receiving end. This entity may take on characteristics that have been programmed into it, and may, for example, appear on a wireless terminal display as an animated character. The animated character may include sounds and expressions that make the entity seem life-like... an entity may even be programmed to have personality and emotion". However, the requirement to send the entity is disadvantageous as it makes communications slow and takes up precious bandwidth. Also the sender controls how the recipient sees the entity which is also disadvantageous in that the recipient is unable to personalise the way in which their communications device handles the message. This document is cumbersome and difficult to follow, with the entity concept in particular possibly not being explained in an enabling manner.

In WO- A-01/27879, a method for remote communication is described, which involves a visual representation of a user being provided to a recipient. A set of behavioural characteristics of the visual representation, which include personality and mood intensity settings, is provided to the user. The user selects a behavioural characteristic and inputs data to be communicated to the recipient, along with any specific behavioural commands. Data is communicated to the recipient concurrently with a behavioural movement of the visual representation associated with the selected behavioural characteristic, wherein the behavioural movement provides context to the recipient for interpreting the communicated data. However, this system is time consuming to use and is inefficient in that the user has to send a message as well as the behavioural movement of the visual representation. Accordingly, is desired to overcome or substantially reduce at least some of the abovementioned problems. More specifically, it is desired to provide a method for conveying non-verbal messages which can be implemented on many telecommunications devices presently in use, the messages being quick to compose and inexpensive to send and receive.

Summary of the Invention

According to one aspect of the present invention there is provided a method of communicating non-verbal messages between first and second individuals using respective first and second telecommunications devices, the method comprising: storing a plurality of sets of predetermined graphical images on the first telecommunications device, each graphical image being assigned an image identifier and the graphical images within each set representing different non-verbal messages; linking one of the graphical images sets with a device identifier of the second telecommunications device; receiving a communication for conveying a non- verbal message from the second telecommunications device, the communication specifying one of the image identifiers and the device identifier of the second telecommunications device; retrieving the graphical image specified by the first communication using the received image and device identifiers; and displaying the retrieved graphical image, thereby conveying the non-verbal message of the communication to the first individual.

The term 'non-verbal message' is intended to have a specific meaning in the context of the present invention, namely that the message is not spoken or written by the sender of the message. Non-verbal messages include messages which convey emotions such as WFs. These types of messages may be difficult to express in a written manner and would in any case take up more time in message construction than with the present invention.

The advantage of the present invention is that communication between two parties of sometimes complex messages can be simplified and made easier. This is because the communicating entities are bonded (linked) together in a way which enables the recipient of a Floof message to determine how that message will be conveyed to them and also whether or not they wish to be able to determine the identity of the sender just from the type of nonverbal message that is presented to the recipient. The bonding also enables the present invention to operate without modifying existing messaging structures such as SMS, MMS and EMS but rather to use them with the only requirement being to provide an emotion identifier in the message which can be detected and interpreted by an application on the recipient's phone. The emotion identifier can be a simple code thereby minimising the data requirements of the present invention.

The present invention is fully compatible with existing mobile phone technology in that it can be embodied as a simple downloadable application which provides an additional application to those already provided on the mobile phone.

The linking step may comprise linking one of the graphical images sets with a communications address of the second telecommunications device and more preferably it comprises linking one of the graphical images sets with a telephone number of a mobile phone. With many messaging systems, the communications address of the sender is automatically provided with the message itself, and as such the required device identifier of the second communications device is provided automatically. This advantageously means only the emotion identifier needs to be provided in the message in addition to a Floofs application identifier thereby making the message more efficient (shorter).

The storing step preferably comprises storing a plurality of sets of predetermined graphical character images in which the graphical images each convey different facial expressions of that character. By using characters and facial expressions of the character most emotions can be conveyed quickly and easily in the same way that we can determine the state of mind of a person by simply looking at the expression on their face. Also this enables different characters to represent different people such that the identity of the sender is automatically determined on seeing their character image. This removes the requirement for the receiving device to explicitly show the sender's name which is advantageous given the often limited amount of graphical space present in most mobile phone displays.

The storing step comprises storing sets of predetermined video clips or sets of predetermined animated graphical images. Non-verbal messages can be presented to the recipient in many different ways. By providing video clips and/or animated graphics, there can be a better chance of actually conveying that message to the recipient correctly. Graphical images do provide a good messaging medium but moving images can contain much more information to convey the desired message quickly.

The method may further comprise: associating one or more pre-programmed actions, to be performed by the first telecommunications device, with a stored graphical image; and executing the one or more pre-programmed actions when the displaying of that graphical image occurs, the one or more actions of the first telecommunications device enhancing the non-verbal message conveyed to the first individual. Again these pre-programmed actions can help to convey the sometimes complex message to the recipient correctly. For example, the executing step may comprise vibrating the first telecommunications device to enhance the message conveyed. Alternatively or in addition, the executing step may comprise playing prerecorded audio data to enhance the message conveyed. A non-verbal message which uses these different pre-recorded actions provides the message to two or three of the recipient's senses helping to convey the message.

A descriptive textual identifier may be assigned to each graphical image. This can advantageously help individuals new to this way of messaging get used to the different graphical images (such as icons) used to represent different non-verbal messages.

The method may further comprise selecting one of the sets of graphical images for linking with the device identifier of the second telecommunications device in response to inputs made by the first individual into the first telecommunications device. In this way, the first individual can determine which sets of graphical images are linked to the second individual thereby advantageously giving him or her control of how messages are seen on their communications device.

The present invention provides a new way of messaging which is hereinafter termed 'Floof Messaging'. The present invention can be embodied wholly in software within a mobile telephone as a Floof application. Alternatively, the present invention can be embodied in a combination of a mobile phone and a Floof toy. In the latter case, the displaying step may comprise displaying the retrieved graphical image on a graphical display housed within an electrical toy, the graphical image depicting a facial expression on the electrical Floof toy. This can advantageously provide a more human interface with the recipient and also be more acceptable to a small child for example.

When the present invention is implemented using a Floof toy, the method may further comprise connecting the electrical toy to the first telecommunications device using a wireless communications protocol. This advantageously minimises danger to young children, for example, and makes interfacing relatively easy with the availability of automatic local data transfer protocols such as Bluetooth. Alternatively, the method may comprise housing the first telecommunications device within the electrical toy itself. This advantageously avoids the need for there to be two devices at the first individual and makes the Floof toy the communication device itself.

In either case, the method may further comprise arranging the electrical toy to perform physical actions when displaying one of the graphical images, to enhance the non-verbal message received from the second telecommunications device. Again, such physical actions in combination with graphical images which can be displayed on the toy's facial display for example, only enhance the ability for the non-verbal message to be correctly and quickly conveyed to the recipient.

The method either in the case of employing a toy or not may further comprise: receiving an input specifying one of the image identifiers; creating an other communication for conveying a non-verbal message comprised of the specified image identifier and the device identifier of the second telecommunications device; sending the other communication to the second telecommunications device. This feature enables the first individual to compose a non-verbal message and transmit the same to the second or other individual. Thus two-way communication of non-verbal messages is possible.

Composing such a non-verbal message is made relatively simply by allowing the first individual to simply choose from a list of possible non-verbal message representing images a desired message. In this case, the method may further comprise listing a plurality of the image identifiers and receiving an input specifying one of the listed image identifiers.

The composition of such non-verbal images is made even easier if the method preferably comprises associating groups of the image identifiers with different input actuators of the first telecommunications device and listing a group of image identifiers in response to receiving an input from a particular one of the actuators. The number of actions which the first individual has to go through to compose a message can thus be minimised to make the method easy and quick to use in practice.

Preferably, a Floof message sent identifies a single emotion. This is advantageous, as the second individual (sender) can simply select an emotion, address it, and send it to convey their non-verbal message to the first individual (recipient). In this regard, the message would be selected for sending in the simplest of manners.

The method may further comprise associating the device identifier of the second telecommunications device with a particular one of the actuators of the first telecommunications device and inserting that device identifier into the other communication in response to receiving an input from that particular actuator. Again, this minimises the number of actions which the first individual has to go through to compose a message. Also, if a set of graphical images has been associated with a particular individual, selection of that set of graphical images can avoid the need for the first user to actually input the address details for the intended recipient as this has already been determined by the specific selection of the set of graphical images.

The image identifiers may be unique within any one of the plurality of sets of graphical images. Also at least one image identifier may be used in more than one of the plurality of sets of graphical images. This uniqueness within a single set is a minimum requirement and enables the image identifiers to be reused in a plurality of different sets. They may even be identical in each of the other sets. This not only minimises the number of different image identifiers but also if the identifiers actually help describe the non-verbal message then this provides a good way for the first individual to be able to learn how the same message is represented in each different graphical image set. Also, it certainly helps the first and second individuals learn the identifiers used in conveying different non-verbal messages. For example, an image identifier may be the word 'CONFUSED' which could represent a single image in each of the plurality of image sets. Alternatively, words may not be used at all but rather codes representing the desired non-verbal message to be conveyed to the recipient.

Using the present invention, where a third individual using a third telecommunications device can communicate non-verbal messages to the first individual, the method may further comprise: designating one of the sets of graphical images as a default images set; receiving a further communication for conveying a non-verbal message from the third telecommunications device, the further communication specifying one of the image identifiers and the device identifier of the third telecommunications device; determining that none of the sets of graphical images stored on the first telecommunications device are linked with the device identifier of the third telecommunications device; retrieving the default graphical image specified in the further communication using the image identifier received in the further communication; and displaying the retrieved default graphical image, thereby conveying the non-verbal message of the further communication to the first individual. This advantageously enables the method to handle non-verbal messages received from non-bonded individuals, namely individuals communicating with the first individual for the first time who therefore have not been linked with any particular set of graphical images.

According to another aspect of the present invention there is provided a telecommunications device for communicating non-verbal messages between a first individual using the telecommunications device and a second individual using a communications device, the telecommunications device comprising: a database for storing a plurality of sets of predetermined graphical images on the telecommunications device, the graphical images within each set representing different non-verbal messages and being assigned image identifiers; linking means for linking one of the graphical images sets with a device identifier of the communications device; receiving means for receiving a communication for conveying a non-verbal message from the communications device, the communication specifying one of the image identifiers and the device identifier of the communications device; retrieving means for retrieving the graphical image specified by the first communication using the received image and device identifiers; and a display for displaying the retrieved graphical image, thereby conveying the non-verbal message of the communication to the first individual.

It is not necessary, when the present invention is used in a toy, for the non-verbal messages to be conveyed to the recipient on a graphical display. Rather physical actions (including nonverbal sounds) can be generated by the toy to convey the non-verbal message. More specifically according to a different aspect of the present invention there is provided a method of communicating non-verbal messages between first and second individuals using respective first and second telecommunications devices, the first telecommunications device being operatively connected to a respective first toy, the method comprising: storing a plurality of predetermined sets of physical actions at the first toy, each physical action being assigned an action identifier and the physical actions within each set representing different non-verbal messages; linking one of the physical action sets with a device identifier of the second telecommunications device; receiving a communication for conveying a non-verbal message from the second communications device, the communication specifying one of the action identifiers and the device identifier of the second telecommunications device; retrieving the physical action specified in the communication using the received action and device identifiers; and carrying out the physical action on the first toy, thereby conveying the nonverbal message of the communication to the first individual.

The advantage of such a device is that the toy itself can appear to be conveying the message to the recipient by way of its physical actions, for example by waving goodbye. These physical actions can sometimes be more readily understood by the recipient than images displayed on a screen and certainly are more readily understood by younger children.

According to another aspect of the present invention there is provided a toy apparatus for communicating non-verbal messages between first and second individuals, the toy apparatus comprising: a first communications device associated with the first individual, the first communications device being operably connectable to a second telecommunications device associated with the second individual; a toy, the toy being operatively connectable to the first telecommunications device; a data store for storing a plurality of predetermined sets of physical actions, the physical actions within each set representing different non-verbal messages; a plurality of action identifiers, each action identifier identifying one of the physical actions; means for linking one of the physical action sets with a device identifier of the second telecommunications device; communication means for receiving a communication for conveying a non-verbal message from the second communications device via the first communications device, the communication specifying one of the action identifiers and the device identifier of the second telecommunications device; means for retrieving the physical action specified in the communication using the received action and device identifiers; and processing means for interpreting the physical action and carrying out the physical action on the toy, thereby conveying the non-verbal message of the communication to the first individual.

According to a further aspect of the present invention there is provided a toy for communicating non-verbal messages between first and second individuals, the toy being associated with the first individual and comprising: a data store for storing a plurality of predetermined sets of physical actions, the physical actions within each set representing different non-verbal messages; a plurality of action identifiers, each action identifier identifying one of the physical actions; means for linking one of the physical action sets with a device identifier of a second telecommunications device associated with the second individual; communication means for coupling the toy to a first telecommunications device which is operably connectable to the second telecommunications device, the communication means being arranged to receive a communication for conveying a non-verbal message from the second communications device via the first communications device, the communication specifying one of the action identifiers and the device identifier of the second telecommunications device; means for retrieving the physical action specified in the communication using the received action and device identifiers; and processing means for interpreting the physical action and carrying out the physical action on the toy, thereby conveying the non-verbal message of the communication to the first individual.

Brief Description of the Figures

Methods and apparatus according to preferred embodiments of the method of communicating non-verbal messages using telecommunications devices will now be described by way of example, with reference to the accompanying drawings in which: Figure 1 is a schematic block diagram showing a communications system over which messages are sent in accordance with embodiments of the present invention;

Figure 2 is a schematic block diagram showing software modules and the data stores which are provided in a mobile phone for implementing the first embodiment of the present invention;

Figure 3 is a schematic block diagram showing the tabulated contents of a downloaded database of Figure 2 accessed by a downloaded Floofs messaging application;

Figures 4a and 4b are schematic screen shot diagrams of the display of the mobile phone showing how the new messaging application Floofs is accessed by the user;

Figures 5a to 5f are schematic screen shot diagrams of the display of the mobile phone showing how a new entry is added to a user's contacts list and how a Floof is bonded;

Figure 6 is a flow diagram showing the Floof bonding process of Figures 5a to 5f;

Figures 7a to 7c are schematic screen shot diagrams of the display of the mobile phone showing how an incoming Floof message is received;

Figure 8 a is a flow diagram showing the phone operation when an application specific message is received and how messages for those specific applications are routed;

Figure 8b is a flow diagram showing the specific phone operation and message routing when a Floof message is received;

Figure 8 c is a flow diagram showing the mobile phone interactions of the user and the consequential Floofs application operation after a Floof message has been received; Figures 9a to 9f are schematic screen shot diagrams of the display of the mobile phone showing how a new Floof message is composed and sent by the user;

Figure 10 is a schematic screen shot diagram of the display of a mobile phone showing how the message sent in Figures 9a to 9f is seen on a recipient's mobile phone;

Figure 11 is a flow diagram showing the phone operation when a Floof is to be composed and sent by the mobile phone user;

Figure 12 is a schematic diagram showing a second embodiment of the present invention where the Floof is embodied in a toy with a screen;

Figure 13 is a schematic block diagram of the component parts of the second embodiment shown in Figure 12;

Figure 14 is a schematic block diagram of the component parts of third and fourth embodiments of the present invention in which the telephone capability is built into the Floof toy; and

Figure 15 is a schematic diagram showing a fifth embodiment of the present invention where the Floof is embodied in a toy without a screen.

Detailed Description of Preferred Embodiments of the Present Invention

With reference to Figure 1, a communications system 10 for implementing presently preferred embodiments of the present invention is now described. The communications system 10 facilitates the sending of data messages such as SMS, EMS, MMS and e-mail between individuals using mobile phones 12 via data messaging gateways 14 and a telecommunications network 16. Data messages can either be sent directly to the other individual's mobile phone 12 or else can be redirected to an e-mail account for that person on the Internet 18. The exact type of messaging system used is not important so long as the data within it can be recognised as being so called Floofs data. In this regard, each message sent has a unique Floofs identifier (not shown) provided within it to distinguish its data from that for other types of applications.

The messages are generated and handled by Floofs applications 20 which reside on both the sender and recipient's mobile phones 12. These Floofs applications 20 can be provided on the phone as built in software but in this embodiment, they are downloaded from a Floofs server 22 via the Internet 18 and a GPRS connection to the mobile phone 12. The downloaded information is in the form of a JAVA application. The Floofs server 22 accesses a data store 24 in which the Floofs application 20 resides and also downloads a Floofs database 26 of information for use with the application 20.

A user (called Jo in this embodiment) needs to access the Floofs server 22 and download the Floofs application 20 and database 26 in order to run the present improved messaging system of the present embodiment. This is in a similar way to that currently carried out when a user downloads a new JAVA application such as a new game for their mobile phone via a GPRS connection. The application then resides on the mobile phone and enables Floofs data messages to be generated, sent, received, recognised (because of their unique identifiers) and also processed in a user-determined way as will be described later.

Referring now to Figure 2, the construction of the mobile phone 12 in terms of some of its software and related hardware modules is now described with respect to the downloading and running of the Floofs application 20. Whilst a mobile phone has many different functions only the ones relevant to the messaging and download capabilities are referred to in the present embodiment.

The mobile phone 12 comprises a keypad and display 30 which provides the main interface with Jo (the user) and a network manager module 32 which controls the data communications with the outside world. Between the display and keypad 30 and the network manager module 32, software modules and data stores are provided in either a user-interface environment 34 or a management and control environment 36. The software modules and data stores provided in these environments effect downloading of a specific application, ninning of that application and standard data messaging. The software modules/data stores for carrying out each of these functions is now described in greater detail.

The downloading of a Floofs application is handled by a downloads function 38 in the user- interface environment 34, and downloads manager 40 and a downloads store 42 both provided in the management and control environment 36. A Floofs application download is instigated by Jo selecting the downloads function 38 which takes the user-specified parameters and passes these to the downloads manager 40. The downloads manager 40 uses this information to get the network manager 32 to send a message to the Floofs server 22 either by SMS, EMS or MMS or even by e-mail. In response to the request, the Floofs server 22 sends the Floofs application 20 and its corresponding Floofs database 26 to the mobile phone 12. These are passed back to the downloads manager 40 via the network manager 32 and then stored in the downloads store 42. At the same time, the downloads manager 40 executes a code installation procedure provided as part of the Floofs application 20, which causes the downloads manager 40 to load configuration codes 44 on a virtual machine manager 46 and on a messaging manager 48 both again provided in the management and control environment 36. These configuration codes 44 enable the virtual machine manager 46 and the messaging manager 48 to know that the Floofs application 20 is available in the download store 42 and notifies the messaging manager 48 of the unique Floofs identifier (mentioned earlier) which identifies any Floof message (described in greater detail later).

Messaging is handled by a messaging function 50 provided in the user-interface environment 34, and messaging manager 48 and a message store 52 both provided in the management and control environment 36. The messaging function 50 derives the user-selected data and options for non application-specific messaging (i.e. non-Floofs messaging) via the keypad and display 30 and conveys these to the messaging manger 48. Also, received non application-specific messages can be selected and viewed via the messaging function 50. The messaging manger 48 is responsible for sending and receiving data messages and also for storing the messages in the message store 52. For received messages which are not conventional, namely which are application specific (such as a Floof message), the messaging manager 48 identifies such a message in view of the configuration data 44 provided to it and on recognition, simply routes the Floof message to the virtual machine manager 46. The virtual machine manager 46 is part of a virtual machine environment 54, namely an environment which is configured by a downloaded application. In the case of a recognised Floof message being received, the virtual machine manager 46 determines firstly the relevant application for the message. This is determined by the configuration data 44 which has already been loaded into the virtual machine manager 46. Once the Floofs application 20 has been identified as the appropriate application for the received message, the virtual machine manager 46 checks to see if it is running. If it is not running, the virtual machine manager 46 fetches the Floofs application 20 from the download store 42 and runs it. This causes a Floofs executable 56 to be run within both the virtual machine environment 54 and the user-interface environment 34, such that the received message can be displayed to Jo via the display 30. Conversely, if on receipt of the Floof message, the Floofs application is running, the message is simply conveyed to Jo via the display 30.

The Floofs executable 56 is also responsible for the user-composition of Floof messages which are created and sent via the virtual machine manager 46 and the messaging manager 48 as will be described in detail later. It is to be appreciated that most of the elements shown in Figure 2 are provided as part of the mobile phone's conventional elements (shown with solid lines), namely as part of its inherent capability. This can be considered to be like the operating system of the phone. The only new elements required to implement the Floofs messaging capability (shown in dashed lines) are the downloaded Floofs application 20, the Floofs database 26, the configuration codes 44 and the Floofs. executable 56. This shows how the present embodiment is simple to install and use and how easily it integrates into the existing phone structure, thereby minimising implementation cost and increasing compatibility with many different types of phones.

The downloaded Floofs database 26, as shown in detail in Figure 3, comprises a Floof table 60 an Emotion table 62 and a Floofs Contact List table 64. Each of these is described in detail below. However, the structure of a Floof is now described. Each Floof has a name (which helps user identification), its own unique identifier (which helps the Floofs application 20 handle it more efficiently) and a set of images associated with it (for actually conveying the non-verbal message to the use). Each image within the set represents a different emotion and has a unique emotion identifier (descriptor) within the set. However, such emotion identifiers are not, within the current embodiment, unique over all the sets as different Floofs are able to convey the same emotion (non-verbal message).

Referring more specifically to Figure 3, the Floof table 60 comprises three columns: a column of Floof names 66; a column of Floofs IDs; and a column of Default Emotions 70. It is necessary to have the default emotions 70 specified because, as will be elaborated on later, when Jo is looking through different Floofs to send a message, a default image needs to be shown which is selected on the basis to the stored default emotion 70 in the Floof table 60.

The Emotion table 62 also comprises three columns: a column of Floof IDs 68; a column of Emotion IDs 72; and a column of Floof images 74. In this way for each Floof ID 68, the corresponding set of Floof images are defined and individually labelled.

The Floof Contact List table 64 is a user-configurable table which has three columns: a column of contact names 76, a column of contact numbers 78 and a column of bonded Floof IDs 80. When downloaded, the Floof Contact List table 64 is empty because it has to be populated by Jo. The population of this table 64 is the result of a bonding process which causes different sets of Floofs to be linked to different contacts, such that messages from that contact can be conveyed to Jo using the Floof images 74 associated with that particular Floof. The bonding process is also described in detail later. In the table, it can be seen that the male contact Bob is bonded with the 'Boy' Floof having a Floof ID of 2 and that the female contact Alice and Eve are both bonded to the Default Floof. The Default Floof is provided to handle Floof messages from contacts which have not been bonded yet, for contacts which have not been put in the contacts list and for contacts for which Jo requires no unique Floof identification of the sender. In these cases, the Default Floof still enables the non-verbal message to be conveyed to Jo but does not uniquely identify the actual person sending the Floof; it could be one of many or an unknown sender.

The user interaction with the mobile phone is now described with reference to the acts of running the application, bonding a Floof to a particular person, receiving a Floof message and sending a Floof message. The way in which a Floof is bonded to a contact is now described with reference to an example in which Jo wishes to bond a Floof to Bob. This process has the effect of populating the Floof Contact List table 64 (shown in its populated state in Figure 3). Referring to Figures 4a and 4b, the Floofs application 20 appears as an option 90 on the main menu 92 displayed to Jo on her mobile phone 12. User options are provided, namely 'SELECT' 94 and 'QUIT' 96, which are chosen by Jo selecting the appropriate keypad button 98. In the present embodiment, the Floofs option 90 is chosen which leads to the display of a Floofs sub menu 100 as shown in Figure 4b. Here the Address Book sub option 102 is selected in order to effect bonding of a Floof as is described now with reference to Figures 5a to 5f and 6.

The selection of the Address Book sub option 102 in Figure 4b leads to the display of the Address Book sub menu 110 shown in Figure 5 a. Here different contacts 112 already present in the address book are listed (such as Alice and Eve) and an option 114 to enter a new entry is displayed as well as an option 116 to show more names in the address book. Assuming the New Entry option 114 is selected by Jo, because she wants to add her friend Bob as a new contact, the new entry screen 120 is generated as shown in Figure 5b. This screen has an entry 122 for the name of the contact, an entry 124 for the telephone number of the contact and an option 126 for Setting the Floof for the contact. Jo then enters Bob's name and number as appropriate and this converts the new entry screen 120 into an address list entry Screen 130 for Bob as shown in Figure 5c. Jo can then select a SET FLOOF option 126 from this screen which is described below.

Alternatively, if Bob had been an existing entry, Jo could have selected Bob's entry at the Address Book screen 110 and this would have lead straight to address list entry Screen 130 for Bob as shown in Figure 5c.

Assuming now that a Floof is to be bonded to the contact Bob by selection of the SET FLOOF option 126, then the bonding process 200 as described in Figure 6 is carried out. The bonding process 200 commences at Step 202 with the display of the details of the Address Book entry being displayed as shown in Figure 5c and discussed above. Whilst no command has been entered by Jo, checked at Step 204, the screen shown in Figure 5c is displayed. However, when a command is entered by Jo and it is a SET FLOOF command as checked at Step 206, then the steps leading to bonding of the selected entry in the Address Book are started. If however, the entered command is not a SET FLOOF command, then Jo is at Step 208 taken back up to the main Floofs menu 100 (as shown in Figure 4b) and the bonding process 200 effectively comes to an end.

The next step in the bonding process 200 when the SET FLOOF option 126 has been selected at Step 206, is for a check at Step 210 to be made by the Floofs executable 56 to the contents of the Floofs Contact List table 64. If this table shows that the contact has already been bonded to a Floof then the image 74 of the default Emotion 70 of the bonded Floof is displayed at Step 212. If however the selected contact has not been bonded to the Floof, then the image 74 of the default Emotion 70 of the default Floof 140 is displayed at Step 214 as derived from the Floof table 60. The display of the default Floof 140 is shown in Figure 5d as it appears on the SET FLOOF screen 142. Here Jo is presented with four different options, SELECT 144, BACK 146, SCROLL UP 148 and SCROLL DOWN 150. The selection is made using the appropriate keypad buttons 98.

Returning to Figure 6, the bonding process 200 continues with Jo making a selection at Step

216 of one of the presented options. If Jo selects the SCROLL DOWN option 150 indicating that the next Floof should be shown as determined at Step 218 then the next Floof in the Floof table 60 is presented and the image 74 of the default Emotion 70 of that next Floof is displayed at Step 220. If it is the last Floof in the table 60, then the first Floof is presented. If

»

Jo selects the SCROLL UP option 148 indicating that the previous Floof should be shown as determined at Step 222 then the previous Floof in the Floof table 60 is presented and the image 74 of the default Emotion 70 of that previous Floof is displayed at Step 224. If it is the first Floof in the table 60, then the last Floof is presented. In both these cases, the process 200 is taken back to step 216 awaiting a user command.

If however, Jo selects the BACK option 146 as determined at Step 226, the process 200 is taken back to step 202 with the display of the current details of the BOB entry (Figure 5c).

Similarly, if the SELECT option 144 is selected as determined at Step 228, the process is returned back to the display of the current details of the BOB entry (Figure 5c) at Step 202. However, in this later case, first the Floofs Contact List table 64 is updated at Step 230 with the selected Floof ID 68 thereby effecting the bonding of the Floof to the contact. This also means that the current details for the Bob entry which are shown at step 202 are updated to include the bonded Floof information.

In this example, Jo selects the SCROLL UP option 148 in Figure 5d which results in the display of the Boy Floof 150 in the SET FLOOF screen 142 as shown in Figure 5e. Jo then chooses the SELECT option 144 and the contact Bob is bonded with the Boy Floof 160. This is reflected in the updating of the Floof Contact List table 64 to show the Floof ID of '2' next to the entry for Bob. Figure 5f shows the resultant address list entry Screen 130 for Bob which now shows the name of the bonded Floof for Bob.

The process of receiving a Floof message is now described with reference to Figures 7a to7c and 8a to 8c. The process is relatively simple for Jo as is illustrated in the screen shots shown in Figures 7a to 7c which depict an example for when a Floof message is received from Eve and how Jo can access the message. Figure 7a shows a message alert screen 250 which indicates that there is an incoming Floof message. The screen 250 may be generated in conjunction with a phone vibrate and/or an audible sound to help alert the user to the message. On being alerted to the incoming message, Jo uses the keypad buttons 98 of the mobile phone 12 to choose the SELECT option 252 provided on the screen 250. This takes Jo to the inbox screen 260 shown in Figure 7b. Here each of the previous messages 262 is shown and the new Floof message 264 from Eve is highlighted to show it has not been read by Jo yet. Jo then uses the keypad buttons 98 to select Eve's message 264. Figure 7c shows the Incoming Floof screen 270 which displays the Floof message from Eve. It can be seen that in this example Eve has sent the 'crazy' Emotion to indicate her non-verbal message. This has been presented as an expression of the default Floof as the Floof Contact List table 64 shows that Eve is bonded to the default Floof. Also the name of the sender is displayed 272 as is the date 274 of the message.

The process 280 of receiving an incoming Floof message is described more generally with reference to Figures 8a to 8c. Figure 8a shows the flow diagram for the phone operation when a Floof message is received and how messages for applications in general are routed to the applications themselves. More specifically, the receiving process 280 commences with a data communications message being received at Step 282. The message is assessed at Step 284 to determine whether it is a standard message which is suitable for the mobile phone's general message inbox (not shown but provided in the download store 42). Examples of such messages are non-application specific e-mail, SMS, EMS and MMS messages. If such a conventional message is recognised at Step 284, then the message is simply routed at Step 286 to the appropriate general inbox (this may be a set of inboxes one for each type of different message that is received). Then the receiving process 280 comes to an end at Step 288.

If however the message is determined at Step 284 to be non-conventional then the process 280 determines at Step 290 whether or not the data message is for an application resident on the mobile telephone 12. Such application-specific messages can be identified by recognition of a special application identifier code for example. On determination of the message being an application-specific message, a check is made at Step 292 to determine whether the application is actually running in the virtual machine environment 54 of the mobile phone. If the application is not running, it is downloaded by the virtual machine manager 46 and executed at step 294. Then the message is routed at Step 296 to the application. Alternatively, if the check at Step 292 determines the application to be running, then the message is routed directly to the application at Step 296. Subsequently, the receiving process 280 comes to an end at Step 288.

If it is determined at Step 290 that the data message is not for an application resident on the mobile telephone 12, then the message is an internal data message (such as a service message for updating some register on the phone) and is handled internally at Step 298. Following this, the process 280 ends at Step 288.

Figure 8b shows the process of how the Floofs executable 56 running the Floofs application 20 handles the message once it has been routed to it at Step 296. The handling process 300 commences with the message being received at Step 302 by the Floofs application (the running Floofs executable 56). The application places at Step 304 the Floof message in the Floof inbox (not shown) which is provided in the message store 52. Following this, the user is alerted at Step 306 via display sound and/or vibration of the presence of an incoming Floof message (as shown for example in Figure 7a). However, it is to be appreciated that none of these actions actually are there to convey the message but rather only to alert the user to the presence of an incoming message which they can decide whether to view or not. Subsequently, the process 300 ends at Step 308.

A process 310 by which a user opens the Floof message to actually see the non-verbal message which has been sent is set out in Figure 8c. The process 310 commences at Step 312 with the user selecting a message in the Floof inbox (as also shown in Figure 7b). The Floofs application then extracts the sender telephone number and the Floof Emotion ID from the message at Step 314. The sender telephone number is then checked at Step 316 against the contact numbers 78 in the Floof Contact List table 64. If no match is found at Step 316, then the appropriate image (as determined by the Emotion ID which accompanies the message) of the default Floof is fetched at Step 318 from the Emotion table 62 for display to the user. However, if a match is found at Step 316, the process 310 determines at Step 320 whether the sender telephone number has a bonded Floof, namely is there a Floof ID 80 listed in the Floof Contact List table 64 against the sender number? If not then again the appropriate image (as determined by the received Emotion ID) of the default Floof is fetched at Step 318 from the Emotion table 62 for display to the user. However, if there is a bonded Floof as determined at Step 320, then the appropriate image (as determined by the extracted Emotion ID) of the bonded Floof is fetched at Step 318 from the Emotion table 62 for display to the user.

The process of sending a Floof message is now described with reference to Figures 9a to 9f, 10 and 11. The process is again relatively simple for Jo as is illustrated in the screen shots shown in Figures 9a to 9f which depict of an example for when a Floof message is sent to Bob and how Jo creates the message. Figure 9a shows the Floof sub menu screen 100 which lists all of the available Floofs options. Jo uses the keypad buttons 98 of the mobile phone 12 to select the Send option 330 provided by the sub menu 100. This brings up the address book screen 110. Here the different contacts 112 already present in the address book are listed (such as Alice and Bob) and an option 114 to enter a new entry is displayed as well as an option 116 to show more names in the address book. Jo selects Bob as the intended recipient of the message, and Bob's default emotion image 332 of his Floof is fetched and displayed as shown in Figure 9c. On display of the default image, Jo navigates through all of the different images associated with the available Emotions for Bob's Floof. Once an appropriate Emotion image 334 has been found this is selected for sending as shown in Figure 9d. It can be seen that in this example that Jo has selected an angry Emotion to send to Bob. When Jo selects the send option 336, the message is sent and a notification screen 338 confirms this as shown in Figure 9e. Finally, Jo is taken back to the Floofs sub menu screen 100 as shown in Figure 9f.

Figure 10 shows how the sent message is displayed to Bob when it is received. As can clearly be seen, Jo's identity is recognised and as she has been bonded to the Girl Floof in Bob's mobile phone, the Floof image is of the angry girl Floof 340. This serves to illustrate the point that the user defines how they will see the incoming messages rather than the sender. Nevertheless, the emotion sent by Jo is still conveyed to Bob.

The process 350 of sending a Floof message is described more generally with reference to Figure 11. Figure 11 shows a flow diagram for the mobile phone operation when a Floof message is to be composed and sent. More specifically, the sending process 350 commences with the selection of the Send option 330 (see Figure 9a) which leads to generation at step 352 of the address book screen 110. A recipient is chosen from the list at Step 354 by use of the keypad keys 98 and this leads to the process checking at Step 356 to see whether the chosen intended recipient has a bonded Floof. The check is made by determining whether there is any

Bonded Floof entry in the Floof Contact List table 64 for that intended recipient. If a Floof has not been bonded to this intended recipient, then the Floofs application 20 fetches at Step 358 the default image of the default Floof from the Emotion table 62. Conversely, if there is a bonded Floof for this intended recipient, then the Floofs application fetches at Step 360 the default image of the bonded Floof from the Emotion table 62. The default image in both cases is determined by checking the Floof table 60.

The default image fetched at either of Steps 358 or 360, then displayed on the mobile phone at Step 362 (see for example Figure 9c). However, the default emotion represented by the image displayed at Step 362 may not be the image which the user wishes to send. Accordingly, the user can scroll through the different available set of images for that Floof stored in the Emotion table 62. In the present embodiment three different images representing three different emotions are provided. However, in other embodiments there may well be many more, typically thirty, emotions in any given Floof set. So the option to scroll through different images is provided as is the options to select an emotion (a given Floof image) and even to return back to the address book screen 110 for reselection of an intended recipient.

The process therefore receives a user command at Step 364, and checks at Step 364 to see whether it is a scroll up/down command at Step 366. If it is, then this indicates that the emotion has not yet been selected and the next/previous respective emotion image 74 in the set of images for that Floof is fetched at Step 368 from the Emotion table 62. This new emotion image 334 is then displayed for selection on the display 30 at Step 362. However, if the user command is not a scroll up/down command as determined at Step 366, then the process 350 continues with a check being made at Step 370 to determine whether it is a send Floof command. If the send command has been selected, then a data message is generated by the application at Step 372 for the selected emotion. The data message as explained previously can be an SMS, e-mail, EMS or MMS message. In this embodiment a simple SMS is sent. The important point to note is that the data message contains a Floofs application identifier which will be recognised by a Floofs application 20 running on the recipient's mobile phone. In addition, the data message has to contain the Emotion ID 72 such that the Floofs application at the intended recipient knows which emotion is being communicated by the sender and knows how to display that to the recipient as is shown in Figure 10. The thus composed non-verbal message is then addressed and sent at Step 374 to the contact number of the intended recipient which has been obtained from the Floofs Contact List table 64. Following this, the process 280 ends at Step 288.

If the received user command is not a send command as determined at Step 370, then a check is made at Step 378 to determine whether it is a Back command. If not then a command has been entered in error and the process 350 continues with the presently displayed Floof image continuing to be displayed at Step 362. However, if the command is a Back command, the process 350 is returned to the address book screen 110 at Step 352 to enable the user to select another intended recipient from the address book. The second embodiment is very similar to the first and accordingly only the differences are described hereinafter. Referring now to Figure 12, the main difference is that both the sending and receiving mobile communications devices 400 do not convey the Floof message to the user but rather relay the received message to a respective toy 402 (called a Floof Toy) which conveys the non-verbal message to the recipient. Whilst it is possible to keep the user interface the same with the phone keypad being accessible to the users, in this embodiment, the user interface becomes the toy 402 itself, such that in order to send a non-verbal message or WF, the sender interacts with their respective toy 402. The toy 402 in turn generates a message that is sent via the sender's mobile telephone 400 to the receiver's mobile telephone 400 and thence to the receiver's toy 402. The decoded received message at the receiving Floof toy 402 is then interpreted by the Floof toy 402 to convey the non-verbal message to the recipient by physical activity of the Floof toy itself. In this embodiment the toys 402 and the respective telecommunications devices 400 are connected together by a local wireless data communications link 404, such as a Bluetooth local radio link.

Examples of user interaction with the Floof toys 402 acting as the user interface and their effects might include:

• Stroking the originating Floof. The local and remote Floof would respond by emitting a pleasing sound. • Squeezing the originating Floof. The local and remote Floof would respond by vibrating as in wriggling with pleasure.

• Pressing a button on the originating Floof to select a given facial expression and sound. The local and remote Floof would respond with the given expression and sound.

In order that the Floof toy 402 is aware of the destination address (telephone number) to which Floof messages should be sent, Floof toys 402 are "bonded" as has been described in relation to the first embodiment. This generally comprises recording the telephone number of a handset 400 to which a Floof toy (Floof A) is connected, in another Floof toy (Floof B) and also the reverse bond: recording the telephone number of Floof B in Floof A. This may be done by entering the number into a Floof equipped with some method of data entry and display, but in the present embodiment it is carried out preferentially by use of the Bluetooth radio link 404 to effect local data communication between the Floof toys 402 (whereby bringing two Floof toys 402 together can effect the bonding).

The behaviour may be a physical action such as moving the whole toy, moving an extremity of the toy, moving a part of the toy such as eyes, vibrating or wriggling. Also the behaviour can also be a change in an electronically controlled non-moving aspect of the toy's appearance such as an LCD or other display on the toy or colours of illuminated parts of the toy. Furthermore the behaviour may include replay, from the toy, of non-verbal audio contained in a multimedia message.

The second embodiment is described more specifically now with reference to Figure 13 which shows a block-level diagram of the functional components of the Floof toy 402. The Floof toy 402 comprises a CPU 410 which administers all operations of the Floof toy 402, a button 412 for controlling basic functions of the Floof toy 402 and an LCD 414 used to provide a facial expression for the Floof by depicting the eyes, nose and mouth. A variety of pre-programmed expressions (which may be animated) may be selected for display. Furthermore, The Floof toy 402 comprises a Bluetooth chipset 416, a set of motors 418 for physically moving different parts of the Floof and a non- volatile memory 420 for storing Floof images and expressions as well as telephone numbers of bonded Floof toys 402.

To bond the Floof toy 402 of this embodiment to a mobile telephone 400 so that it will use that telephone for reception and transmission of messages, the following is carried out:

1. The Floof toy 402 and the telephone handset 400 are brought into close proximity such that they can exchange information via the Bluetooth local radio link 404.

2. The CPU 410 is informed of the presence of the phone 400 by the Bluetooth chipset 416.

3. The" CPU 410 responds by causing the LCD 414 to display a facial expression of surprise and curiosity.

4. The button 412 on the Floof toy 402 is depressed and held down. 5. The CPU 410 detects that the button 412 is depressed and starts a timer (not shown) counting down from two seconds. 6. The timer expires and the button 412 is still held down. The CPU 410 requests the handset's telephone number using the Bluetooth chipset 416. The CPU 410 stores the new originating number of the handset 400 in the non- volatile memory 418.

7. The CPU 410 causes the LCD 414 to display an expression signifying contentment. 8. The button 412 may now be released, whereupon the CPU 410 causes the LCD 414 to display the default facial expression.

The procedure used to bond two Floofs together is carried out as described below:

9. The two Floof toys 402 are brought into close proximity such that they can exchange information via the Bluetooth local radio link 404.

10. The CPUs 410 in both Floofs are informed of the presence of another Floof by the respective Bluetooth chipsets 416.

11. The CPUs 410 in both Floof toys 402 respond by causing the LCDs 414 to display a facial expression of surprise and curiosity.

12. The respective buttons 412 on both Floof toys 402 are depressed and held down.

13. The CPUs 410 in both Floof toys detect that the buttons 412 are depressed and start respective timers (not shown) each counting down from two seconds.

14. The timers each expire and the buttons 412 are still held down. The CPUs 410 in both Floof toys exchange telephone numbers using the Bluetooth chipsets 416. The CPU 410 of each Floof toy 402 stores the new destination number of the remote Floof in its nonvolatile memory 418.

15. The CPUs 410 in both Floof toys 402 cause the LCDs 414 to display an expression signifying love and affection. 16. The buttons 412 may now be released whereupon the CPU 410 in each Floof toy 402 causes the respective LCD 414 to display the default facial expression.

Replacement of the Bluetooth capability for one Floof to detect the presence of other Floofs by infrared or other wireless data exchange, or by the use of direct contacts between Floofs is an obvious and trivial change to the method described above.

The procedure used to send a non-verbal message using the Floof toy 402 is described below: 1. The button 412 is depressed by the user. It may be depressed a number of times to indicate a particular action. It will be understood that alternative methods of activating the Floof are possible, such as motion sensors or contacts that can determine that the Floof is being squeezed. For this example, it is assumed that pressing the button 412 once signals that the user wishes to send a "smile" message to the remote Floof toy 402.

2. The CPU 410 detects that the button 412 has been depressed and first provides feedback for the user by causing the LCD 414 to show a smiling face. This provides feedback that the button press has been seen and that the action will be sent. 3. The CPU 410 retrieves the telephone number of handset 400 with which the remote Floof toy is associated (bonded) from the non-volatile memory 418). 4. The CPU 410 creates an SMS message that instructs the remote Floof toy 402 to smile. The message is in a form that conveys an appropriate meaning if read by a human. In this case the message is "Floof smile from originating telephone number>." 5. The CPU 410 uses the Bluetooth chipset 416 to communicate with the user's mobile telephone 400. On successful connection, the SMS message is sent using the phone 400 to the target number. 6. The CPU 410 causes the LCD 414 to display the default facial expression.

The recipient's Floof, for which Figure 13 is also a block-level diagram, operates in response to the sending of a Floof message as follows:

1. The CPU 410 uses the Bluetooth chipset 416 to communicate with the user's mobile telephone 400. Periodically, it checks for incoming messages. On receipt of the SMS message, the CPU 410 identifies that it is a message intended for the Floof toy 402 by detecting the keyword "Floof at the start of the message and checking that the following word (in this case "smile") is an appropriate action for a Floof.

2. The CPU 410 determines the pre-programmed response to the "smile" message, which in this case is to cause the LCD 414 to display a smiling face and to activate the motors 420 to cause the Floof toy 402 to vibrate. These actions are carried out.

3. The CPU 402 stops the motors 420 running after a short period of time but leaves the LCD 414 displaying the smiling face. . The user depresses the button 412 to acknowledge the message. The CPU 410 causes the LCD 414 to display the default facial expression.

Figure 14 shows a third embodiment of the present invention. This embodiment is very similar to the second embodiment and accordingly, only the difference are described herein. The main difference is that both the sending and receiving mobile communications devices (the telephone capability) are provided within respective Floof toys 500. This is possible because continuing miniaturisation and commoditisation of the electronic components of mobile phones has resulted in the phone circuitry and antenna being small enough to be contained within the body of the Floof toy 500. As has been mentioned, no connection to an external phone is necessary but it will be appreciated that the principle of this third embodiment of the invention is the same as that of the second embodiment in which separate phones are used in that message exchange takes place over the telephone network.

The components of the Floof toy 500 of the third embodiment is similar to that of the second embodiment in that a CPU 510, a button 512, and LCD display 514, non- volatile memory 518 and a set of motors 520 are provided. However, the Bluetooth chipset 416 is now replaced by a GSM/GPRS/UMTS chipset 516 and a message store 522. These two components function to provided the telephony capability to the Floof toy 500. (UMTS is Universal Mobile Telecommunications System. GSM is the Global System for Mobile telecommunications. GPRS is General Packet Radio Services. UMTS is a standard that provides GPRS functions. GSM provides circuit-switched data calls and simple messaging.) It will be understood by those skilled in the arts of mobile telephony that GSM, GPRS and UMTS represent currently popular systems for mobile telephony and that the same principles apply to systems yet to be developed insofar as they provide for messages sent over a mobile telephone network in the same manner as SMS, MMS and EMS messages are currently sent over mobile telephone networks.

The operation of this embodiment of the present invention is the same as the second embodiment except that: in step 4 of sending a message, the created message is placed in the message store; in step 5 of sending a message, the message is sent directly to the telephone network using the built-in mobile telephone capability; and in step 1 of receiving a message, received messages are delivered directly to the Floof using the built-in mobile telephone capability and placed in the message store from which the CPU can read them.

In a fourth embodiment of the present invention, also described by Figure 14, the use of SMS, EMS or MMS messages may be replaced with other simple datagram network protocols such as UDP (Unreliable Datagram Protocol) over any available packet-based connection such as those provided by GPRS .

In alternate embodiments of the present invention, the set of motors 420, 520 of Figures 13 and 14 may be replaced or augmented with a speaker for emitting pre-recorded non-verbal sounds under the control of the CPU or illuminated LEDs to produce light patterns.

A fifth embodiment of the present invention is now described with reference to Figure 15. This embodiment is very similar to the third embodiment and accordingly, only the differences are described herein. The main difference is that there is no screen provided in the Floof toy 600. Rather, all communications are carried out simply by physical user interactions with the Floof toy itself.

In this embodiment, Floof expression images are not displayed but rather the Floof toy 600 utilises its plurality of motors operatively controlled by the CPU to cause movement of the toy in different ways according to the selected physical action. Furthermore, the CPU comprises a sound generator (not shown) arranged to emit non-verbal sounds in response to a selected physical action. The provision of a sound generator is important as there is no visual display provided for assisting in conveying the non-verbal message to the recipient.

Having described particular preferred embodiments of the present invention, it is to be appreciated that the embodiments in question are exemplary only, and that variations and modifications, such as those that will occur to those possessed of the appropriate knowledge and skills, may be made without departure from the spirit and scope of the invention as set forth in the appended claims. For example, rather than employing a Bluetooth radio connection in the second embodiment, the connection between the Floof toy 402 and the telephone handset '400 may be a physical cable connection such as serial, Universal Serial Bus, IEEE1394/iLink or Fire Wire. Alternatively, the connection can be a physical direct connection, not involving a cable, to a data port on the telephone handset 400. The connection between the toy and the telephone handset can also be a wireless infrared data connection.

Also if the downloaded application is not a JAVA application, then the Floofs application 20 could liase with an existing contacts list on the mobile phone 12, adding in new column regarding the bonded Floof rather than needing to download and populate a new Floofs Contact List table 64.

It is also possible when downloading data from the Floofs server 22, that information regarding the phone model of the mobile phone can be provided to the server. This information can be used by the server to select the most appropriate resolution of data to send to the mobile phone to take advantage of its particular capabilities, for example Floofs images in an optimised resolution. Also use can be made of other functional capabilities of the phone such as polyphonic ringtones, colour screen and vibration capabilities.

At present in mobile phones there is a requirement to keep JAVA applications separate from phone applications. This is done in the present embodiments. However, in due course this problem may be overcome and one may be able to integrate the application with the phone's operating systems i.e. messages could be received directly into the normal inbox rather than having to go to the Floofs inbox.

Claims

CLAIMS:
1. A method of communicating non-verbal messages between first and second individuals using respective first and second telecommunications devices, the method comprising: storing a plurality of sets of predetermined graphical images on the first telecommunications device, each graphical image being assigned an image identifier and the graphical images within each set representing different non-verbal messages; linking one of the graphical images sets with a device identifier of the second telecommunications device; receiving a communication for conveying a non-verbal message from the second telecommunications device, the communication specifying one of the image identifiers and the device identifier of the second telecommunications device; retrieving the graphical image specified by the first communication using the received image and device identifiers; and displaying the retrieved graphical image, thereby conveying the non-verbal message of the communication to the first individual.
2. A method according to Claim 1, wherein the linking step comprises linking one of the graphical images sets with a communications address of the second telecommunications device.
3. A method according to Claim 2, wherein the linking step comprises linking one of the graphical images sets with a telephone number of a mobile phone.
4. A method according to any preceding claim, wherein the storing step comprises storing a plurality of sets of predetermined graphical character images in which the graphical images each convey different facial expressions of that character.
5. A method according to any preceding claim, wherein the storing step comprises storing sets of predetermined video clips.
6. A method according to Claim 5, wherein the storing step comprises storing sets of predetermined animated graphical images.
7. A method according to any preceding claim, further comprising: associating one or more pre-programmed actions, to be performed by the first telecommunications device, with a stored graphical image; and executing the one or more pre-programmed actions when the displaying of that graphical image occurs, the one or more actions of the first telecommunications device enhancing the non-verbal message conveyed to the first individual.
8. A method according to Claim 7, wherein the executing step comprises vibrating the first telecommunications device to enhance the message conveyed.
9. A method according to Claims 7, wherein the executing step comprises playing pre- recorded audio data to enhance the message conveyed.
10. A method according to any preceding claim, wherein a descriptive textual identifier is assigned to each graphical image.
11. A method according to any preceding claim, further comprising selecting one of the sets of graphical images for linking with the device identifier of the second telecommunications device in response to inputs made by the first individual into the first telecommunications device.
12. A method according to any preceding claim, wherein the linking step comprises using the device identifier of the second telecommunications device as a key to locate the set of graphical images in a look-up table.
13. A method according to any preceding claim, wherein the receiving step comprises receiving a short message of the SMS, EMS or MMS variety.
14. A method according to any preceding claim, wherein the receiving step comprises receiving an e-mail.
15. A method according to any preceding claim, wherein the displaying step comprises displaying the retrieved graphical image on a graphical display of the first telecommunications device.
16. A method according to any preceding claim, wherein the displaying step comprises displaying the retrieved graphical image on a graphical display housed within an electrical toy, the graphical image depicting a facial expression on the electrical toy.
17. A method according to Claim 16, further comprising connecting the electrical toy to the first telecommunications device using a wireless communications protocol.
18. A method according to Claim 16, further comprising housing the first telecommunications device within the electrical toy.
19. A method according to any of Claims 16 to 18, further comprising arranging the electrical toy to perform physical actions when displaying one of the graphical images, to enhance the non-verbal message received from the second telecommunications device.
20. A method according to any preceding claim, further comprising: receiving an input specifying one of the image identifiers; creating an other communication for conveying a non-verbal message comprised of the specified image identifier and the device identifier of the second telecommunications device; sending the other communication to the second telecommunications device.
21. A method according to Claim 20, further comprising listing a plurality of the image identifiers and receiving an input specifying one of the listed image identifiers.
22. A method according to Claim 20, further comprising associating groups of the image identifiers with different input actuators of the first telecommunications device and listing a group of image identifiers in response to receiving an input from a particular one of the actuators.
23. A method according to Claim 20, further comprising outputting one or more of the graphical images and receiving an input specifying one of the image identifiers for one of the graphical images which has been output.
24. A method according to Claim 21, further comprising associating groups of the graphical images with different input actuators of the first telecommunications device and outputting a group of the graphical images in response to receiving the input from a particular one of the actuators.
25. A method according to any of Claims 20 to 24, further comprising associating the device identifier of the second telecommunications device with a particular one of the actuators of the first telecommunications device and inserting that device identifier into the other communication in response to receiving an input from that particular actuator.
26. A method according to any of Claims 20 to 25, further comprising outputting a graphical image conveying the non-verbal message of the other communication on the first telecommunications device for approval prior to carrying out the sending step.
27. A method according to any preceding claim, wherein the image identifiers are unique within any one of the plurality of sets of graphical images.
28. A method according to Claim 27, wherein at least one image identifier is used in more than one of the plurality of sets of graphical images.
29. A method according to Claim 28, wherein a third individual using a third telecommunications device can communicate non-verbal messages to the first individual, the method further comprising: designating one of the sets of graphical images as a default images set; receiving a further communication for conveying a non-verbal message from the third telecommunications device, the further communication specifying one of the image identifiers and the device identifier of the third telecommunications device; determining that none of the sets of graphical images stored on the first telecommunications device are linked with the device identifier of the third telecommunications device; retrieving the default graphical image specified in the further communication using the image identifier received in the further communication; and displaying the retrieved default graphical image, thereby conveying the non-verbal message of the further communication to the first individual.
30. A method according to Claims 27 or 28, wherein an identical set of identifiers is used for each one of the plurality of sets of graphical images to distinguish the graphical images within each set of the graphical images.
31. A telecommunications device for communicating non-verbal messages between a first individual using the telecommunications device and a second individual using a communications device, the telecommunications device comprising: a database for storing a plurality of sets of predetermined graphical images on the telecommunications device, the graphical images within each set representing different nonverbal messages and being assigned image identifiers; linking means for linking one of the graphical images sets with a device identifier of the communications device; receiving means for receiving a communication for conveying a non-verbal message from the communications device, the communication specifying one of the image identifiers and the device identifier of the communications device;
' retrieving means for retrieving the graphical image specified by the first communication using the received image and device identifiers; and a display for displaying the retrieved graphical image, thereby conveying the non- verbal message of the communication to the first individual.
32. A processing program for a programmable telecommunications device, the processing program being arranged to configure the telecommunications device to implement a method according to any preceding claim.
33. A processing program according to Claim 32, carried on an electrical carrier signal.
34. A method of communicating non-verbal messages between first and second individuals using respective first and second telecommunications devices, the first telecommunications device being operatively connected to a respective first toy, the method comprising: storing a plurality of predetermined sets of physical actions at the first toy, the physical actions within each set representing different non-verbal messages; assigning an action identifier to each physical action; linking one of the physical action sets with a device identifier of the second telecommunications device; receiving a communication for conveying a non-verbal message from the second communications device, the communication specifying one of the action identifiers and the device identifier of the second telecommunications device; retrieving the physical action specified in the communication using the received action and device identifiers; and carrying out the physical action on the first toy, thereby conveying the non-verbal message of the communication to the first individual.
35. A toy apparatus for communicating non-verbal messages between first and second individuals, the toy apparatus comprising: a first communications device associated with the first individual, the first communications device being operably connectable to a second telecommunications device associated with the second individual; a toy, the toy being operatively connectable to the first telecommunications device; a data store for storing a plurality of predetermined sets of physical actions, the physical actions within each set representing different non-verbal messages; a plurality of action identifiers, each action identifier identifying one of the physical actions; means for linking one of the physical action sets with a device identifier of the second telecommunications device; communication means for receiving a commumcation for conveying a non-verbal message from the second communications device via the first communications device, the communication specifying one of the action identifiers and the device identifier of the second telecommunications device; means for retrieving the physical action specified in the communication using the received action and device identifiers; and processing means for interpreting the physical action and carrying out the physical action on the toy, thereby conveying the non-verbal message of the communication to the first individual.
36. A toy apparatus according to Claim 35, wherein the first telecommunications device is provided within the toy itself.
37. A toy apparatus according to Claim 35 or 36, wherein the toy and the first telecommunications device are connected together by a local wireless data communications link.
38. A toy apparatus according to any of Claims 35 to 37, wherein the toy comprises a plurality of motors operatively controlled by the processing means for causing movement of the toy in different ways according to the selected physical action.
39. A toy apparatus according to any of Claims 35 to 38, wherein the processing means comprises a sound generator arranged to emit non-verbal sounds in response to a selected physical action.
40. A toy for communicating non-verbal messages between first and second individuals, the toy being associated with the first individual and comprising: a data store for storing a plurality of predetermined sets of physical actions, the physical actions within each set representing different non-verbal messages; a plurality of action identifiers, each action identifier identifying one of the physical actions; means for linking one of the physical action sets with a device identifier of a second telecommunications device associated with the second individual; communication means for coupling the toy to a first telecommunications device which is operably connectable to the second telecommunications device, the communication means being arranged to receive a communication for conveying a non-verbal message from the second communications device via the first communications device, the communication specifying one of the action identifiers and the device identifier of the second telecommunications device; means for retrieving the physical action specified in the communication using the received action and device identifiers; and processing means for interpreting the physical action and carrying out the physical action on the toy, thereby conveying the non-verbal message of the communication to the first individual.
PCT/GB2003/003560 2002-08-14 2003-08-14 Methods and device for transmitting emotion within a wireless environment WO2004017596A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
GB0218927.2 2002-08-14
GB0218927A GB0218927D0 (en) 2002-08-14 2002-08-14 Improvements relating to communication
GB0224206.3 2002-10-17
GB0224206A GB0224206D0 (en) 2002-08-14 2002-10-17 Improvements relating to communications

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2003255788A AU2003255788A1 (en) 2002-08-14 2003-08-14 Methods and device for transmitting emotion within a wireless environment

Publications (1)

Publication Number Publication Date
WO2004017596A1 true WO2004017596A1 (en) 2004-02-26

Family

ID=31889677

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2003/003560 WO2004017596A1 (en) 2002-08-14 2003-08-14 Methods and device for transmitting emotion within a wireless environment

Country Status (2)

Country Link
AU (1) AU2003255788A1 (en)
WO (1) WO2004017596A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005107193A1 (en) * 2004-04-29 2005-11-10 Siemens Aktiengesellschaft Method for reproducing emotion information and mobile communication terminal therefor
WO2006052303A1 (en) * 2004-11-09 2006-05-18 Sony Ericsson Mobile Communications Ab A method and apparatus for providing call-related personal images responsive to supplied mood data
EP1701339A2 (en) * 2005-03-11 2006-09-13 Samsung Electronics Co., Ltd. Method for controlling emotion information in wireless terminal
WO2006107463A1 (en) * 2005-03-31 2006-10-12 Motorola, Inc. Method and apparatus for representing communication attributes
FR2901904A1 (en) * 2006-06-06 2007-12-07 Bruno Vonesch Wearable accessory e.g. digital watch, for e.g. exchanging knowledge between persons, has light-emitting surface with displaying and signaling units displaying visual configurations providing information relative to state of mind
EP1956530A1 (en) * 2007-02-06 2008-08-13 Research In Motion Limited System and method for image inclusion in e-mail messages
WO2010034362A1 (en) * 2008-09-23 2010-04-01 Sony Ericsson Mobile Communications Ab Methods and devices for controlling a presentation of an object
WO2010106217A1 (en) * 2009-03-20 2010-09-23 Nokia Corporation Method and apparatus for providing an emotion-based user interface
WO2011143523A3 (en) * 2010-05-13 2012-04-19 Alexander Poltorak Electronic personal interactive device
EP2562995A1 (en) * 2011-08-23 2013-02-27 Research In Motion Limited Variable incoming communication indicators
US8489684B2 (en) 2007-02-06 2013-07-16 Research In Motion Limited System and method for image inclusion in e-mail messages
EP2667339A1 (en) * 2012-05-22 2013-11-27 LG Electronics, Inc. Mobile terminal and control method thereof
WO2014071375A1 (en) * 2012-11-05 2014-05-08 Brilliant Mobile L.L.C. Media messaging methods, systems, and devices
US8798601B2 (en) 2011-08-23 2014-08-05 Blackberry Limited Variable incoming communication indicators

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5073927A (en) * 1989-08-29 1991-12-17 Motorola, Inc. Imaging identification method for a communication system
GB2348082A (en) * 1999-03-18 2000-09-20 Nokia Mobile Phones Ltd Communication terminal handling messages including graphics
US6289085B1 (en) * 1997-07-10 2001-09-11 International Business Machines Corporation Voice mail system, voice synthesizing device and method therefor
US20020049836A1 (en) * 2000-10-20 2002-04-25 Atsushi Shibuya Communication system, terminal device used in commuication system, and commuication method of dislaying informations

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5073927A (en) * 1989-08-29 1991-12-17 Motorola, Inc. Imaging identification method for a communication system
US6289085B1 (en) * 1997-07-10 2001-09-11 International Business Machines Corporation Voice mail system, voice synthesizing device and method therefor
GB2348082A (en) * 1999-03-18 2000-09-20 Nokia Mobile Phones Ltd Communication terminal handling messages including graphics
US20020049836A1 (en) * 2000-10-20 2002-04-25 Atsushi Shibuya Communication system, terminal device used in commuication system, and commuication method of dislaying informations

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005107193A1 (en) * 2004-04-29 2005-11-10 Siemens Aktiengesellschaft Method for reproducing emotion information and mobile communication terminal therefor
WO2006052303A1 (en) * 2004-11-09 2006-05-18 Sony Ericsson Mobile Communications Ab A method and apparatus for providing call-related personal images responsive to supplied mood data
EP1701339A2 (en) * 2005-03-11 2006-09-13 Samsung Electronics Co., Ltd. Method for controlling emotion information in wireless terminal
EP1701339A3 (en) * 2005-03-11 2007-05-09 Samsung Electronics Co., Ltd. Method for controlling emotion information in wireless terminal
WO2006107463A1 (en) * 2005-03-31 2006-10-12 Motorola, Inc. Method and apparatus for representing communication attributes
FR2901904A1 (en) * 2006-06-06 2007-12-07 Bruno Vonesch Wearable accessory e.g. digital watch, for e.g. exchanging knowledge between persons, has light-emitting surface with displaying and signaling units displaying visual configurations providing information relative to state of mind
EP1956530A1 (en) * 2007-02-06 2008-08-13 Research In Motion Limited System and method for image inclusion in e-mail messages
US8489684B2 (en) 2007-02-06 2013-07-16 Research In Motion Limited System and method for image inclusion in e-mail messages
WO2010034362A1 (en) * 2008-09-23 2010-04-01 Sony Ericsson Mobile Communications Ab Methods and devices for controlling a presentation of an object
US9386139B2 (en) 2009-03-20 2016-07-05 Nokia Technologies Oy Method and apparatus for providing an emotion-based user interface
WO2010106217A1 (en) * 2009-03-20 2010-09-23 Nokia Corporation Method and apparatus for providing an emotion-based user interface
WO2011143523A3 (en) * 2010-05-13 2012-04-19 Alexander Poltorak Electronic personal interactive device
US9634855B2 (en) 2010-05-13 2017-04-25 Alexander Poltorak Electronic personal interactive device that determines topics of interest using a conversational agent
EP2562995A1 (en) * 2011-08-23 2013-02-27 Research In Motion Limited Variable incoming communication indicators
US8798601B2 (en) 2011-08-23 2014-08-05 Blackberry Limited Variable incoming communication indicators
US9380433B2 (en) 2012-05-22 2016-06-28 Lg Electronics Inc. Mobile terminal and control method thereof
EP2667339A1 (en) * 2012-05-22 2013-11-27 LG Electronics, Inc. Mobile terminal and control method thereof
US20130316695A1 (en) * 2012-05-22 2013-11-28 Lg Electronics Inc. Mobile terminal and control method thereof
WO2014071375A1 (en) * 2012-11-05 2014-05-08 Brilliant Mobile L.L.C. Media messaging methods, systems, and devices
US9565149B2 (en) 2012-11-05 2017-02-07 Phoji, Llc Media messaging methods, systems, and devices

Also Published As

Publication number Publication date
AU2003255788A1 (en) 2004-03-03

Similar Documents

Publication Publication Date Title
US9948772B2 (en) Configurable phone with interactive voice response engine
USRE45982E1 (en) Method and device for speeding up and simplifying information transfer between electronic devices
CN105594163B (en) Voice communication with real-time status notification
US20160042547A1 (en) Mobile communication terminal and data input method
US8989786B2 (en) System and method for graphical expression during text messaging communications
US8509743B2 (en) Mood-based messaging
EP2210214B1 (en) Automatic identifying
EP1417775B1 (en) Personalizing electronic devices and smart covering
KR100751184B1 (en) Method for changing graphical data like avatars by mobile telecommunications terminals
US7035803B1 (en) Method for sending multi-media messages using customizable background images
JP5466420B2 (en) Emotion recognition message system, mobile communication terminal and message storage server
AU2007346312B2 (en) A communication network and devices for text to speech and text to facial animation conversion
RU2299514C2 (en) Multimedia editor and method for wireless communication devices
US7248677B2 (en) Method of and apparatus for communicating user related information using a wireless information device
CN101297541B (en) Communications between devices having different communication modes
US9026710B2 (en) Customized settings for docking station for mobile device
EP1587286B1 (en) Portable terminal for transmitting a call response mesage.
AU2002244511B2 (en) A system and method for customising call alerts
EP1542439B1 (en) Method of raising schedule alarm with avatars in wireless telephone
US8549074B2 (en) Adjunct use of instant messenger software to enable communications to or between chatterbots or other software agents
KR100720133B1 (en) Method for processing message using avatar in wireless phone
EP1982549B1 (en) Personalization content sharing system and method
TW578432B (en) Method and apparatus for presenting script
AU2003215430B2 (en) Animated messaging
EP1592212B1 (en) Method for displaying a screen image on a mobile terminal

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct app. not ent. europ. phase
NENP Non-entry into the national phase in:

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP