WO2006013363A1 - Haptic input and haptic output in a communications networks - Google Patents

Haptic input and haptic output in a communications networks Download PDF

Info

Publication number
WO2006013363A1
WO2006013363A1 PCT/GB2005/003046 GB2005003046W WO2006013363A1 WO 2006013363 A1 WO2006013363 A1 WO 2006013363A1 GB 2005003046 W GB2005003046 W GB 2005003046W WO 2006013363 A1 WO2006013363 A1 WO 2006013363A1
Authority
WO
WIPO (PCT)
Prior art keywords
message
network
terminal
output
touch input
Prior art date
Application number
PCT/GB2005/003046
Other languages
French (fr)
Inventor
Phil Gosset
Original Assignee
Vodafone Group Plc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to GB0417469.4 priority Critical
Priority to GB0417469A priority patent/GB2416962B/en
Application filed by Vodafone Group Plc filed Critical Vodafone Group Plc
Publication of WO2006013363A1 publication Critical patent/WO2006013363A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72522With means for supporting locally a plurality of applications to increase the functionality
    • H04M1/72547With means for supporting locally a plurality of applications to increase the functionality with interactive input/output means for internally managing multimedia messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Abstract

A mobile telecommunications network may be provided with mobile terminals including motion sensors (21) and pressure sensors (23). Movement and/or pressure applied to a mobile terminal can be sensed by these sensors (21,23) and converted into an electrical signal. This electrical signal is encoded and transmitted to the network core. The network core generates a message to a recipient mobile terminal having a movement enabler (25) and/or pressure enabler (26). When the message is received by the receiving mobile terminal, the movement and/or pressure enabler generate an approximation of the touch input generated at the sending mobile terminal. The users of mobile terminals are therefore able to communicate with each other by the sense of touch.

Description

HAPTIC INPUT AND HAPTIC OUTPUT IN A COMMUNICATION NETWORK

Background to the Invention

The present invention relates to a mobile telecommunications network, to terminals for use with a mobile telecommunications network and to a method of operating a mobile telecommunications network, and in particular to an arrangement in which communications can be enhanced by using the sense of touch.

Brief description of the prior art

In addition to voice calls between mobile terminals registered with a mobile telecommunications network, it is possible to send alphanumeric data in the form of an SMS or text message. Recent developments allow the transmission of multimedia messages (MMS) such as drawings, photographs, music and video clips.

GB-A-2308523 and US-A-2002/180698 disclose mobile terminals that have a touch-sensitive screen that allows alphanumeric data to be captured by moving a finger or stylus across the screen.

Brief Summary of the invention

The present invention seeks to provide an additional type of communication using a mobile telecommunications network.

According to a first aspect of the present invention, there is provided a mobile telecommunications network including a plurality of terminals registered with the network, wherein a first of said terminals includes means for sensing the variation of a touch input with respect to time and/or the intensity of the touch input and generating a signal representative thereof; and including means for generating a message representative of said signal; and means for transmitting the message to a second of said terminals for generating an output in response to the message that is indicative of the touch input.

The touch input may also be referred to as a haptic input.

The intensity of motion or pressure may be sensed.

The message may be generated by the first terminal, the mobile telecommunications network or a separate entity.

The output may be a simulation or approximation of the input - it may stimulate the touch (haptic) sense of the user of the second terminal to convey the touch (haptic) input by the user of the first terminal, allowing the users to communicate by the sense of touch (haptics).

Alternatively, the output may be indicative of the touch input but may stimulate a different sense of the user of the second terminal - such as sight or hearing. This arrangement may be used when the second terminal is not capable of generating a touch output.

Particular touch inputs generate particular outputs (whether stimulating touch, sight or hearing). Particular inputs may be mapped to particular outputs. For example, shaking the first terminal may generate a particular audible message at the second terminal. This mapping may be pre-set and/or may be set by the users of the terminals or by a third party. According to a second aspect of the present invention, there is provided a mobile telecommunications network, including means for receiving a message from a first terminal for delivery to a second terminal and for converting the message between a first type to a second type, wherein at least the first type of message includes data enabling the reproduction of an output by a terminal which output is detectable by the sense of touch and varies with respect to time and/or varies in intensity in dependence upon the content of the message.

The data in the first type of message enabling the reproduction of an output by a terminal may data input from a terminal (which will subsequently be reproduced in some form) or data for generating an output from a terminal.

According to a third aspect of the present invention, there is provided a terminal for use with a mobile telecommunications network, the terminal including means for sensing a variation of a touch input with respect to time and/or the intensity of the touch input and generating a signal representative thereof, a representation of which is for inclusion in a message for transmission to another terminal for generating an output in response to the message that is indicative of the touch input.

According to a fourth aspect of the present invention, there is provided a terminal for use with a mobile telecommunications network, the terminal including means for receiving a message representative of an input signal generated by another terminal, means for generating an output detectable by the sense of touch and which varies with respect to time and/or in intensity in dependence upon the content of the message.

According to a fifth aspect of the present invention, there is provided a method of operating a mobile telecommunications network having a plurality of terminals, the method including generating at a first of said terminals a signal representative of how a touch input varies with respect to time and/or itnensity of the touch input, generating a message representative of the signal, transmitting the message to a second of said terminals, and generating an output at the said second terminal in response to the message that is indicative of the touch input.

According to a sixth aspect of the present invention, there is provided a mobile telecommunication network including means for generating a message containing information for use by the user of a terminal, means for transmitting the message to said mobile terminal, and means for conveying the information to the user of said mobile terminal by stimulating the user's sense of touch.

Humans have five senses: sight, hearing, touch, smell and taste.

Known mobile terminals can detect visual and audible stimuli, convert these into a message and transmit this to another terminal via a mobile telecommunications network. For example, a user of a first mobile terminal may record a video clip (comprising sound and moving pictures) of themselves and transmit this as a message to a second mobile terminal, where it is reproduced, stimulating the sight and hearing sense of the recipient. The present invention provides an enhancement to communications by allowing a touch input to be sensed by a mobile terminal and/or a touch output to be reproduced on a mobile terminal.

In the specification the term "touch" means anything that is detectable by the human sense of touch, and includes such stimuli as heating and vibration.

Some mobile terminals have the facility to vibrate in order to alert the user of an incoming call or message. However, the vibration is simply triggered by a mobile terminal on receipt of an incoming call or message. The vibration does not convey the content of the message. The signal that generates the vibration is not generated in dependence upon the content of a message received from another terminal.

Conventional mobile telephones are capable of receiving a touch input in the sense that they have buttons and other controls which are depressed. However, the nature or intensity of the touch input is not recorded and conveyed as part of a message. The depression of the key may result in data being recorded in a message (for example a letter in an SMS message), but the nature or intensity of the touching of the key (or how it varies with time) is not recorded or conveyed.

According to one aspect of the present invention the variation of a touch input with respect to time is sensed. The intensity of the touch input may be sensed.

The term mobile telecommunications "network" used in this specification does not necessarily refer to a single network operated by a particular (legal) entity.

The network might comprise a plurality of separately operated networks, or a part of one of such networks.

Brief description of the drawings

For a better understanding of the present invention, an embodiment will now be described by way of example, with reference to the accompanying drawings, in which:-

Figure 1 shows schematically principal elements of a mobile telecommunications network; and Figure 2 shows schematically additional components provided to a mobile telecommunications terminal in accordance with an embodiment of the invention.

Exemplary embodiment of the invention

As shown in Figure 1, mobile terminals 1,3,5 and 7 are registered with a GSM or UMTS (3G) mobile or cellular telecommunications network 9. The mobile terminals may be hand held mobile telephones, personal digital assistants (PDAs) or laptop computers equipped with a datacard (or, of course, any combination of these). The mobile terminals communicate wirelessly with the mobile telecommunications network 9 via a radio access network comprising base transceiver stations (BTSs) and base station controllers (BSCs). Communications between the mobile terminals and the network 3 are routed from the radio access network via mobile switching centres (MSCs), which may be connected by a fixed (cable) link to the network 9. Of course, in practice, a typical mobile telecommunications network 9 will have many thousands of subscribers, each with one or more terminals.

Each of the mobile terminals 1,3,5,7 is provided with a respective subscriber identity module (SIM). During the manufacturing process of each SIM authentication information is stored thereon under control of the network 9. The network 9 itself stores details of each of the SIMs issued under its control. In operation of the network 9, a mobile terminal is authenticated (for example, when the user activates the terminal in the network with a view to making and receiving calls) by the network sending a challenge to the mobile terminal. The received challenge is passed to the SIM associated with the mobile terminal and the SIM calculates a reply (dependent on predetermined information held on the SIM - typically an authentication algorithm and a unique key Ki) and transmits it back to the network 9. The network receives the reply from the mobile terminal. Using information pre-stored concerning the content of the relevant SIM and the nature of the challenge sent to the mobile terminal, the network calculates the expected value of the reply from the mobile terminal. If the reply received matches the expected calculated reply, the SIM and the associated mobile terminal are considered to be authenticated.

A mobile terminal in accordance with an embodiment of the invention is shown in Figure 2. In a conventional manner the mobile terminal includes a visual display 11 , various buttons and keys 13 and an antenna 15 for communication with the radio access network of the mobile telecommunications network 9. The mobile terminal also includes a microphone 17 for sensing audio input such as speech and a loudspeaker 19 for generating an audio output. The visual display 11 may provide a user interface such as a graphical user interface to facilitate access to functions provided by the mobile terminal and may also display messages received by the mobile terminal - such as alphanumeric text and pictures (moving or still). The mobile terminal also of course includes data processing circuitry (not shown) for allowing the user of the mobile terminal to control the terminal and for allowing wireless communications with the mobile telecommunications network 9. The features of the mobile terminal described thus far are conventional.

According to the embodiment the mobile terminal is provided with a motion sensor 21, such as a micro-accelerometer for detecting movement of the mobile terminal. A pressure sensor 23, such as a piezoelectric strip, is also provided for detecting, for example, squeezing, hitting and/or stroking of the mobile terminal. The sensors 21 and 23 allow the terminal to sense movement and pressure - for example caused by the user of the mobile terminal shaking the terminal or squeezing the terminal. The mobile terminal may be provided with an elastically deformable outer casing and may be configured to allow the outer casing to be resiliently compressed (for example, by squeezing) such that this is o

sensed by the pressure sensor 23. The pressure sensor 23 may be insensitive to the position of the location of pressure application on the sensor 23 but sensitive to the intensity of the pressure applied.

Other sensors may also be provided that allow the mobile terminal to sense touch inputs (that is, inputs that would stimulate the human sense of touch). For example, the mobile terminal may also be provided with a temperature sensor.

The sensors 21 and 23 generate electrical signals in response to a touch input. These are detected by processing circuitry of the mobile terminal and may be stored or converted into a format suitable for transmission in the message to the mobile telecommunications network 9 and onwardly to another mobile terminal. The generation and transmission of such messages will be discussed further below.

In addition to providing sensors for receiving and recording a touch input, the mobile terminal also includes devices for generating a touch output (that is, devices that provide a stimulus to the human sense of touch). Motion output is created using a movement enabler 25 such as an electromagnetic device. Heat is generated using a heating element 27. Additionally, a pressure generation mechanism 26 may be provided for flexing (expanding/contracting) the resiliently deformable case of the mobile terminal. These output types all stimulate the human sense of touch.

If the user of mobile terminal 1 wishes to communicate with the user of mobile terminal 3 using the mobile telecommunications network 9, data relating to the communication is routed wirelessly between mobile terminal 1 and the local BTS 29. From there the communication data is transmitted to the BSC 31 and to MSC 33 via a fixed or cable link. The mobile telecommunications network core 35 then routes the communication to an appropriate MSC 37 with which the mobile terminal 3 is registered. The communication data is transmitted from MSC 37 to the appropriate BSC 39, and from there to BTS 41. The communication data is transmitted from the BTS 41 wirelessly to the mobile terminal 3.

The communication data may be data representative of the users' voices in a conventional circuit switched voice call. The communication data may also be transmitted during a (packet switched) communication session between the mobile terminal 1 and the mobile terminal 3.

To efficiently facilitate such communication sessions, the third generation partnership project (3 GPP) has recently defined a new concept known as IMS (IP-based multimedia subsystem). The aim of IMS is to allow users such as mobile telephone network operators to provide services to their subscribers as efficiently and effectively as possible. For example, the IMS architecture is likely to support the following communication types: voice, video, instant messaging, "presence" (a user's availability for contact), location-based services, email and web. Further communication types are likely to be added in the future. This diverse collection of communication devices requires efficient communication session management due to the number of different applications and services that will be developed to support these communication types. The 3GPP have chosen session initiation protocol (SIP) for managing these sessions.

SIP protocol is a session-based protocol designed to establish IP based communication sessions between two or more points or users. Once the SIP session has been established, communication between these end points or users can be carried out using a variety of different protocols (for example, those designed for streaming audio and video). These protocols are defined in the SIP session initiation messages.

With IMS, users are no longer restricted to a separate voice call or data session. Sessions can be established between mobile terminals that allow a variety of communication types to be used and media to be exchanged. The sessions are dynamic in nature in that they can be adapted to meet the needs of the end users. For example, two users might start a session with an exchange of instant messages and then decide that they wish to change to a voice call, possibly with video. This is all possible within the IMS framework. If a user wishes to send a file to another user and the users already have a session established between each other (for example, a voice session), the session can be redefined to allow data file exchange to take place. This session redefinition is transparent to the end user.

One application of IMS is push-to-talk over cellular (PoC). PoC allows a communication session to be established between a group of devices such that the user of one of the devices can speak and the users of the or each of the other devices will hear that person speak. During such communication session each device functions like a two-way radio or walkie-talkie in a one-to-one or one- to-many group mode. Full duplex speech communication between the users of the respective devices during the PoC part of the communication is not possible - only one user can speak at a time.

One feature of PoC is that, when the communication is established, there is an "always on" communication in between the terminals. When a user wishes to talk to the or each of the other terminals associated with the communication session, the user issues an appropriate instruction to their device (typically using a soft key - that is, a key whose function is programmable), and the user's speech is captured by their terminal instantly, or within a relatively short mode of time, is transmitted to the or each of the other terminals and is reproduced on those terminals. There is no requirement for the user inputting the speech data to dial the or each other device, and nor is there any requirement for the users of the devices receiving the speech to take any action to receive the speech data - it is automatically reproduced by the device when it is received (assuming, of course, the device is operating in an appropriate mode for allowing PoC communication) .

PoC is described in the document "Push-to-talk over Cellular (PoC) - architecture, draft version 1.0 - 13th February 2004" available from the Open Mobile Alliance Limited (OMA).

In addition to establishing a PoC communication session using IMS, a PoC communication session could be established over existing GSM/GPRS networks by the exchange of data packets but without IMS.

In the embodiment of the present invention, the touch input of mobile terminal 1 is sensed by sensors 21 and/or 23 and a signal representative thereof is generated by the mobile terminal 1. In accordance with a feature of the present embodiment, this touch input is used as a means of communicating with the user of mobile terminal 3. The signal derived from the sensors 21 and/or 23 is encoded and transmitted to the network core 35 via BTS 29, BSC 31 and MSC 33. In the network core 35 a translation server 43 is provided which receives the encoded signal and generates a suitable output message representative of the encoded signal. The output message is transmitted to the mobile terminal 3 via MSC 37, BSC 39 and BTS 41.

The data communicated between the mobile terminal 1 and the mobile terminal

3 may be transmitted by any suitable means. The data may be transmitted in the circuit switched domain, but is preferably transmitted in the packet switched domain. The data may be transmitted as GPRS data, as an SMS message or an MMS message. The data may also be transmitted during a IMS communication session (controlled by SIP) between the mobile terminals 1 and 3 (and possibly other mobile terminals). The data may be transmitted as part of a PoC communication session, so that when the message is received by the mobile terminal 3 an appropriate output is generated (preferably immediately) without requiring any action of the user of the mobile terminal 3. In this regard, although PoC relates to push-to-"talk", in fact any type of data can be communicated during such a session.

In a first example, it will be assumed that both mobile terminal 1 and mobile terminal 3 include the features of the mobile terminal shown in Figure 2. In mobile terminal 1 the touch input signals generated as a result of pressure, movement or heating of the mobile terminal 1 have the electrical signals generated thereby recorded and stored by their mobile terminal 1. The electrical signals vary with the intensity of the pressure, movement or heating. How the signals received from the sensors vary with respect to time is recorded. For example, the variation in pressure applied to pressure sensor 23 may be recorded for a predetermined period of, for example, one, two, or five seconds (or of any other duration). Simultaneously, or during a different time period, the signals generated by the movement sensor 21 may be recorded and stored. The data processor of the mobile terminal 1 then encodes the signals in a suitable format such that their data content can be extracted by a receiving device. Typically, the data will be encoded as binary data. Suitable methods for encoding such data will be known to those skilled in the art and will not be described further here.

A communication session (for example, a IMS, SIP controlled, communication session) between mobile terminal 1 and mobile terminal 3 is then initiated in a known manner if not already established. The encoded data is transmitted to the network core 35, where it is passed to translation server 43. In this example the translation server 43 need not take any action because the mobile terminals 1 and 3 are the same type of mobile terminal. The translation server 43 then issues a message comprising the encoded data to the mobile terminal 3. On receipt of the message the mobile terminal 3 decodes the message and applies appropriate electrical signals to movement enabler 25, pressure generation mechanism 26 and any other touch output devices provided. The touch output or outputs generated by the mobile terminal 3 will, if the user is in contact with mobile terminal 3, stimulate the touch sense of the user. The touch output is an approximation of the touch input to mobile terminal 1. For example, the squeezing of mobile terminal 1 will temporarily change the shape of its casing. The touch output of the mobile terminal 3 will cause a corresponding temporary change in the shape of the terminal 3. The users of mobile terminals 1 and 3 can therefore communicate with each other using the sense of touch.

If the user of mobile terminal 3 does not have a mobile terminal of the type illustrated in Figure 2, or at least has a mobile terminal of a different type to the mobile terminal 1 , the translation server 43 is operative to process the encoded data received from the mobile terminal 1 to convert it into a suitable message for reproduction by the mobile terminal 3. For example, the user of mobile terminal 3 may not have a movement enabler 25. In such an instance, the translation server may generate the message such that the output produced by the mobile terminal 3 is an audible and/or visible output. That is, data encoded from the motion sensor 21 and/or pressure sensor 23 is detected by the translation server 43 and is converted into an encoded signal for generating a different type of output from the sensed input.

For example, the encoded signals representing the touch input sensed by the motion sensor 21 and the pressure sensor 23 may be analysed by the translation server 43, where it is determined that these signals are indicative of the user of mobile terminal 1 stroking the mobile terminal. The translation server 43 then accesses a look-up table to determine to which type of output this type of input should be mapped. In this example, it is indicated that the output should be mapped to a simulated voice output generating an "oooh" sound. Similarly, if the translation server determines that the signals input to the movement sensor 21 and pressure sensor 23 are indicative of the mobile terminal 1 being squeezed by the user, the look-up table may indicate that the appropriate output is an audio voice simulation output generating the sound "aaah". Further, if it is determined from the sensors 21 ,23 that the mobile terminal has been shaken, the look-up table may indicate that the appropriate output is the signal to cause the display 11 of the mobile terminal 3 to have a red colour. Furthermore, if the signals derived from the sensor 21,23 indicate that the mobile terminal has been dropped, the appropriate output message may be a signal to cause the display to flash between red and blue colours.

It should of course be understood that these are merely examples of the translation of types of input to types of output. The output may stimulate more than one sense - for example, the output may produce both an audible and a visual stimulus to the user of mobile terminal 3, or a visual output and a touch output.

The mapping of the inputs of mobile terminal 1 to the appropriate output of mobile terminal 3 may be predetermined or preset by the mobile terminal 1 , the mobile terminal 3 or the network 9. The mapping may depend upon the functionality of the mobile terminal 1 and/or the mobile terminal 3 - for example, whether the terminal has the facility to detect and/or reproduce movement. The type of message that needs to be generated by the translation server 43 to cause the receiving mobile terminal 3 to produce the desired output type may depend upon the particular type of mobile terminal 3, and this information may be obtained from the look-up table. If the mobile terminal 3 has a standard operating system, this may not be necessary.

As indicated above, the mapping of particular input stimuli to output stimuli may be predetermined or pre-set. The user of mobile terminal and/or the user of mobile terminal 3 may be able to select which input stimuli are mapped to which output stimuli, for example by appropriate data communication with the translation server 43. For example, the users of mobile terminals 1 and 3 may agree a particular form of communication between themselves. For example, the users may agree that the sensed squeezing of the mobile terminal 1 by pressure sensor 23 will cause heat generation by heating element 27 in mobile terminal 3. This allows the user of mobile terminals 1 and 3 to communicate in a manner that will not disturb others around them and which is only understandable to the users. Even if the heat output at the mobile terminal 3 was detected by the person other than the authorised user of mobile terminal 3, that person would not know what the generation of heat signified. If desired, this allows communications between the users of the mobile terminals 1 and 3 that have an element of secrecy.

The table below shows the entries of the look-up table stored in the translation server 43 in relation to the mobile terminal 1 , indicating the appropriate output to receiving mobile terminals A5B, C and D, depending upon the input from mobile terminal 1.

Figure imgf000016_0001
Figure imgf000017_0001

The mapping of particular input stimuli to output stimuli between any pair of mobile terminals may be set by the users of those terminals, or may be preset in dependence upon the input and output facilities of those terminals.

The mapping may also be set or altered by a third party - i.e. not the mobile terminals or the network. For example, mapping data may be obtained from a website and passed to the translation server 43. Such a website may, by way of illustration, be for users interested in massage. Details of the type of mobile terminals 1 and 3 may be provided to the website. The website may then provide for downloading suitable mapping coding that allows the user of mobile terminal 1 to cause the mobile terminal 3 to produce a touch output for performing a particular type of massage. The user of the mobile terminal 1 may do this by providing as an input a particular touch input, alphanumeric character or sound. The website may download a menu for providing a user interface for the user of mobile terminal 1 that provides a convenient mechanism for the desired touch output for mobile terminal 3 to be selected.

As an alternative, or in addition to, the translation server 43 being provided by the network core 35, the mobile terminal 1 may be provided with a data processing function that generates an appropriately formatted message that produces the desired output stimuli on the mobile terminal 3 when transmitted to that mobile terminal 3 without modification. Alternatively, the receiving mobile terminal 3 may receive encoded signals from the sensors 21 and/or 23 of the sending mobile terminal and may include a data processing function for converting those received data to an appropriate message to generate the desired output stimuli on the mobile terminal 3. These data processing functions may require a considerable amount of processing power, and at present it is preferred that these are performed by the network core 35, rather than the mobile terminals 1 and 3.

It should be understood that an output corresponding to the input stimuli of the mobile terminal 1 may be transmitted (possibly simultaneously) to a plurality of receiving terminals. The output stimuli generated by each of the receiving terminals may be different, in dependence upon the data in the relevant part of the look-up table accessed by the translation server 43 in respect of the receiving terminal and/or upon the output facilities of the receiving mobile terminal.

The touch output generated by the receiving mobile terminal may be in addition to conventional forms of visual and/audio output. During a single communication session (such as an IMS SIP controlled communication session), voice data, other audio data, picture data and touch data may be transmitted, providing an enriched communications experience between the users of mobile terminals 1 and 3.

Claims

1. A mobile telecommunications network (9) including a plurality of
terminals (1,3) registered with the network (9), wherein a first of said terminals
(1) includes means (21,23) for sensing the variation of a touch input with
respect to time and/or the intensity of the touch input and generating a signal
representative thereof; and including means for generating a message
representative of said signal; and means for transmitting the message to a
second of said terminals (3) for generating an output in response to the message
that is indicative of the touch input.
2. The network of claim 1, wherein the message is generated by a data
processor of the first terminal (1).
3. The network of claim 1, wherein the message is generated by a data
processor of an entity separate from the first terminal (1).
4. The network of claim 1, wherein the message is generated by a data
processor associated with the mobile telecommunications network (9).
5. The network of claim 1, wherein the output is generated by a data
processor of the second terminal (3).
6. The network of any one of claims 1 to 5, wherein the output is a
simulation or approximation of the touch input.
7. The network of any one of claims 1 to 5, wherein the output is indicative
of the touch input but is an audible and/or visible output.
8. The network of any one of claims 1 to 7, including means (43) for
mapping selected inputs to generate selected outputs to provide a
predetermined relationship therebetween.
9. The network of claim 8, wherein said mapping is determined by the first
terminal (1).
10. The network of claim 8, wherein said mapping is determined by the
second terminal (3).
11. The network of claim 8, wherein said mapping is determined by the
mobile telecommunications network (9).
12. The network of claim 8, wherein said mapping is determined by a third
party entity, separate from said terminals and the mobile telecommunications
network (9).
13. The network of any one of claims 1 to 12, wherein the sensing means
includes a motion sensor (21).
14. The network of claim 13, wherein the motion sensor (21) includes a
micro- accelerometer.
15. The network of any one of claims 1 to 14, wherein the sensing means
includes a pressure sensor (23).
16. The network of claim 15, wherein the pressure sensor (23) includes a
piezoelectric device.
17. The network of any one of claims 1 to 16, wherein the network (9)
comprises a GSM mobile telecommunications network.
18. The network of any one of claims 1 to 16, wherein the network (9)
comprises a UMTS (3G) mobile telecommunications network.
19. The network of any one of claims 1 to 18, wherein the network
comprises a GPRS mobile telecommunications network.
20. The network of any one of claims 1 to 19, wherein the message is
transmitted in a push-to-talk over cellular (PoC) communication session.
21. The network of any one of claims 1 to 20, wherein the message is
transmitted in a session initiation protocol (SIP) session.
22. The network of any one of claims 1 to 19, wherein the message is an
SMS message.
23. The network of any one of claims 1 to 19, wherein the message is an
MMS messasge.
24. A mobile telecommunications network (9), including means for
receiving a message from a first terminal (1) for delivery to a second terminal
(3) and for converting the message between a first type to a second type,
wherein at least the first type of message includes data enabling the
reproduction of an output by a terminal which output is detectable by the sense
of touch and varies with respect to time and/or varies in intensity in dependence
upon the content of the message.
25. The network of claim 24 wherein the second type of message includes
data enabling the production of an output by a terminal detectable by the sense
of touch.
26. The network of claim 24, wherein the second type of message includes
data enabling the production of an audible output.
27. The network of claim 24, wherein the second type of message includes
data enabling the production of a visible output by a terminal.
28. A terminal (1,3) for use with a mobile telecommunications network (9),
the terminal (1,3) including means (21,23) for sensing a variation of a touch
input with respect to time and/or the intensity of the touch input and generating
a signal representative thereof, a representation of which is for inclusion in a
message for transmission to another terminal for generating an output in
response to the message that is indicative of the touch input.
29. A terminal (1,3) for use with a mobile telecommunications network (9),
the terminal (1,3) including means for receiving a message representative of an
input signal generated by another terminal, means (25,26,27) for generating an
output detectable by the sense of touch and which varies with respect to time
and/or in intensity in dependence upon the content of the message.
30. A method of operating a mobile telecommunications network (9) having
a plurality of terminals (1,3), the method including generating at a first of said
terminals (1) a signal representative of how a touch input varies with respect to
time and/or the intensity of the touch input, generating a message representative of the signal, transmitting the message to a second of said
terminals (3), and generating an output at the said second terminal (3) in
response to the message that is indicative of the touch input.
31. The method of claim 30, including generating the message at the first
terminal (1).
32. The method of claim 30, including generating the message using a data
processor associated with the mobile telecommunications network (9).
33. The method of claim 30, 31 or 32, wherein the output is a simulation or
approximation of the touch input.
34. The method of claim 30, 31,32 or 33, wherein the output is indicative of
the touch input but is an audible and/or visible output.
35. The method of any one of claims 30 to 34, including receiving said
signal representative of how a touch input varies with respect to time or the
intensity of the touch input, generating said message such that said output is
produced for stimulating a selected human sense.
36. The method of any one of claims 30 to 35, including transmitting the
message in a push-to-talk over cellular (PoC) communication session.
37. The method of any one of claims 30 to 35, including transmitting the
message in a session initiation protocol (SIP) communication session.
"
38. The method of any one of claims 30 to 35, including transmitting the
message as an SMS message.
39. The method of any one of claims 30 to 35, including transmitting the
message as an MMS message.
40. The method of any one of claims 30 to 39, wherein the mobile
telecommunications network comprises a GSM, GPRS or UMTS (3G) mobile
telecommunications network.
41. A mobile telecommunication network (9) including means for
generating a message containing information for use by the user of a terminal,
means for transmitting the message to said mobile terminal, and means for
conveying the information to the user of said mobile terminal by stimulating
the user's sense of touch.
PCT/GB2005/003046 2004-08-05 2005-08-03 Haptic input and haptic output in a communications networks WO2006013363A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0417469.4 2004-08-05
GB0417469A GB2416962B (en) 2004-08-05 2004-08-05 New communication type for mobile telecommunications networks

Publications (1)

Publication Number Publication Date
WO2006013363A1 true WO2006013363A1 (en) 2006-02-09

Family

ID=32982591

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2005/003046 WO2006013363A1 (en) 2004-08-05 2005-08-03 Haptic input and haptic output in a communications networks

Country Status (2)

Country Link
GB (1) GB2416962B (en)
WO (1) WO2006013363A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010036050A2 (en) * 2008-09-26 2010-04-01 Lg Electronics Inc. Mobile terminal and control method thereof
WO2011062922A1 (en) * 2009-11-18 2011-05-26 Qualcomm Incorporated System and method of haptic communication at a portable computing device
WO2011064432A1 (en) * 2009-11-24 2011-06-03 Telefonica, S.A. Method for communicating physical stimuli using mobile devices
CN102187647A (en) * 2008-07-15 2011-09-14 伊梅森公司 Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging
US20130227409A1 (en) * 2011-12-07 2013-08-29 Qualcomm Incorporated Integrating sensation functionalities into social networking services and applications
EP2323351A4 (en) * 2008-09-05 2015-07-08 Sk Telecom Co Ltd Mobile communication terminal that delivers vibration information, and method thereof
EP2866425A4 (en) * 2012-06-20 2016-01-20 Tencent Tech Shenzhen Co Ltd Mobile device communication method, apparatus and communication system
US10101804B1 (en) 2017-06-21 2018-10-16 Z5X Global FZ-LLC Content interaction system and method

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1936929A1 (en) * 2006-12-21 2008-06-25 Samsung Electronics Co., Ltd Haptic generation method and system for mobile phone
US20080163282A1 (en) * 2006-12-29 2008-07-03 Nokia Corporation Apparatus and system for multimedia meditation
US8315652B2 (en) 2007-05-18 2012-11-20 Immersion Corporation Haptically enabled messaging
US20090091479A1 (en) * 2007-10-04 2009-04-09 Motorola, Inc. Keypad haptic communication
US20100283726A1 (en) * 2007-11-20 2010-11-11 Nokia Corporation user interfaces and associated apparatus and methods
EP2150020A1 (en) * 2008-07-28 2010-02-03 Alcatel, Lucent Method for communicating, a related system for communicating and a related transforming part
US8004391B2 (en) * 2008-11-19 2011-08-23 Immersion Corporation Method and apparatus for generating mood-based haptic feedback
US9696803B2 (en) 2009-03-12 2017-07-04 Immersion Corporation Systems and methods for friction displays and additional haptic effects
US8711118B2 (en) 2012-02-15 2014-04-29 Immersion Corporation Interactivity model for shared feedback on mobile devices
EP2825934A4 (en) * 2012-03-15 2015-11-04 Nokia Technologies Oy A tactile apparatus link
US8570296B2 (en) 2012-05-16 2013-10-29 Immersion Corporation System and method for display of multiple data channels on a single haptic display
EP2688321B1 (en) * 2012-07-18 2015-06-24 BlackBerry Limited Method and apparatus for motion based ping during chat mode
US8493354B1 (en) 2012-08-23 2013-07-23 Immersion Corporation Interactivity model for shared feedback on mobile devices
US9147329B2 (en) 2013-05-17 2015-09-29 Edward D. Bugg, JR. Sensory messaging systems and related methods

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3705262A1 (en) * 1987-02-19 1988-09-01 Dikeoulias Vassilios Dipl Ing Geraet for exchange of movements and positions
WO1998014860A1 (en) * 1996-10-04 1998-04-09 Sense Technology B.V. I.O. System for communication of feelings
JPH1115600A (en) * 1997-04-28 1999-01-22 Matsushita Electric Ind Co Ltd Communication terminal which transmits physical quantity operating on one terminal and which can work received picture and transmission terminal/reception temrinal supplied for the same
JP2000049956A (en) * 1998-08-03 2000-02-18 Sharp Corp Communication equipment and communication system
DE10022336A1 (en) * 2000-05-08 2001-11-29 Juergen Rall Connecting electronically controlled physical sexual stimulation devices to Internet involves providing stimulation devices via additional sensing arrangements, interactive data communications
US20030069470A1 (en) * 2001-10-09 2003-04-10 Ching-Chuan Lee Interactive control system of a sexual delight appliance
EP1376316A1 (en) * 2002-06-26 2004-01-02 BRITISH TELECOMMUNICATIONS public limited company Haptic communications
US20040125120A1 (en) * 2001-06-08 2004-07-01 Michael Weiner Method and apparatus for interactive transmission and reception of tactile information

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2308523A (en) * 1995-12-22 1997-06-25 Northern Telecom Ltd Transferring graphical messages between mobile telephones
EP1271900A1 (en) * 2001-06-01 2003-01-02 Siemens Aktiengesellschaft Keypad system
GB0115822D0 (en) * 2001-06-28 2001-08-22 Koninkl Philips Electronics Nv Data input device
WO2003051062A2 (en) * 2001-10-30 2003-06-19 Immersion Corporation Methods and apparatus for providing haptic feedback in interacting with virtual pets
US7769417B2 (en) * 2002-12-08 2010-08-03 Immersion Corporation Method and apparatus for providing haptic feedback to off-activating area

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3705262A1 (en) * 1987-02-19 1988-09-01 Dikeoulias Vassilios Dipl Ing Geraet for exchange of movements and positions
WO1998014860A1 (en) * 1996-10-04 1998-04-09 Sense Technology B.V. I.O. System for communication of feelings
JPH1115600A (en) * 1997-04-28 1999-01-22 Matsushita Electric Ind Co Ltd Communication terminal which transmits physical quantity operating on one terminal and which can work received picture and transmission terminal/reception temrinal supplied for the same
JP2000049956A (en) * 1998-08-03 2000-02-18 Sharp Corp Communication equipment and communication system
DE10022336A1 (en) * 2000-05-08 2001-11-29 Juergen Rall Connecting electronically controlled physical sexual stimulation devices to Internet involves providing stimulation devices via additional sensing arrangements, interactive data communications
US20040125120A1 (en) * 2001-06-08 2004-07-01 Michael Weiner Method and apparatus for interactive transmission and reception of tactile information
US20030069470A1 (en) * 2001-10-09 2003-04-10 Ching-Chuan Lee Interactive control system of a sexual delight appliance
EP1376316A1 (en) * 2002-06-26 2004-01-02 BRITISH TELECOMMUNICATIONS public limited company Haptic communications

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 1999, no. 04 30 April 1999 (1999-04-30) *
PATENT ABSTRACTS OF JAPAN vol. 2000, no. 05 14 September 2000 (2000-09-14) *

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9063571B2 (en) 2008-07-15 2015-06-23 Immersion Corporation Systems and methods for shifting haptic feedback function between passive and active modes
US10203756B2 (en) 2008-07-15 2019-02-12 Immersion Corporation Systems and methods for shifting haptic feedback function between passive and active modes
US10198078B2 (en) 2008-07-15 2019-02-05 Immersion Corporation Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging
US10019061B2 (en) 2008-07-15 2018-07-10 Immersion Corporation Systems and methods for haptic message transmission
CN102187647A (en) * 2008-07-15 2011-09-14 伊梅森公司 Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging
US9785238B2 (en) 2008-07-15 2017-10-10 Immersion Corporation Systems and methods for transmitting haptic messages
CN104111726B (en) * 2008-07-15 2017-05-24 意美森公司 A method and a system output a haptic effect
US8462125B2 (en) 2008-07-15 2013-06-11 Immersion Corporation Systems and methods for shifting haptic feedback function between passive and active modes
US9612662B2 (en) 2008-07-15 2017-04-04 Immersion Corporation Systems and methods for shifting haptic feedback function between passive and active modes
US8587417B2 (en) 2008-07-15 2013-11-19 Immersion Corporation Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging
US8638301B2 (en) 2008-07-15 2014-01-28 Immersion Corporation Systems and methods for transmitting haptic messages
US9134803B2 (en) 2008-07-15 2015-09-15 Immersion Corporation Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging
US8866602B2 (en) 2008-07-15 2014-10-21 Immersion Corporation Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging
CN104111726A (en) * 2008-07-15 2014-10-22 意美森公司 Systems And Methods For Physics-based Tactile Messaging
US8976112B2 (en) 2008-07-15 2015-03-10 Immersion Corporation Systems and methods for transmitting haptic messages
US10248203B2 (en) 2008-07-15 2019-04-02 Immersion Corporation Systems and methods for physics-based tactile messaging
EP2323351A4 (en) * 2008-09-05 2015-07-08 Sk Telecom Co Ltd Mobile communication terminal that delivers vibration information, and method thereof
US9024870B2 (en) 2008-09-26 2015-05-05 Lg Electronics Inc. Mobile terminal and control method thereof
US8780054B2 (en) 2008-09-26 2014-07-15 Lg Electronics Inc. Mobile terminal and control method thereof
WO2010036050A3 (en) * 2008-09-26 2010-06-17 Lg Electronics Inc. Mobile terminal and control method thereof
WO2010036050A2 (en) * 2008-09-26 2010-04-01 Lg Electronics Inc. Mobile terminal and control method thereof
US9621706B2 (en) 2009-11-18 2017-04-11 Qualcomm Incorporated System and method of haptic communication at a portable computing device
JP2013511897A (en) * 2009-11-18 2013-04-04 クゥアルコム・インコーポレイテッドQualcomm Incorporated Haptic communication system and method in a mobile computing device
CN102668529A (en) * 2009-11-18 2012-09-12 高通股份有限公司 System and method of haptic communication at a portable computing device
WO2011062922A1 (en) * 2009-11-18 2011-05-26 Qualcomm Incorporated System and method of haptic communication at a portable computing device
WO2011064432A1 (en) * 2009-11-24 2011-06-03 Telefonica, S.A. Method for communicating physical stimuli using mobile devices
US20130227409A1 (en) * 2011-12-07 2013-08-29 Qualcomm Incorporated Integrating sensation functionalities into social networking services and applications
EP2866425A4 (en) * 2012-06-20 2016-01-20 Tencent Tech Shenzhen Co Ltd Mobile device communication method, apparatus and communication system
US10101804B1 (en) 2017-06-21 2018-10-16 Z5X Global FZ-LLC Content interaction system and method

Also Published As

Publication number Publication date
GB2416962B (en) 2009-04-01
GB2416962A (en) 2006-02-08
GB0417469D0 (en) 2004-09-08

Similar Documents

Publication Publication Date Title
CN1662920B (en) System, apparatus, and method for effecting network connections via wireless devices using radio frequency identification
KR100751184B1 (en) Method for changing graphical data like avatars by mobile telecommunications terminals
CN1524387B (en) Improvements in message display
US7580678B2 (en) System, apparatus, and method for effecting network connections via wireless devices using radio frequency identification
KR100762629B1 (en) Method for processing back-up service of mobile terminal
CN100334902C (en) Apparatus and method for displaying an image of a speaker in a push-to-talk communication service in a push-to-talk portable terminal
KR100878900B1 (en) Method and arrangement for indicating a size restriction of a message
US20040044774A1 (en) System for providing content sharing and method therefor
KR100660424B1 (en) Unbroken primary connection switching between communications services
US20020021696A1 (en) Method and apparatus for exchange of information in a communication network
US20050149618A1 (en) System and method of transmitting electronic files over to a mobile phone
US7738861B2 (en) Caller identification using push-to-talk protocol for wireless communications devices
KR100584369B1 (en) Method for providing status information of mobile communication terminal in mobile communication system and the mobile communication terminal
EP1643736B1 (en) Apparatus and method for displaying information of calling partner during call waiting in portable wireless terminal
CN1813468B (en) Group call in a communications system
WO2003041379A1 (en) Method and communication network for routing a real-time communication message based on a subscriber profile
US8548532B1 (en) Head unit to handset interface and integration
US7610055B2 (en) Synchronizing information across telecommunications terminals for multiple users
JP2005507625A (en) The method of transferring personalized items between communication terminals
KR20030081430A (en) Multimedia messaging method and system
EP2119201A1 (en) Device and method for providing and displaying animated sms messages
CN1802826B (en) Method for transmitting messages in an MMS-based communications system
KR20120048704A (en) Methods and apparatus for communicating by vibrating or moving mobile devices
CN1954588A (en) Portable electronic devices and method for usage of ring tones customized by the calling part
CN100397918C (en) Methods and apparatus for delivering a message to two or more associated wireless communication devices

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

NENP Non-entry into the national phase in:

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase