GB2416962A - Haptic communication in mobile telecommunications networks - Google Patents
Haptic communication in mobile telecommunications networks Download PDFInfo
- Publication number
- GB2416962A GB2416962A GB0417469A GB0417469A GB2416962A GB 2416962 A GB2416962 A GB 2416962A GB 0417469 A GB0417469 A GB 0417469A GB 0417469 A GB0417469 A GB 0417469A GB 2416962 A GB2416962 A GB 2416962A
- Authority
- GB
- United Kingdom
- Prior art keywords
- message
- network
- terminal
- output
- mobile telecommunications
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000004891 communication Methods 0.000 title claims description 48
- 230000033001 locomotion Effects 0.000 claims abstract description 19
- 238000000034 method Methods 0.000 claims description 18
- 238000013507 mapping Methods 0.000 claims description 13
- 238000004088 simulation Methods 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 7
- 230000005540 biological transmission Effects 0.000 claims description 5
- 230000001413 cellular effect Effects 0.000 claims description 5
- 230000004936 stimulating effect Effects 0.000 claims description 5
- 230000000977 initiatory effect Effects 0.000 claims description 4
- 238000004519 manufacturing process Methods 0.000 claims description 4
- 238000013519 translation Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 238000010438 heat treatment Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 241000282412 Homo Species 0.000 description 1
- 210000004899 c-terminal region Anatomy 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- VJYFKVYYMZPMAB-UHFFFAOYSA-N ethoprophos Chemical compound CCCSP(=O)(OCC)SCCC VJYFKVYYMZPMAB-UHFFFAOYSA-N 0.000 description 1
- 230000020169 heat generation Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/21—Combinations with auxiliary equipment, e.g. with clocks or memoranda pads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/725—Cordless telephones
-
- H04Q7/32—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Telephone Function (AREA)
- Telephonic Communication Services (AREA)
Abstract
A mobile telecommunications network may be provided with mobile terminals including motion sensors 21 and pressure sensors 23. Movement and/or pressure applied to a mobile terminal can be sensed by these sensors 21,23 and converted into an electrical signal. This electrical signal is encoded and transmitted to the network core. The network core generates a message to a recipient mobile terminal having a movement enabler 25 and/or pressure enabler 26. When the message is received by the receiving mobile terminal, the movement and/or pressure enabler generate an approximation of the touch input generated at the sending mobile terminal. The users of mobile terminals are therefore able to communicate with each other by the sense of touch.
Description
24 1 6962
NEW COMMUNICATION TYPE FOR MOBILE
TELECOMMUNICATIONS NETWORKS
The present invention relates to a mobile telecommunications network, to terminals for use with a mobile telecommunications network and to a method of operating a mobile telecommunications network, and in particular to an arrangement in which communications can be enhanced by using the sense of touch.
In addition to voice calls between mobile terminals registered with a mobile telecommunications network, it is possible to send alphanumeric data in the form of an SMS or text message. Recent developments allow the transmission of multimedia messages (MMS) such as drawings, photographs, music and video clips. The present invention seeks to provide an additional type of communication using a mobile telecommunications network.
According to a first aspect of the present invention, there is provided a mobile telecommunications network including a plurality of terminals registered with the network, wherein a first of said terminals includes means for sensing the variation of a touch input with respect to time and generating a signal representative thereof; and including means for generating a message representative of said signal; and means for transmitting the message to a second of said terminals for generating an output in response to the message that is indicative of the touch input.
The message may be generated by the first terminal, the mobile telecommunications network or a separate entity.
The output may be a simulation or approximation of the input - it may stimulate the touch sense of the user of the second terminal to convey the touch input by the user of the first terminal, allowing the users to communicate by the sense of touch.
Alternatively, the output may be indicative of the touch input but may stimulate a different sense of the user of the second terminal - such as sight or hearing.
This arrangement may be used when the second terminal is not capable of generating a touch output.
Particular touch inputs generate particular outputs (whether stimulating touch, sight or hearing). Particular inputs may be mapped to particular outputs. For example, shaking the first terminal may generate a particular audible message at the second terminal. This mapping may be pre-set and/or may be set by the users of the terminals or by a third party.
According to a second aspect of the present invention, there is provided a mobile telecommunications network, including means for receiving a message from a first terminal for delivery to a second terminal and for converting the message between a first type to a second type, wherein at least the first type of message includes data enabling the reproduction of an output by a terminal which output is detectable by the sense of touch and varies with respect to time in dependence upon the content of the message.
The data in the first type of message enabling the reproduction of an output by a terminal may data input from a terminal (which will subsequently be reproduced in some form) or data for generating an output from a terminal.
According to a third aspect of the present invention, there is provided a terminal for use with a mobile telecommunications network, the terminal including means for sensing a variation of a touch input with respect to time and generating a signal representative thereof, a representation of which is for inclusion in a message for transmission to another terminal for generating an output in response to the message that is indicative of the touch input.
According to a fourth aspect of the present invention, there is provided a terminal for use with a mobile telecommunications network, the terminal including means for receiving a message representative of an input signal generated by another terminal, means for generating an output detectable by the sense of touch and which varies with respect to time in dependence upon the content of the message.
According to a fifth aspect of the present invention, there is provided a method of operating a mobile telecommunications network having a plurality of terminals, the method including generating at a first of said terminals a signal representative of how a touch input varies with respect to time, generating a message representative of the signal, transmitting the message to a second of said terminals, and generating an output at the said second terminal in response to the message that is indicative of the touch input.
According to a sixth aspect of the present invention, there is provided a mobile telecommunication network including means for generating a message containing information for use by the user of a terminal, means for transmitting the message to said mobile terminal, and means for conveying the information to the user of said mobile terminal by stimulating the user's sense of touch.
Humans have five senses: sight, hearing, touch, smell and taste.
Known mobile terminals can detect visual and audible stimuli, convert these into a message and transmit this to another terminal via a mobile telecommunications network. For example, a user of a first mobile terminal may record a video clip (comprising sound and moving pictures) of themselves
-
and transmit this as a message to a second mobile terminal, where it is reproduced, stimulating the sight and hearing sensors of the recipient. The present invention provides an enhancement to communications by allowing a touch input to be sensed by a mobile terminal and/or a touch output to be reproduced on a mobile terminal.
In the specification the term "touch" means anything that is detectable by the human sense of touch, and includes such stimuli as heating and vibration.
Some mobile terminals have the facility to vibrate in order to alert the user of an incoming call or message. However, the vibration is simply triggered by a mobile terminal on receipt of an incoming call or message. The vibration does not convey the content of the message.
Conventional mobile telephones are capable of receiving a touch input in the sense that they have buttons and other controls which are depressed. However, the nature of the touch input is not recorded and conveyed as part of a message.
The depression of the key may result in data being recorded in a message (for example a letter in an SMS message), but the nature of the touching of the key (or how it varies with time) is not recorded or conveyed.
According to one aspect of the present invention the variation of a touch input with respect to time is sensed. The intensity of the touch input may be sensed.
The term mobile telecommunications "network" used in this specification does not necessarily refer to a single network operated by a particular (legal) entity.
The network might comprise a plurality of separately operated networks, or a part of one of such networks.
-
For a better understanding of the present invention, an embodiment will now be described by way of example, with reference to the accompanying drawings, in whieh: Figure 1 shows sehematieally principal elements of a mobile telecommunications network; and Figure 2 shows schematically additional components provided to a mobile telecommunications terminal in accordance with an embodiment of the invention.
As shown in Figure 1, mobile terminals 1,3,5 and 7 are registered with a GSM or UMTS (3G) mobile or cellular telecommunications network 9. The mobile terminals may be hand held mobile telephones, personal digital assistants (PDAs) or laptop computers equipped with a dataeard (or, of course, any combination of these). The mobile terminals communicate wirelessly with the mobile telecommunications network 9 via a radio access network comprising base transceiver stations (BTSs) and base station controllers (BSCs).
Communications between the mobile terminals and the network 3 are routed from the radio access network via mobile switching eentres (MSCs), which may be connected by a fixed (cable) link to the network 9. Of course, in practice, a typical mobile telecommunications network 9 will have many thousands of subscribers, each with one or more terminals.
Each of the mobile terminals 1,3,5,7 is provided with a respective subscriber identity module (SIM). During the manufacturing process of each SIM authentication information is stored thereon under control of the network 9.
The network 9 itself stores details of each of the SIMs issued under its control.
In operation of the network 9, a mobile terminal is authenticated (for example, when the user activates the terminal m the network with a view to making and receiving calls) by the network sending a challenge to the mobile terminal. The received challenge is passed to the SIM associated with the mobile terminal and the SIM calculates a reply (dependent on predetermined information held on the SIM - typically an authentication algorithm and a unique key Ki) and transmits it back to the network 9. The network receives the reply from the mobile terminal. Using information pre-stored concerning the content of the relevant SIM and the nature of the challenge sent to the mobile terminal, the network calculates the expected value of the reply from the mobile terminal. If the reply received matches the expected calculated reply, the SIM and the associated mobile terminal are considered to be authenticated.
A mobile terminal in accordance with an embodiment of the invention is shown in Figure 2. In a conventional manner the mobile terminal includes a visual display 11, various buttons and keys 13 and an antenna 15 for communication with the radio access network of the mobile telecommunications network 9.
The mobile terminal also includes a microphone 17 for sensing audio input such as speech and a loudspeaker 19 for generating an audio output. The visual display 11 may provide a user interface such as a graphical user interface to facilitate access to functions provided by the mobile terminal and may also display messages received by the mobile terminal such as alphanumeric text and pictures (moving or still). The mobile terminal also of course includes data processing circuitry (not shown) for allowing the user of the mobile terminal to control the terminal and for allowing wireless communications with the mobile telecommunications network 9. The features of the mobile terminal described thus far are conventional.
According to the embodiment the mobile terminal is provided with a motion sensor 21, such as a micro-accelerometer for detecting movement of the mobile terminal. A pressure sensor 23, such as a piezoelectric strip, is also provided for detecting, for example, squeezing, hitting and/or stroking of the mobile terminal. The sensors 21 and 23 allow the terminal to sense movement and pressure - for example caused by the user of the mobile terminal shaking the terminal or squeezing the terminal. The mobile terminal may be provided with an elastically deformable outer casing and may be configured to allow the outer casing to be resiliently compressed (for example, by squeezing) such that this is sensed by the pressure sensor 23.
Other sensors may also be provided that allow the mobile terminal to sense touch inputs (that is, inputs that would stimulate the human sense of touch).
For example, the mobile terminal may also be provided with a temperature sensor.
The sensors 21 and 23 generate electrical signals in response to a touch input.
These are detected by processing circuitry of the mobile terminal and may be stored or converted into a format suitable for transmission in the message to the mobile telecommunications network 9 and onwardly to another mobile terminal. The generation and transmission of such messages will be discussed further below.
In addition to providing sensors for receiving and recording a touch input, the mobile terminal also includes devices for generating a touch output (that is, devices that provide a stimulus to the human sense of touch). Motion output is created using a movement enabler 25 such as an electromagnetic device. Heat is generated using a heating element 27. Additionally, a pressure generation mechanism 26 may be provided for flexing (expanding/contracting) the resihently deformable case of the mobile terminal. These output types all stimulate the human sense of touch.
If the user of mobile terminal I wishes to communicate with the user of mobile terminal 3 using the mobile telecommunications network 9, data relating to the communication is routed wirelessly between mobile terminal 1 and the local BTS 29. Frorr there the communication data is transmitted to the BSC 31 and to MSC 33 via a fixed or cable link. The mobile telecommunications network core 35 then routes the communication to an appropriate MSC 37 with which the mobile terminal 3 is registered. The communication data is transmitted from MSC 37 to the appropriate BSC 39, and from there to BTS 41. The communication data is transmitted from the BTS 41 wirelessly to the mobile terminal 3.
The communication data may be data representative of the users' voices in a conventional circuit switched voice call. The communication data may also be transmitted during a (packet switched) communication session between the mobile terminal 1 and the mobile terminal 3.
To efficiently facilitate such communication sessions, the third generation partnership project (3GPP) has recently defined a new concept known as IMS (If-based multimedia subsystem). The aim of IMS is to allow users such as mobile telephone network operators to provide services to their subscribers as efficiently and effectively as possible. For example, the IMS architecture is likely to support the following communication types: voice, video, instant messaging, "presence" (a user's availability for contact), location-based services, email and web. Further communication types are likely to be added in the future. This diverse collection of communication devices requires efficient communication session management due to the number of different applications and services that will be developed to support these communication types. The 3GPP have chosen session initiation protocol (SIP) for managing these sessions.
SIP protocol is a session-based protocol designed to establish IF based communication sessions between two or more points or users. Once the SIP session has been established, communication between these end points or users can be carried out using a variety of different protocols (for example, those designed for streaming audio and video). These protocols are defined in the SIP session initiation messages.
With IMS, users are no longer restricted to a separate voice call or data session.
Sessions can be established between mobile terminals that allow a variety of communication types to be used and media to be exchanged. The sessions are dynamic in nature in that they can be adapted to meet the needs of the end users. For example, two users might start a session with an exchange of instant messages and then decide that they wish to change to a voice call, possibly with video. This is all possible within the IMS framework. If a user wishes to send a file to another user and the users already have a session established between each other (for example, a voice session), the session can be redefined to allow data file exchange to take place. This session redefinition is transparent to the end user.
One application of IMS is push-to-talk over cellular (PoC). PoC allows a communication session to be established between a group of devices such that the user of one of the devices can speak and the users of the or each of the other devices will hear that person speak. During such communication session each device functions like a two-way radio or walkie-talkie in a one-to-one or one- to-many group mode. Full duplex speech communication between the users of the respective devices during the PoC part of the communication is not possible - only one user can speak at a time.
One feature of PoC is that, when the communication is established, there is an "always on" communication in between the terminals. When a user wishes to talk to the or each of the other terminals associated with the communication session, the user issues an appropriate instruction to their device (typically using a soft key - that is, a key whose function is programmable), and the user's speech is captured by their terminal instantly, or within a relatively short mode of time, is transmitted to the or each of the other terminals and is reproduced on those terminals. There is no requirement for the user inputting the speech data to dial the or each other device, and nor is there any requirement for the users of the devices receiving the speech to take any action to receive the speech data - it is automatically reproduced by the device when it is received (assuming, of course, the device is operating in an appropriate mode for allowing PoC communication).
PoC is described in the document "Push-to-talk over Cellular (PoC) architecture, draft version 1.0 - 13th February 2004" available from the Open Mobile Alliance Limited (OMA).
In addition to establishing a PoC communication session using IMS, a PoC communication session could be established over existing GSM/GPRS networks by the exchange of data packets but without IMS.
In the embodiment of the present invention, the touch input of mobile terminal 1 is sensed by sensors 21 and/or 23 and a signal representative thereof is generated by the mobile terminal 1. In accordance with a feature of the present embodiment, this touch input is used as a means of communicating with the user of mobile terminal 3. The signal derived from the sensors 21 and/or 23 is encoded and transmitted to the network core 35 via BTS 29, BSC 31 and MSC 33. In the network core 35 a translation server 43 is provided which receives the encoded signal and generates a suitable output message representative of the encoded signal. The output message is transmitted to the mobile terminal 3 via MSC 37, BSC 39 and BTS 41.
The data communicated between the mobile terminal 1 and the mobile terminal 3 may be transmitted by any suitable means. The data may be transmitted in the circuit switched domain, but is preferably transmitted in the packet switched domain. The data may be transmitted as GPRS data, as an SMS message or an MMS message. The data may also be transmitted during a IMS communication session (controlled by SIP) between the mobile terminals 1 and 3 (and possibly other mobile terminals). The data may be transmitted as part of a PoC communication session, so that when the message is received by the mobile terminal 3 an appropriate output is generated (preferably immediately) without requiring any action of the user of the mobile terminal 3. In this regard, although PoC relates to push-to-"talk", in fact any type of data can be communicated during such a session.
In a first example, it will be assumed that both mobile terminal 1 and mobile terminal 3 include the features of the mobile terminal shown in Figure 2. In mobile terminal 1 the touch input signals generated as a result of pressure, movement or heating of the mobile terminal 1 have the electrical signals generated thereby recorded and stored by their mobile terminal 1. How the signals received from the sensors vary with respect to time is recorded. For example, the variation in pressure applied to pressure sensor 23 may be recorded for a predetermined period of, for example, one, two, or five seconds (or of any other duration). Simultaneously, or during a different time period, the signals generated by the movement sensor 21 may be recorded and stored.
The data processor of the mobile terminal 1 then encodes the signals in a suitable format such that their data content can be extracted by a receiving device. Typically, the data will be encoded as binary data. Suitable methods for encoding such data will be known to those skilled in the art and will not be described further here.
A communication session (for example, a IMS, SIP controlled, communication session) between mobile terminal 1 and mobile terminal 3 is then initiated in a known manner if not already established. The encoded data is transmitted to the network core 35, where it is passed to translation server 43. In this example the translation server 43 need not take any action because the mobile terminals 1 and 3 are the same type of mobile terminal. The translation server 43 then issues a message comprising the encoded data to the mobile terminal 3. On receipt of the message the mobile terminal 3 decodes the message and applies appropriate electrical signals to movement enabler 25, pressure generation mechanism 26 and any other touch output devices provided. The touch output or outputs generated by the mobile terminal 3 will, if the user is in contact with mobile terminal 3, stimulate the touch sense of the user. The touch output is an approximation of the touch input to mobile terminal 1. For example, the squeezing of mobile terminal 1 will temporarily change the shape of its casing.
The touch output of the mobile terminal 3 will cause a corresponding temporary change in the shape of the terminal 3. The users of mobile terminals I and 3 can therefore commumcate with each other using the sense of touch.
If the user of mobile terminal 3 does not have a mobile terminal of the type illustrated in Figure 2, or at least has a mobile terminal of a different type to the mobile terminal 1, the translation server 43 is operative to process the encoded data received from the mobile terminal 1 to convert it into a suitable message for reproduction by the mobile terminal 3. For example, the user of mobile terminal 3 may not have a movement enabler 25. In such an instance, the translation server may generate the message such that the output produced by the mobile terminal 3 is an audible and/or visible output. That is, data encoded from the motion sensor 21 and/or pressure sensor 23 is detected by the translation server 43 and is converted into an encoded signal for generating a different type of output from the sensed input.
For example, the encoded signals representing the touch input sensed by the motion sensor 21 and the pressure sensor 23 may be analysed by the translation server 43, where it is determined that these signals are indicative of the user of mobile terminal 1 stroking the mobile terminal. The translation server 43 then accesses a look-up table to determine to which type of output this type of input should be mapped. in this example, it is indicated that the output should be mapped to a simulated voice output generating an "oooh" sound. Similarly, if the translation server determines that the signals input to the movement sensor 21 and pressure sensor 23 are indicative of the mobile terminal 1 being squeezed by the user, the look-up table may indicate that the appropriate output is an audio voice simulation output generating the sound "aaah". Further, if it is determined from the sensors 21,23 that the mobile terminal has been shaken, the look-up table may indicate that the appropriate output is the signal to cause the display 11 of the mobile terminal 3 to have a red colour. Furthermore, if the signals derived from the sensor 21,23 indicate that the mobile terminal has been dropped, the appropriate output message may be a signal to cause the display to flash between red and blue colours.
It should of course be understood that these are merely examples of the translation of types of input to types of output. The output may stimulate more than one sense - for example, the output may produce both an audible and a visual stimulus to the user of mobile terminal 3, or a visual output and a touch output.
The mapping of the inputs of mobile terminal l to the appropriate output of mobile terminal 3 may be predetermined or preset by the mobile terminal 1, the mobile terminal 3 or the network 9. The mapping may depend upon the functionality of the mobile terminal 1 and/or the mobile terminal 3 - for example, whether the terminal has the facility to detect and/or reproduce movement. The type of message that needs to be generated by the translation server 43 to cause the receiving mobile terminal 3 to produce the desired output type may depend upon the particular type of mobile terminal 3, and this information may be obtained from the look-up table. If the mobile terminal 3 has a standard operating system, this may not be necessary.
As indicated above, the mapping of particular input stimuli to output stimuli may be predetermined or pre-set. The user of mobile terminal and/or the user of mobile terminal 3 may be able to select which input stimuli are mapped to which output stimuli, for example by appropriate data communication with the translation server 43. For example, the users of mobile terminals I and 3 may agree a particular form of communication between themselves. For example, the users may agree that the sensed squeezing of the mobile terminal 1 by pressure sensor 23 will cause heat generation by heating element 27 in mobile terminal 3. This allows the user of mobile terminals I and 3 to communicate in a manner that will not disturb others around them and which is only understandable to the users. Even if the heat output at the mobile terminal 3 was detected by the person other than the authorised user of mobile terminal 3, that person would not know what the generation of heat signified. If desired, this allows communications between the users of the mobile terminals 1 and 3 that have an element of secrecy.
The table below shows the entries of the look-up table stored in the translation server 43 in relation to the mobile terminal 1, indicating the appropriate output to receiving mobile terminals A,B,C and D, depending upon the input from mobile terminal 1.
_ Mobile terminal I Input Output of Output of Output of Output of Mobile Mobile Mobile Mobile terminal A terminal B terminal C terminal D Vertical Simulation of Audio: Vibration MMS image: stroke vertical stroke "oooh' ,
- -
Squeeze Simulation of Audio: Heat MMS image: Squeeze "aaah" generation Shake Simulation of Display Audio: "John MMS image: Shake colour to red is fine" . . Drop Simulation of Display Audio: "John MMS image: Drop flashes is cross" a.
red/blue The mapping of particular input stimuli to output stimuli between any pair of mobile terminals may be set by the users of those terminals, or may be preset in dependence upon the input and output facilities of those terminals.
The mapping may also be set or altered by a third party - i.e. not the mobile terminals or the network. For example, mapping data may be obtained from a website and passed to the translation server 43. Such a website may, by way of illustration, be for users interested in massage. Details of the type of mobile terminals 1 and 3 may be provided to the website. The website may then provide for downloading suitable mapping coding that allows the user of mobile terminal 1 to cause the mobile terminal 3 to produce a touch output for performing a particular type of massage. The user of the mobile terminal 1 may do this by providing as an input a particular touch input, alphanumeric character or sound. The website may download a menu for providing a user interface for the user of mobile terminal l that provides a convenient mechanism for the desired touch output for mobile terminal 3 to be selected.
As an alternative, or in addition to, the translation server 43 being provided by the network core 35, the mobile terminal 1 may be provided with a data processing function that generates an appropriately formatted message that produces the desired output stimuli on the mobile terminal 3 when transmitted to that mobile terminal 3 without modification. Alternatively, the receiving mobile terminal 3 may receive encoded signals from the sensors 21 and/or 23 of the sending mobile terminal and may include a data processing function for converting those received data to an appropriate message to generate the desired output stimuli on the mobile terminal 3. These data processing functions may require aconsiderable amount of processing power, and at present it is preferred that these are performed by the network core 35, rather than the mobile terminals 1 and 3. s
It should be understood that an output corresponding to the input stimuli of the mobile terminal 1 may be transmitted (possibly simultaneously) to a plurality of receiving terminals. The output stimuli generated by each of the receiving terminals may be different, in dependence upon the data in the relevant part of the look-up table accessed by the translation server 43 in respect of the receiving terminal and/or upon the output facilities of the receiving mobile terminal.
The touch output generated by the receiving mobile terminal may be in addition to conventional forms of visual and/audio output. During a single communication session (such as an IMS SIP controlled communication session), voice data, other audio data, picture data and touch data may be transmitted, providing an enriched communications experience between the users of mobile terminals 1 and 3.
Claims (44)
1. A mobile telecommunications network including a plurality of terminals registered with the network, wherein a first of said terminals includes means for sensing the variation of a touch input with respect to time and generating a signal representative thereof; and including means for generating a message representative of said signal; and means for transmitting the message to a second of said terminals for generating an output in response to the message that is indicative of the touch input.
2. The network of claim 1, wherein the message is generated by a data processor of the first terminal.
3. The network of claim 1, wherein the message is generated by a data processor of an entity separate from the first terminal.
4. The network of claim 1, wherein the message is generated by a data processor associated with the mobile telecommunications network.
5. The network of claim 1, wherein the output is generated by a data processor of the second terminal.
6. The network of any one of claims 1 to 5, wherein the output is a simulation or approximation of the touch input.
7. The network of any one of claims 1 to 5, wherein the output is indicative of the touch input but is an audible and/or visible output.
8. The network of any one of claims 1 to 7, including means for mapping selected inputs to generate selected outputs to provide a predetermined relationship therebetween.
9. The network of claim 8, wherein said mapping is determined by the first terminal.
10. The network of claim 8, wherein said mapping is determined by the second terminal.
11. The network of claim 8, wherein said mapping is determined by the mobile telecommunications network.
12. The network of claim 8, wherein said mapping is determined by a third party entity, separate from said terminals and the mobile telecommunications network.
13. The network of any one of claims 1 to 12, wherein the sensing means includes a motion sensor.
14. The network of claim 13, wherein the motion sensor includes a microaccelerometer.
15. The network of any one of claims 1 to 14, wherein the sensing means includes a pressure sensor.
16. The network of claim 15, wherein the pressure sensor includes a piezoelectric device.
17. The network of any one of claims 1 to 16, wherein the network comprises a GSM mobile telecommunications network.
18. The network of any one of claims 1 to 16, wherein the network comprises a UMTS (3G) mobile telecommunications network.
19. The network of any one of claims I to 18, wherein the network comprises a GPRS mobile telecommunications network.
20. The network of any one of claims 1 to 19, wherein the message is transmitted in a push-to-talk over cellular (PoC) communication session.
21. The network of any one of claims 1 to 20, wherein the message is transmitted in a session initiation protocol (SIP) session.
22. The network of any one of claims 1 to 19, wherein the message is an SMS message.
23. The network of any one of claims I to 19, wherein the message is an MMS messasge.
24. A mobile telecommunications network, including means for receiving a message from a first terminal for delivery to a second terminal and for converting the message between a first type to a second type, wherein at least the first type of message includes data enabling the reproduction of an output by a terminal which output is detectable by the sense of touch and varies with respect to time in dependence upon the content of the message.
25. The network of claim 24 wherein the second type of message includes data enabling the production of an output by a terminal detectable by the sense of touch.
26. The network of claim 24, wherein the second type of message includes data enabling the production of an audible output.
27. The network of claim 24, wherein the second type of message includes data enabling the production of a visible output by a terminal.
28. A terminal for use with a mobile telecommunications network, the terminal including means for sensing a variation of a touch input with respect to time and generating a signal representative thereof, a representation of which is for inclusion in a message for transmission to another terminal for generating an output in response to the message that is indicative of the touch input.
29. A terminal for use with a mobile telecommunications network, the terminal including means for receiving a message representative of an input signal generated by another terminal, means for generating an output detectable by the sense of touch and which varies with respect to time in dependence upon the content of the message.
30. A method of operating a mobile telecommunications network having a plurality of terminals, the method including generating at a first of said terminals a signal representative of how a touch input varies with respect to time, generating a message representative of the signal, transmitting the message to a second of said terminals, and generating an output at the said second terminal in response to the message that is indicative of the touch input.
31. The method of claim SO, including generating the message at the first terminal.
32. The method of claim SO, including generating the message using a data processor associated with the mobile telecommunications network.
33. The method of claim SO, 31 or 32, wherein the output is a simulation or approximation of the touch input.
34. The method of claim SO, 31,32 or 33, wherein the output is indicative of the touch input but is an audible and/or visible output.
35. The method of any one of claims 30 to 34, including receiving said signal representative of how a touch input varies with respect to time, generating said message such that said output is produced for stimulating a selected human sense.
36. The method of any one of claims 30 to 35, including transmitting the message in a push-to-talk over cellular (PoC) communication session.
37. The method of any one of claims 30 to 35, including transmitting the message in a session initiation protocol (SIP) communication session.
38. The method of any one of claims 30 to 35, including transmitting the message as an SMS message.
39. The method of any one of claims 30 to 35, including transmitting the message as an MMS message.
40. The method of any one of claims 30 to 39, wherein the mobile telecommunications network comprises a GSM, GPRS or UMTS (3G) mobile telecommunications network.
41. A mobile telecommunication network including means for generating a message containing information for use by the user of a terminal, means for transmitting the message to said mobile terminal, and means for conveying the information to the user of said mobile terminal by stimulating the user's sense of touch.
42. A mobile telecommunications network substantially as hereinbefore described with reference to and/or substantially as illustrated in any one of or any combination of the accompanying drawings.
43. A terminal for use with a mobile telecommunications network, substantially as hereinbeforc described with reference to and/or substantially as illustrated in any one of or any combination of the accompanying drawings. .
44. A method of operating a mobile telecommunications network, substantially as hereinbefore described with reference to and/or substantially as illustrated in any one of or any combination of the accompanying drawings. s
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0417469A GB2416962B (en) | 2004-08-05 | 2004-08-05 | New communication type for mobile telecommunications networks |
PCT/GB2005/003046 WO2006013363A1 (en) | 2004-08-05 | 2005-08-03 | Haptic input and haptic output in a communications networks |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0417469A GB2416962B (en) | 2004-08-05 | 2004-08-05 | New communication type for mobile telecommunications networks |
Publications (3)
Publication Number | Publication Date |
---|---|
GB0417469D0 GB0417469D0 (en) | 2004-09-08 |
GB2416962A true GB2416962A (en) | 2006-02-08 |
GB2416962B GB2416962B (en) | 2009-04-01 |
Family
ID=32982591
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB0417469A Expired - Fee Related GB2416962B (en) | 2004-08-05 | 2004-08-05 | New communication type for mobile telecommunications networks |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB2416962B (en) |
WO (1) | WO2006013363A1 (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008081304A1 (en) * | 2006-12-29 | 2008-07-10 | Nokia Corporation | Apparatus and system for multimedia mediation |
WO2008144108A1 (en) * | 2007-05-18 | 2008-11-27 | Immersion Corporation | Haptical content in a text message |
WO2009045996A2 (en) * | 2007-10-04 | 2009-04-09 | Motorola, Inc. | Keypad haptic communication |
WO2009065421A1 (en) * | 2007-11-20 | 2009-05-28 | Nokia Corporation | Improvements in or relating to user interfaces and associated apparatus and methods |
WO2010009145A1 (en) | 2008-07-15 | 2010-01-21 | Immersion Corporation | Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging |
EP2150020A1 (en) * | 2008-07-28 | 2010-02-03 | Alcatel, Lucent | Method for communicating, a related system for communicating and a related transforming part |
ES2362776A1 (en) * | 2009-11-24 | 2011-07-13 | Telefonica,S.A. | Method for communicating physical stimuli using mobile devices |
US8004391B2 (en) * | 2008-11-19 | 2011-08-23 | Immersion Corporation | Method and apparatus for generating mood-based haptic feedback |
US8412282B2 (en) * | 2006-12-21 | 2013-04-02 | Samsung Electronics Co., Ltd | Haptic generation method and system for mobile phone |
WO2013085837A1 (en) * | 2011-12-07 | 2013-06-13 | Qualcomm Incorporated | Integrating sensation functionalities into social networking services and applications |
WO2013136133A1 (en) | 2012-03-15 | 2013-09-19 | Nokia Corporation | A tactile apparatus link |
EP2688321A1 (en) * | 2012-07-18 | 2014-01-22 | BlackBerry Limited | Method and apparatus for motion based ping during chat mode |
WO2014186751A1 (en) * | 2013-05-17 | 2014-11-20 | Bugg Jr Edward D | Sensory messaging systems and related methods |
US9696803B2 (en) | 2009-03-12 | 2017-07-04 | Immersion Corporation | Systems and methods for friction displays and additional haptic effects |
US9746923B2 (en) | 2009-03-12 | 2017-08-29 | Immersion Corporation | Systems and methods for providing features in a friction display wherein a haptic effect is configured to vary the coefficient of friction |
US9874935B2 (en) | 2009-03-12 | 2018-01-23 | Immersion Corporation | Systems and methods for a texture engine |
US9927873B2 (en) | 2009-03-12 | 2018-03-27 | Immersion Corporation | Systems and methods for using textures in graphical user interface widgets |
US10007340B2 (en) | 2009-03-12 | 2018-06-26 | Immersion Corporation | Systems and methods for interfaces featuring surface-based haptic effects |
US10466791B2 (en) | 2012-02-15 | 2019-11-05 | Immersion Corporation | Interactivity model for shared feedback on mobile devices |
US10564721B2 (en) | 2009-03-12 | 2020-02-18 | Immersion Corporation | Systems and methods for using multiple actuators to realize textures |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010027190A2 (en) * | 2008-09-05 | 2010-03-11 | 에스케이텔레콤 주식회사 | Mobile communication terminal that delivers vibration information, and method thereof |
WO2010036050A2 (en) * | 2008-09-26 | 2010-04-01 | Lg Electronics Inc. | Mobile terminal and control method thereof |
US9621706B2 (en) * | 2009-11-18 | 2017-04-11 | Qualcomm Incorporated | System and method of haptic communication at a portable computing device |
US8711118B2 (en) | 2012-02-15 | 2014-04-29 | Immersion Corporation | Interactivity model for shared feedback on mobile devices |
US8570296B2 (en) | 2012-05-16 | 2013-10-29 | Immersion Corporation | System and method for display of multiple data channels on a single haptic display |
CN103516867B (en) * | 2012-06-20 | 2019-01-22 | 腾讯科技(深圳)有限公司 | Mobile device call method, device and phone system |
US9788298B1 (en) * | 2016-12-01 | 2017-10-10 | Immersion Corporation | Smart surfaces for visuo-haptics notifications |
US10743087B2 (en) | 2017-06-21 | 2020-08-11 | Z5X Global FZ-LLC | Smart furniture content interaction system and method |
US10101804B1 (en) | 2017-06-21 | 2018-10-16 | Z5X Global FZ-LLC | Content interaction system and method |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2308523A (en) * | 1995-12-22 | 1997-06-25 | Northern Telecom Ltd | Transferring graphical messages between mobile telephones |
US20020180698A1 (en) * | 2001-06-01 | 2002-12-05 | Michael Kaelbling | Keypads |
US20030064686A1 (en) * | 2001-06-28 | 2003-04-03 | Thomason Graham G. | Data input device |
WO2003051062A2 (en) * | 2001-10-30 | 2003-06-19 | Immersion Corporation | Methods and apparatus for providing haptic feedback in interacting with virtual pets |
WO2004053670A2 (en) * | 2002-12-08 | 2004-06-24 | Immersion Corporation | Method and apparatus for providing haptic feedback |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE3705262C2 (en) * | 1987-02-19 | 1994-11-24 | Dikeoulias Vassilios Dipl Ing | Device arrangement for the transmission of human movements |
NL1004195C1 (en) * | 1996-10-04 | 1998-04-07 | Christine Karman | System for transmitting touch sensation via a computer network. |
JPH1115600A (en) * | 1997-04-28 | 1999-01-22 | Matsushita Electric Ind Co Ltd | Communication terminal which transmits physical quantity operating on one terminal and which can work received picture and transmission terminal/reception temrinal supplied for the same |
JP2000049956A (en) * | 1998-08-03 | 2000-02-18 | Sharp Corp | Communication equipment and communication system |
DE10022336A1 (en) * | 2000-05-08 | 2001-11-29 | Juergen Rall | Connecting electronically controlled physical sexual stimulation devices to Internet involves providing stimulation devices via additional sensing arrangements, interactive data communications |
US20040125120A1 (en) * | 2001-06-08 | 2004-07-01 | Michael Weiner | Method and apparatus for interactive transmission and reception of tactile information |
US6592516B2 (en) * | 2001-10-09 | 2003-07-15 | Ching-Chuan Lee | Interactive control system of a sexual delight appliance |
EP1376316A1 (en) * | 2002-06-26 | 2004-01-02 | BRITISH TELECOMMUNICATIONS public limited company | Haptic communications |
-
2004
- 2004-08-05 GB GB0417469A patent/GB2416962B/en not_active Expired - Fee Related
-
2005
- 2005-08-03 WO PCT/GB2005/003046 patent/WO2006013363A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2308523A (en) * | 1995-12-22 | 1997-06-25 | Northern Telecom Ltd | Transferring graphical messages between mobile telephones |
US20020180698A1 (en) * | 2001-06-01 | 2002-12-05 | Michael Kaelbling | Keypads |
US20030064686A1 (en) * | 2001-06-28 | 2003-04-03 | Thomason Graham G. | Data input device |
WO2003051062A2 (en) * | 2001-10-30 | 2003-06-19 | Immersion Corporation | Methods and apparatus for providing haptic feedback in interacting with virtual pets |
WO2004053670A2 (en) * | 2002-12-08 | 2004-06-24 | Immersion Corporation | Method and apparatus for providing haptic feedback |
Non-Patent Citations (1)
Title |
---|
"Mobiles get a sense of touch" BBC News, 21th January 2003 Downloaded from http://news.bbc.co.uk/1/hi/technology/2677813.stm Whole document * |
Cited By (59)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8412282B2 (en) * | 2006-12-21 | 2013-04-02 | Samsung Electronics Co., Ltd | Haptic generation method and system for mobile phone |
WO2008081304A1 (en) * | 2006-12-29 | 2008-07-10 | Nokia Corporation | Apparatus and system for multimedia mediation |
US8315652B2 (en) | 2007-05-18 | 2012-11-20 | Immersion Corporation | Haptically enabled messaging |
WO2008144108A1 (en) * | 2007-05-18 | 2008-11-27 | Immersion Corporation | Haptical content in a text message |
WO2009045996A2 (en) * | 2007-10-04 | 2009-04-09 | Motorola, Inc. | Keypad haptic communication |
WO2009045996A3 (en) * | 2007-10-04 | 2009-08-13 | Motorola Inc | Keypad haptic communication |
WO2009065421A1 (en) * | 2007-11-20 | 2009-05-28 | Nokia Corporation | Improvements in or relating to user interfaces and associated apparatus and methods |
US8587417B2 (en) | 2008-07-15 | 2013-11-19 | Immersion Corporation | Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging |
EP2741177A1 (en) | 2008-07-15 | 2014-06-11 | Immersion Corporation | Systems and Methods for Transmitting Haptic Messages |
EP3206381A1 (en) * | 2008-07-15 | 2017-08-16 | Immersion Corporation | Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging |
WO2010009149A3 (en) * | 2008-07-15 | 2010-04-01 | Immersion Corporation | Systems and methods for transmitting haptic messages |
US9612662B2 (en) | 2008-07-15 | 2017-04-04 | Immersion Corporation | Systems and methods for shifting haptic feedback function between passive and active modes |
US9910496B2 (en) | 2008-07-15 | 2018-03-06 | Immersion Corporation | Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging |
US8462125B2 (en) | 2008-07-15 | 2013-06-11 | Immersion Corporation | Systems and methods for shifting haptic feedback function between passive and active modes |
WO2010009145A1 (en) | 2008-07-15 | 2010-01-21 | Immersion Corporation | Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging |
US10198078B2 (en) | 2008-07-15 | 2019-02-05 | Immersion Corporation | Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging |
US9785238B2 (en) | 2008-07-15 | 2017-10-10 | Immersion Corporation | Systems and methods for transmitting haptic messages |
EP3480680A1 (en) | 2008-07-15 | 2019-05-08 | Immersion Corporation | Systems and methods for transmitting haptic messages |
US8638301B2 (en) | 2008-07-15 | 2014-01-28 | Immersion Corporation | Systems and methods for transmitting haptic messages |
EP2723107A1 (en) | 2008-07-15 | 2014-04-23 | Immersion Corporation | Systems and methods for transmitting haptic messages |
US10203756B2 (en) | 2008-07-15 | 2019-02-12 | Immersion Corporation | Systems and methods for shifting haptic feedback function between passive and active modes |
JP2014139797A (en) * | 2008-07-15 | 2014-07-31 | Immersion Corp | Systems and methods for physical law-based tactile messaging |
US10019061B2 (en) | 2008-07-15 | 2018-07-10 | Immersion Corporation | Systems and methods for haptic message transmission |
US8866602B2 (en) | 2008-07-15 | 2014-10-21 | Immersion Corporation | Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging |
US10248203B2 (en) | 2008-07-15 | 2019-04-02 | Immersion Corporation | Systems and methods for physics-based tactile messaging |
US8976112B2 (en) | 2008-07-15 | 2015-03-10 | Immersion Corporation | Systems and methods for transmitting haptic messages |
US9063571B2 (en) | 2008-07-15 | 2015-06-23 | Immersion Corporation | Systems and methods for shifting haptic feedback function between passive and active modes |
US9134803B2 (en) | 2008-07-15 | 2015-09-15 | Immersion Corporation | Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging |
US10416775B2 (en) | 2008-07-15 | 2019-09-17 | Immersion Corporation | Systems and methods for shifting haptic feedback function between passive and active modes |
EP2150020A1 (en) * | 2008-07-28 | 2010-02-03 | Alcatel, Lucent | Method for communicating, a related system for communicating and a related transforming part |
US8390439B2 (en) | 2008-11-19 | 2013-03-05 | Immersion Corporation | Method and apparatus for generating mood-based haptic feedback |
US10289201B2 (en) | 2008-11-19 | 2019-05-14 | Immersion Corporation | Method and apparatus for generating mood-based haptic feedback |
US8004391B2 (en) * | 2008-11-19 | 2011-08-23 | Immersion Corporation | Method and apparatus for generating mood-based haptic feedback |
US9841816B2 (en) | 2008-11-19 | 2017-12-12 | Immersion Corporation | Method and apparatus for generating mood-based haptic feedback |
US10466792B2 (en) | 2009-03-12 | 2019-11-05 | Immersion Corporation | Systems and methods for friction displays and additional haptic effects |
US10564721B2 (en) | 2009-03-12 | 2020-02-18 | Immersion Corporation | Systems and methods for using multiple actuators to realize textures |
US9874935B2 (en) | 2009-03-12 | 2018-01-23 | Immersion Corporation | Systems and methods for a texture engine |
US9696803B2 (en) | 2009-03-12 | 2017-07-04 | Immersion Corporation | Systems and methods for friction displays and additional haptic effects |
US10379618B2 (en) | 2009-03-12 | 2019-08-13 | Immersion Corporation | Systems and methods for using textures in graphical user interface widgets |
US9927873B2 (en) | 2009-03-12 | 2018-03-27 | Immersion Corporation | Systems and methods for using textures in graphical user interface widgets |
US10007340B2 (en) | 2009-03-12 | 2018-06-26 | Immersion Corporation | Systems and methods for interfaces featuring surface-based haptic effects |
US9746923B2 (en) | 2009-03-12 | 2017-08-29 | Immersion Corporation | Systems and methods for providing features in a friction display wherein a haptic effect is configured to vary the coefficient of friction |
US10073526B2 (en) | 2009-03-12 | 2018-09-11 | Immersion Corporation | Systems and methods for friction displays and additional haptic effects |
US10073527B2 (en) | 2009-03-12 | 2018-09-11 | Immersion Corporation | Systems and methods for providing features in a friction display including a haptic effect based on a color and a degree of shading |
US10620707B2 (en) | 2009-03-12 | 2020-04-14 | Immersion Corporation | Systems and methods for interfaces featuring surface-based haptic effects |
US10747322B2 (en) | 2009-03-12 | 2020-08-18 | Immersion Corporation | Systems and methods for providing features in a friction display |
US10248213B2 (en) | 2009-03-12 | 2019-04-02 | Immersion Corporation | Systems and methods for interfaces featuring surface-based haptic effects |
ES2362776A1 (en) * | 2009-11-24 | 2011-07-13 | Telefonica,S.A. | Method for communicating physical stimuli using mobile devices |
WO2013085837A1 (en) * | 2011-12-07 | 2013-06-13 | Qualcomm Incorporated | Integrating sensation functionalities into social networking services and applications |
CN103988216A (en) * | 2011-12-07 | 2014-08-13 | 高通股份有限公司 | Integrating sensation functionalities into social networking services and applications |
US10466791B2 (en) | 2012-02-15 | 2019-11-05 | Immersion Corporation | Interactivity model for shared feedback on mobile devices |
US10146315B2 (en) | 2012-03-15 | 2018-12-04 | Nokia Technologies Oy | Tactile apparatus link |
US9886091B2 (en) | 2012-03-15 | 2018-02-06 | Nokia Technologies Oy | Tactile apparatus link |
EP2825934A4 (en) * | 2012-03-15 | 2015-11-04 | Nokia Technologies Oy | A tactile apparatus link |
US10579148B2 (en) | 2012-03-15 | 2020-03-03 | Nokia Technologies Oy | Tactile apparatus link |
WO2013136133A1 (en) | 2012-03-15 | 2013-09-19 | Nokia Corporation | A tactile apparatus link |
EP2688321A1 (en) * | 2012-07-18 | 2014-01-22 | BlackBerry Limited | Method and apparatus for motion based ping during chat mode |
US9147329B2 (en) | 2013-05-17 | 2015-09-29 | Edward D. Bugg, JR. | Sensory messaging systems and related methods |
WO2014186751A1 (en) * | 2013-05-17 | 2014-11-20 | Bugg Jr Edward D | Sensory messaging systems and related methods |
Also Published As
Publication number | Publication date |
---|---|
GB2416962B (en) | 2009-04-01 |
WO2006013363A1 (en) | 2006-02-09 |
GB0417469D0 (en) | 2004-09-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006013363A1 (en) | Haptic input and haptic output in a communications networks | |
KR101402243B1 (en) | Mobile terminal for providing haptic service and method thereof | |
KR100643078B1 (en) | Apparatus and method for displaying information of calling partner during call waiting in portable wireless terminal | |
EP1662767A2 (en) | Ringtone service server, mobile communication terminal and method for setting incoming call notification in a mobile communication terminal | |
US20070127645A1 (en) | Technique for providing secondary information to a user equipment | |
US20070230678A1 (en) | Technique for providing caller-originated alert signals | |
US20070226240A1 (en) | Technique for providing data objects prior to call establishment | |
KR101554057B1 (en) | Method and computer readable medium for modifying caller behavior | |
KR20060033868A (en) | Group call in a communication system | |
CN101820603B (en) | System and method for displaying information about calling user to called user on mobile communication network | |
CN101317434B (en) | Method, system and terminal unit for acquiring media characteristic information | |
KR100678086B1 (en) | Apparatus and method for setting multimedia using mms message in mobile terminal | |
KR100888340B1 (en) | Voice Message Transmission System for Using MultiModal Plug-in in End Terminal Browser Based And Method thereof | |
CN102685698B (en) | Realize a method for cross operator data SMS forward, Apparatus and system | |
NL1031015C2 (en) | Communication system, communication method and communication device. | |
KR101349156B1 (en) | Method for Sharing Status Information, System, Server, Mobile Communication Terminal And Computer-Readable Recording Medium with Program therefor | |
KR100692659B1 (en) | Method, Device and Recording-Medium for providing Ring-Back-tone Service by using multi-media data of receiver | |
KR20060109525A (en) | Mobile communication system and method for providing lettering service | |
KR20030041549A (en) | Method for reducing to download in multimedia messaging service | |
KR20070032879A (en) | Method and System for Providing Image Call Service for Mobile Telecommunication Terminal | |
KR100993323B1 (en) | Method and apparatus for transmitting a calling display message | |
KR100986264B1 (en) | Transmitting Method of Multimedia Data Memoried in Mobile Phone | |
KR20070049702A (en) | Mobile terminal for outputting short message applied special effect | |
KR20070030064A (en) | Appratus and method for picture in common using sms in mobile terminal | |
KR100667341B1 (en) | Method and device for providing bi-directional group message service |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20160805 |