CN103533168A - Sensibility information interacting method and system and sensibility interaction device - Google Patents
Sensibility information interacting method and system and sensibility interaction device Download PDFInfo
- Publication number
- CN103533168A CN103533168A CN201310486324.6A CN201310486324A CN103533168A CN 103533168 A CN103533168 A CN 103533168A CN 201310486324 A CN201310486324 A CN 201310486324A CN 103533168 A CN103533168 A CN 103533168A
- Authority
- CN
- China
- Prior art keywords
- mobile terminal
- information
- interaction device
- emotion information
- emotional interaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a sensibility information interacting method, a sensibility information interacting system and a sensibility interaction device. The method comprises the following steps that a first mobile terminal receives preset type of sensibility information from a second mobile terminal and sends the received preset type of sensibility information to the sensibility interaction device; the sensibility interaction device determines prompting information corresponding to the sensibility interaction received from the first mobile terminal according to the mapping relation between the preset type of sensibility information and the prompting information and controls an information output unit to output the determined prompting information. By the embodiment of the invention, sensibility information interaction can be realized through the sensibility interaction device conveniently and intuitively.
Description
Technical field
The present invention relates to information interaction field, specially refer to a kind of emotion information exchange method, system and emotional interaction device.
Background technology
Along with being more and more widely used of mobile terminal (as mobile phone), people can pass through this mobile terminal exchange of information, as sent note etc.As in high mood the same day in user A, he also wishes that user B also can share his good mood, user A sends note by mobile terminal to the mobile terminal of user B, this short message content comprise the content that this user A writes (as, I am good glad today, in so-and-so dining room, eaten one nice), user B can receive this note by mobile terminal, and share the good mood of user A.But there is following defect in this communication method: user B may not be placed on mobile terminal on hand, or because mobile terminal, in silent mode, makes user B can not view in time note and can not experience intuitively the mood that user A shares.
Summary of the invention
Main purpose of the present invention, for a kind of emotion information exchange method, system and emotional interaction device are provided, is intended to by emotional interaction device, to realize emotion information easily mutual.
The invention provides a kind of emotion information exchange method, the method comprises:
The first mobile terminal receives the emotion information of pre-set categories from the second mobile terminal, and the emotion information of the pre-set categories of described reception is sent to emotional interaction device;
Described emotional interaction device, according to the mapping relations of default emotion information and prompting message, is determined prompting message corresponding to emotion information receiving from the first mobile terminal, and is controlled its information output unit and export definite prompting message.
Preferably, described emotional interaction device is jewellery.
Preferably, described method also comprises:
Described the first mobile terminal is set up communication by wifi unit, bluetooth unit or NFC unit with described emotional interaction device and is connected.
Preferably, described prompting message involving vibrations information, acoustic and light information and/or image information.
Preferably, described the first mobile terminal receives the emotion information of pre-set categories from the second mobile terminal, and before the emotion information of the pre-set categories of described reception being sent to the step of emotional interaction device, the method also comprises:
Between predetermined the first mobile terminal, the second mobile terminal and emotional interaction device, set up the mutual pair relationhip of the emotion information of described pre-set categories.
Preferably, after the step of the mutual pair relationhip of the described emotion information of setting up described pre-set categories between predetermined the first mobile terminal, the second mobile terminal and emotional interaction device, the method also comprises:
Cancel the mutual pair relationhip of described the first mobile terminal, the second mobile terminal and emotional interaction device of mutual pair relationhip of the emotion information of built vertical pre-set categories.
The present invention also provides a kind of emotion information interactive system, comprises the first mobile terminal, the emotional interaction device being connected with described the first mobile terminal communication, wherein:
Described the first mobile terminal, for receive the emotion information of pre-set categories from the second mobile terminal, and sends to emotional interaction device by the emotion information of the described pre-set categories receiving;
Described emotional interaction device, for according to the mapping relations of default emotion information and prompting message, determines prompting message corresponding to emotion information receiving from the first mobile terminal, and controls its information output unit and export definite prompting message.
Preferably, described prompting message involving vibrations information, acoustic and light information and/or image information.
The present invention also comprises a kind of emotional interaction device, and this emotional interaction device comprises: wireless communication module, and the processor being connected with described wireless communication module, the information output unit being connected with described processor, wherein:
Described wireless communication module, for receiving the emotion information of the pre-set categories that the first mobile terminal sends over, and sends to described processor by the emotion information of the pre-set categories of reception; The emotion information of the pre-set categories that this first mobile terminal sends over is that this first mobile terminal receives and comes from the second handover terminal;
Described processor, for according to the mapping relations of default emotion information and prompting message, determines prompting message corresponding to emotion information receiving from the first mobile terminal;
Described information output unit, for exporting the definite prompting message of described processor.
Preferably, described wireless communication module is wifi unit, bluetooth unit or NFC unit.
Emotion information exchange method, system and emotional interaction device that the present invention proposes, by emotional interaction device, receive the first mobile terminal from the emotion information of the default kind of the second mobile terminal reception, and according to the mapping relations of default emotion information and prompting message, determine prompting message corresponding to emotion information receiving from the first mobile terminal, and control its information output unit and export definite prompting message, can facilitate that by emotional interaction device, to realize emotion information intuitively mutual.
Accompanying drawing explanation
Fig. 1 is the first embodiment schematic flow sheet of emotion information exchange method of the present invention;
Fig. 2 is the second embodiment schematic flow sheet of emotion information exchange method of the present invention;
Fig. 3 is the 3rd embodiment schematic flow sheet of emotion information exchange method of the present invention;
Fig. 4 is the 4th embodiment schematic flow sheet of emotion information exchange method of the present invention;
Fig. 5 is the structural representation of emotion information interactive system of the present invention;
Fig. 6 is the concrete structure schematic diagram of matching module of the first mobile terminal of emotion information interactive system of the present invention;
Fig. 7 is the concrete structure schematic diagram of cancelling module of the first mobile terminal of emotion information interactive system of the present invention;
Fig. 8 is the structural representation of emotional interaction device of the present invention.
The realization of the object of the invention, functional characteristics and advantage, in connection with embodiment, are described further with reference to accompanying drawing.
Embodiment
Should be appreciated that specific embodiment described herein, only in order to explain the present invention, is not intended to limit the present invention.
With reference to Fig. 1, Fig. 1 is the first embodiment schematic flow sheet of emotion information exchange method of the present invention, and the method comprises the following steps:
S10, the first mobile terminal receive the emotion information of pre-set categories from the second mobile terminal, and the emotion information of the pre-set categories of this reception is sent to emotional interaction device.
This first mobile terminal and the second mobile terminal can be all mobile phone, and this first mobile terminal and the second mobile terminal can be also other portable mobile termianls.This second mobile terminal can send the emotion information of pre-set categories to this first mobile terminal by the Internet, this Internet comprises mobile Internet, and this mobile Internet comprises cloud computing platform; This second mobile terminal also can send by GPRS/CDMA communication modes the emotion information of pre-set categories to this first mobile terminal.
The emotion information of this pre-set categories comprises the normally used word showing emotion of people, as: very delight, be very glad, very sad, very sad, very angry, very angry, very sad, be hard hit, very worried, be very much depressed, very sexy, very sexy etc.
S20, this emotional interaction device, according to the mapping relations of default emotion information and prompting message, are determined prompting message corresponding to emotion information receiving from the first mobile terminal, and are controlled its information output unit and export definite prompting message.
User can arrange according to actual conditions the mapping relations of emotion information and prompting message, this prompting message involving vibrations information, acoustic and light information and/or image information, in these mapping relations, emotion information is corresponding vibration information only, also corresponding acoustic and light information only, also can a correspondence image information, also simultaneously corresponding vibration information and acoustic and light information, also simultaneously corresponding vibration information and image information, also simultaneously corresponding acoustic and light information and image information, also can corresponding vibration information, acoustic and light information and image information.This vibration information involving vibrations intensity, vibration number, each vibration continue duration, each time of vibration interval etc.; This acoustic and light information comprises acoustic information, optical information, wherein acoustic information comprises sound-content, sound order of magnitude, sound broadcasting time, each sound reproduction time interval etc., and optical information comprises light color, continue glittering or interval is glittering, continue the glittering time, the glittering number of times in interval etc.; This image information comprises: moon pattern, heart type pattern, the pattern of airkissing, handle trousers image, smiling face's pattern etc.The emotion information that this is default and the mapping relations of prompting message are as shown in Table 1.
Table one
For making that same emotion information is shown to diversified prompting message, can be in the mapping relations of this default emotion information and prompting message, an emotion information is arranged to a plurality of prompting messages, when being very delight when emotion information, prompting message corresponds to: " vibration: oscillation intensity is for strong; vibrate twice, each vibration time is 5 seconds, each time of vibration is spaced apart 1 second; Pattern-information: smiling face ".
Further, this emotional interaction device is jewellery.These jewellery can be bangle, necklace, bullion etc.In daily life, people like wearing jewellery miscellaneous conventionally.These jewellery directly contact with human body, and prompting message corresponding to emotion information the second mobile terminal being sent by these jewellery expressed, can be more direct.As when this emotional interaction device is bangle, in step S20, definite prompting message is during for " vibration: a little less than oscillation intensity is; vibration number once; vibration time is 5 seconds ", user can directly experience vibration information, and then knows that the emotion information that the second mobile terminal sends is " being very glad ", also can share the good mood of the second mobile phone users together.
With reference to Fig. 2, Fig. 2 is the second embodiment schematic flow sheet of emotion information exchange method of the present invention.
The first embodiment based on above-mentioned emotion information exchange method, before step S10, the method also comprises:
S30, this first mobile terminal are set up communication by wifi unit, bluetooth unit or NFC unit with this emotional interaction device and are connected.
This NFC is the abbreviation of Near Field Communication, and this NFC unit adopts near field communication (NFC) to realize communication.
With reference to Fig. 3, Fig. 3 is the 3rd embodiment schematic flow sheet of emotion information exchange method of the present invention.
The first embodiment based on above-mentioned emotion information exchange method, before step S10, the method also comprises:
S40, between predetermined the first mobile terminal, the second mobile terminal and emotional interaction device, set up the mutual pair relationhip of the emotion information of this pre-set categories.
In this step S40, set up the mutual pair relationhip of the emotion information of pre-set categories between the first mobile terminal, the second mobile terminal and emotional interaction device, make this first mobile terminal the emotion information of the pre-set categories receiving from the second mobile terminal can be sent to emotional interaction device.
Concrete, this step S40 comprises the following steps:
S41, this first mobile terminal receive the emotional interaction device unique identification code that this emotional interaction device sends, and receive the second mobile terminal unique identification code of this second mobile terminal transmission and the unique identification code of emotional interaction device to be paired.
This first mobile terminal can set up communication while being connected with this emotional interaction device, sends the request of obtaining its unique identification code to this emotional interaction device; This emotional interaction device, after receiving the request of obtaining, feeds back its unique identification code to this first mobile terminal.This first mobile terminal also can not send the request of obtaining its unique identification code to this emotional interaction device setting up communication while being connected with this emotional interaction device, but directly receives this emotional interaction device active push emotional interaction device unique identification code of coming.
S42, this first mobile terminal judge that whether this emotional interaction device unique identification code is identical with the unique identification code of this emotional interaction device to be paired, if identical, perform step S43.
In this step S42, whether the emotional interaction device unique identification code that the first mobile terminal judgement receives is identical with the unique identification code of emotional interaction device to be paired, if identical, explanation can be set up the first mobile terminal, the second mobile terminal and emotional interaction device the mutual pair relationhip of the emotion information of pre-set categories.
The emotional interaction device unique identification code receiving from emotional interaction device as the first mobile terminal is 123456, the unique identification code of the emotional interaction device to be paired receiving from the second mobile terminal is 123456, determines and can between this first mobile terminal, the second mobile terminal and emotional interaction device, set up the mutual pair relationhip of the emotion information of pre-set categories.
S43, between this first mobile terminal, the second mobile terminal and emotional interaction device, set up the mutual pair relationhip of the emotion information of this pre-set categories.
With reference to Fig. 4, Fig. 4 is the 4th embodiment schematic flow sheet of emotion information exchange method of the present invention.
The 3rd embodiment based on above-mentioned emotion information exchange method, after step S40, the method also comprises:
S50, cancel the mutual pair relationhip of this first mobile terminal, the second mobile terminal and emotional interaction device of mutual pair relationhip of the emotion information of built vertical pre-set categories.
In this step S50, cancel the mutual pair relationhip of the emotion information of pre-set categories between the first mobile terminal, the second mobile terminal and emotional interaction device, make this first mobile terminal no longer the emotion information of the pre-set categories receiving from the second mobile terminal be sent to emotional interaction device.
Concrete, this step S50 comprises:
S51, this first mobile terminal send and cancel mutual pair relationhip request to this second mobile terminal, and this is cancelled mutual pair relationhip request and comprises emotional interaction device unique identification code.
S52, this second mobile terminal receive this and cancel mutual pair relationhip request, and send and agree to cancel pairing to this first mobile terminal.
In this step S52, this second mobile terminal receives cancels after mutual pair relationhip request, if this second mobile terminal is agreed to cancel and this mutual pair relationhip of cancelling the emotional interaction device that emotional interaction device unique identification code in mutual pairing request is corresponding, to this first mobile terminal, send and agree to cancel pairing.
S53, this first mobile terminal receive the agreement of this second mobile terminal transmission and cancel pairing, cancel the mutual pair relationhip of the emotion information of built vertical pre-set categories between the corresponding emotional interaction device of this first mobile terminal, the second mobile terminal and this emotional interaction device unique identification code.
With reference to Fig. 5, Fig. 5 is the structural representation of emotion information interactive system of the present invention, and this system comprises: the first mobile terminal 10, and the emotional interaction device 20 being connected with these the first mobile terminal 10 communications, wherein:
This first mobile terminal 10 receives the emotion information of pre-set categories from the second mobile terminal 30, and the emotion information of this pre-set categories receiving is sent to emotional interaction device 20;
This emotional interaction device 20, according to the mapping relations of default emotion information and prompting message, is determined prompting message corresponding to emotion information receiving from the first mobile terminal, and controls the definite prompting message of its information output unit 23 output.
This first mobile terminal and the second mobile terminal can be all mobile phone, and this first mobile terminal and the second mobile terminal can be also other portable mobile termianls.This second mobile terminal can send by the Internet the emotion information of pre-set categories to this first mobile terminal, this second mobile terminal also can send by GPRS/CDMA communication modes the emotion information of pre-set categories to this first mobile terminal.
The emotion information of this pre-set categories comprises the normally used word showing emotion of people, as: very delight, be very glad, very sad, very sad, very angry, very angry, very sad, be hard hit, very worried, be very much depressed, very sexy, very sexy etc.
User can arrange according to actual conditions the mapping relations of emotion information and prompting message, this prompting message involving vibrations information, acoustic and light information and/or image information, in these mapping relations, emotion information is corresponding vibration information only, also corresponding acoustic and light information only, also can a correspondence image information, also simultaneously corresponding vibration information and acoustic and light information, also simultaneously corresponding vibration information and image information, also simultaneously corresponding acoustic and light information and image information, also can corresponding vibration information, acoustic and light information and image information.This vibration information involving vibrations intensity, vibration number, each vibration continue duration, each time of vibration interval etc.; This acoustic and light information comprises acoustic information, optical information, wherein acoustic information comprises sound-content, sound order of magnitude, sound broadcasting time, each sound reproduction time interval etc., and optical information comprises light color, continue glittering or interval is glittering, continue the glittering time, the glittering number of times in interval etc.; This image information comprises: moon pattern, heart type pattern, the pattern of airkissing, handle trousers image, smiling face's pattern etc.The emotion information that this is default and the mapping relations of prompting message are as shown in above-mentioned table one.
For making that same emotion information is shown to diversified prompting message, can be in the mapping relations of this default emotion information and prompting message, an emotion information is arranged to a plurality of prompting messages, when being very delight when emotion information, prompting message corresponds to: " vibration: oscillation intensity is for strong; vibrate twice, each vibration time is 5 seconds, each time of vibration is spaced apart 1 second; Pattern-information: smiling face ".
Further, this emotional interaction device 20 is jewellery.These jewellery can be bangle, necklace etc.In daily life, people like wearing jewellery miscellaneous conventionally.These jewellery directly contact with human body, and prompting message corresponding to emotion information the second mobile terminal being sent by these jewellery expressed, can be more direct.As when this emotional interaction device is bangle, when this bangle is exported prompting message for " vibration: a little less than oscillation intensity is; vibration number once; vibration time is 5 seconds " by information output unit, user can directly experience vibration information, and then know that the emotion information that the second mobile terminal sends is " being very glad ", also can share the good mood of the second mobile phone users together.
Further, this first mobile terminal 10 comprises wifi unit, bluetooth unit or NFC unit, for setting up communication with emotional interaction device 20, is connected.
This NFC is the abbreviation of Near Field Communication, and this NFC unit adopts near field communication (NFC) to realize communication.
Accordingly, this emotional interaction device 20 also comprises wifi unit, bluetooth unit or NFC unit, to set up with the communication of this first mobile terminal 10, is connected.
Further, this first mobile terminal 10 comprises matching module 11, and this matching module 11 for setting up the mutual pair relationhip of the emotion information of this pre-set categories between predetermined the first mobile terminal, the second mobile terminal and emotional interaction device.
This matching module 11 is set up the mutual pair relationhip of the emotion information of pre-set categories between the first mobile terminal, the second mobile terminal and emotional interaction device, makes this first mobile terminal the emotion information of the pre-set categories receiving from the second mobile terminal can be sent to emotional interaction device.
Concrete, as shown in Figure 6, this matching module 11 comprises the first receiving element 111, judging unit 112 and sets up unit 113 concrete structure of this matching module 11, wherein:
This first receiving element 111, the emotional interaction device unique identification code sending for receiving this emotional interaction device, and receive the second mobile terminal unique identification code of this second mobile terminal transmission and the unique identification code of emotional interaction device to be paired.
This first mobile terminal can set up communication while being connected with this emotional interaction device, sends the request of obtaining its unique identification code to this emotional interaction device; This emotional interaction device, after receiving the request of obtaining, feeds back its unique identification code to this first mobile terminal, and this first mobile terminal 10 receives the unique identification code of this emotional interaction device feedback by this first receiving element 111.This first mobile terminal also can set up communication while being connected with this emotional interaction device, to this emotional interaction device, do not send the request of obtaining its unique identification code, but directly by this first receiving element 111, receive these emotional interaction device active push emotional interaction device unique identification code of coming.
Whether this judging unit 112 is identical with the unique identification code of this emotional interaction device to be paired for judging the emotional interaction device unique identification code of these first receiving element, 111 receptions.
When judgment result is that of judging unit 112 is identical, explanation can be set up the first mobile terminal, the second mobile terminal and emotional interaction device the mutual pair relationhip of the emotion information of pre-set categories.The emotional interaction device unique identification code receiving from emotional interaction device as the first receiving element 111 of the first mobile terminal 10 is 123456, the unique identification code of the emotional interaction device to be paired receiving from the second mobile terminal 30 is 123456, determines and can between this first mobile terminal 10, the second mobile terminal 30 and emotional interaction device 20, set up the mutual pair relationhip of the emotion information of pre-set categories.
This sets up unit 113, for set up the mutual pair relationhip of the emotion information of this pre-set categories between this first mobile terminal, the second mobile terminal and emotional interaction device.
Further, this first mobile terminal 10 also comprises cancels module 12, and this cancels module 12 for cancelling the mutual pair relationhip of this first mobile terminal, the second mobile terminal and emotional interaction device of mutual pair relationhip of the emotion information of built vertical pre-set categories.
This cancels the mutual pair relationhip that module 12 is cancelled the emotion information of pre-set categories between the first mobile terminal, the second mobile terminal and emotional interaction device, makes this first mobile terminal no longer the emotion information of the pre-set categories receiving from the second mobile terminal be sent to emotional interaction device.
Concrete, as shown in Figure 7, this cancels module 12 and comprises the first Transmit-Receive Unit 121, cancels unit 122 this concrete structure of cancelling module 12, wherein:
This first Transmit-Receive Unit 121, for sending and cancel mutual pair relationhip request to this second mobile terminal, this is cancelled mutual pair relationhip request and comprises emotional interaction device unique identification code, and cancels pairing for receiving the agreement of this second mobile terminal transmission.
This cancels unit 122, while cancelling pairing for receive the agreement of the second mobile terminal transmission at the first Transmit-Receive Unit 121, cancel the mutual pair relationhip of the emotion information of built vertical pre-set categories between the corresponding emotional interaction device of this first mobile terminal, the second mobile terminal and this emotional interaction device unique identification code.
The second mobile terminal comprises the second Transmit-Receive Unit, and this second Transmit-Receive Unit for receiving the first mobile terminal by the mutual pair relationhip request of cancelling of the first Transmit-Receive Unit transmission, and sends and agrees to cancel pairing to this first mobile terminal.
With reference to Fig. 8, Fig. 8 is the structural representation of emotional interaction device of the present invention, and this emotional interaction device comprises: wireless communication module 21, and the processor 22 being connected with this wireless communication module 21, the information output unit 23 being connected with this processor 22, wherein:
This wireless communication module 21, for receiving the emotion information of the pre-set categories that the first mobile terminal sends over, and sends to this processor 22 by the emotion information of the pre-set categories of reception; The emotion information of the pre-set categories that this first mobile terminal sends over is that this first mobile terminal receives and comes from the second handover terminal;
This processor 22, for according to the mapping relations of default emotion information and prompting message, determines prompting message corresponding to emotion information receiving from the first mobile terminal;
This information output unit 23, for exporting the definite prompting message of this processor.
This first mobile terminal and the second mobile terminal can be all mobile phone, and this first mobile terminal and the second mobile terminal can be also other portable mobile termianls.This second mobile terminal can send the emotion information of pre-set categories to this first mobile terminal by the Internet, this Internet comprises mobile Internet, and this mobile Internet comprises cloud computing platform; This second mobile terminal also can send by GPRS/CDMA communication modes the emotion information of pre-set categories to this first mobile terminal.
The emotion information of this pre-set categories comprises the normally used word showing emotion of people, as: very delight, be very glad, very sad, very sad, very angry, very angry, very sad, be hard hit, very worried, be very much depressed, very sexy, very sexy etc.
User can arrange according to actual conditions the mapping relations of emotion information and prompting message, this prompting message involving vibrations information, acoustic and light information and/or image information, in these mapping relations, emotion information is corresponding vibration information only, also corresponding acoustic and light information only, also can a correspondence image information, also simultaneously corresponding vibration information and acoustic and light information, also simultaneously corresponding vibration information and image information, also simultaneously corresponding acoustic and light information and image information, also can corresponding vibration information, acoustic and light information and image information.This vibration information involving vibrations intensity, vibration number, each vibration continue duration, each time of vibration interval etc.; This acoustic and light information comprises acoustic information, optical information, wherein acoustic information comprises sound-content, sound order of magnitude, sound broadcasting time, each sound reproduction time interval etc., and optical information comprises light color, continue glittering or interval is glittering, continue the glittering time, the glittering number of times in interval etc.; This image information comprises: moon pattern, heart type pattern, the pattern of airkissing, handle trousers image, smiling face's pattern etc.The emotion information that this is default and the mapping relations of prompting message are as shown in above-mentioned table one.
For making that same emotion information is shown to diversified prompting message, can be in the mapping relations of this default emotion information and prompting message, an emotion information is arranged to a plurality of prompting messages, when being very delight when emotion information, prompting message corresponds to: " vibration: oscillation intensity is for strong; vibrate twice, each vibration time is 5 seconds, each time of vibration is spaced apart 1 second; Pattern-information: smiling face ".
When prompting message is vibration information, this information output unit 23 is vibrators; When prompting message is image information, this information output unit 23 is display units; When prompting message is acoustic and light information, this information output unit 23 comprises loudspeaker and LED lamp; When prompting message is vibration information and image information, this information output unit 23 is vibrator and display unit; When prompting message is vibration information and acoustic and light information, this information output unit 23 is vibrator, loudspeaker and LED lamp; When prompting message is acoustic and light information and image information, this information output unit 23 is loudspeaker, LED lamp and display unit; When prompting message is vibration information, acoustic and light information and image information, this information output unit 23 is vibrator, loudspeaker, LED lamp and display unit.
Further, this wireless communication module 21 is wifi unit, bluetooth unit or NFC unit.
Further, emotional interaction device 20 is jewellery.These jewellery can be bangle, necklace etc.In daily life, people like wearing jewellery miscellaneous conventionally.These jewellery directly contact with human body, and prompting message corresponding to emotion information the second mobile terminal being sent by these jewellery expressed, can be more direct.As when this emotional interaction device is bangle, in step S20, definite prompting message is during for " vibration: a little less than oscillation intensity is; vibration number once; vibration time is 5 seconds ", user can directly experience vibration information, and then knows that the emotion information that the second mobile terminal sends is " being very glad ", also can share the good mood of the second mobile phone users together.
The foregoing is only the preferred embodiments of the present invention; not thereby limit the scope of the claims of the present invention; every equivalent structure or conversion of equivalent flow process that utilizes specification of the present invention and accompanying drawing content to do; or be directly or indirectly used in other relevant technical fields, be all in like manner included in scope of patent protection of the present invention.
Claims (10)
1. an emotion information exchange method, is characterized in that, the method comprises:
The first mobile terminal receives the emotion information of pre-set categories from the second mobile terminal, and the emotion information of the pre-set categories of described reception is sent to emotional interaction device;
Described emotional interaction device, according to the mapping relations of default emotion information and prompting message, is determined prompting message corresponding to emotion information receiving from the first mobile terminal, and is controlled its information output unit and export definite prompting message.
2. method according to claim 1, is characterized in that, described emotional interaction device is jewellery.
3. method according to claim 1, is characterized in that, described method also comprises:
Described the first mobile terminal is set up communication by wifi unit, bluetooth unit or NFC unit with described emotional interaction device and is connected.
4. method according to claim 1, is characterized in that, described prompting message involving vibrations information, acoustic and light information and/or image information.
5. method according to claim 1, it is characterized in that, described the first mobile terminal receives the emotion information of pre-set categories from the second mobile terminal, and before the emotion information of the pre-set categories of described reception being sent to the step of emotional interaction device, the method also comprises:
Between predetermined the first mobile terminal, the second mobile terminal and emotional interaction device, set up the mutual pair relationhip of the emotion information of described pre-set categories.
6. method according to claim 5, it is characterized in that, after the step of the mutual pair relationhip of the described emotion information of setting up described pre-set categories between predetermined the first mobile terminal, the second mobile terminal and emotional interaction device, the method also comprises:
Cancel the mutual pair relationhip of described the first mobile terminal, the second mobile terminal and emotional interaction device of mutual pair relationhip of the emotion information of built vertical pre-set categories.
7. an emotion information interactive system, is characterized in that, comprises the first mobile terminal, the emotional interaction device being connected with described the first mobile terminal communication, wherein:
Described the first mobile terminal, for receive the emotion information of pre-set categories from the second mobile terminal, and sends to emotional interaction device by the emotion information of the described pre-set categories receiving;
Described emotional interaction device, for according to the mapping relations of default emotion information and prompting message, determines prompting message corresponding to emotion information receiving from the first mobile terminal, and controls its information output unit and export definite prompting message.
8. system according to claim 7, is characterized in that, described prompting message involving vibrations information, acoustic and light information and/or image information.
9. an emotional interaction device, is characterized in that, this emotional interaction device comprises: wireless communication module, and the processor being connected with described wireless communication module, the information output unit being connected with described processor, wherein:
Described wireless communication module, for receiving the emotion information of the pre-set categories that the first mobile terminal sends over, and sends to described processor by the emotion information of the pre-set categories of reception; The emotion information of the pre-set categories that this first mobile terminal sends over is that this first mobile terminal receives and comes from the second handover terminal;
Described processor, for according to the mapping relations of default emotion information and prompting message, determines prompting message corresponding to emotion information receiving from the first mobile terminal;
Described information output unit, for exporting the definite prompting message of described processor.
10. emotional interaction device according to claim 9, is characterized in that, described wireless communication module is wifi unit, bluetooth unit or NFC unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310486324.6A CN103533168A (en) | 2013-10-16 | 2013-10-16 | Sensibility information interacting method and system and sensibility interaction device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310486324.6A CN103533168A (en) | 2013-10-16 | 2013-10-16 | Sensibility information interacting method and system and sensibility interaction device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN103533168A true CN103533168A (en) | 2014-01-22 |
Family
ID=49934825
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310486324.6A Pending CN103533168A (en) | 2013-10-16 | 2013-10-16 | Sensibility information interacting method and system and sensibility interaction device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103533168A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104301202A (en) * | 2014-02-25 | 2015-01-21 | 王石强 | Vibration information expression method and system for instant messaging |
CN112311410A (en) * | 2020-11-03 | 2021-02-02 | 赵天辉 | Emotion interaction system and device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101917512A (en) * | 2010-07-26 | 2010-12-15 | 宇龙计算机通信科技(深圳)有限公司 | Method and system for displaying head picture of contact person and mobile terminal |
US20110184721A1 (en) * | 2006-03-03 | 2011-07-28 | International Business Machines Corporation | Communicating Across Voice and Text Channels with Emotion Preservation |
CN102263802A (en) * | 2010-05-26 | 2011-11-30 | 不嘴炮工作室股份有限公司 | Long-distance interactive device and interactive unit thereof |
-
2013
- 2013-10-16 CN CN201310486324.6A patent/CN103533168A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110184721A1 (en) * | 2006-03-03 | 2011-07-28 | International Business Machines Corporation | Communicating Across Voice and Text Channels with Emotion Preservation |
CN102263802A (en) * | 2010-05-26 | 2011-11-30 | 不嘴炮工作室股份有限公司 | Long-distance interactive device and interactive unit thereof |
CN101917512A (en) * | 2010-07-26 | 2010-12-15 | 宇龙计算机通信科技(深圳)有限公司 | Method and system for displaying head picture of contact person and mobile terminal |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104301202A (en) * | 2014-02-25 | 2015-01-21 | 王石强 | Vibration information expression method and system for instant messaging |
CN112311410A (en) * | 2020-11-03 | 2021-02-02 | 赵天辉 | Emotion interaction system and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9241061B2 (en) | Cell phone peripheral device, communication terminal and method for a cell phone peripheral device communicating with a cell phone | |
CN107095647B (en) | System and method for wireless device pairing | |
CN106251890B (en) | A kind of methods, devices and systems of recording song audio | |
CN106531165A (en) | Portable smart home voice control system and control method adopting same | |
CN108429972B (en) | Music playing method, device, terminal, earphone and readable storage medium | |
CN108431764A (en) | Electronic equipment and the method operated for control electronics | |
CN106201816B (en) | Reminding method and device | |
WO2015172519A1 (en) | Wearable device and user information exchange method based on wearable device | |
CN102868426A (en) | Watch-type man-machine mediated electronic equipment for conveniently acquiring information of mobilephone | |
JP2017147652A (en) | Information processing apparatus | |
CN110427073A (en) | Watchband adjusting method, wearable device and computer readable storage medium | |
CN108055735A (en) | A kind of sound control intelligent music lamp | |
CN110012148A (en) | A kind of bracelet control method, bracelet and computer readable storage medium | |
CN104519163A (en) | Intelligent wearable device with phone answering function | |
CN108694945A (en) | A kind of intelligent sound sofa | |
CN202004844U (en) | Mobile terminal utilizing gravity sensor to identify user's use habits | |
CN103533168A (en) | Sensibility information interacting method and system and sensibility interaction device | |
TW200818845A (en) | Method for utilizing a mobile communication device to search an object and a related mobile communication device | |
US10747181B2 (en) | Electronic talking stick | |
CN102438054A (en) | Method and system for dynamic setting of alarm clock time | |
CN111491286A (en) | Emergency rescue method, device and terminal | |
CN111167084A (en) | Rehabilitation training method, system and storage medium | |
CN206209638U (en) | Intelligent massaging mouse pad | |
CN107219769A (en) | The control method of lighting sound integration apparatus, apparatus and system | |
CN105641900B (en) | A kind of respiratory state based reminding method and electronic equipment and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C12 | Rejection of a patent application after its publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20140122 |