CN105204748A - Terminal interaction method and device - Google Patents

Terminal interaction method and device Download PDF

Info

Publication number
CN105204748A
CN105204748A CN201410302455.9A CN201410302455A CN105204748A CN 105204748 A CN105204748 A CN 105204748A CN 201410302455 A CN201410302455 A CN 201410302455A CN 105204748 A CN105204748 A CN 105204748A
Authority
CN
China
Prior art keywords
terminal
manipulation
communication interface
message communication
mutual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410302455.9A
Other languages
Chinese (zh)
Other versions
CN105204748B (en
Inventor
朱鹏程
王炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Advanced New Technologies Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201410302455.9A priority Critical patent/CN105204748B/en
Publication of CN105204748A publication Critical patent/CN105204748A/en
Application granted granted Critical
Publication of CN105204748B publication Critical patent/CN105204748B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a terminal interaction method and device. The method comprises the steps that an assigned contact displayed on a message communication interface on a terminal screen is selected according to a detected trigger operation of a user on the terminal screen; a control mark corresponding to the assigned contact is generated on the terminal screen; corresponding user behavior data are generated according to a detected dragging operation of the user on the control mark; the user behavior data are sent to a server, and thus a corresponding interaction control instruction is generated by the server. By means of the technical scheme, the interaction process between terminals can be conveniently achieved in the message communication process, and the use experience of the user can be improved easily.

Description

Terminal interaction method and device thereof
Technical field
The application relates to communication technical field, particularly relates to terminal interaction method and device thereof.
Background technology
Along with the development of Internet technology, increasing user tends to be exchanged by the mode of message communication.As by short message or immediate communication tool etc., user can send word, picture etc. to the other side, to realize information interchange.But, need the interactive mode by more horn of plenty, to optimize the message communication mode in correlation technique.
Summary of the invention
In view of this, the application provides a kind of new technical scheme, in the process of message communication, can realize the reciprocal process of terminal room easily, contributes to the experience promoting user.
For achieving the above object, the application provides technical scheme as follows:
According to the first aspect of the application, propose a kind of terminal interaction method, comprising:
According to the user detected trigger action on a terminal screen, choose the designated contact on the message communication interface that is shown on described terminal screen;
On described terminal screen, the manipulation generated corresponding to described designated contact identifies;
According to the drag operation of the described user detected to described manipulation mark, generate corresponding user behavior data;
Described user behavior data is sent to server, to generate corresponding mutual manipulation instruction by described server.
According to the second aspect of the application, propose a kind of terminal interaction device, comprising:
Selection unit, according to the user detected trigger action on a terminal screen, chooses the designated contact on the message communication interface that is shown on described terminal screen;
Identification generation unit, on described terminal screen, the manipulation generated corresponding to described designated contact identifies;
Data generating unit, according to the drag operation of the described user detected to described manipulation mark, generates corresponding user behavior data;
Transmitting element, is sent to server by described user behavior data, to generate corresponding mutual manipulation instruction by described server.
According to a third aspect of the invention we, propose a kind of terminal interaction method, comprising:
According to the user detected trigger action on a terminal screen, choose the designated contact on the message communication interface that is shown on described terminal screen;
On described terminal screen, the manipulation generated corresponding to described designated contact identifies;
According to the drag operation of the described user detected to described manipulation mark, generate corresponding user behavior data;
By described user behavior data directly or be sent to the terminal of described designated contact by server, to generate corresponding mutual manipulation instruction by the terminal of this designated contact.
According to a forth aspect of the invention, propose a kind of terminal interaction device, comprising:
Selection unit, according to the user detected trigger action on a terminal screen, chooses the designated contact on the message communication interface that is shown on described terminal screen;
Identification generation unit, on described terminal screen, the manipulation generated corresponding to described designated contact identifies;
Data generating unit, according to the drag operation of the described user detected to described manipulation mark, generates corresponding user behavior data;
Transmitting element, by described user behavior data directly or be sent to the terminal of described designated contact by server, to generate corresponding mutual manipulation instruction by the terminal of this designated contact.
According to a fifth aspect of the invention, propose a kind of terminal interaction method, comprising:
According to the user detected trigger action on a terminal screen, choose the designated contact on the message communication interface that is shown on described terminal screen;
On described terminal screen, the manipulation generated corresponding to described designated contact identifies;
According to the drag operation of the described user detected to described manipulation mark, generate corresponding user behavior data;
Corresponding mutual manipulation instruction is generated according to described user behavior data, and directly or be sent to the terminal of described designated contact by server.
According to a sixth aspect of the invention, propose a kind of terminal interaction device, comprising:
Selection unit, according to the user detected trigger action on a terminal screen, chooses the designated contact on the message communication interface that is shown on described terminal screen;
Identification generation unit, on described terminal screen, the manipulation generated corresponding to described designated contact identifies;
Data generating unit, according to the drag operation of the described user detected to described manipulation mark, generates corresponding user behavior data;
Generation unit, generates corresponding mutual manipulation instruction according to described user behavior data;
Transmitting element, the mutual manipulation instruction of generation directly or be sent to the terminal of described designated contact by server.
From above technical scheme, the application, by the drag operation on message communication interface, realizes the reciprocal process of terminal room easily, contributes to the experience promoting user.
Accompanying drawing explanation
Fig. 1 shows the schematic flow diagram of the terminal interaction method of the exemplary embodiment according to the application;
Fig. 2 shows the schematic flow diagram of the selection contact person of the exemplary embodiment according to the application;
Fig. 3 A-3B shows the interface schematic diagram of the trigger action of the exemplary embodiment according to the application;
Fig. 4 A-4B shows the interface schematic diagram of the generation manipulation mark according to an exemplary embodiment of the application;
Fig. 5 shows the interface schematic diagram of the dragging manipulation mark according to an exemplary embodiment of the application;
Fig. 6 A-6B shows the interface schematic diagram of the interaction effect of the exemplary embodiment according to the application;
Fig. 7 shows the schematic diagram realizing interactive operation between the terminal according to an exemplary embodiment of the application;
Fig. 8 shows the schematic diagram realizing interactive operation between the terminal according to another exemplary embodiment of the application;
Fig. 9 shows the schematic diagram realizing interactive operation between the terminal according to another exemplary embodiment of the application;
Figure 10 shows the schematic diagram realizing interactive operation between the terminal according to another exemplary embodiment of the application;
Figure 11 shows the schematic diagram realizing interactive operation between the terminal according to another exemplary embodiment of the application;
Figure 12 A-12B shows the interface schematic diagram of the terminal interaction of the exemplary embodiment according to the application;
Figure 13 shows the interface schematic diagram according to dragging path implementation interaction effect of the exemplary embodiment according to the application;
Figure 14 shows the interface schematic diagram according to dragging path implementation interaction effect of another exemplary embodiment according to the application;
Figure 15 shows the structural representation of the electronic equipment of the exemplary embodiment according to the application;
Figure 16 shows the schematic block diagram of the terminal interaction device of the exemplary embodiment according to the application;
Figure 17 shows the schematic block diagram of the terminal interaction device of another exemplary embodiment according to the application;
Figure 18 shows the schematic block diagram of the terminal interaction device of another exemplary embodiment according to the application.
Embodiment
When being undertaken mutual by the mode of message communication between user, this message communication can be the form of short message, multimedia message, also can be the form of the instant messaging such as credulity, QQ.By installing corresponding application program in PC or on the mobile devices such as mobile phone/panel computer, user just can send the message such as word, picture to other users.But in traditional communication modes, user is merely able to input characters or the picture etc. in input frame fixed, and its interactive mode is too dull.
And in the technical scheme of the application, by the drag operation on message communication interface, make can realize easily between user terminal interaction, contribute to the experience promoting user.For being further described the application, provide the following example:
Please refer to Fig. 1, show the schematic flow diagram of the terminal interaction method of the exemplary embodiment according to the application, this terminal interaction method comprises:
Step 102, according to the user detected trigger action on a terminal screen, chooses the designated contact on the message communication interface that is shown on described terminal screen;
Step 104, on described terminal screen, the manipulation generated corresponding to described designated contact identifies;
Step 106, according to the drag operation of the described user detected to described manipulation mark, generates corresponding user behavior data;
Step 108, is sent to server by described user behavior data, to generate corresponding mutual manipulation instruction by described server.
In this technical scheme, user can send word, picture etc. by message communication interface, realize traditional message communication, manipulation mark can also be generated by triggering on a terminal screen, thus the interactive controlling realized based on drag operation, enrich terminal interaction mode, contribute to the experience promoting user.
In a step 102, by user's trigger action on a terminal screen, achieve and operate the selection of the contact person in message communication interface, its detailed process as shown in Figure 2, comprising:
Step 202, when user's trigger action on a terminal screen being detected, determines that this trigger action is at trigger position corresponding to terminal screen.Due to when message communication interface display is on terminal screen, the relative position relation between message communication interface and terminal screen is fixing, and thus this trigger position also can think the position that trigger action is corresponding on message communication interface.
Wherein, the trigger action of user can have a variety of mode.As an exemplary embodiment, for the mobile device that mobile phone, panel computer etc. are touch, can the mode shown in Fig. 3 A be adopted, carry out on a terminal screen clicking, double-click, long by etc., using as trigger action; Or, as another exemplary embodiment, for desktop computer, the mobile device etc. being connected to the peripheral hardwares such as mouse by OTG (On-The-Go) technology, can by the cursor that the peripheral hardware control terminal screens such as mouse demonstrate, and adopt the mode shown in Fig. 3 B, carry out on a terminal screen clicking, double-click, long by etc., using as trigger action.
Step 204, based on above-mentioned trigger position, judges whether this trigger position is positioned at region corresponding to the head portrait of the arbitrary contact person on message communication interface.If this trigger position is positioned at region corresponding to the head portrait of arbitrary contact person, then goes to step 206, otherwise return step 202, rejudge based on follow-up trigger action.
Step 206, chooses the contact person corresponding to trigger position.As shown in Figure 3A, terminal detects user Lucy touch operation on a terminal screen to ratio, and obtains concrete trigger position; When this trigger position is positioned at region corresponding to the head portrait of user Jame (head portrait of such as trigger position and Jame exists overlapping region, then think that trigger position is positioned at region corresponding to this head portrait), choose Jame.Similarly, in figure 3b, if cursor click Jame head portrait and click the duration be greater than preset time threshold, then think that user Lucy triggers Jame, should Jame be chosen.
At step 104, the manipulation mark generated on a terminal screen can show as arbitrary graphic pattern, than circle as shown in Figure 4 A.Manipulation mark can be appeared in one's mind on message communication interface, namely can be understood as " manipulation mark " and " message communication interface " and is in respectively different " layer ", and " layer " of manipulation mark correspondence is on " layer " that message communication interface is corresponding.Thus as shown in Figure 4 A, when user drags manipulation mark by the mode (or being realized by peripheral hardwares such as mouses) touched in touch screen, manipulation mark can be moved with drag operation, but this drag operation itself can't have influence on message communication interface.
Certainly, in order to promote the enjoyment in user operation process, make selection contact person and drag between the operations such as manipulation mark interrelated, can when user choose contact person by trigger action, than as shown in Figure 4 B, when the long head portrait pressing Jame of finger of Lucy, when making Jame selected, extract the head portrait picture of Jame, and this head portrait picture is appeared in one's mind on described message communication interface, to identify as described manipulation.
Assuming that Lucy have selected Jame as operand, and on message communication interface, generate manipulation mark with the head portrait picture of Jame, then Lucy can further by this manipulation of dragging mark, manipulation mark is dragged according to certain way, thus message communication interface on the terminal screen of Jame represents corresponding interaction effect.Such as an exemplary embodiment, the drag operation of Lucy to manipulation mark is: as shown in Figure 5, makes manipulation mark along first direction (as lower left) and the second direction relative with first direction (as upper right side) to-and-fro movement.
Corresponding to the to-and-fro movement of the manipulation mark shown in Fig. 5, the terminal of Lucy collects corresponding user behavior data, the dragging track etc. of such as this drag operation, and this user behavior data is sent to server, corresponding mutual manipulation instruction is generated by server, this is manipulated the terminal that instruction is sent to Jame by server alternately, then the terminal of Jame manipulates instruction alternately by performing this, realizes corresponding interaction effect.Particularly, interaction effect can comprise following at least one or a combination set of: corresponding message communication interface is shaken, and corresponding message communication interface represents default picture, and corresponding terminal is vibrated.Such as Fig. 6 A is shown as the interface in the terminal of Jame, and this interface comprises the message communication interface entering Jame and Lucy and other users (as Adam etc.) entrance (the sub-interface of multiple rectangles namely in Fig. 6 A, there is corresponding contact person in every sub-interface; Such as first sub-interface corresponds to Adam, for entering the message communication interface of Jame and Adam, second sub-interface corresponds to Lucy, for entering the message communication interface of Jame and Lucy), then based on the drag operation of Lucy, the sub-interface corresponding to Lucy can be made to produce shake, to point out Jame.And after Jame clicks the message communication interface entered with Lucy, the picture (head portrait of such as Lucy) shown in Fig. 6 B can be demonstrated on the surface, and the terminal of this interface and/or Jame is shaken simultaneously.
Further, the reciprocating frequency of manipulation mark can be corresponding with described mutual manipulation instruction the severe degree of interaction effect be proportionate.Particularly, such as Lucy wishes the reply obtaining Jame as early as possible, then, when Lucy drags manipulation mark, the reciprocating frequency of manipulation mark can be recorded in user behavior data.Therefore, the reciprocating frequency manipulating mark if make is higher, then the degree that the interface shake in Jame terminal and/or terminal are vibrated is stronger, to show the eager mood of Lucy, contributes to the experience promoting user.
Visible, in the above-described embodiments, by Lucy end side to the drag operation (to-and-fro movement) of manipulation mark, be presented as the shake at the sub-interface/message communication interface in Jame end side, and/or the vibration of terminal.So, based on similar mode, the terminal interaction process based on the application can be realized, specifically can describe whole terminal interaction process in detail by Fig. 7, still carry out message communication for user Lucy and user Jame by respective terminal A and terminal B.
Lucy side:
On the such as message communication interface of Lucy on terminal A, by the long head portrait pressing Jame, have selected Jame is operand; Terminal A appears in one's mind corresponding manipulation mark, is such as specifically as follows the head portrait picture of Jame; Lucy drags this manipulation mark, then the user behavior data bag that terminal A record is corresponding, and uploads onto the server.
Server side:
Server receives the user behavior data bag of self terminal A, parse the user behavior data such as dragging track, motion frequency wherein, and the user behavior data passing through coupling pre-stored and the corresponding relation manipulated alternately between instruction, determine should the mutual manipulation instruction of user behavior data bag, and by this alternately manipulation instruction be sent to terminal B.
Jame side:
Jame receives the mutual manipulation instruction from server by terminal B, and perform this instruction, make on terminal B, represent corresponding interaction effect, such as corresponding message communication interface is shaken, corresponding message communication interface represents default picture, there is vibration etc. in corresponding terminal.
Simultaneously, in the reciprocal process shown in Fig. 7, terminal A only needs direct upload user behavioral data bag, and determine corresponding mutual manipulation instruction by server, and without the need to prestoring user behavior data, mutual manipulation instruction and corresponding relation between the two, contribute to reducing the configuration requirement to terminal A, and make full use of storage and the arithmetic capability of server.Meanwhile, terminal B only needs to receive the mutual manipulation instruction from server, and its data volume, much smaller than user behavior data bag, contributes to reducing bandwidth occupancy and the traffic consumes to terminal B, especially contributes to reducing campus network under mobile network.
Certainly, other modes obviously also can be adopted to complete above-mentioned flow process.As an illustrative embodiments, ratio as shown in Figure 8, server can not realize the determination to mutual manipulation instruction, but the user behavior data Packet forwarding of direct self terminal A is in the future to terminal B, by terminal B according to the user behavior data of pre-stored, mutual manipulation instruction and corresponding relation between the two, search and perform corresponding mutual manipulation instruction, to realize corresponding interaction effect.
As another illustrative embodiments, than as shown in Figure 9, except being determined to manipulate instruction alternately by server or terminal B, also can be determined voluntarily by terminal A.Particularly, after terminal A has sampled user behavior data bag, according to the user behavior data of pre-stored, mutual manipulation instruction and corresponding relation between the two, directly search the mutual manipulation instruction of the user behavior data bag corresponding to sampling, then by server, this is manipulated instruction alternately and be sent to terminal B.Because the data volume manipulating instruction is alternately less than the data volume of user behavior data bag, thus contribute to the data traffic that reduction terminal A, terminal B consume in reciprocal process, avoid producing comparatively high cost under mobile network.
As another illustrative embodiments, assuming that realize instant messaging between terminal A and terminal B, initiate communication for terminal A and be described.Based on the operation of user, after terminal A initiates the communication request with terminal B to server, the IP address of terminal B, TCP (TransmissionControlProtocol transmission control protocol) port numbers etc. are informed terminal A by server, thus based on these information, terminal A can set up with terminal B and directly be connected, thus realizes not needing the point-to-point communication (i.e. peer to peer communication mode) being carried out transfer by server.As shown in Figure 10, assuming that adopt point-to-point communication mode between terminal A and terminal B, then user behavior data bag directly can be sent to terminal B by terminal A, and forwards without the need to server; Then, according to the user behavior data bag received, by terminal B according to the user behavior data of pre-stored, mutual manipulation instruction and corresponding relation between the two, search and perform corresponding mutual manipulation instruction, to realize corresponding interaction effect.
As another illustrative embodiments, as shown in figure 11, when adopting point-to-point communication mode between terminal A and terminal B, equally can by terminal A after sampling user behavior data bag, according to the user behavior data of pre-stored, mutual manipulation instruction and corresponding relation between the two, search corresponding mutual manipulation instruction, and directly this mutual manipulation instruction is sent to terminal B, and carry out data relay without the need to server.
In addition, consider that terminal A also may be in mobile network, namely responsive for data transfer throughput, then present applicant proposes further improvement: based under the embodiments such as Fig. 7, Fig. 8 and Figure 10, terminal A is gathering the drag operation of user Lucy, after generating corresponding user behavior data, search the local pre-stored data whether existing and be matched with described user behavior data, if exist, then this user behavior data (can be packaged as user behavior data bag) is sent to server, otherwise does not send.Therefore, when the drag operation of user Lucy does not mate with local pre-stored data, Lucy does not probably wish to activate the terminal interaction with Jame, such as Lucy wishes the drag operation performing " to-and-fro movement " shown in Fig. 5 originally, but in dragging process, Lucy Iterim Change idea, if then Lucy directly looses one's grip, corresponding user behavior data can be mated " to-and-fro movement " and be sent to server by terminal A, causes terminal B that the interaction effect of interface shake and/or terminal vibration occurs; If but Lucy does not directly loose one's grip, but drag operation is carried out according to a random track on the screen of terminal A, the user behavior data that then this drag operation is corresponding can not mate local pre-stored data, this user behavior data can not be sent to server by terminal A, thus it is mutual to cause with terminal B, also can not waste the flow of terminal A under mobile network.
It should be noted that: although in the embodiment shown in Fig. 3 A-Fig. 6 B, be described for the message communication interface of Lucy and Jame, but this message communication interface obviously can also be cluster communication interface, namely this message communication interface may be used for the communication process of more than 2 contact persons, and Lucy still can by the selection of arbitrary contact person on this message communication interface and drag operation, realize similar terminal interaction, its detailed process repeats no more herein.
In the above-described embodiments, with Lucy on the terminal screen oneself used, by directly triggering and selecting other contact persons, and drag operation is carried out to manipulation mark, achieve the terminal interaction with Jame.In fact, the application also proposed other modes a lot, is described in detail below with an exemplary embodiment.
Please refer to Figure 12 A, still for the terminal interaction between Lucy and Jame.Assuming that the Lucy long head portrait by oneself on a terminal screen, thus trigger the selection to Lucy self; Based on the selection to Lucy self, appear manipulation mark on a terminal screen in one's mind, this manipulation mark can adopt the head portrait picture of Lucy to be the form of expression, this manipulation is identified tightr with associating between the selection to Lucy self.Then, Lucy drags manipulation mark, the region that the head portrait making it be dragged to another contact person is corresponding, such as makes manipulation mark all or part of overlapping with the head portrait of Jame, then triggers the terminal interaction between Lucy and Jame.Particularly, can at the icon being corresponded to this terminal interaction type by display near the head portrait of Jame that operates, than " lip " as illustrated in fig. 12, to represent that this terminal interaction is for " Lucy kisses Jame ".
Corresponding, Figure 12 B shows the display situation of the terminal screen that Jame uses.If terminal screen is directly shown as the message communication interface of Jame and Lucy, then directly can appear " lip " pattern in one's mind on this message communication interface, can also shake this message communication interface and/or the terminal of Jame be vibrated.Or, if what the interface that terminal screen shows comprised the message communication interface of Jame and Lucy enters openning interface, then can shake this sub-interface, the pattern of " lip " can also be demonstrated at this sub-near interface, facilitate Jame to the preview of concrete interaction content.
Certainly, based on choosing user self and triggering the mode with the terminal interaction of another contact person, in order to the quantity of terminal extension type of interaction, manipulation mark can also be detected and be dragged to dragging track before region corresponding to the head portrait of this another contact person, and determine corresponding mutual manipulation instruction according to this dragging track.
Such as an exemplary embodiment, Lucy can adopt the driving style shown in Figure 13, namely makes to drag track and comprises a circle, then based on the matching operation of server or Jame lateral terminal, search the mutual manipulation instruction to dragging track.Particularly, such as this manipulates instruction alternately and makes the message communication interface of Jame lateral terminal shows " smiling face " pattern shown on the right side of Figure 13, and can shake this message communication interface and/or vibrate the terminal of Jame side.
Such as another exemplary embodiment, Lucy can also adopt the driving style shown in Figure 14, namely makes to drag track and comprises " wave " shape, then based on the matching operation of server or Jame lateral terminal, search the mutual manipulation instruction to dragging track.Particularly, such as this manipulates instruction alternately and makes the message communication interface of Jame lateral terminal shows " swooning " pattern shown on the right side of Figure 14, and can shake this message communication interface and/or vibrate the terminal of Jame side.
Figure 15 shows the schematic configuration diagram of the electronic equipment of the exemplary embodiment according to the application.Please refer to Figure 15, at hardware view, this electronic equipment comprises processor, internal bus, network interface, internal memory and nonvolatile memory, certainly also may comprise the hardware required for other business.Processor reads corresponding computer program and then runs in internal memory from nonvolatile memory, and logic level is formed terminal interaction device.Certainly, except software realization mode, the application does not get rid of other implementations, mode of such as logical device or software and hardware combining etc., that is the executive agent of the treatment scheme of following management method is not limited to each logical block, and the executive agent of management method also can be hardware or logical device.
Please refer to Figure 16, in Software Implementation, when above-mentioned electronic equipment adopts the processing mode shown in Fig. 7, this terminal interaction device can comprise selection unit, identification generation unit, data generating unit and transmitting element.Wherein:
Selection unit, according to the user detected trigger action on a terminal screen, chooses the designated contact on the message communication interface that is shown on described terminal screen;
Identification generation unit, on described terminal screen, the manipulation generated corresponding to described designated contact identifies;
Data generating unit, according to the drag operation of the described user detected to described manipulation mark, generates corresponding user behavior data;
Transmitting element, is sent to server by described user behavior data, to generate corresponding mutual manipulation instruction by described server.
Optionally, described selection unit detects trigger position corresponding to described trigger action, if described trigger position is positioned at region corresponding to the head portrait of arbitrary contact person, then chooses this contact person.
Optionally, described identification generation unit extracts the head portrait picture of described designated contact, and is appeared in one's mind by this head portrait picture on described message communication interface, to identify as described manipulation.
Optionally, as the non-described user of described designated contact, if the described user detected to described manipulation mark drag operation be: along first direction and the second direction to-and-fro movement relative with first direction, then described mutual manipulation instruction for: the message communication interface on the terminal screen being shown in described designated contact represents corresponding interaction effect.
Optionally, the severe degree of the interaction effect that the reciprocating frequency of described manipulation mark is corresponding with described mutual manipulation instruction is proportionate.
Optionally, when described designated contact is described user self, if the drag operation of the described user detected to described manipulation mark is: described manipulation mark is dragged to region corresponding to the head portrait of another contact person, then described mutual manipulation instruction for: the message communication interface on the terminal screen being shown in another contact person described represents corresponding interaction effect.
Optionally, the drag operation of the described user detected to described manipulation mark also comprises: the dragging track before the region that the head portrait that described manipulation mark is dragged to another contact person described is corresponding;
Wherein, the interaction effect that described mutual manipulation instruction is corresponding is relevant to described dragging track.
Optionally, described interaction effect comprise following at least one or a combination set of: corresponding message communication interface is shaken, and corresponding message communication interface represents default picture, and corresponding terminal is vibrated.
Optionally, also comprise:
Search unit, search the local pre-stored data whether existing and be matched with described user behavior data;
Wherein, described user behavior data for depositing in case at lookup result, is sent to described server, otherwise does not send by described transmitting element.
Optionally, described manipulation mark is appeared in one's mind on described message communication interface.
Optionally, also comprise:
Receiving element, receives the mutual manipulation instruction from the terminal of arbitrary contact person from server;
Represent unit, perform the mutual manipulation instruction received, and represent corresponding interaction effect on the message communication interface that described arbitrary contact person is corresponding.
Please refer to Figure 17, in Software Implementation, when above-mentioned electronic equipment adopts the processing mode shown in Fig. 8 or Figure 10, this terminal interaction device can comprise selection unit, identification generation unit, data generating unit and transmitting element.Wherein:
Selection unit, according to the user detected trigger action on a terminal screen, chooses the designated contact on the message communication interface that is shown on described terminal screen;
Identification generation unit, on described terminal screen, the manipulation generated corresponding to described designated contact identifies;
Data generating unit, according to the drag operation of the described user detected to described manipulation mark, generates corresponding user behavior data;
Transmitting element, by described user behavior data directly or be sent to the terminal of described designated contact by server, to generate corresponding mutual manipulation instruction by the terminal of this designated contact.
Optionally, described selection unit detects trigger position corresponding to described trigger action, if described trigger position is positioned at region corresponding to the head portrait of arbitrary contact person, then chooses this contact person.
Optionally, described identification generation unit extracts the head portrait picture of described designated contact, and is appeared in one's mind by this head portrait picture on described message communication interface, to identify as described manipulation.
Optionally, as the non-described user of described designated contact, if the described user detected to described manipulation mark drag operation be: along first direction and the second direction to-and-fro movement relative with first direction, then described mutual manipulation instruction for: the message communication interface on the terminal screen being shown in described designated contact represents corresponding interaction effect.
Optionally, the severe degree of the interaction effect that the reciprocating frequency of described manipulation mark is corresponding with described mutual manipulation instruction is proportionate.
Optionally, when described designated contact is described user self, if the drag operation of the described user detected to described manipulation mark is: described manipulation mark is dragged to region corresponding to the head portrait of another contact person, then described mutual manipulation instruction for: the message communication interface on the terminal screen being shown in another contact person described represents corresponding interaction effect.
Optionally, the drag operation of the described user detected to described manipulation mark also comprises: the dragging track before the region that the head portrait that described manipulation mark is dragged to another contact person described is corresponding;
Wherein, the interaction effect that described mutual manipulation instruction is corresponding is relevant to described dragging track.
Optionally, described interaction effect comprise following at least one or a combination set of: corresponding message communication interface is shaken, and corresponding message communication interface represents default picture, and corresponding terminal is vibrated.
Optionally, also comprise:
Search unit, search the local pre-stored data whether existing and be matched with described user behavior data;
Wherein, at lookup result for depositing in case, described user behavior data is sent to described server by described transmitting element, otherwise does not send.
Optionally, described manipulation mark is appeared in one's mind on described message communication interface.
Optionally, also comprise:
Receiving element, directly or by server receives the user behavior data from the terminal of arbitrary contact person;
Generation unit, according to the user behavior data received, generates corresponding mutual manipulation instruction;
Represent unit, perform the mutual manipulation instruction generated, and represent corresponding interaction effect on the message communication interface that described arbitrary contact person is corresponding.
Please refer to Figure 18, in Software Implementation, when above-mentioned electronic equipment adopts the processing mode shown in Fig. 9 or Figure 11, this terminal interaction device can comprise selection unit, identification generation unit, data generating unit, generation unit and transmitting element.Wherein:
Selection unit, according to the user detected trigger action on a terminal screen, chooses the designated contact on the message communication interface that is shown on described terminal screen;
Identification generation unit, on described terminal screen, the manipulation generated corresponding to described designated contact identifies;
Data generating unit, according to the drag operation of the described user detected to described manipulation mark, generates corresponding user behavior data;
Generation unit, generates corresponding mutual manipulation instruction according to described user behavior data;
Transmitting element, the mutual manipulation instruction of generation directly or be sent to the terminal of described designated contact by server.
Optionally, described selection unit detects trigger position corresponding to described trigger action, if described trigger position is positioned at region corresponding to the head portrait of arbitrary contact person, then chooses this contact person.
Optionally, described identification generation unit extracts the head portrait picture of described designated contact, and is appeared in one's mind by this head portrait picture on described message communication interface, to identify as described manipulation.
Optionally, as the non-described user of described designated contact, if the described user detected to described manipulation mark drag operation be: along first direction and the second direction to-and-fro movement relative with first direction, then described mutual manipulation instruction for: the message communication interface on the terminal screen being shown in described designated contact represents corresponding interaction effect.
Optionally, the severe degree of the interaction effect that the reciprocating frequency of described manipulation mark is corresponding with described mutual manipulation instruction is proportionate.
Optionally, when described designated contact is described user self, if the drag operation of the described user detected to described manipulation mark is: described manipulation mark is dragged to region corresponding to the head portrait of another contact person, then described mutual manipulation instruction for: the message communication interface on the terminal screen being shown in another contact person described represents corresponding interaction effect.
Optionally, the drag operation of the described user detected to described manipulation mark also comprises: the dragging track before the region that the head portrait that described manipulation mark is dragged to another contact person described is corresponding;
Wherein, the interaction effect that described mutual manipulation instruction is corresponding is relevant to described dragging track.
Optionally, described interaction effect comprise following at least one or a combination set of: corresponding message communication interface is shaken, and corresponding message communication interface represents default picture, and corresponding terminal is vibrated.
Optionally, described manipulation mark is appeared in one's mind on described message communication interface.
Optionally, also comprise:
Receiving element, directly or by server receives the mutual manipulation instruction from the terminal of arbitrary contact person;
Represent unit, perform the mutual manipulation instruction received, and represent corresponding interaction effect on the message communication interface that described arbitrary contact person is corresponding.
Therefore, the application, by the drag operation on message communication interface, realizes the reciprocal process of terminal room easily, contributes to the experience promoting user.
In one typically configuration, computing equipment comprises one or more processor (CPU), input/output interface, network interface and internal memory.
Internal memory may comprise the volatile memory in computer-readable medium, and the forms such as random access memory (RAM) and/or Nonvolatile memory, as ROM (read-only memory) (ROM) or flash memory (flashRAM).Internal memory is the example of computer-readable medium.
Computer-readable medium comprises permanent and impermanency, removable and non-removable media can be stored to realize information by any method or technology.Information can be computer-readable instruction, data structure, the module of program or other data.The example of the storage medium of computing machine comprises, but be not limited to phase transition internal memory (PRAM), static RAM (SRAM), dynamic RAM (DRAM), the random access memory (RAM) of other types, ROM (read-only memory) (ROM), Electrically Erasable Read Only Memory (EEPROM), fast flash memory bank or other memory techniques, read-only optical disc ROM (read-only memory) (CD-ROM), digital versatile disc (DVD) or other optical memory, magnetic magnetic tape cassette, tape magnetic rigid disk stores or other magnetic storage apparatus or any other non-transmitting medium, can be used for storing the information can accessed by computing equipment.According to defining herein, computer-readable medium does not comprise temporary computer readable media (transitorymedia), as data-signal and the carrier wave of modulation.
Also it should be noted that, term " comprises ", " comprising " or its any other variant are intended to contain comprising of nonexcludability, thus make to comprise the process of a series of key element, method, commodity or equipment and not only comprise those key elements, but also comprise other key elements clearly do not listed, or also comprise by the intrinsic key element of this process, method, commodity or equipment.When not more restrictions, the key element limited by statement " comprising ... ", and be not precluded within process, method, commodity or the equipment comprising described key element and also there is other identical element.
The foregoing is only the preferred embodiment of the application, not in order to limit the application, within all spirit in the application and principle, any amendment made, equivalent replacements, improvement etc., all should be included within scope that the application protects.

Claims (30)

1. a terminal interaction method, is characterized in that, comprising:
According to the user detected trigger action on a terminal screen, choose the designated contact on the message communication interface that is shown on described terminal screen;
On described terminal screen, the manipulation generated corresponding to described designated contact identifies;
According to the drag operation of the described user detected to described manipulation mark, generate corresponding user behavior data;
Described user behavior data is sent to server, to generate corresponding mutual manipulation instruction by described server.
2. method according to claim 1, is characterized in that, according to the user detected trigger action on a terminal screen, chooses the designated contact on the message communication interface that is shown on described terminal screen, comprising:
Detect the trigger position that described trigger action is corresponding;
If described trigger position is positioned at region corresponding to the head portrait of arbitrary contact person, then choose this contact person.
3. method according to claim 1, is characterized in that, on described terminal screen, the manipulation generated corresponding to described designated contact identifies, and comprising:
Extract the head portrait picture of described designated contact, and this head portrait picture is appeared in one's mind on described message communication interface, to identify as described manipulation.
4. method according to claim 1, it is characterized in that, as the non-described user of described designated contact, if the described user detected to described manipulation mark drag operation be: along first direction and the second direction to-and-fro movement relative with first direction, then described mutual manipulation instruction for: the message communication interface on the terminal screen being shown in described designated contact represents corresponding interaction effect.
5. method according to claim 4, is characterized in that, the severe degree of the interaction effect that the reciprocating frequency that described manipulation identifies is corresponding with described mutual manipulation instruction is proportionate.
6. method according to claim 1, it is characterized in that, when described designated contact is described user self, if the drag operation of the described user detected to described manipulation mark is: described manipulation mark is dragged to region corresponding to the head portrait of another contact person, then described mutual manipulation instruction for: the message communication interface on the terminal screen being shown in another contact person described represents corresponding interaction effect.
7. method according to claim 6, is characterized in that, the drag operation of the described user detected to described manipulation mark also comprises: the dragging track before the region that the head portrait that described manipulation mark is dragged to another contact person described is corresponding;
Wherein, the interaction effect that described mutual manipulation instruction is corresponding is relevant to described dragging track.
8. the method according to any one of claim 4 to 7, it is characterized in that, described interaction effect comprise following at least one or a combination set of: corresponding message communication interface is shaken, and corresponding message communication interface represents default picture, and corresponding terminal is vibrated.
9. method according to claim 1, is characterized in that, also comprises:
Search the local pre-stored data whether existing and be matched with described user behavior data;
If exist, then described user behavior data is sent to described server, otherwise does not send.
10. method according to claim 1, is characterized in that, described manipulation mark is appeared in one's mind on described message communication interface.
11. methods according to claim 1, is characterized in that, also comprise:
The mutual manipulation instruction from the terminal of arbitrary contact person is received from server;
Perform the mutual manipulation instruction received, and represent corresponding interaction effect on the message communication interface that described arbitrary contact person is corresponding.
12. 1 kinds of terminal interaction devices, is characterized in that, comprising:
Selection unit, according to the user detected trigger action on a terminal screen, chooses the designated contact on the message communication interface that is shown on described terminal screen;
Identification generation unit, on described terminal screen, the manipulation generated corresponding to described designated contact identifies;
Data generating unit, according to the drag operation of the described user detected to described manipulation mark, generates corresponding user behavior data;
Transmitting element, is sent to server by described user behavior data, to generate corresponding mutual manipulation instruction by described server.
13. devices according to claim 12, is characterized in that:
Described selection unit detects trigger position corresponding to described trigger action, if described trigger position is positioned at region corresponding to the head portrait of arbitrary contact person, then chooses this contact person.
14. devices according to claim 12, is characterized in that, described identification generation unit extracts the head portrait picture of described designated contact, and are appeared in one's mind by this head portrait picture on described message communication interface, to identify as described manipulation.
15. devices according to claim 12, it is characterized in that, as the non-described user of described designated contact, if the described user detected to described manipulation mark drag operation be: along first direction and the second direction to-and-fro movement relative with first direction, then described mutual manipulation instruction for: the message communication interface on the terminal screen being shown in described designated contact represents corresponding interaction effect.
16. devices according to claim 15, is characterized in that, the severe degree of the interaction effect that the reciprocating frequency that described manipulation identifies is corresponding with described mutual manipulation instruction is proportionate.
17. devices according to claim 12, it is characterized in that, when described designated contact is described user self, if the drag operation of the described user detected to described manipulation mark is: described manipulation mark is dragged to region corresponding to the head portrait of another contact person, then described mutual manipulation instruction for: the message communication interface on the terminal screen being shown in another contact person described represents corresponding interaction effect.
18. devices according to claim 17, is characterized in that, the drag operation of the described user detected to described manipulation mark also comprises: the dragging track before the region that the head portrait that described manipulation mark is dragged to another contact person described is corresponding;
Wherein, the interaction effect that described mutual manipulation instruction is corresponding is relevant to described dragging track.
19. according to claim 15 to the device according to any one of 18, it is characterized in that, described interaction effect comprise following at least one or a combination set of: corresponding message communication interface is shaken, and corresponding message communication interface represents default picture, and corresponding terminal is vibrated.
20. devices according to claim 12, is characterized in that, also comprise:
Search unit, search the local pre-stored data whether existing and be matched with described user behavior data;
Wherein, at lookup result for depositing in case, described user behavior data is sent to described server by described transmitting element, otherwise does not send.
21. devices according to claim 12, is characterized in that, described manipulation mark is appeared in one's mind on described message communication interface.
22. devices according to claim 12, is characterized in that, also comprise:
Receiving element, receives the mutual manipulation instruction from the terminal of arbitrary contact person from server;
Represent unit, perform the mutual manipulation instruction received, and represent corresponding interaction effect on the message communication interface that described arbitrary contact person is corresponding.
23. 1 kinds of terminal interaction methods, is characterized in that, comprising:
According to the user detected trigger action on a terminal screen, choose the designated contact on the message communication interface that is shown on described terminal screen;
On described terminal screen, the manipulation generated corresponding to described designated contact identifies;
According to the drag operation of the described user detected to described manipulation mark, generate corresponding user behavior data;
By described user behavior data directly or be sent to the terminal of described designated contact by server, to generate corresponding mutual manipulation instruction by the terminal of this designated contact.
24. methods according to claim 23, is characterized in that, also comprise:
Directly or by server receive the user behavior data from the terminal of arbitrary contact person;
According to the user behavior data received, generate corresponding mutual manipulation instruction;
Perform the mutual manipulation instruction generated, and represent corresponding interaction effect on the message communication interface that described arbitrary contact person is corresponding.
25. 1 kinds of terminal interaction devices, is characterized in that, comprising:
Selection unit, according to the user detected trigger action on a terminal screen, chooses the designated contact on the message communication interface that is shown on described terminal screen;
Identification generation unit, on described terminal screen, the manipulation generated corresponding to described designated contact identifies;
Data generating unit, according to the drag operation of the described user detected to described manipulation mark, generates corresponding user behavior data;
Transmitting element, by described user behavior data directly or be sent to the terminal of described designated contact by server, to generate corresponding mutual manipulation instruction by the terminal of this designated contact.
26. devices according to claim 25, is characterized in that, also comprise:
Receiving element, directly or by server receives the user behavior data from the terminal of arbitrary contact person;
Generation unit, according to the user behavior data received, generates corresponding mutual manipulation instruction;
Represent unit, perform the mutual manipulation instruction generated, and represent corresponding interaction effect on the message communication interface that described arbitrary contact person is corresponding.
27. 1 kinds of terminal interaction methods, is characterized in that, comprising:
According to the user detected trigger action on a terminal screen, choose the designated contact on the message communication interface that is shown on described terminal screen;
On described terminal screen, the manipulation generated corresponding to described designated contact identifies;
According to the drag operation of the described user detected to described manipulation mark, generate corresponding user behavior data;
Corresponding mutual manipulation instruction is generated according to described user behavior data, and directly or be sent to the terminal of described designated contact by server.
28. methods according to claim 27, is characterized in that, also comprise:
Directly or by server receive the mutual manipulation instruction from the terminal of arbitrary contact person;
Perform the mutual manipulation instruction received, and represent corresponding interaction effect on the message communication interface that described arbitrary contact person is corresponding.
29. 1 kinds of terminal interaction devices, is characterized in that, comprising:
Selection unit, according to the user detected trigger action on a terminal screen, chooses the designated contact on the message communication interface that is shown on described terminal screen;
Identification generation unit, on described terminal screen, the manipulation generated corresponding to described designated contact identifies;
Data generating unit, according to the drag operation of the described user detected to described manipulation mark, generates corresponding user behavior data;
Generation unit, generates corresponding mutual manipulation instruction according to described user behavior data;
Transmitting element, by the mutual manipulation instruction generated directly or be sent to the terminal of described designated contact by server.
30. devices according to claim 29, is characterized in that, also comprise:
Receiving element, directly or by server receives the mutual manipulation instruction from the terminal of arbitrary contact person;
Represent unit, perform the mutual manipulation instruction received, and represent corresponding interaction effect on the message communication interface that described arbitrary contact person is corresponding.
CN201410302455.9A 2014-06-27 2014-06-27 Terminal interaction method and its device Active CN105204748B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410302455.9A CN105204748B (en) 2014-06-27 2014-06-27 Terminal interaction method and its device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410302455.9A CN105204748B (en) 2014-06-27 2014-06-27 Terminal interaction method and its device

Publications (2)

Publication Number Publication Date
CN105204748A true CN105204748A (en) 2015-12-30
CN105204748B CN105204748B (en) 2019-09-17

Family

ID=54952471

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410302455.9A Active CN105204748B (en) 2014-06-27 2014-06-27 Terminal interaction method and its device

Country Status (1)

Country Link
CN (1) CN105204748B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106302137A (en) * 2016-10-31 2017-01-04 努比亚技术有限公司 Group chat message processing apparatus and method
CN106888317A (en) * 2017-01-03 2017-06-23 努比亚技术有限公司 A kind of interaction processing method, device and terminal
CN111290722A (en) * 2020-01-20 2020-06-16 北京大米未来科技有限公司 Screen sharing method, device and system, electronic equipment and storage medium
CN114928524A (en) * 2022-05-20 2022-08-19 浪潮思科网络科技有限公司 Interaction method, equipment and medium for WEB side and switch

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1842003A (en) * 2005-03-30 2006-10-04 广州市领华科技有限公司 Method for realizing instant communication with a plurality of linkmen in single conversational window
CN101465816A (en) * 2007-12-19 2009-06-24 腾讯科技(深圳)有限公司 Method and system for displaying instant communication dynamic effect
US20100299392A1 (en) * 2009-05-19 2010-11-25 Shih-Chien Chiou Method for controlling remote devices using instant message
CN102750555A (en) * 2012-06-28 2012-10-24 北京理工大学 Expression robot applied to instant messaging tool
CN102790731A (en) * 2012-07-18 2012-11-21 上海量明科技发展有限公司 Triggering transmission method, client and system by instant messaging tool
CN102833183A (en) * 2012-08-16 2012-12-19 上海量明科技发展有限公司 Instant messaging interactive interface moving method, client and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1842003A (en) * 2005-03-30 2006-10-04 广州市领华科技有限公司 Method for realizing instant communication with a plurality of linkmen in single conversational window
CN101465816A (en) * 2007-12-19 2009-06-24 腾讯科技(深圳)有限公司 Method and system for displaying instant communication dynamic effect
US20100299392A1 (en) * 2009-05-19 2010-11-25 Shih-Chien Chiou Method for controlling remote devices using instant message
CN102750555A (en) * 2012-06-28 2012-10-24 北京理工大学 Expression robot applied to instant messaging tool
CN102790731A (en) * 2012-07-18 2012-11-21 上海量明科技发展有限公司 Triggering transmission method, client and system by instant messaging tool
CN102833183A (en) * 2012-08-16 2012-12-19 上海量明科技发展有限公司 Instant messaging interactive interface moving method, client and system

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106302137A (en) * 2016-10-31 2017-01-04 努比亚技术有限公司 Group chat message processing apparatus and method
CN106888317A (en) * 2017-01-03 2017-06-23 努比亚技术有限公司 A kind of interaction processing method, device and terminal
CN111290722A (en) * 2020-01-20 2020-06-16 北京大米未来科技有限公司 Screen sharing method, device and system, electronic equipment and storage medium
CN114928524A (en) * 2022-05-20 2022-08-19 浪潮思科网络科技有限公司 Interaction method, equipment and medium for WEB side and switch
CN114928524B (en) * 2022-05-20 2024-03-26 浪潮思科网络科技有限公司 Interaction method, device and medium of WEB terminal and switch

Also Published As

Publication number Publication date
CN105204748B (en) 2019-09-17

Similar Documents

Publication Publication Date Title
JP7013466B2 (en) Application data processing methods, equipment, and computer programs
CN107636584B (en) Follow mode and position tagging of virtual workspace viewports in a collaborative system
US10545658B2 (en) Object processing and selection gestures for forming relationships among objects in a collaboration system
US10228835B2 (en) Method for displaying information, and terminal equipment
US20150019980A1 (en) Multi-dimensional content platform for a network
KR20160017050A (en) Collaboration system including a spatial event map
WO2013123757A1 (en) File data transmission method and device
TW201535257A (en) Identifying relationships between message threads
JP2009519627A (en) System, method and computer program product for concurrent media collaboration
CN104125508A (en) Video sharing method and terminal
JP7407928B2 (en) File comments, comment viewing methods, devices, computer equipment and computer programs
CN102436344B (en) context menu
JP2014514668A (en) Multi-input gestures in hierarchical domains
CN105205072B (en) The methods of exhibiting and system of webpage information
CN112215924A (en) Picture comment processing method and device, electronic equipment and storage medium
CN105204748A (en) Terminal interaction method and device
JP2020161135A (en) Method and system for displaying chat thread
CN112764857A (en) Information processing method and device and electronic equipment
US10652105B2 (en) Display apparatus and controlling method thereof
CN108092872A (en) The means of communication and device
CN112083866A (en) Expression image generation method and device
WO2017011084A1 (en) System and method for interaction between touch points on a graphical display
CN111290722A (en) Screen sharing method, device and system, electronic equipment and storage medium
CN112291412B (en) Application program control method and device and electronic equipment
TW201426488A (en) Real-time sharing method, electronic device and computer program product

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20191211

Address after: P.O. Box 31119, grand exhibition hall, hibiscus street, 802 West Bay Road, Grand Cayman, Cayman Islands

Patentee after: Innovative advanced technology Co., Ltd

Address before: A four-storey 847 mailbox in Grand Cayman Capital Building, British Cayman Islands

Patentee before: Alibaba Group Holding Co., Ltd.

TR01 Transfer of patent right