CN105204748B - Terminal interaction method and its device - Google Patents

Terminal interaction method and its device Download PDF

Info

Publication number
CN105204748B
CN105204748B CN201410302455.9A CN201410302455A CN105204748B CN 105204748 B CN105204748 B CN 105204748B CN 201410302455 A CN201410302455 A CN 201410302455A CN 105204748 B CN105204748 B CN 105204748B
Authority
CN
China
Prior art keywords
manipulation
terminal
communication interface
message communication
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410302455.9A
Other languages
Chinese (zh)
Other versions
CN105204748A (en
Inventor
朱鹏程
王炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Advanced New Technologies Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201410302455.9A priority Critical patent/CN105204748B/en
Publication of CN105204748A publication Critical patent/CN105204748A/en
Application granted granted Critical
Publication of CN105204748B publication Critical patent/CN105204748B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a kind of terminal interaction method and its device, this method comprises: the trigger action according to the user detected on a terminal screen, chooses the designated contact on the message communication interface being shown on the terminal screen;The manipulation mark for corresponding to the designated contact is generated on the terminal screen;According to the user detected to the drag operation of the manipulation mark, corresponding user behavior data is generated;The user behavior data is sent to server, to generate corresponding interactive manipulation instruction by the server.By the technical solution of the application, the interactive process of terminal room can be easily realized during message communication, facilitate the usage experience for promoting user.

Description

Terminal interaction method and its device
Technical field
This application involves field of communication technology more particularly to terminal interaction methods and its device.
Background technique
With the continuous development of Internet technology, more and more users tend to carry out by way of message communication Exchange.For example by short message or immediate communication tool etc., user can send text, picture etc. to other side, to realize information Exchange.However, it is necessary to by more abundant interactive mode, in a manner of the message communication of optimization in the related technology.
Summary of the invention
In view of this, the application provides a kind of new technical solution, can easily be realized during message communication The interactive process of terminal room facilitates the usage experience for promoting user.
To achieve the above object, it is as follows to provide technical solution by the application:
According to a first aspect of the present application, a kind of terminal interaction method is proposed, comprising:
According to the trigger action of the user detected on a terminal screen, the message being shown on the terminal screen is chosen Designated contact in communication interface;
The manipulation mark for corresponding to the designated contact is generated on the terminal screen;
According to the user detected to the drag operation of the manipulation mark, corresponding user behavior data is generated;
The user behavior data is sent to server, to generate corresponding interactive manipulation instruction by the server.
According to a second aspect of the present application, a kind of terminal interaction device is proposed, comprising:
Selecting unit is chosen according to the trigger action of the user detected on a terminal screen and is shown in the end panel The designated contact on message communication interface on curtain;
Identification generation unit generates the manipulation mark for corresponding to the designated contact on the terminal screen;
Data generating unit generates corresponding according to the user detected to the drag operation of the manipulation mark User behavior data;
The user behavior data is sent to server by transmission unit, to generate corresponding interaction by the server Manipulation instruction.
According to the third aspect of the invention we, a kind of terminal interaction method is proposed, comprising:
According to the trigger action of the user detected on a terminal screen, the message being shown on the terminal screen is chosen Designated contact in communication interface;
The manipulation mark for corresponding to the designated contact is generated on the terminal screen;
According to the user detected to the drag operation of the manipulation mark, corresponding user behavior data is generated;
The user behavior data is sent to the terminal of the designated contact directly or by server, to be referred to by this The terminal for determining contact person generates corresponding interactive manipulation instruction.
According to the fourth aspect of the invention, a kind of terminal interaction device is proposed, comprising:
Selecting unit is chosen according to the trigger action of the user detected on a terminal screen and is shown in the end panel The designated contact on message communication interface on curtain;
Identification generation unit generates the manipulation mark for corresponding to the designated contact on the terminal screen;
Data generating unit generates corresponding according to the user detected to the drag operation of the manipulation mark User behavior data;
The user behavior data is sent to the end of the designated contact by transmission unit directly or by server End, to generate corresponding interactive manipulation instruction by the terminal of the designated contact.
According to the fifth aspect of the invention, a kind of terminal interaction method is proposed, comprising:
According to the trigger action of the user detected on a terminal screen, the message being shown on the terminal screen is chosen Designated contact in communication interface;
The manipulation mark for corresponding to the designated contact is generated on the terminal screen;
According to the user detected to the drag operation of the manipulation mark, corresponding user behavior data is generated;
Corresponding interactive manipulation instruction is generated according to the user behavior data, and is sent to institute directly or by server State the terminal of designated contact.
According to the sixth aspect of the invention, a kind of terminal interaction device is proposed, comprising:
Selecting unit is chosen according to the trigger action of the user detected on a terminal screen and is shown in the end panel The designated contact on message communication interface on curtain;
Identification generation unit generates the manipulation mark for corresponding to the designated contact on the terminal screen;
Data generating unit generates corresponding according to the user detected to the drag operation of the manipulation mark User behavior data;
Generation unit generates corresponding interactive manipulation instruction according to the user behavior data;
Transmission unit, the interaction manipulation instruction of generation are sent to the end of the designated contact directly or by server End.
By above technical scheme as it can be seen that the application is by the drag operation on message communication interface, easily realize eventually Interactive process between end facilitates the usage experience for promoting user.
Detailed description of the invention
Fig. 1 shows the schematic flow diagram of the terminal interaction method of the exemplary embodiment according to the application;
Fig. 2 shows the schematic flow diagrams according to the selection contact person of the exemplary embodiment of the application;
Fig. 3 A-3B shows the interface schematic diagram of the trigger action of the exemplary embodiment according to the application;
Fig. 4 A-4B shows the interface schematic diagram of the generation manipulation mark according to the exemplary embodiment of the application;
Fig. 5 shows the interface schematic diagram of the dragging manipulation mark according to the exemplary embodiment of the application;
Fig. 6 A-6B shows the interface schematic diagram of the interaction effect of the exemplary embodiment according to the application;
Fig. 7 shows the schematic diagram that interactive operation is realized between the terminal according to the exemplary embodiment of the application;
Fig. 8 shows the schematic diagram that interactive operation is realized between the terminal according to the another exemplary embodiment of the application;
Fig. 9 shows the schematic diagram that interactive operation is realized between the terminal according to the another exemplary embodiment of the application;
Figure 10 shows the schematic diagram that interactive operation is realized between the terminal according to the another exemplary embodiment of the application;
Figure 11 shows the schematic diagram that interactive operation is realized between the terminal according to the another exemplary embodiment of the application;
Figure 12 A-12B shows the interface schematic diagram of the terminal interaction of the exemplary embodiment according to the application;
Figure 13 shows the interface according to dragging path implementation interaction effect of the exemplary embodiment according to the application Schematic diagram;
Figure 14 shows the boundary according to dragging path implementation interaction effect of the another exemplary embodiment according to the application Face schematic diagram;
Figure 15 shows the structural schematic diagram of the electronic equipment of the exemplary embodiment according to the application;
Figure 16 shows the schematic block diagram of the terminal interaction device of the exemplary embodiment according to the application;
Figure 17 shows the schematic block diagrams according to the terminal interaction device of the another exemplary embodiment of the application;
Figure 18 shows the schematic block diagram of the terminal interaction device according to the another exemplary embodiment of the application.
Specific embodiment
When being interacted by way of message communication between user, which can be short message, the shape of multimedia message Formula, or the form of the instant messagings such as credulity, QQ.By being set in PC or in the movement such as mobile phone/tablet computer The standby upper corresponding application program of installation, user can send the message such as text, picture to other users.However, traditional In communication modes, user is merely able to the fixed input text or picture etc. in input frame, and interactive mode is excessively dull.
And in the technical solution of the application, by the drag operation on message communication interface, so that energy between user It is enough easily to realize terminal interaction, facilitate the usage experience for promoting user.For the application is further described, provide down Column embodiment:
Referring to FIG. 1, the schematic flow diagram of the terminal interaction method of the exemplary embodiment according to the application is shown, The terminal interaction method includes:
Step 102, the trigger action according to the user detected on a terminal screen is chosen and is shown in the terminal screen On message communication interface on designated contact;
Step 104, the manipulation mark for corresponding to the designated contact is generated on the terminal screen;
Step 106, corresponding user's row is generated to the drag operation of the manipulation mark according to the user detected For data;
Step 108, the user behavior data is sent to server, is grasped with generating corresponding interaction by the server Control instruction.
In the technical scheme, user can send text, picture etc. by message communication interface, realize traditional message Communication can also generate manipulation mark by triggering on a terminal screen, thus realize the interactive controlling based on drag operation, it is rich Rich terminal interaction mode, facilitates the usage experience for promoting user.
In a step 102, the trigger action by user on a terminal screen is realized to the connection in message communication interface It is the selection operation of people, detailed process is as shown in Figure 2, comprising:
Step 202, when detecting the trigger action of user on a terminal screen, determine the trigger action in terminal screen Corresponding trigger position.Since message communication interface display is when on terminal screen, between message communication interface and terminal screen Relative positional relationship be fixed, thus the trigger position is it is also assumed that be that trigger action is corresponding on message communication interface Position.
Wherein, the trigger action of user can be there are many kinds of mode.As an exemplary embodiment, for mobile phone, plate The touch mobile device such as computer can be clicked on a terminal screen, be double-clicked, long-pressing using mode shown in Fig. 3 A Deng using as trigger action;Alternatively, property embodiment as another example, for desktop computer, passes through OTG (On-The-Go) skill Art is connected to the mobile device etc. of the peripheral hardwares such as mouse, can by the cursor that is shown on the peripheral hardwares controlling terminal screen such as mouse, And using mode shown in Fig. 3 B, clicked, double-clicked on a terminal screen, long-pressing etc., using as trigger action.
Step 204, it is based on above-mentioned trigger position, it is any on message communication interface to judge whether the trigger position is located at It is in the corresponding region of head portrait of people.If the trigger position is located in the corresponding region of head portrait of any contact person, go to step 206, otherwise return step 202, are rejudged based on subsequent trigger action.
Step 206, the contact person corresponding to trigger position is chosen.Than as shown in Figure 3A, terminal detects that user Lucy exists Touch operation on terminal screen, and obtain specific trigger position;When the head portrait that the trigger position is located at user Jame is corresponding Region when (for example there are overlapping regions for the head portrait of trigger position and Jame, then it is assumed that it is corresponding that trigger position is located at the head portrait In region), choose Jame.Similarly, in figure 3b, if cursor clicks the head portrait of Jame and clicks the duration greater than pre- If time threshold, then it is assumed that user Lucy triggers Jame, should choose Jame.
At step 104, the manipulation mark generated on a terminal screen can show as arbitrary graphic pattern, than as shown in Figure 4 A Circle.Manipulation mark can emerge on message communication interface, it can be interpreted as " manipulation mark " and " message communication circle Face " is respectively at different " layer ", and is manipulated and identified corresponding " layer " at the upper surface of message communication interface corresponding " layer ".Thus As shown in Figure 4 A, when user on the touch screen mark by way of touch by (or being realized by peripheral hardwares such as mouses) dragging manipulation When knowledge, manipulation mark can be moved with drag operation, but the drag operation itself does not interfere with message communication interface.
Certainly, in order to promote the enjoyment in user operation process, make to select the operation such as contact person and dragging manipulation mark Between it is interrelated, can be when user chooses contact person by trigger action, than as shown in Figure 4 B, when the finger long-pressing of Lucy The head portrait of Jame extracts the head portrait picture of Jame, and the head portrait picture is emerged and is led in the message when so that Jame is selected It interrogates on interface, to be identified as the manipulation.
It is assumed that Lucy has selected Jame as operation object, and is generated on message communication interface with the head portrait picture of Jame Manipulation mark, then Lucy can be identified further by dragging the manipulation, so that manipulation mark is dragged according to certain way, from And show corresponding interaction effect on the message communication interface on the terminal screen of Jame.For example as an exemplary implementation Example, Lucy to manipulation mark drag operation are as follows: as shown in figure 5, make manipulation mark along first direction (such as lower left) and with The opposite second direction of first direction (such as upper right side) moves back and forth.
Corresponding to the reciprocating motion of manipulation mark shown in fig. 5, the terminal of Lucy collects corresponding user behavior data, than The dragging track of such as drag operation, and the user behavior data is sent to server, corresponding friendship is generated by server The interaction manipulation instruction is sent to the terminal of Jame by mutual manipulation instruction, server, then the terminal of Jame is by executing the interaction Manipulation instruction realizes corresponding interaction effect.Specifically, interaction effect may include at least one of or combinations thereof: corresponding Message communication interface shake, default picture is showed on corresponding message communication interface, corresponding terminal is vibrated. For example Fig. 6 A is shown as the interface in the terminal of Jame, and Jame and Lucy and other users are incorporated on the interface (such as Adam etc.) the entrance at message communication interface (i.e. there is corresponding connection in multiple rectangle sub-interfaces in Fig. 6 A, each sub-interface People;For example first sub-interface corresponds to Adam, for entering the message communication interface of Jame and Adam, second sub-interface pair Should be in Lucy, for entering the message communication interface of Jame and Lucy), then based on the drag operation of Lucy, can to correspond to Shake is generated in the sub-interface of Lucy, to prompt Jame.It, can be with and after Jame clicks to enter the message communication interface with Lucy Picture shown in Fig. 6 B (such as head portrait of Lucy) is shown on the surface, and simultaneously to the terminal at the interface and/or Jame It is shaken.
Further, the reciprocating frequency for manipulating mark can corresponding with interactive manipulation instruction interaction effect Severe degree is positively correlated.Specifically, for example Lucy wishes the reply for obtaining Jame as early as possible, then Lucy drags manipulation mark When dynamic, the reciprocating frequency of manipulation mark can be recorded in user behavior data.Therefore, past if identifying manipulation Multiple motion frequency is higher, then the degree of the interface shake in Jame terminal and/or terminal vibration is stronger, to show Lucy's Eager mood facilitates the usage experience for promoting user.
As it can be seen that in the above-described embodiments, to the drag operation (reciprocating motion) of manipulation mark, will be embodied on the terminal side Lucy For the sub-interface on the terminal side the Jame/shake at message communication interface and/or the vibration of terminal.So, based on similar side The terminal interaction process based on the application can be realized in formula, and entire terminal interaction mistake can specifically be described in detail by Fig. 7 Journey, still by taking user Lucy and user Jame carries out message communication by respective terminal A and terminal B as an example.
The side Lucy:
For example Lucy, by the head portrait of long-pressing Jame, has selected Jame on the message communication interface on terminal A as operation Object;Emerge corresponding manipulation mark on terminal A, for example is specifically as follows the head portrait picture of Jame;Lucy drags the manipulation Mark, then terminal A records corresponding user behavior data packet, and is uploaded to server.
Server side:
Server receives the user behavior data packet for carrying out self terminal A, parses dragging track therein, motion frequency etc. User behavior data, and by the pre-stored user behavior data of matching with interact the corresponding relationship between manipulation instruction, determination The interaction manipulation instruction of the corresponding user behavior data packet, and the interaction manipulation instruction is sent to terminal B.
The side Jame:
Jame receives the interaction manipulation instruction from server by terminal B, and executes the instruction, so that in terminal B On show corresponding interaction effect, for example corresponding message communication interface is shaken, and is opened up on corresponding message communication interface Picture is now preset, vibration etc. occurs for corresponding terminal.
Meanwhile in interactive process shown in Fig. 7, terminal A only needs direct upload user behavioral data packet, and by server Corresponding interactive manipulation instruction is determined, without user behavior data, interaction manipulation instruction and between the two is stored in advance Corresponding relationship helps to reduce the configuration requirement to terminal A, and makes full use of the storage and operational capability of server.Meanwhile eventually End B only needs to receive the interaction manipulation instruction from server, and data volume is much smaller than user behavior data packet, helps to reduce band Width is occupied and is consumed to the flow of terminal B, especially facilitates to reduce campus network under mobile network.
It is of course apparent that above-mentioned process can also be completed using other modes.As an illustrative embodiments, such as Fig. 8 It is shown, server can not realize to interaction manipulation instruction determination, but directly future self terminal A user behavior data packet It is forwarded to terminal B, is closed by terminal B according to pre-stored user behavior data, interaction manipulation instruction and between the two corresponding System, searches and executes corresponding interactive manipulation instruction, to realize corresponding interaction effect.
Property embodiment as another example, than as shown in figure 9, in addition to determining interactive manipulation by server or terminal B Instruction, can also voluntarily be determined by terminal A.Specifically, after terminal A has sampled user behavior data packet, according to pre-stored User behavior data, interaction manipulation instruction and corresponding relationship between the two, directly search the user's row for corresponding to and sampling For the interaction manipulation instruction of data packet, the interaction manipulation instruction is then sent to by terminal B by server.Since interaction manipulates The data volume of instruction is less than the data volume of user behavior data packet, thus helps to reduce terminal A, terminal B in interactive process The data traffic of consumption avoids generating under mobile network compared with high cost.
Property embodiment as another example, it is assumed that instant messaging is realized between terminal A and terminal B, with terminal A initiation It is illustrated for communication.Operation based on user, after terminal A initiates the communication request with terminal B to server, server will IP address, TCP (Transmission Control Protocol transmission control protocol) port numbers of terminal B etc. inform terminal A, to be based on these information, terminal A can be established with terminal B and is directly connected to, thus in being implemented without and being carried out by server The point-to-point communication (i.e. peer to peer communication mode) turned.As shown in Figure 10, it is assumed that point-to-point communication is used between terminal A and terminal B Mode, then user behavior data packet can be sent directly to terminal B by terminal A, be forwarded without server;Then, root According to the user behavior data packet received, by terminal B according to pre-stored user behavior data, interaction manipulation instruction and the two Between corresponding relationship, corresponding interactive manipulation instruction is searched and executes, to realize corresponding interaction effect.
Property embodiment as another example, as shown in figure 11, when between terminal A and terminal B use point-to-point communication side It, equally can be by terminal A after sampling user behavior data packet, according to pre-stored user behavior data, interaction when formula Manipulation instruction and corresponding relationship between the two search corresponding interactive manipulation instruction, and directly by the interaction manipulation instruction It is sent to terminal B, carries out data relay without server.
In addition, it is contemplated that terminal A may also be in the mobile network, i.e., for data transfer throughput sensitivity, then the application Propose further improvement: based under the embodiments such as Fig. 7, Fig. 8 and Figure 10, terminal A acquisition user Lucy drag operation, After generating corresponding user behavior data, search whether there is the local pre-stored data for being matched with the user behavior data, If it exists, then the user behavior data (can be packaged as user behavior data packet) is sent to server, otherwise do not sent.Cause This, when the drag operation of user Lucy and local pre-stored data mismatch, Lucy is likely to be not intended to activation and Jame Terminal interaction, for example Lucy wishes to execute the drag operation of " reciprocating motion " shown in fig. 5 originally, but in dragging process, Lucy Iterim Change idea, if then Lucy directly looses one's grip, corresponding user behavior data can be matched " reciprocal fortune by terminal A It is dynamic " and it is sent to server, cause terminal B that the interaction effect of interface shake and/or terminal vibration occurs;But if Lucy is not It directly looses one's grip, but carries out drag operation according to a random track on the screen of terminal A, then the corresponding user of the drag operation Behavioral data will not match local pre-stored data, which will not be sent to server by terminal A, thus will not Cause the interaction with terminal B, flow of the terminal A under mobile network will not be wasted.
Although it should be understood that in the embodiment shown in Fig. 3 A- Fig. 6 B, with the message communication interface of Lucy and Jame For be illustrated, but the message communication interface obviously can also be that cluster communication interface, i.e. the message communication interface can be used In 2 communication process above in conjunction with people, and Lucy still can pass through the choosing to any contact person on the message communication interface It selects and drag operation, realizes similar terminal interaction, details are not described herein again for detailed process.
In the above-described embodiments, with Lucy on the terminal screen oneself used, by directly triggering and selecting other connection It is people, and manipulation is identified and carries out drag operation, realizes the terminal interaction with Jame.In fact, the application also proposed very More other modes are described in detail below with an exemplary embodiment.
Figure 12 A is please referred to, still by taking the terminal interaction between Lucy and Jame as an example.It is assumed that Lucy long-pressing on a terminal screen The head portrait of oneself, to trigger the selection to Lucy itself;Based on the selection to Lucy itself, emerge behaviour on a terminal screen Control mark, manipulation mark can use the head portrait picture of Lucy for the form of expression, so that the manipulation identifies and to Lucy itself Selection between association it is even closer.Then, Lucy dragging manipulation mark, makes it be dragged to the head portrait pair of another contact person The region answered, for example make manipulation mark all or part of Chong Die with the head portrait of Jame, then trigger the end between Lucy and Jame End interaction.Specifically, the icon corresponding to the terminal interaction type can be shown near the head portrait of the Jame operated, such as " lip " shown in Figure 12 A, to indicate the terminal interaction for " Lucy kisses Jame ".
Corresponding, Figure 12 B shows the display situation for the terminal screen that Jame is used.If terminal screen is indirectly displayed as The message communication interface of Jame and Lucy then can directly emerge " lip " pattern on the message communication interface, can also be right The message communication interface shake and/or vibrate to the terminal of Jame.Alternatively, if the interface shown on terminal screen The entrance sub-interface at the message communication interface comprising Jame and Lucy can then shake the sub-interface, can also be at this Sub-interface nearby shows the pattern of " lip ", facilitates preview of the Jame to specific interaction content.
Certainly, based on choosing user itself and triggering the mode with the terminal interaction of another contact person, in order to extend terminal The quantity of type of interaction can also detect dragging before manipulation mark is dragged to the corresponding region of head portrait of another contact person Dynamic rail mark, and corresponding interactive manipulation instruction is determined according to the dragging track.
For example as an exemplary embodiment, Lucy can be using driving style shown in Figure 13, i.e., so that dragging track Comprising a circle, then the interaction manipulation of the corresponding dragging track is searched in the matching operation based on server or Jame lateral terminal Instruction.? " smiling face " pattern, and the message communication interface can also be carried out shake and/or the terminal of the side Jame is vibrated.
For example as another exemplary embodiment, Lucy can also be using driving style shown in Figure 14, i.e., so that dragging Track includes " wave " shape, then the friendship of the corresponding dragging track is searched in the matching operation based on server or Jame lateral terminal Mutual manipulation instruction.Specifically, for example the interaction manipulation instruction to show the right side Figure 14 on the message communication interface of Jame lateral terminal " dizzy " pattern shown in side, and the message communication interface can also be carried out to shake and/or shake to the terminal of the side Jame It is dynamic.
Figure 15 shows the schematic configuration diagram of the electronic equipment of the exemplary embodiment according to the application.Please refer to figure 15, in hardware view, which includes processor, internal bus, network interface, memory and nonvolatile memory, Certainly it is also possible that hardware required for other business.Processor reads corresponding computer journey from nonvolatile memory Then sequence is run into memory, terminal interaction device is formed on logic level.Certainly, other than software realization mode, this Other implementations, such as logical device or the mode of software and hardware combining etc. is not precluded in application, that is to say, that with down tube The executing subject of the process flow of reason method is not limited to each logic unit, and the executing subject of management method is also possible to firmly Part or logical device.
Figure 16 is please referred to, it, should when above-mentioned electronic equipment uses processing mode shown in Fig. 7 in Software Implementation Terminal interaction device may include selecting unit, identification generation unit, data generating unit and transmission unit.Wherein:
Selecting unit is chosen according to the trigger action of the user detected on a terminal screen and is shown in the end panel The designated contact on message communication interface on curtain;
Identification generation unit generates the manipulation mark for corresponding to the designated contact on the terminal screen;
Data generating unit generates corresponding according to the user detected to the drag operation of the manipulation mark User behavior data;
The user behavior data is sent to server by transmission unit, to generate corresponding interaction by the server Manipulation instruction.
Optionally, the selecting unit detects the corresponding trigger position of the trigger action, if the trigger position is located at In the corresponding region of the head portrait of any contact person, then the contact person is chosen.
Optionally, the identification generation unit extracts the head portrait picture of the designated contact, and the head portrait picture is floated Now on the message communication interface, to be identified as the manipulation.
Optionally, as the non-user of the designated contact, if the user detected identifies the manipulation Drag operation are as follows: moved back and forth along first direction and the second direction opposite with first direction, then the interactive manipulation instruction For: on the message communication interface on the terminal screen for being shown in the designated contact show corresponding interaction effect.
Optionally, the play of the reciprocating frequency interaction effect corresponding with the interactive manipulation instruction of the manipulation mark Strong degree is positively correlated.
Optionally, when the designated contact is the user itself, if the user detected is to the manipulation The drag operation of mark are as follows: the manipulation mark is dragged to the corresponding region of head portrait of another contact person, then the interaction behaviour Control instruction is used for: showing corresponding interaction effect on the message communication interface on the terminal screen for being shown in another contact person Fruit.
Optionally, drag operation of the user detected to the manipulation mark further include: the manipulation identifies quilt It is dragged to the dragging track before the corresponding region of head portrait of another contact person;
Wherein, the corresponding interaction effect of the interactive manipulation instruction is related to the dragging track.
Optionally, the interaction effect includes at least one of or combinations thereof: being trembled at corresponding message communication interface It is dynamic, default picture is showed on corresponding message communication interface, corresponding terminal is vibrated.
Optionally, further includes:
Searching unit searches whether there is the local pre-stored data for being matched with the user behavior data;
Wherein, the user behavior data is sent to institute in the presence of lookup result is by the transmission unit Server is stated, is not otherwise sent.
Optionally, the manipulation mark emerges on the message communication interface.
Optionally, further includes:
Receiving unit receives the interaction manipulation instruction of the terminal from any contact person from server;
Show unit, execute the interaction manipulation instruction received, and in corresponding message communication circle of any contact person Show corresponding interaction effect on face.
Figure 17 is please referred to, in Software Implementation, when above-mentioned electronic equipment uses Fig. 8 or processing mode shown in Fig. 10 When, which may include selecting unit, identification generation unit, data generating unit and transmission unit.Wherein:
Selecting unit is chosen according to the trigger action of the user detected on a terminal screen and is shown in the end panel The designated contact on message communication interface on curtain;
Identification generation unit generates the manipulation mark for corresponding to the designated contact on the terminal screen;
Data generating unit generates corresponding according to the user detected to the drag operation of the manipulation mark User behavior data;
The user behavior data is sent to the end of the designated contact by transmission unit directly or by server End, to generate corresponding interactive manipulation instruction by the terminal of the designated contact.
Optionally, the selecting unit detects the corresponding trigger position of the trigger action, if the trigger position is located at In the corresponding region of the head portrait of any contact person, then the contact person is chosen.
Optionally, the identification generation unit extracts the head portrait picture of the designated contact, and the head portrait picture is floated Now on the message communication interface, to be identified as the manipulation.
Optionally, as the non-user of the designated contact, if the user detected identifies the manipulation Drag operation are as follows: moved back and forth along first direction and the second direction opposite with first direction, then the interactive manipulation instruction For: on the message communication interface on the terminal screen for being shown in the designated contact show corresponding interaction effect.
Optionally, the play of the reciprocating frequency interaction effect corresponding with the interactive manipulation instruction of the manipulation mark Strong degree is positively correlated.
Optionally, when the designated contact is the user itself, if the user detected is to the manipulation The drag operation of mark are as follows: the manipulation mark is dragged to the corresponding region of head portrait of another contact person, then the interaction behaviour Control instruction is used for: showing corresponding interaction effect on the message communication interface on the terminal screen for being shown in another contact person Fruit.
Optionally, drag operation of the user detected to the manipulation mark further include: the manipulation identifies quilt It is dragged to the dragging track before the corresponding region of head portrait of another contact person;
Wherein, the corresponding interaction effect of the interactive manipulation instruction is related to the dragging track.
Optionally, the interaction effect includes at least one of or combinations thereof: being trembled at corresponding message communication interface It is dynamic, default picture is showed on corresponding message communication interface, corresponding terminal is vibrated.
Optionally, further includes:
Searching unit searches whether there is the local pre-stored data for being matched with the user behavior data;
Wherein, in the presence of lookup result is, the user behavior data is sent to institute by the transmission unit Server is stated, is not otherwise sent.
Optionally, the manipulation mark emerges on the message communication interface.
Optionally, further includes:
Receiving unit receives the user behavior data of the terminal from any contact person directly or by server;
Generation unit generates corresponding interactive manipulation instruction according to the user behavior data received;
Show unit, execute the interaction manipulation instruction of generation, and at the corresponding message communication interface of any contact person On show corresponding interaction effect.
Figure 18 is please referred to, in Software Implementation, when above-mentioned electronic equipment is using processing mode shown in Fig. 9 or Figure 11 When, which may include selecting unit, identification generation unit, data generating unit, generation unit and sends single Member.Wherein:
Selecting unit is chosen according to the trigger action of the user detected on a terminal screen and is shown in the end panel The designated contact on message communication interface on curtain;
Identification generation unit generates the manipulation mark for corresponding to the designated contact on the terminal screen;
Data generating unit generates corresponding according to the user detected to the drag operation of the manipulation mark User behavior data;
Generation unit generates corresponding interactive manipulation instruction according to the user behavior data;
Transmission unit, the interaction manipulation instruction of generation are sent to the end of the designated contact directly or by server End.
Optionally, the selecting unit detects the corresponding trigger position of the trigger action, if the trigger position is located at In the corresponding region of the head portrait of any contact person, then the contact person is chosen.
Optionally, the identification generation unit extracts the head portrait picture of the designated contact, and the head portrait picture is floated Now on the message communication interface, to be identified as the manipulation.
Optionally, as the non-user of the designated contact, if the user detected identifies the manipulation Drag operation are as follows: moved back and forth along first direction and the second direction opposite with first direction, then the interactive manipulation instruction For: on the message communication interface on the terminal screen for being shown in the designated contact show corresponding interaction effect.
Optionally, the play of the reciprocating frequency interaction effect corresponding with the interactive manipulation instruction of the manipulation mark Strong degree is positively correlated.
Optionally, when the designated contact is the user itself, if the user detected is to the manipulation The drag operation of mark are as follows: the manipulation mark is dragged to the corresponding region of head portrait of another contact person, then the interaction behaviour Control instruction is used for: showing corresponding interaction effect on the message communication interface on the terminal screen for being shown in another contact person Fruit.
Optionally, drag operation of the user detected to the manipulation mark further include: the manipulation identifies quilt It is dragged to the dragging track before the corresponding region of head portrait of another contact person;
Wherein, the corresponding interaction effect of the interactive manipulation instruction is related to the dragging track.
Optionally, the interaction effect includes at least one of or combinations thereof: being trembled at corresponding message communication interface It is dynamic, default picture is showed on corresponding message communication interface, corresponding terminal is vibrated.
Optionally, the manipulation mark emerges on the message communication interface.
Optionally, further includes:
Receiving unit receives the interaction manipulation instruction of the terminal from any contact person directly or by server;
Show unit, execute the interaction manipulation instruction received, and in corresponding message communication circle of any contact person Show corresponding interaction effect on face.
Therefore, the application easily realizes the interactive process of terminal room by the drag operation on message communication interface, Facilitate the usage experience of promotion user.
In a typical configuration, calculating equipment includes one or more processors (CPU), input/output interface, net Network interface and memory.
Memory may include the non-volatile memory in computer-readable medium, random access memory (RAM) and/or The forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable medium Example.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method Or technology come realize information store.Information can be computer readable instructions, data structure, the module of program or other data. The example of the storage medium of computer includes, but are not limited to phase change memory (PRAM), static random access memory (SRAM), moves State random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasable Programmable read only memory (EEPROM), flash memory or other memory techniques, read-only disc read only memory (CD-ROM) (CD-ROM), Digital versatile disc (DVD) or other optical storage, magnetic cassettes, tape magnetic disk storage or other magnetic storage devices Or any other non-transmission medium, can be used for storage can be accessed by a computing device information.As defined in this article, it calculates Machine readable medium does not include temporary computer readable media (transitory media), such as the data-signal and carrier wave of modulation.
It should also be noted that, the terms "include", "comprise" or its any other variant are intended to nonexcludability It include so that the process, method, commodity or the equipment that include a series of elements not only include those elements, but also to wrap Include other elements that are not explicitly listed, or further include for this process, method, commodity or equipment intrinsic want Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including described want There is also other identical elements in the process, method of element, commodity or equipment.
The foregoing is merely the preferred embodiments of the application, not to limit the application, all essences in the application Within mind and principle, any modification, equivalent substitution, improvement and etc. done be should be included within the scope of the application protection.

Claims (34)

1. a kind of terminal interaction method characterized by comprising
According to the trigger action of the user detected on a terminal screen, the message communication being shown on the terminal screen is chosen Designated contact on interface, wherein the message communication interface is group chat communication interface;
The manipulation mark for corresponding to the designated contact is generated on the terminal screen;
According to the user detected to the drag operation of the manipulation mark, corresponding user behavior data is generated;
The user behavior data is sent to server, to generate corresponding interactive manipulation instruction by the server;Work as institute State designated contact be the user itself when, if the user detected to it is described manipulation mark drag operation are as follows: institute The corresponding region of head portrait that manipulation mark is dragged to another contact person is stated, then the interactive manipulation instruction is used for: being shown in Show corresponding interaction effect on message communication interface on the terminal screen of another contact person.
2. the method according to claim 1, wherein being grasped according to the triggering of the user detected on a terminal screen Make, choose the designated contact on the message communication interface being shown on the terminal screen, comprising:
Detect the corresponding trigger position of the trigger action;
If the trigger position is located in the corresponding region of head portrait of any contact person, the contact person is chosen.
3. the method according to claim 1, wherein generating on the terminal screen corresponding to described specified It is the manipulation mark of people, comprising:
The head portrait picture of the designated contact is extracted, and the head portrait picture is emerged on the message communication interface, with It is identified as the manipulation.
4. the method according to claim 1, wherein as the non-user of the designated contact, if detection Drag operation of the user arrived to the manipulation mark are as follows: past along first direction and the second direction opposite with first direction Multiple movement, then the interactive manipulation instruction is used for: message communication circle on the terminal screen for being shown in the designated contact Show corresponding interaction effect on face.
5. according to the method described in claim 4, it is characterized in that, the reciprocating frequency and the interaction of the manipulation mark The severe degree of the corresponding interaction effect of manipulation instruction is positively correlated.
6. the method according to claim 1, wherein dragging of the user detected to the manipulation mark Operation further include: the manipulation mark is dragged to the dragging track before the corresponding region of head portrait of another contact person;
Wherein, the corresponding interaction effect of the interactive manipulation instruction is related to the dragging track.
7. the method according to any one of claim 4 to 6, which is characterized in that the interaction effect include it is following at least One or a combination set of: corresponding message communication interface is shaken, and shows default picture, phase on corresponding message communication interface The terminal answered is vibrated.
8. the method according to claim 1, wherein further include:
Search whether there is the local pre-stored data for being matched with the user behavior data;
If it exists, then the user behavior data is sent to the server, otherwise do not sent.
9. the method according to claim 1, wherein the manipulation mark emerge in the message communication interface it On.
10. the method according to claim 1, wherein further include:
The interaction manipulation instruction of the terminal from any contact person is received from server;
It executes the interaction manipulation instruction that receives, and shows on the corresponding message communication interface of any contact person corresponding Interaction effect.
11. a kind of terminal interaction device characterized by comprising
Selecting unit is chosen and is shown on the terminal screen according to the trigger action of the user detected on a terminal screen Message communication interface on designated contact, wherein the message communication interface be group chat communication interface;
Identification generation unit generates the manipulation mark for corresponding to the designated contact on the terminal screen;
Data generating unit generates corresponding user according to the user detected to the drag operation of the manipulation mark Behavioral data;
The user behavior data is sent to server by transmission unit, is manipulated with generating corresponding interaction by the server Instruction;When the designated contact is the user itself, if dragging of the user detected to the manipulation mark Operation are as follows: the manipulation mark is dragged to the corresponding region of head portrait of another contact person, then the interactive manipulation instruction is used for: Show corresponding interaction effect on the message communication interface on the terminal screen for being shown in another contact person.
12. device according to claim 11, it is characterised in that:
The selecting unit detects the corresponding trigger position of the trigger action, if the trigger position is located at any contact person's In the corresponding region of head portrait, then the contact person is chosen.
13. device according to claim 11, which is characterized in that the identification generation unit extracts the designated contact Head portrait picture, and the head portrait picture is emerged on the message communication interface, to be identified as the manipulation.
14. device according to claim 11, which is characterized in that as the non-user of the designated contact, if inspection Drag operation of the user measured to the manipulation mark are as follows: along first direction and the second direction opposite with first direction It moves back and forth, then the interactive manipulation instruction is used for: the message communication on the terminal screen for being shown in the designated contact Show corresponding interaction effect on interface.
15. device according to claim 14, which is characterized in that the reciprocating frequency and the friendship of the manipulation mark The severe degree of the corresponding interaction effect of mutual manipulation instruction is positively correlated.
16. device according to claim 11, which is characterized in that the user detected drags the manipulation mark Dynamic operation further include: the manipulation mark is dragged to the dragging rail before the corresponding region of head portrait of another contact person Mark;
Wherein, the corresponding interaction effect of the interactive manipulation instruction is related to the dragging track.
17. device described in any one of 4 to 16 according to claim 1, which is characterized in that the interaction effect include with down toward One or a combination set of few: corresponding message communication interface is shaken, and shows default picture on corresponding message communication interface, Corresponding terminal is vibrated.
18. device according to claim 11, which is characterized in that further include:
Searching unit searches whether there is the local pre-stored data for being matched with the user behavior data;
Wherein, in the presence of lookup result is, the user behavior data is sent to the clothes by the transmission unit Business device, does not otherwise send.
19. device according to claim 11, which is characterized in that the manipulation mark emerges in the message communication interface On.
20. device according to claim 11, which is characterized in that further include:
Receiving unit receives the interaction manipulation instruction of the terminal from any contact person from server;
Show unit, execute the interaction manipulation instruction received, and on the corresponding message communication interface of any contact person Show corresponding interaction effect.
21. a kind of terminal interaction method characterized by comprising
According to the trigger action of the user detected on a terminal screen, the message communication being shown on the terminal screen is chosen Designated contact on interface, wherein the message communication interface is group chat communication interface;
The manipulation mark for corresponding to the designated contact is generated on the terminal screen;
According to the user detected to the drag operation of the manipulation mark, corresponding user behavior data is generated;
The user behavior data is sent to the terminal of the designated contact directly or by server, by the specified connection It is the corresponding interactive manipulation instruction of terminal generation of people;When the designated contact is the user itself, if detect Drag operation of the user to the manipulation mark are as follows: the head portrait that the manipulation mark is dragged to another contact person is corresponding Region, then the interactive manipulation instruction is used for: the message communication interface on the terminal screen for being shown in another contact person On show corresponding interaction effect.
22. according to the method for claim 21, which is characterized in that further include:
The user behavior data of the terminal from any contact person is received directly or by server;
According to the user behavior data received, corresponding interactive manipulation instruction is generated;
The interaction manipulation instruction generated is executed, and shows corresponding friendship on the corresponding message communication interface of any contact person Mutual effect.
23. a kind of terminal interaction device characterized by comprising
Selecting unit is chosen and is shown on the terminal screen according to the trigger action of the user detected on a terminal screen Message communication interface on designated contact, wherein the message communication interface be group chat communication interface;
Identification generation unit generates the manipulation mark for corresponding to the designated contact on the terminal screen;
Data generating unit generates corresponding user according to the user detected to the drag operation of the manipulation mark Behavioral data;
The user behavior data is sent to the terminal of the designated contact by transmission unit directly or by server, with Corresponding interactive manipulation instruction is generated by the terminal of the designated contact;When the designated contact is the user itself, If the user detected is to the drag operation of the manipulation mark are as follows: the manipulation mark is dragged to another contact person's The corresponding region of head portrait, then the interactive manipulation instruction is used for: disappearing on the terminal screen for being shown in another contact person Show corresponding interaction effect in breath communication interface.
24. device according to claim 23, which is characterized in that further include:
Receiving unit receives the user behavior data of the terminal from any contact person directly or by server;
Generation unit generates corresponding interactive manipulation instruction according to the user behavior data received;
Show unit, execute the interaction manipulation instruction of generation, and is opened up on the corresponding message communication interface of any contact person Now corresponding interaction effect.
25. a kind of terminal interaction method characterized by comprising
According to the trigger action of the user detected on a terminal screen, the message communication being shown on the terminal screen is chosen Designated contact on interface, wherein the message communication interface is group chat communication interface;
The manipulation mark for corresponding to the designated contact is generated on the terminal screen;
According to the user detected to the drag operation of the manipulation mark, corresponding user behavior data is generated;
Corresponding interactive manipulation instruction is generated according to the user behavior data, and is sent to the finger directly or by server Determine the terminal of contact person;When the designated contact is the user itself, if the user detected is to the manipulation The drag operation of mark are as follows: the manipulation mark is dragged to the corresponding region of head portrait of another contact person, then the interaction behaviour Control instruction is used for: showing corresponding interaction effect on the message communication interface on the terminal screen for being shown in another contact person Fruit.
26. according to the method for claim 25, which is characterized in that further include:
The interaction manipulation instruction of the terminal from any contact person is received directly or by server;
It executes the interaction manipulation instruction that receives, and shows on the corresponding message communication interface of any contact person corresponding Interaction effect.
27. a kind of terminal interaction device characterized by comprising
Selecting unit is chosen and is shown on the terminal screen according to the trigger action of the user detected on a terminal screen Message communication interface on designated contact, wherein the message communication interface be group chat communication interface;
Identification generation unit generates the manipulation mark for corresponding to the designated contact on the terminal screen;
Data generating unit generates corresponding user according to the user detected to the drag operation of the manipulation mark Behavioral data;
Generation unit generates corresponding interactive manipulation instruction according to the user behavior data;
The interaction manipulation instruction of generation is sent to the terminal of the designated contact by transmission unit directly or by server; When the designated contact is the user itself, if the user detected is to the drag operation of the manipulation mark Are as follows: the manipulation mark is dragged to the corresponding region of head portrait of another contact person, then the interactive manipulation instruction is used for: aobvious Show corresponding interaction effect on the message communication interface being shown on the terminal screen of another contact person.
28. device according to claim 27, which is characterized in that further include:
Receiving unit receives the interaction manipulation instruction of the terminal from any contact person directly or by server;
Show unit, execute the interaction manipulation instruction received, and on the corresponding message communication interface of any contact person Show corresponding interaction effect.
29. a kind of electronic equipment characterized by comprising
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is configured to executable instruction is to realize such as method of any of claims 1-10.
30. a kind of computer readable storage medium, is stored thereon with computer instruction, which is characterized in that the instruction is by processor It is realized when execution such as the step of any one of claim 1-10 the method.
31. a kind of electronic equipment characterized by comprising
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is configured to executable instruction is to realize the method as described in any one of claim 21-22.
32. a kind of computer readable storage medium, is stored thereon with computer instruction, which is characterized in that the instruction is by processor It is realized when execution such as the step of any one of claim 21-22 the method.
33. a kind of electronic equipment characterized by comprising
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is configured to executable instruction is to realize the method as described in any one of claim 25-26.
34. a kind of computer readable storage medium, is stored thereon with computer instruction, which is characterized in that the instruction is by processor It is realized when execution such as the step of any one of claim 25-26 the method.
CN201410302455.9A 2014-06-27 2014-06-27 Terminal interaction method and its device Active CN105204748B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410302455.9A CN105204748B (en) 2014-06-27 2014-06-27 Terminal interaction method and its device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410302455.9A CN105204748B (en) 2014-06-27 2014-06-27 Terminal interaction method and its device

Publications (2)

Publication Number Publication Date
CN105204748A CN105204748A (en) 2015-12-30
CN105204748B true CN105204748B (en) 2019-09-17

Family

ID=54952471

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410302455.9A Active CN105204748B (en) 2014-06-27 2014-06-27 Terminal interaction method and its device

Country Status (1)

Country Link
CN (1) CN105204748B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106302137A (en) * 2016-10-31 2017-01-04 努比亚技术有限公司 Group chat message processing apparatus and method
CN106888317B (en) * 2017-01-03 2019-08-09 努比亚技术有限公司 A kind of interaction processing method, device and terminal
CN111290722A (en) * 2020-01-20 2020-06-16 北京大米未来科技有限公司 Screen sharing method, device and system, electronic equipment and storage medium
CN114928524B (en) * 2022-05-20 2024-03-26 浪潮思科网络科技有限公司 Interaction method, device and medium of WEB terminal and switch

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1842003A (en) * 2005-03-30 2006-10-04 广州市领华科技有限公司 Method for realizing instant communication with a plurality of linkmen in single conversational window
CN101465816A (en) * 2007-12-19 2009-06-24 腾讯科技(深圳)有限公司 Method and system for displaying instant communication dynamic effect
CN102750555A (en) * 2012-06-28 2012-10-24 北京理工大学 Expression robot applied to instant messaging tool
CN102790731A (en) * 2012-07-18 2012-11-21 上海量明科技发展有限公司 Triggering transmission method, client and system by instant messaging tool
CN102833183A (en) * 2012-08-16 2012-12-19 上海量明科技发展有限公司 Instant messaging interactive interface moving method, client and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100299392A1 (en) * 2009-05-19 2010-11-25 Shih-Chien Chiou Method for controlling remote devices using instant message

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1842003A (en) * 2005-03-30 2006-10-04 广州市领华科技有限公司 Method for realizing instant communication with a plurality of linkmen in single conversational window
CN101465816A (en) * 2007-12-19 2009-06-24 腾讯科技(深圳)有限公司 Method and system for displaying instant communication dynamic effect
CN102750555A (en) * 2012-06-28 2012-10-24 北京理工大学 Expression robot applied to instant messaging tool
CN102790731A (en) * 2012-07-18 2012-11-21 上海量明科技发展有限公司 Triggering transmission method, client and system by instant messaging tool
CN102833183A (en) * 2012-08-16 2012-12-19 上海量明科技发展有限公司 Instant messaging interactive interface moving method, client and system

Also Published As

Publication number Publication date
CN105204748A (en) 2015-12-30

Similar Documents

Publication Publication Date Title
US11455093B2 (en) Capturing and sending multimedia as electronic messages
KR102058976B1 (en) Application method and device
CN105204748B (en) Terminal interaction method and its device
CN110189089A (en) Method, system and mobile device for communication
WO2015023406A1 (en) Capture and retrieval of a personalized mood icon
US9798441B2 (en) Displaying a post unit within a stream interface
TW201546704A (en) Instant messaging (1)
EP3103250A1 (en) Highlighting univiewed video messages
CN109587031A (en) Data processing method
CN111817947A (en) Message display system, method, device and storage medium for communication application
WO2017011084A1 (en) System and method for interaction between touch points on a graphical display
CN108171079A (en) A kind of collecting method based on terminal, device, terminal and storage medium
TW202001685A (en) Data processing method and apparatus, electronic device and readable medium
CN109525697A (en) Contact person shares and the method, apparatus and terminal of display
CN110543582A (en) image-based query method and device
CN106911551B (en) Method and device for processing identification picture
CN109426416A (en) Message method, device and equipment in instant messaging tools
CN104038513A (en) Information processing device and information processing method
WO2023115316A1 (en) Screen mirroring method and apparatus, and storage medium and electronic device
CN116633896A (en) Interaction method and device based on expression image, storage medium and electronic equipment
TWI567628B (en) Long press the message immediately after the search method
CN103870509A (en) Browser resources storage method and terminal equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20191211

Address after: P.O. Box 31119, grand exhibition hall, hibiscus street, 802 West Bay Road, Grand Cayman, Cayman Islands

Patentee after: Innovative advanced technology Co., Ltd

Address before: A four-storey 847 mailbox in Grand Cayman Capital Building, British Cayman Islands

Patentee before: Alibaba Group Holding Co., Ltd.

TR01 Transfer of patent right