CN110134452B - Object processing method and device - Google Patents

Object processing method and device Download PDF

Info

Publication number
CN110134452B
CN110134452B CN201810132665.6A CN201810132665A CN110134452B CN 110134452 B CN110134452 B CN 110134452B CN 201810132665 A CN201810132665 A CN 201810132665A CN 110134452 B CN110134452 B CN 110134452B
Authority
CN
China
Prior art keywords
control
selected object
area
user
display area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810132665.6A
Other languages
Chinese (zh)
Other versions
CN110134452A (en
Inventor
钟磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201810132665.6A priority Critical patent/CN110134452B/en
Priority to TW107139646A priority patent/TW201935187A/en
Priority to PCT/CN2019/074138 priority patent/WO2019154258A1/en
Publication of CN110134452A publication Critical patent/CN110134452A/en
Application granted granted Critical
Publication of CN110134452B publication Critical patent/CN110134452B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • G06F9/44526Plug-ins; Add-ons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

One or more embodiments of the present specification provide an object processing method and apparatus, where the method may include: aiming at a communication session interface between a home terminal user and an opposite terminal user, displaying a corresponding object input area, wherein the object input area comprises an alternative object display area, a selected object display area and a selected object control area, the selected object display area is used for displaying a selected alternative object in the alternative object display area, and the selected object control area is used for displaying an operation control defined by the home terminal user; and according to the received trigger instruction aiming at the operation control, carrying out corresponding processing operation on the object in the selected object display area.

Description

Object processing method and device
Technical Field
One or more embodiments of the present disclosure relate to the field of communications technologies, and in particular, to an object processing method and apparatus.
Background
In the related art, the communication application may establish a communication session between users, and the users may communicate based on a communication session interface corresponding to the communication session. The communication session interface may provide an object input area for the user to select the object that the user wishes to send.
Disclosure of Invention
In view of this, one or more embodiments of the present disclosure provide an object processing method and apparatus.
To achieve the above object, one or more embodiments of the present disclosure provide the following technical solutions:
according to a first aspect of one or more embodiments of the present specification, there is provided an object processing method including:
aiming at a communication session interface between a home terminal user and an opposite terminal user, displaying a corresponding object input area, wherein the object input area comprises an alternative object display area, a selected object display area and a selected object control area, the selected object display area is used for displaying a selected alternative object in the alternative object display area, and the selected object control area is used for displaying an operation control defined by the home terminal user;
and according to the received trigger instruction aiming at the operation control, carrying out corresponding processing operation on the object in the selected object display area.
According to a second aspect of one or more embodiments of the present specification, there is provided an object processing apparatus including:
the display unit is used for displaying a corresponding object input area aiming at a communication session interface between a home terminal user and an opposite terminal user, wherein the object input area comprises an alternative object display area, a selected object display area and a selected object control area, the selected object display area is used for displaying a selected alternative object in the alternative object display area, and the selected object control area is used for displaying an operation control defined by the home terminal user;
and the processing unit is used for carrying out corresponding processing operation on the object in the selected object display area according to the received trigger instruction aiming at the operation control.
Drawings
Fig. 1 is a schematic diagram of a communication system according to an exemplary embodiment.
Fig. 2 is a flowchart of an object processing method according to an exemplary embodiment.
FIG. 3 is a diagrammatic illustration of a communication session interface provided in an exemplary embodiment.
Fig. 4 is a schematic diagram illustrating an expression input area according to an exemplary embodiment.
FIG. 5 is a schematic diagram illustrating content entered within an input box according to an exemplary embodiment.
Fig. 6 is a diagram illustrating a method for sending input content according to an exemplary embodiment.
Fig. 7 is a schematic diagram of an adjustment operation control according to an exemplary embodiment.
Fig. 8 is a schematic diagram of an adjusted expression manipulation area according to an exemplary embodiment.
Fig. 9 is a schematic diagram of another adjusted expression manipulation area according to an exemplary embodiment.
Fig. 10 is a diagram illustrating deletion of input content according to an exemplary embodiment.
Fig. 11 is a diagram illustrating a portion of input content being deleted according to an exemplary embodiment.
FIG. 12 is a diagram illustrating an exemplary embodiment of a method for deleting more input content.
Fig. 13 is a schematic diagram for adding another type of operation control in an expression manipulation area according to an exemplary embodiment.
FIG. 14 is a schematic diagram illustrating an overall adjustment of a dashboard.
Fig. 15 is a schematic structural diagram of an apparatus provided in an exemplary embodiment.
Fig. 16 is a block diagram of an object processing apparatus according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the following exemplary embodiments do not represent all implementations consistent with one or more embodiments of the present specification. Rather, they are merely examples of apparatus and methods consistent with certain aspects of one or more embodiments of the specification, as detailed in the claims which follow.
It should be noted that: in other embodiments, the steps of the corresponding methods are not necessarily performed in the order shown and described herein. In some other embodiments, the methods may include more or fewer steps than those described herein. Moreover, a single step described in this specification may be broken down into multiple steps for description in other embodiments; multiple steps described in this specification may be combined into a single step in other embodiments.
Fig. 1 is a schematic diagram of a communication system according to an exemplary embodiment. As shown in fig. 1, the system may include a server 11, a network 12, a plurality of electronic devices such as a mobile phone 13, a mobile phone 14, a mobile phone 15, and the like.
The server 11 may be a physical server comprising a separate host, or the server 11 may be a virtual server carried by a cluster of hosts. During operation, the server 11 may run a server-side program of a communication application.
The handsets 13-15 are just one type of electronic device that a user may use. In fact, it is obvious that the user can also use electronic devices of the type such as: tablet devices, notebook computers, personal Digital Assistants (PDAs), wearable devices (e.g., smart glasses, smart watches, etc.), etc., which are not limited by one or more embodiments of the present disclosure. During operation, the electronic device may run a client-side program of a communication application. Then, the corresponding communication function can be realized by the cooperation between the server-side program running on the server 11 and the client-side program running on the electronic device such as the mobile phone 13-15. It should be noted that: the client can be pre-installed on the electronic device, so that a program on the client side can be started and run on the electronic device; of course, when an online "client" such as HTML5 technology is employed, the client can be obtained and run without installing a corresponding application on the electronic device.
And the network 12 for interaction between the handsets 13-15 and the server 11 may include various types of wired or wireless networks. In one embodiment, the Network 12 may include the Public Switched Telephone Network (PSTN) and the Internet.
In one embodiment, the electronic devices such as the mobile phones 13-15 implement the object processing scheme in this specification by running a client-side program of a communication application, so as to be used in the communication process. In other words, the object handling scheme of the present specification may be implemented separately on an electronic device such as the cell phone 13-15, without the server 11 participating. Of course, in some cases, the user-defined scheme of the local user for the selected object manipulation area may also be uploaded to the server 11, so that the local user can share the user-defined scheme on different electronic devices.
Fig. 2 is a flowchart of an object processing method according to an exemplary embodiment. As shown in fig. 2, the method is applied to an electronic device (such as the mobile phones 13-15 shown in fig. 1), and may include the following steps:
202, aiming at a communication session interface between a home terminal user and an opposite terminal user, displaying a corresponding object input area, wherein the object input area comprises an alternative object display area, a selected object display area and a selected object control area, the selected object display area is used for displaying the selected alternative object in the alternative object display area, and the selected object control area is used for displaying the self-defined operation control of the home terminal user.
In one embodiment, a home terminal user operates a user logged on an electronic device of an object handling scheme of the present specification; the "home end user" and the "opposite end user" belong to a group of relative concepts, for example, in a communication process between a user a and a user B, the user a belongs to the home end user and the user B belongs to the opposite end user from the perspective of the user a, and the user B belongs to the home end user and the user a belongs to the opposite end user from the perspective of the user B.
In an embodiment, the opposite-end user may be a single user, and the corresponding communication session interface is a single chat interface, or the opposite-end user may be multiple users, and the corresponding communication session interface is a group chat or group chat interface, which is not limited in this specification.
In one embodiment, the peer user may include any party that communicates with the home user, such as other users different from the home user, and may even be other electronic devices associated with the home user (e.g., a user logs in to his account with a PC while using a mobile phone, where the user uses the mobile phone, the mobile phone corresponds to the "home user" and the PC corresponds to the "peer user").
In one embodiment, the object input area may be located in the communication session interface, such as may be part of the communication session interface; the object input area may be separate from the communication session interface, for example, the object input area may be shown floating on the communication session interface, or the object input area may be shown in a separate interface distinct from the communication session interface.
In an embodiment, the alternative object display area is used for displaying alternative objects to the home terminal user for the home terminal user to select; wherein, the selected candidate object is added to the selected object display area, which may be called as the selected object. The selected object can be displayed in the selected object display area instead of being directly sent to the opposite-end user after being selected, so that the local-end user can perform corresponding processing operation on the selected object through the selected object control area.
In one embodiment, the selected object presentation area may include: the information input box in the communication session interface can be used for displaying other types of input contents besides the selected object. In other embodiments, the selected object display area may also include other areas different from the information input box, such as an area dedicated to displaying the selected object, and the like, which is not limited in this specification.
In an embodiment, the candidate object may include at least one of: an expression character (e.g., colon ": and parentheses") "," composed "", indicates "smiling"), an expression pattern corresponding to a coded character in a preset character set (e.g., an emoji pattern corresponding to a character set of Unicode coding), and the like, which are not limited in this specification. What is different from some pictures, photos, documents, etc., is that after being selected, the candidate object needs to be displayed in the selected object display area first, rather than being directly sent to the opposite end user, so that the home end user can perform processing operation on the selected object/selected candidate user in the selected object display area through the selected object control area.
In one embodiment, the operation control in the selected object manipulation region may include at least one of: deleting a control, sending a control, revoking a control, editing a control for a specified property, and the like, which are not limited in this specification. The specified attribute may include size, shape, color, transparency, and the like, which is not limited in this specification.
In an embodiment, according to a received control adjustment instruction from the home terminal user, the operation control in the selected object control region may be adjusted in a user-defined manner, so that the selected object control region can meet the personalized requirements of the home terminal user. Custom adjustments to operational controls may include: deleting an existing operation control, adding a new operation control, adjusting an operation function of the existing operation control (for example, adjusting an operation control originally used for adjusting the zoom multiple to an operation control used for executing the deletion operation), and the like.
In an embodiment, at least one of the size and the position of the selected object manipulation area may be adjusted according to a received area adjustment instruction from the home terminal user. For example, the selected object manipulation area is enlarged to facilitate the operation of the local user, or the selected object manipulation area is reduced to enlarge the candidate object display area or the selected object display area. For another example, by adjusting the position of the selected object control area, the relative position relationship among the selected object control area, the candidate object display area, and the selected object display area can be changed to satisfy the control habit of the home terminal user. When the size of the selected object manipulation region is adjusted, the shape of the selected object manipulation region may be locked (for example, when the selected object manipulation region is rectangular, the aspect ratio is locked), so that the shape of the selected object manipulation region is kept unchanged during the size adjustment; alternatively, the resizing process may also be accompanied by a shape change of the selected object manipulation area, which is not limited in this specification.
And step 206, according to the received trigger instruction aiming at the operation control, carrying out corresponding processing operation on the object in the selected object display area.
In an embodiment, when the trigger instruction is initiated for the operation control of the delete function, the objects located in the selected object display area and before the positioning cursor may be deleted in reverse order in sequence, and other types of input content in the selected object display area may be retained. By limiting the deleting function to only act on the object in the selected object display area, the input content of other types can be prevented from being deleted by mistake while the object is deleted quickly without moving a positioning cursor in the selected object display area by a home terminal user under the condition that the selected object display area also contains the input content of other types.
In an embodiment, a predefined mapping relationship corresponding to the peer user may be obtained, where the predefined mapping relationship records an operation control bound by the home terminal user for the peer user; and then, determining the operation control displayed in the selected object control area according to the predefined mapping relation. The local user can have different requirements and habits when facing different opposite end users, so that the local user can realize high-efficiency and accurate control behaviors when facing each opposite end user by recording the predefined mapping relation corresponding to each opposite end user, and the communication efficiency is favorably improved.
In an embodiment, besides the operation control for the selected object, the selected object manipulation area may further include other function controls customized by the home terminal user, so as to implement corresponding processing functions. For example, the selected object control area may include a shortcut sending control, and when a trigger instruction for the shortcut sending control is received, a shortcut message corresponding to the shortcut sending control may be generated, so as to be quickly sent to the peer user through the communication session page.
In an embodiment, when the shortcut message corresponding to the shortcut sending control is generated, the shortcut message may include content for expressing a preset meaning, for example, the preset meaning may include "like", "birthday blessing", "new year blessing", and the like, which is not limited in this specification. The preset meaning may be determined according to default settings, historical communication contents of the home terminal user and the opposite terminal user, historical events/events occurring/events about to occur in the future corresponding to the current date, historical events/events occurring/events about to occur in the future corresponding to the current geographical location, and the like. For example, the default setting may have a predetermined meaning of "like"; when the historical communication content comprises 'happy birthday', the preset meaning can be 'happy birthday'; when the geographical location is located at a certain festival, the preset meaning may be "festival blessing" or the like. The shortcut message may include various types of message contents such as text, graphics, patterns, etc., which are not limited in this specification.
For convenience of understanding, the technical solutions of one or more embodiments of the present specification will be described by taking a communication application "WeChat" as an example. Assuming that a communication operation is implemented between the user a and the user B, the user a uses the mobile phone 13, and the wechat client is run on the mobile phone 13, and can implement the object processing scheme of this specification.
FIG. 3 is a schematic diagram of a communication session interface provided by an exemplary embodiment. As shown in fig. 3, it is assumed that the mobile phone 13 may present a communication session interface 300 with the user B to the user a through the operating WeChat client, so that the user a may perform a communication operation with the user B based on the communication session interface 300. The communication session interface 300 comprises a start icon 301 of an emoticon input area, and the corresponding emoticon input area can be started by triggering the start icon 301.
For example, fig. 4 is a schematic diagram illustrating an emotive input area according to an exemplary embodiment. In response to the user a's trigger operation on the launch icon 301 shown in fig. 3, the mobile phone 13 may show the emoji input area 40 shown in fig. 4 to the user a. Here, when the emoticon input area 40 is shown, the start icon 301 shown in fig. 3 may be switched to the close icon 302 shown in fig. 4 for returning from fig. 4 to the state shown in fig. 3. With respect to the communication session interface 300 and the emotive input area 40 shown in fig. 4, the emotive input area 40 may be considered to be displayed in the communication session interface 300, or the display range of the communication session interface 300 may be considered to be reduced upwards for displaying the emotive input area 40, that is, the communication session interface 300 and the emotive input area 40 are independent from each other. In some embodiments, the emotive input area 40 may also be in other relative positions with respect to the communication session interface 300, such as the emotive input area 40 being suspended above the communication session interface 30, and the like, which is not limited in this specification. In some embodiments, the emoticon input area 40 may be displayed in a separate interface, such as the cell phone 13 may jump from the communication session interface 300 to the separate interface to display the emoticon input area 40 in response to the user a's activation of the activation icon 301.
In an embodiment, the emotion input area 40 may include an emotion display area 401 for displaying alternative emotion patterns, for example, the alternative emotion patterns may include a pattern 401a, a pattern 401b to a pattern 401l shown in fig. 4, and the user a may also view more alternative emotion patterns by flipping up and down. The emoticon input area 40 may include an input box 402, the input box 402 being used to present the input content; for example, fig. 5 is a schematic diagram of displaying input content in an input box according to an exemplary embodiment, and when the user a selects the pattern 401b in the expression display area 401, the corresponding input pattern 402a may be shown in the input box 402. Other types of entered content, such as text, etc., may also be displayed in the input box 402, which is not limited in this specification. The emotion input area 40 may further include an emotion control area 403, where the emotion control area 403 is used to expose operation controls for the input content in the input box 402, such as the delete control 403a and the send control 403b shown in fig. 4-5, for implementing corresponding processing operations on the input content.
For example, when the input content in the input box 402 is the input pattern 402a shown in fig. 5, the input pattern 402a may be deleted and the state shown in fig. 4 may be returned in response to the user a triggering operation on the deletion control 403 a. For another example, fig. 6 is a schematic diagram of an exemplary embodiment of sending an input content, and in response to a triggering operation of the sending control 403B by the user a, a communication message 60 as shown in fig. 6 may be generated according to the input pattern 402a in the input box 402, so as to send the input pattern 402a to the user B through the communication message 60.
Fig. 7 is a schematic diagram of an adjustment operation control according to an exemplary embodiment. In response to a preset operation performed by the user a on the expression control area 403, such as pressing any operation control in the expression control area 403 for a long time, double-clicking a blank position of the expression control area 403, and the like, the expression control area 403 may be switched from the normal control state shown in fig. 4-6 to the adjustment state shown in fig. 7. For example, after the preset operation is performed, as for the delete control 403a, the send control 403b, and the like in the expression control area 403, on the right side of the operation controls, the preset operation is performedThe upper corner can show
Figure BDA0001575360280000091
And identifying, the normal operation state can be determined to be switched to the adjustment state.
In one embodiment, any operation control is triggered to correspond to
Figure BDA0001575360280000092
Identifying that the corresponding operation control can be deleted from the emoticon control area 403. For example, fig. 8 is a schematic diagram of an adjusted expressive control area according to an exemplary embodiment; responsive to user A's selection of the upper right corner of the delete control 403a
Figure BDA0001575360280000093
The trigger operation of the identifier may remove the delete control 403a from the expression control area 403, so that the expression control area 403 remains with only the send control 403b as shown in fig. 8.
In an embodiment, the expression control area 403 in the adjusted state may display an "add" option 403c as shown in fig. 7, and the user a may add more operation controls to the expression control area 403 by triggering the "add" option 403c, so as to implement function expansion of the expression control area 403. For example, fig. 9 is a schematic diagram of another adjusted expressive control area provided by an exemplary embodiment; as shown in fig. 9, assume that user a has added an undo control 403d by triggering an "add" option 403c, such that user a may implement a corresponding undo function based on the undo control 403 d.
FIG. 10 is a diagram illustrating manipulation of input content according to an exemplary embodiment. As shown in fig. 10, the input content in the input box 402 includes: entered pattern 402b, text "no problem", entered pattern 402c, and so on. Assuming that, after forming the input content described above, the user a finds that it is desirable to delete the input patterns 402b and 402c, the input patterns in the input box 402 can be deleted quickly by triggering the deletion control 403a in the expression manipulation area 403. Before the user a has not triggered the delete control 403a, the positioning cursor in the input box 402 is located behind the input pattern 402c, and then when the user a triggers the delete control 403a for the first time, the input pattern 402c may be deleted, the input pattern 402b and the text "no problem" shown in fig. 11 are left, and at this time, the positioning cursor moves to behind the text "no problem". The user a may then trigger the delete control 403a second time, at which point the text "no problem" may be skipped and the entered pattern 402b deleted directly, leaving the text "no problem" as shown in fig. 12, without the user a having to manually move the positioning cursor behind the entered pattern 402 b. In other words, in the case that the text and the input pattern exist in the input box 402 at the same time, the continuous and quick deletion operation of the input pattern in the input box 402 can be realized through the deletion control 403a without the user a manually adjusting the position of the positioning cursor.
Based on the above-described trigger operation for the deletion control 403a, the input pattern 402a and the input pattern 402b shown in fig. 10 can be deleted, and thus in the embodiment shown in fig. 12, only the text "without problem" in the input box 402 remains. At this time, if the user a finds that it is desirable to retain the inputted pattern 402b, in one case, the positioning cursor may be manually moved to the front of the text "no problem", and the pattern 401a in the expression display area 401 is selected; in another case, a quick input may be achieved by overriding control 403 d. For example, in response to the user a triggering the cancel control 403d, the mobile phone 13 may cancel the last executed history operation, and since the history operations implemented in the above embodiments sequentially include "delete the input pattern 402c by triggering the delete control 403 a", "delete the input pattern 402b by triggering the delete control 403 a", it is known that the last executed history operation is "delete the input pattern 402b by triggering the delete control 403 a", and thus by canceling "delete the input pattern 402b by triggering the delete control 403 a", it is possible to quickly return to the state shown in fig. 11, that is, the input content in the input box 402 includes the input pattern 402b and the text "no problem".
Fig. 13 is a schematic diagram of adding another type of operation control to an emote manipulation area according to an exemplary embodiment. In addition to the operation control for performing manipulation on the input content in the input box 402, another type of operation control may be added to the emote manipulation region 403, which is not necessarily used for performing manipulation on the input content in the input box 402. For example, the emoticon manipulation area 403 may include a shortcut sending control 403e as shown in fig. 13, and by triggering the shortcut sending control 403e, the user a may quickly send a shortcut message to the user B. The message content of the shortcut message is used for expressing a preset meaning; for example, the preset meaning may be "like" by default, so that the user a may send a corresponding like pattern to the user B by the trigger operation of the shortcut sending control 403 e; for another example, the preset meaning may be configured to be related to the content of the historical communication message between the user a and the user B, such as the preset meaning may be related to "birthday blessing" after the user a sends "happy birthday" to the user B; for another example, the preset meaning may be related to a historical event/an occurring event/an event that will occur in the future corresponding to the current date, or may be related to a historical event/an occurring event/an event that will occur in the future corresponding to the current geographic location, or may also be related to other specific events, which is not limited in this specification.
Besides performing custom setting on the operation control in the expression control area 403, the user a may perform overall setting on the expression control area 403, such as adjusting the size and the position of the expression control area 403. For example, FIG. 14 is a schematic diagram illustrating an exemplary embodiment of an overall adjustment of a gestural control region; as shown in fig. 14, when the expression manipulation area 403 is in the adjustment state, the user a may press the expression manipulation area 403 for a long time and drag the expression manipulation area 403 toward a certain direction, for example, drag the expression manipulation area from right to left, and may move the expression manipulation area 403 from the right edge to the left edge of the expression input area 40, thereby completing the position adjustment of the expression manipulation area 403. In addition, the user a may resize the expression manipulation region 403 by, for example, long-pressing the vertex of the expression manipulation region 403 and dragging it in a certain direction.
For example, when the expression control region 403 is rectangular in fig. 14, the aspect ratio of the rectangle may be locked, and the expression control region 403 may be resized based on the aspect ratio. Alternatively, the shape of the expression manipulation area 403 may be unlocked, so that the user a may make a shape adjustment to the expression manipulation area 403 during resizing.
FIG. 15 is a schematic block diagram of an apparatus provided in an exemplary embodiment. Referring to fig. 15, at a hardware level, the apparatus includes a processor 1502, an internal bus 1504, a network interface 1506, a memory 1508, and a non-volatile storage 1510, although other hardware required for the service may be included. The processor 1502 reads a corresponding computer program from the non-volatile memory 1510 into the memory 1508 and then runs the computer program, forming an object processing apparatus on a logical level. Of course, besides software implementation, the one or more embodiments in this specification do not exclude other implementations, such as logic devices or combinations of software and hardware, and so on, that is, the execution subject of the following processing flow is not limited to each logic unit, and may also be hardware or logic devices.
Referring to fig. 16, in a software implementation, the object processing apparatus may include:
a display unit 1601, configured to display a corresponding object input area for a communication session interface between a home terminal user and an opposite terminal user, where the object input area includes a candidate object display area, a selected object display area, and a selected object control area, the selected object display area is used to display a selected candidate object in the candidate object display area, and the selected object control area is used to display an operation control defined by the home terminal user;
the processing unit 1602, according to the received trigger instruction for the operation control, performs a corresponding processing operation on the object in the selected object display area.
Optionally, the operation control in the selected object manipulation region includes at least one of:
deleting a control, sending a control, canceling a control, and editing a control aiming at a specified attribute.
Optionally, the method further includes:
the control adjusting unit 1603 performs custom adjustment on the operation controls in the selected object control area according to the received control adjusting instruction from the home terminal user.
Optionally, the method further includes:
the region adjusting unit 1604 adjusts at least one of the size and the position of the selected object manipulation region according to the received region adjusting instruction from the home terminal user.
Optionally, the method further includes:
a deleting unit 1605, configured to delete the objects located in the selected object display area and before the positioning cursor in reverse order and retain the other types of input contents in the selected object display area when the triggering instruction is initiated for the operation control of the deleting function.
Optionally, the method further includes:
an obtaining unit 1606, configured to obtain a predefined mapping relationship corresponding to the peer user, where the predefined mapping relationship records an operation control bound by the home terminal user for the peer user;
a determining unit 1607, which determines the operation control displayed in the selected object control area according to the predefined mapping relationship.
Optionally, the candidate includes at least one of: the expression characters and the expression patterns corresponding to the code characters in the preset character set.
Optionally, the selected object display area includes: and an information input box in the communication session interface.
Optionally, the selected object manipulation area is further used for showing: the control is quickly sent; the device further comprises:
the generating unit 1608, when receiving a trigger instruction for the shortcut sending control, generates a shortcut message corresponding to the shortcut sending control, so as to send the shortcut message to the peer user through the communication session page.
The systems, apparatuses, modules or units described in the above embodiments may be specifically implemented by a computer chip or an entity, or implemented by a product with certain functions. A typical implementation device is a computer, which may take the form of a personal computer, laptop computer, cellular telephone, camera phone, smart phone, personal digital assistant, media player, navigation device, email messaging device, game console, tablet computer, wearable device, or a combination of any of these devices.
In a typical configuration, a computer includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic disk storage, quantum memory, graphene-based storage media or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus comprising the element.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The terminology used in the description of the one or more embodiments is for the purpose of describing the particular embodiments only and is not intended to be limiting of the description of the one or more embodiments. As used in one or more embodiments of the present specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in one or more embodiments of the present description to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of one or more embodiments herein. The word "if" as used herein may be interpreted as "at" \8230; "or" when 8230; \8230; "or" in response to a determination ", depending on the context.
The above description is only for the purpose of illustrating the preferred embodiments of the one or more embodiments of the present disclosure, and is not intended to limit the scope of the one or more embodiments of the present disclosure, and any modifications, equivalent substitutions, improvements, etc. made within the spirit and principle of the one or more embodiments of the present disclosure should be included in the scope of the one or more embodiments of the present disclosure.

Claims (16)

1. An object processing method, comprising:
aiming at a communication session interface between a home terminal user and an opposite terminal user, displaying a corresponding object input area, wherein the object input area comprises an alternative object display area, a selected object display area and a selected object control area, the selected object display area is used for displaying a selected alternative object in the alternative object display area, and the selected object control area is used for displaying an operation control defined by the home terminal user;
according to the received trigger instruction aiming at the operation control, corresponding processing operation is carried out on the object in the selected object display area;
and when the triggering instruction is initiated aiming at the operation control of the deleting function, sequentially deleting the objects which are positioned in the selected object display area and are positioned in front of the positioning cursor in a reverse order, and reserving other types of input contents in the selected object display area.
2. The method of claim 1, wherein the operational controls within the selected object manipulation zone comprise at least one of:
deleting the control, sending the control, canceling the control and editing the control aiming at the specified attribute.
3. The method of claim 1, further comprising:
and performing user-defined adjustment on the operation control in the selected object control area according to the received control adjustment instruction from the home terminal user.
4. The method of claim 1, further comprising:
and adjusting at least one of the size and the position of the selected object control area according to the received area adjusting instruction from the home terminal user.
5. The method of claim 1, further comprising:
acquiring a predefined mapping relation corresponding to the opposite-end user, wherein an operation control bound for the opposite-end user by the home-end user is recorded in the predefined mapping relation;
and determining the operation control displayed in the selected object control area according to the predefined mapping relation.
6. The method of claim 1, wherein the candidate object comprises at least one of: the expression character and the expression pattern corresponding to the code character in the preset character set.
7. The method of claim 1, wherein the selected object presentation area comprises: and an information input box in the communication session interface.
8. The method of claim 1, wherein the selected object manipulation area is further used to present: the control is sent quickly; the method further comprises the following steps:
and when a trigger instruction aiming at the quick sending control is received, generating a quick message corresponding to the quick sending control so as to be quickly sent to the opposite terminal user through the communication session page.
9. An object processing apparatus, comprising:
the display unit is used for displaying a corresponding object input area aiming at a communication session interface between a local user and an opposite user, wherein the object input area comprises an alternative object display area, a selected object display area and a selected object control area, the selected object display area is used for displaying a selected alternative object in the alternative object display area, and the selected object control area is used for displaying an operation control defined by the local user;
the processing unit is used for carrying out corresponding processing operation on the objects in the selected object display area according to the received trigger instruction aiming at the operation control;
and the deleting unit is used for sequentially deleting the objects which are positioned in the selected object display area and are positioned before the positioning cursor in a reverse order when the triggering instruction is initiated aiming at the operation control of the deleting function, and reserving the other types of input contents in the selected object display area.
10. The apparatus of claim 9, wherein the operational controls within the selected object manipulation zone comprise at least one of:
deleting a control, sending a control, canceling a control, and editing a control aiming at a specified attribute.
11. The apparatus of claim 9, further comprising:
and the control adjusting unit is used for performing custom adjustment on the operation controls in the selected object control area according to the received control adjusting instruction from the home terminal user.
12. The apparatus of claim 9, further comprising:
and the area adjusting unit adjusts at least one of the size and the position of the selected object control area according to the received area adjusting instruction from the home terminal user.
13. The apparatus of claim 9, further comprising:
the acquiring unit is used for acquiring a predefined mapping relation corresponding to the opposite-end user, and the predefined mapping relation records an operation control bound for the opposite-end user by the home-end user;
and the determining unit is used for determining the operation control displayed in the selected object control area according to the predefined mapping relation.
14. The apparatus of claim 9, wherein the candidate object comprises at least one of: the expression character and the expression pattern corresponding to the code character in the preset character set.
15. The apparatus of claim 9, wherein the selected object display area comprises: and an information input box in the communication session interface.
16. The apparatus of claim 9, wherein the selected object manipulation zone is further configured to present: the control is quickly sent; the device further comprises:
and the generating unit is used for generating a shortcut message corresponding to the shortcut sending control when receiving a triggering instruction aiming at the shortcut sending control so as to be quickly sent to the opposite-end user through the communication session page.
CN201810132665.6A 2018-02-09 2018-02-09 Object processing method and device Active CN110134452B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201810132665.6A CN110134452B (en) 2018-02-09 2018-02-09 Object processing method and device
TW107139646A TW201935187A (en) 2018-02-09 2018-11-08 Object processing method and apparatus
PCT/CN2019/074138 WO2019154258A1 (en) 2018-02-09 2019-01-31 Object processing method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810132665.6A CN110134452B (en) 2018-02-09 2018-02-09 Object processing method and device

Publications (2)

Publication Number Publication Date
CN110134452A CN110134452A (en) 2019-08-16
CN110134452B true CN110134452B (en) 2022-10-25

Family

ID=67548136

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810132665.6A Active CN110134452B (en) 2018-02-09 2018-02-09 Object processing method and device

Country Status (3)

Country Link
CN (1) CN110134452B (en)
TW (1) TW201935187A (en)
WO (1) WO2019154258A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112905079B (en) * 2019-11-19 2022-12-13 北京搜狗科技发展有限公司 Data processing method, device and medium
CN113552992A (en) * 2021-08-03 2021-10-26 网易(杭州)网络有限公司 Control display control method, device, equipment and medium
CN114035716A (en) * 2021-11-11 2022-02-11 北京字跳网络技术有限公司 Display control method, display control device, electronic equipment and storage medium
CN114385264A (en) * 2022-01-12 2022-04-22 挂号网(杭州)科技有限公司 Configuration method and device, background, electronic equipment and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1691788A (en) * 1998-02-04 2005-11-02 松下电器产业株式会社 Method and device for erasing message from wireless communication device having a paging function
CN101227424A (en) * 2007-12-20 2008-07-23 腾讯科技(深圳)有限公司 Information exhibiting method as well as subscriber terminal
CN104598245A (en) * 2015-01-29 2015-05-06 广东欧珀移动通信有限公司 Chatting method and device and mobile terminal
CN104933113A (en) * 2014-06-06 2015-09-23 北京搜狗科技发展有限公司 Expression input method and device based on semantic understanding
CN104954605A (en) * 2014-03-31 2015-09-30 京瓷办公信息系统株式会社 Image forming apparatus, image forming system, and image forming method
CN105892372A (en) * 2016-05-31 2016-08-24 北京光年无限科技有限公司 Intelligent robot expression output method and intelligent robot
CN105930828A (en) * 2016-04-15 2016-09-07 腾讯科技(深圳)有限公司 Expression classification identification control method and device
CN106445395A (en) * 2016-12-06 2017-02-22 Tcl集团股份有限公司 Message display method and device
CN106575166A (en) * 2014-08-11 2017-04-19 张锐 Methods for processing handwritten inputted characters, splitting and merging data and encoding and decoding processing
CN106886364A (en) * 2017-02-14 2017-06-23 深圳市金立通信设备有限公司 A kind of text handling method and terminal based on speech recognition
CN206311916U (en) * 2016-05-31 2017-07-07 北京光年无限科技有限公司 A kind of intelligent robot of exportable expression
CN107025039A (en) * 2017-04-11 2017-08-08 北京小度信息科技有限公司 Information processing method and device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100428126C (en) * 2006-03-31 2008-10-22 腾讯科技(深圳)有限公司 Method for editing picture in customer end contents transmission window and customer end
US20100088262A1 (en) * 2008-09-29 2010-04-08 Neuric Technologies, Llc Emulated brain
CN102368196B (en) * 2011-10-02 2016-05-04 上海量明科技发展有限公司 Method, terminal and the system of customer end contents transmission window inediting dynamic picture
CN102693094A (en) * 2012-06-12 2012-09-26 上海量明科技发展有限公司 Method, client side and system for adjusting characters in instant messaging
CN102819325A (en) * 2012-07-21 2012-12-12 上海量明科技发展有限公司 Input method and system for obtaining a plurality of character presenting effects
CN103744576B (en) * 2013-11-08 2016-12-07 维沃移动通信有限公司 A kind of method and system at the operation interface for realizing mobile terminal
CN105634909B (en) * 2014-10-29 2020-01-14 腾讯科技(深圳)有限公司 Message display method and message display device
CN104834442A (en) * 2015-03-05 2015-08-12 广州鑫界数字科技有限公司 Information editing method and system based on wechat development mode
CN107508749B (en) * 2017-09-18 2019-11-19 维沃移动通信有限公司 A kind of message method and mobile terminal

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1691788A (en) * 1998-02-04 2005-11-02 松下电器产业株式会社 Method and device for erasing message from wireless communication device having a paging function
CN101227424A (en) * 2007-12-20 2008-07-23 腾讯科技(深圳)有限公司 Information exhibiting method as well as subscriber terminal
CN104954605A (en) * 2014-03-31 2015-09-30 京瓷办公信息系统株式会社 Image forming apparatus, image forming system, and image forming method
CN104933113A (en) * 2014-06-06 2015-09-23 北京搜狗科技发展有限公司 Expression input method and device based on semantic understanding
CN106575166A (en) * 2014-08-11 2017-04-19 张锐 Methods for processing handwritten inputted characters, splitting and merging data and encoding and decoding processing
CN104598245A (en) * 2015-01-29 2015-05-06 广东欧珀移动通信有限公司 Chatting method and device and mobile terminal
CN105930828A (en) * 2016-04-15 2016-09-07 腾讯科技(深圳)有限公司 Expression classification identification control method and device
CN105892372A (en) * 2016-05-31 2016-08-24 北京光年无限科技有限公司 Intelligent robot expression output method and intelligent robot
CN206311916U (en) * 2016-05-31 2017-07-07 北京光年无限科技有限公司 A kind of intelligent robot of exportable expression
CN106445395A (en) * 2016-12-06 2017-02-22 Tcl集团股份有限公司 Message display method and device
CN106886364A (en) * 2017-02-14 2017-06-23 深圳市金立通信设备有限公司 A kind of text handling method and terminal based on speech recognition
CN107025039A (en) * 2017-04-11 2017-08-08 北京小度信息科技有限公司 Information processing method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
齐齐玩转QQ;伍裕标;《电子与电脑》;20030708(第07期);第107页 *

Also Published As

Publication number Publication date
TW201935187A (en) 2019-09-01
CN110134452A (en) 2019-08-16
WO2019154258A1 (en) 2019-08-15

Similar Documents

Publication Publication Date Title
CN110134452B (en) Object processing method and device
US10254928B1 (en) Contextual card generation and delivery
JP2018166012A (en) Communication user interface systems and methods
US10061493B2 (en) Method and device for creating and editing object-inserted images
US11531646B2 (en) Facilitating generation and utilization of group folders
WO2019242542A1 (en) Screenshot processing method and device
TW201935891A (en) Communication method and device
US11625160B2 (en) Content navigation method and user interface
CN106527864B (en) Interface display method and device
US20240121208A1 (en) Electronic messaging platform that allows users to edit messages after sending
CN108092872B (en) Communication method and device
CN112083978A (en) Event sharing method and device
US10824313B2 (en) Method and device for creating and editing object-inserted images
CN111897607A (en) Application interface loading and interaction method, device and storage medium
CN105204718B (en) Information processing method and electronic equipment
US10283082B1 (en) Differential opacity position indicator
CN110875975A (en) Information processing method and device
CN114995699A (en) Interface interaction method and device
CN113296906B (en) Task configuration method and device
CN112416482B (en) Interface switching method and device
US11054968B2 (en) Method and a device for managing a plurality of messages simultaneously
CN106415626B (en) Group selection initiated from a single item
CN114712849B (en) Cross-platform application operation method and device, electronic equipment and storage medium
JP7045121B1 (en) Program, information processing device, image editing method, and image display method
CN109993815B (en) Method and device for processing picture object in electronic document

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40012224

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant