CN111147356A - Information processing method and device - Google Patents

Information processing method and device Download PDF

Info

Publication number
CN111147356A
CN111147356A CN201911390145.6A CN201911390145A CN111147356A CN 111147356 A CN111147356 A CN 111147356A CN 201911390145 A CN201911390145 A CN 201911390145A CN 111147356 A CN111147356 A CN 111147356A
Authority
CN
China
Prior art keywords
expression
target
expressions
reply
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911390145.6A
Other languages
Chinese (zh)
Inventor
王雨婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Lianshang Network Technology Co Ltd
Original Assignee
Shanghai Lianshang Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Lianshang Network Technology Co Ltd filed Critical Shanghai Lianshang Network Technology Co Ltd
Priority to CN201911390145.6A priority Critical patent/CN111147356A/en
Publication of CN111147356A publication Critical patent/CN111147356A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/02User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail using automatic reactions or user delegation, e.g. automatic replies or chatbot-generated messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an information processing method and device. One embodiment of the method comprises: in response to the condition that the emotion automatic reply is detected to be met, determining the emotion presented in the conversation interface and used as a reply object; acquiring expressions of the same category as the reply object as target expressions; and displaying at least one target expression in the acquired target expressions in the current conversation interface, and sending the at least one target expression or the identification thereof to the server so that the server sends the at least one target expression to the conversation opposite terminal of the terminal. The scheme provided by the embodiment of the application can automatically reply the expression in the session interface, so that the process that a user finds the expression by himself to reply can be omitted, and the reply process of the expression is faster. Meanwhile, the adopted expressions are consistent with the types of the reply objects, so that the probability that the expressions are replied properly can be improved.

Description

Information processing method and device
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to the technical field of internet, and particularly relates to an information processing method and device.
Background
More and more users employ conversation applications for social communication. The user often uses the emoticon during the conversation, and particularly replies with the emoticon after receiving the emoticon sent by the other party.
In the related art, when a user wants to send an expression, it is first necessary to specify what expression is to be sent, so that a corresponding operation can be performed, that is, the user is required to purposefully find and send the expression matching with the expression that the user expects to send. For example, a user needs to actively open an expression browsing interface and actively select an expression presented in the interface to send; for another example, the user needs to actively input text to invoke the display of an expression matching the text, and select and send the expression.
Disclosure of Invention
The embodiment of the application provides an information processing method and device.
In a first aspect, an embodiment of the present application provides an information processing method, including: in response to the condition that the emotion automatic reply is detected to be met, determining the emotion presented in the conversation interface and used as a reply object; acquiring expressions of the same category as the reply object as target expressions; and displaying at least one target expression in the acquired target expressions in the current session interface, and sending the at least one target expression or the identification thereof to the server so that the server sends the at least one target expression to the session opposite terminal of the terminal.
In a second aspect, an embodiment of the present application provides an information processing apparatus, including: a determining unit configured to determine an expression presented in the conversation interface as a reply object in response to detecting that a condition of automatic reply of the expression is met; an acquisition unit configured to acquire, as a target expression, an expression in the same category as the reply object; and the sending unit is configured to display at least one target expression in the acquired target expressions in the current session interface, and send the at least one target expression or the identifier of the target expression to the server, so that the server sends the at least one target expression to the session opposite terminal of the terminal.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; a storage device for storing one or more programs which, when executed by one or more processors, cause the one or more processors to implement a method as in any embodiment of the information processing method.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements a method as in any one of the embodiments of the information processing method.
According to the information processing scheme provided by the embodiment of the application, firstly, the expression which is presented in the conversation interface and is taken as a reply object is determined in response to the condition that the automatic reply conforming to the expression is detected. And then, acquiring the expressions in the same category as the reply object as target expressions. And finally, displaying at least one target expression in the acquired target expressions in the current session interface, and sending the at least one target expression or the identification thereof to the server so that the server sends the at least one target expression to the session opposite terminal of the terminal. According to the embodiment of the application, suspected wrong labeling pictures are screened out from the sample pictures of the training sample set, so that the number of wrong labeling pictures in the training sample set is reduced, and the accuracy of the model is improved. The scheme provided by the embodiment of the application can automatically reply the expression in the session interface, so that the process that a user finds the expression to reply by himself can be omitted, the reply process of the expression is faster, and the expression can be automatically selected and sent for the user under the condition that the user does not know which expression to reply. Meanwhile, the expression adopted by the embodiment of the application is consistent with the type of the reply object, so that the probability that the expression is replied properly can be improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram to which some embodiments of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of an information processing method according to the present application;
FIG. 3 is a schematic diagram of an application scenario of an information processing method according to the present application;
FIG. 4 is a flow diagram of yet another embodiment of an information processing method according to the present application;
FIG. 5 is a schematic block diagram of one embodiment of an information processing apparatus according to the present application;
FIG. 6 is a schematic block diagram of a computer system suitable for use in implementing an electronic device according to some embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 shows an exemplary system architecture 100 to which embodiments of the information processing method or information processing apparatus of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. Various communication client applications, such as a social application, a video application, a live application, an instant messaging tool, a mailbox client, etc., may be installed on the terminal devices 101, 102, 103.
Here, the terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices having a display screen, including but not limited to smart phones, tablet computers, e-book readers, laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 105 may be a server providing various services, such as a background server providing support for the terminal devices 101, 102, 103. The background server may analyze and perform other processing on the received data such as the expression, and feed back a processing result (e.g., an expression for reply) to the terminal device.
It should be noted that the information processing method provided in the embodiment of the present application may be executed by the server 105 or the terminal devices 101, 102, and 103, and accordingly, the information processing apparatus may be provided in the server 105 or the terminal devices 101, 102, and 103.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of an information processing method according to the present application is shown. The information processing method is applied to the terminal and comprises the following steps:
step 201, in response to detecting that the condition of automatic reply according to the emotions is met, determining the emotions which are presented in the conversation interface and are used as reply objects.
In this embodiment, an executing subject of the information processing method (for example, the server or the terminal device shown in fig. 1) may determine, in response to detecting that the condition of automatic reply according to the emotions is met, an emoticon as a reply object in a session interface currently presented by the terminal. Specifically, determining the expression as the reply object may be implemented by determining the expression itself or an identifier of the expression. The reply object is an expression sent by the opposite end of the session, and the execution subject can automatically reply the expression.
In practice, the condition for automatically replying according to the emotions may be various, for example, the condition for automatically replying according to the emotions may be that the user of the terminal inputs "system replies to emotions".
Step 202, obtaining the expressions in the same category as the reply object as target expressions.
In this embodiment, the execution subject may acquire an expression that is the same as the category of the reply object, and take the expression as a target expression. Specifically, the execution subject may search for expressions of the same category in a local expression library, or may send the reply object to the server, so that the server determines expressions of the same category in the database.
In practice, the categories of expressions may be pre-set, for example, if two expressions have similar styles, the two expressions may be of the same category, i.e. belong to the same category.
Step 203, displaying at least one target expression in the acquired target expressions in the current session interface, and sending the at least one target expression or the identifier thereof to the server, so that the server sends the at least one target expression to the session opposite terminal of the terminal.
In this embodiment, the execution body may display part or all of the acquired target expressions, and send the displayed expressions to the server, so that the server sends the expressions to the opposite end of the session, and thus, the expressions displayed at the terminal may also be displayed at the opposite end of the session. Specifically, there may be one or more session peers, that is, a terminal of at least one user participating in a session to which the session interface belongs.
The method provided by the embodiment of the application can automatically reply the expression in the session interface, so that the process that a user finds the expression by himself to reply can be omitted, and the reply process of the expression is faster. Meanwhile, the expression adopted by the embodiment of the application is consistent with the type of the reply object, so that the probability that the expression is replied properly can be improved.
In some optional implementations of this embodiment, step 201 may include: in response to the fact that setting information of the automatic reply emotions exists locally and an emotive message presented in a conversation interface is newly received, determining the emotions corresponding to the emotive message as reply objects; or in response to detecting the operation of instructing emoticon reply, determining the emoticon received last in the conversation interface as a reply object.
In these optional implementation manners, if the execution main body detects that preset setting information exists locally and newly receives an emoticon message for presenting in the session interface, the emoticon corresponding to the emoticon message may be used as a reply object. The setting information may indicate that the emotive message presented in the conversation interface is automatically replied. The emotion message refers to a session message with emotion content, and the emotion message and the emotion have a corresponding relationship. In practice, if the execution subject detects an operation instructing expression reply, the expression instructed by the operation may be determined as a reply object. Specifically, the operation of the emoticon reply may be various operations of the user, such as clicking or long-pressing on a reply object.
Specifically, the setting information of the locally existing automatic expression reply, the newly received expression message for presentation in the session interface, and the operation of instructing expression reply are both conditions that meet the automatic expression reply.
In the implementation modes, the terminal can automatically reply to the newly received expression without any operation of a user, so that the expression is replied more conveniently and quickly, or the automatic reply of the expression is triggered through the operation of the user, so that the flow of the automatic reply of the expression more accords with the desire of the user.
In some optional application scenarios of these implementations, in response to detecting an operation instructing an expressive reply in these implementations, determining an expression of the operation instruction as a reply object may include: in response to detecting a preset operation on one expression in the conversation interface, displaying at least two options including an expression reply option; in response to detecting the operation on the expression reply option, determining one of the expressions as a reply object; or in response to detecting the operation of the emoticon reply button in the conversation interface, determining the emoticon corresponding to the emoticon message which is received newly and is used for being presented in the conversation interface as a reply object.
In these optional application scenarios, the user may perform a preset operation on one of the emoticons in the session interface, for example, the preset operation may be a long press, a double click, or the like. Thus, the execution body may display at least two options. The at least two options may include an emoticon reply option, and may further include a message deletion option, and the like. The operation on the emotion reply option may indicate that an emotion receiving a preset operation is determined as a reply object.
There may be an emoticon reply button in the conversation interface. The user may instruct the execution main body to determine an expression corresponding to the latest received expression message as a reply object by operating the expression reply button.
In the application scenes, the preset operation on one expression in the session interface and the operation on an expression reply button in the session interface can both be conditions which accord with the automatic reply of the expression.
The application scenes can enable the user to select the expression to be replied, or the expression reply button triggers the automatic reply of the expression in real time, so that the controllability of the automatic expression reply process is improved, and the automatic reply of the expression which the user does not want to reply or the reply is avoided when the user does not want to reply.
In some optional implementations of this embodiment, step 202 may include: searching expressions matched with the relevant information of the reply object from a local database or a cloud database according to the relevant information of the reply object, wherein the relevant information comprises at least one of the following items: text labels, series, style, author, collection of works to which they belong.
In these alternative implementations, the execution subject may search a local or cloud database for an expression matching the relevant information of the reply object. That is, the expressions of the same category may be expressions that match the related information. If the text labels between two emoticons are the same or include the same fields, such as the text labels "sleep" and "sleep," the text labels between the emoticons match. If the series between two expressions is the same, the two series match. If the styles between two expressions are the same or similar, the two styles match. The similar styles may be pre-set, such as an lovely style and a peeling style. A match between authors of two expressions means that the two authors are the same author. And if the work sets to which the two expressions belong are the same, matching the work sets to which the two expressions belong.
According to the implementation modes, the expression which is matched with the reply object and is harmonious and automatically replied can be determined according to the matching relation between the related information, so that the automatic reply is more accurate.
In some optional implementations of this embodiment, the displaying, in the current conversation interface, at least one of the obtained target expressions in step 203 may include: in response to the number of the acquired target expressions being at least two, determining and displaying at least one target expression from the acquired target expressions based on historical behavior data of the logged-in user, wherein the historical behavior data comprises at least one of the following expressions: historical expression sending data and historical expression downloading data.
In these alternative implementations, the execution subject may select and display at least one target expression from the target expressions based on the historical behavior data of the login user of the terminal in the case that more than one target expression is obtained. The historical expression transmission data may include the number of times and/or frequency of use of each expression by the user, or may be expressions that have been used by the user or have a higher number of times and/or frequency of use. The historical emoticon download data may include emoticons downloaded locally by the local user and/or emoticons that were downloaded (and possibly deleted) by the local user.
In practice, the executing subject may determine the at least one target expression in various ways. For example, the executing agent may use, as the at least one target expression, an expression that exists in the historical expression transmission data and/or the historical expression download data, among the at least two target expressions. Or, the executing entity may use, as the at least one target expression, an expression that is matched with the related information of the historical expression transmission data and/or the historical expression download data, among the at least two target expressions. In addition, the execution main body can also determine at least one target expression from the historical expression sending data and/or the historical expression downloading data according to the sequence of the use times or frequency from large to small.
The implementation modes can accurately determine the expression which is possibly in line with the preference of the user through the historical behavior data of the user related to the expression, so that the automatically replied expression is more in line with the will of the user.
In some optional application scenarios of these implementations, the determining and displaying at least one target expression from the acquired target expressions based on historical behavior data of the logged-in user related to expressions in these implementations may include: determining the sending times of the login user to each acquired target expression; and determining the target expression with the largest sending times as at least one target expression and displaying.
In these optional application scenarios, the execution subject may determine, from among the at least two acquired target expressions, a target expression with the largest number of times of transmission, and determine the target expression as at least one target expression.
The application scenes can accurately determine the expression most favored by the user from all target expressions according to the sending times of the user to the expressions, so that the automatically replied expression is more in line with the will of the user.
In some optional implementations of this embodiment, before step 203, the method may further include: displaying the searched target expression; and in response to the detection of the selection operation on the searched target expression, taking the target expression indicated by the selection operation as at least one target expression.
In these alternative implementations, the execution subject may enable the user to select from the at least two target expressions, so as to increase controllability of the automatic reply process. Meanwhile, the expression of the heart instrument can be selected by the user, so that the automatically replied expression is more in line with the will of the user.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the information processing method according to the present embodiment. In the application scenario of fig. 3, the execution subject 301 may determine an emoticon 303 presented in the conversation interface as a reply object in response to detecting that the condition 302 conforming to emoticon automatic reply is detected. The execution subject 301 acquires an expression in the same category as the reply object as the target expression 304. The execution main body 301 displays at least one target expression 305 in the acquired target expressions in the current session interface, and sends the at least one target expression or the identifier thereof to the server, so that the server sends the at least one target expression to the session opposite end of the terminal.
With further reference to FIG. 4, a flow 400 of yet another embodiment of an information processing method is shown. The process 400 of the information processing method, applied to the server, includes the following steps:
step 401, receiving an expression or an identifier thereof as a reply object from the target terminal, wherein the expression as the reply object is determined by the target terminal in response to detecting that a condition that the expression is automatically replied is met.
Step 402, searching the expression which is in the same category as the reply object in the database as a target expression.
Step 403, returning at least one target expression in the searched target expressions to the target terminal.
In this embodiment, the execution subject may receive the expression or the identifier of the expression sent by the terminal. Specifically, the terminal determines the expression or the identifier as the reply object and sends the expression or the identifier to the server when detecting that the condition of automatic expression reply is met. It should be noted that, without specific description, the technical means and the main implementation manners adopted in the same or similar execution steps in the terminal and the server may be the same.
The method provided by the embodiment of the application can automatically reply the expression in the session interface, so that the process that a user searches for the expression by himself to reply can be omitted, and the reply process is quicker. Meanwhile, the adopted expression is consistent with the type of the reply object, so that the probability that the expression is replied properly can be improved.
In some optional implementations of this embodiment, step 402 may include: searching the expression which is the same as the related information of the reply object from the cloud database according to the related information of the reply object, wherein the related information comprises at least one of the following items: text labels, series, style, author, collection of works to which they belong.
According to the implementation modes, the expression which is matched with the reply object and is harmonious and automatically replied can be determined according to the matching relation between the related information, so that the automatic reply is more accurate.
In some optional implementations of this embodiment, step 403 may include: in response to the fact that the number of the searched target expressions is at least two, determining at least one target expression from the searched target expressions based on historical behavior data, related to expressions, of a login user of the target terminal, wherein the historical behavior data comprises at least one of the following expressions: historical expression sending data and historical expression downloading data; or responding to the number of the searched target expressions being at least two, sending data based on historical expressions of a plurality of users, and determining at least one target expression from the searched target expressions.
In these alternative implementations, the executing agent may determine at least one target expression based on historical expression transmission data of a plurality of users. Specifically, the plurality of users may be a large number of users. The implementation modes can adopt various modes, and data are transmitted by using a large number of historical expressions of users to determine at least one target expression. For example, the executing entity may send the expression with the highest frequency to the users within a preset historical time period (such as the past week) as the at least one target expression.
The implementation modes can accurately determine the expression which is possibly in line with the preference of the user through the historical behavior data of the user related to the expression, so that the automatically replied expression is more in line with the will of the user. In addition, when the history data of the user is small, or the like, a more appropriate expression automatic reply may be performed based on the history data of a large number of users.
In some optional application scenarios of these implementations, the determining at least one target expression from the searched target expressions based on expression-related historical behavior data of the login user of the target terminal in these implementations may include: determining the sending times of the login user to each searched target expression; and determining the target expression with the largest sending times as at least one target expression.
The application scenes can accurately determine the expression most favored by the user from all target expressions according to the sending times of the user to the expressions, so that the automatically replied expression is more in line with the will of the user.
In some optional application scenarios of these implementations, the determining at least one target expression from the searched target expressions based on expression-related historical behavior data of the login user of the target terminal in these implementations may include: determining at least one target expression from the searched target expressions according to the sequence of the sending times from large to small in the historical expression sending data of a plurality of users; or determining at least one target expression from the searched target expressions according to the sequence of the sending times from the large to the small in the historical expression sending data of the plurality of users and the expressions replied by the reply object; or determining at least one target expression from the searched target expressions according to the sequence of the sending times from the most to the least in the historical expression sending data of a plurality of users aiming at the expressions of the same type of the reply object.
In these optional application scenarios, the execution subject may determine, as at least one target expression, a target expression that is sent by a user more frequently, by sending data according to historical expressions of a plurality of users. In addition, the execution subject may determine a target expression which is replied by a plurality of users more frequently in the expressions replied to the reply object or the target expressions of the same type of the reply object. The number of the target expressions with more sending times determined by the methods can be one or more than two, that is, the determined target expressions can be sorted according to the sending times from large to small, and at least one target expression is determined from the end with the larger sending times.
These application scenarios can determine an appropriate reply expression using a large number of expressions frequently sent by the user or replied expressions.
In some optional implementations of this embodiment, step 403 may include: sending at least one target expression to a session opposite terminal comprising a target terminal, wherein the session opposite terminal comprises an opposite terminal which carries out instant communication with the target terminal through a session interface; or sending the searched target expression to the target terminal, and receiving at least one returned target expression, wherein the at least one target expression is determined by the target terminal based on the selection operation in the searched target expression; and sending the at least one target expression to the conversation opposite terminal.
In these optional implementation manners, the execution subject may automatically determine at least one target expression, and may send the at least one searched target expression to the target terminal, and also send the at least one searched target expression to the session peer of the target terminal. In addition, the execution main body can feed back the searched target expression to the target terminal after the target expression is searched, so that the target terminal can display the target expression to the user, and the user can select at least one target expression. After receiving the at least one target expression selected by the user, the server may further send the at least one target expression to the session peer.
The implementation modes can enable the user to select from the searched target expressions, so that the controllability of the automatic reply process can be increased. Meanwhile, the expression of the heart instrument can be selected by the user, so that the automatically replied expression is more in line with the will of the user.
With further reference to fig. 5, as an implementation of the method shown in the above figures, the present application provides an embodiment of an information processing apparatus, which corresponds to the embodiment of the method shown in fig. 2, and which may include the same or corresponding features or effects as the embodiment of the method shown in fig. 2, in addition to the features described below. The device can be applied to various electronic equipment.
As shown in fig. 5, the information processing apparatus 500 of the present embodiment includes: a determination unit 501, an acquisition unit 502 and a transmission unit 503. The determining unit 501 is configured to determine, in response to detecting that a condition of automatic reply by an emotion is met, an emotion presented in the session interface as a reply object; an acquisition unit 502 configured to acquire, as a target expression, an expression in the same category as the reply object; the sending unit 503 is configured to display at least one of the obtained target expressions in the current session interface, and send the at least one target expression or the identifier thereof to the server, so that the server sends the at least one target expression to the session peer of the terminal.
In some embodiments, the determination unit 501 of the information processing apparatus 500 determines an emotion that is an object of a reply in the conversation interface currently presented by the terminal in response to detecting that the condition for emotion automatic reply is met. Specifically, determining the expression as the reply object may be implemented by determining the expression itself or an identifier of the expression. The reply object is an expression sent by the opposite end of the session, and the execution subject can automatically reply the expression.
In some embodiments, the acquisition unit 502 may acquire an expression that is the same as the category of the reply object and take the expression as a target expression. Specifically, the execution subject may search for expressions of the same category in a local expression library, or may send the reply object to the server, so that the server determines expressions of the same category in the database.
In some embodiments, the sending unit 503 may display a part of or all of the obtained target expressions, and send the displayed expressions to the server, so that the server sends the expressions to the opposite end of the session, and thus, the expressions displayed at the terminal may also be displayed at the opposite end of the session.
The device that above-mentioned embodiment of this application provided can carry out the automatic reply to the expression in the conversation interface to can remove the user from and look for the expression by oneself and carry out the process of replying, make the reply process of expression more swift. Meanwhile, the adopted expression is consistent with the type of the reply object, so that the probability that the expression is replied properly can be improved.
As shown in fig. 6, electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 6 may represent one device or may represent multiple devices as desired.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of embodiments of the present disclosure. It should be noted that the computer readable medium of the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In embodiments of the present disclosure, however, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a determination unit, an acquisition unit, and a transmission unit. Where the names of these cells do not constitute a limitation on the cell itself in some cases, for example, the determination unit may also be described as "a cell that determines an emoticon presented in the conversation interface as a reply object in response to detection of a condition that is compliant with emoticon automatic reply".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: in response to the condition that the emotion automatic reply is detected to be met, determining the emotion presented in the conversation interface and used as a reply object; acquiring expressions of the same category as the reply object as target expressions; and displaying at least one target expression in the acquired target expressions in the current session interface, and sending the at least one target expression or the identification thereof to the server so that the server sends the at least one target expression to the session opposite terminal of the terminal.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (15)

1. An information processing method is applied to a terminal, and the method comprises the following steps:
in response to the condition that the emotion automatic reply is detected to be met, determining the emotion presented in the conversation interface and used as a reply object;
acquiring expressions of the same category as the reply object as target expressions;
and displaying at least one target expression in the acquired target expressions in the current session interface, and sending the at least one target expression or the identification thereof to a server, so that the server sends the at least one target expression to a session opposite terminal of the terminal.
2. The method of claim 1, wherein the determining, in response to detecting that the condition for emotive auto-reply is met, an emoticon presented in the conversation interface as a reply object comprises:
in response to the fact that setting information of the automatically replied emotions exists locally and an emotive message presented in the conversation interface is newly received, determining the emotions corresponding to the emotive message as reply objects; or
And in response to the detection of the operation of instructing expression reply, determining the expression instructed by the operation as a reply object.
3. The method of claim 2, wherein the determining, in response to detecting an operation instructing expression reply, an expression instructed by the operation as a reply object comprises:
in response to the fact that the preset operation of one expression in the conversation interface is detected, displaying at least two options including an expression reply option; in response to detecting the operation on the expression reply option, determining the one expression as the reply object;
or
And in response to the detected operation on the expression reply button in the conversation interface, determining the expression corresponding to the expression message which is received latest and is used for being presented in the conversation interface as the reply object.
4. The method of claim 1, wherein the obtaining of the expression in the same category as the reply object as a target expression comprises:
searching an expression matched with the relevant information of the reply object from a local database or a cloud database according to the relevant information of the reply object, wherein the relevant information comprises at least one of the following items: text labels, series, style, author, collection of works to which they belong.
5. The method of claim 1, wherein the displaying at least one of the obtained target expressions in the current conversation interface comprises:
in response to the number of the acquired target expressions being at least two, determining and displaying at least one target expression from the acquired target expressions based on historical behavior data of the logged-in user, wherein the historical behavior data comprises at least one of the following expressions: historical expression sending data and historical expression downloading data.
6. The method of claim 5, wherein the determining and displaying at least one target expression from the acquired target expressions based on the historical behavior data of the logged-in user related to expressions comprises:
determining the sending times of the login user to each acquired target expression;
and determining the target expression with the largest sending times as the at least one target expression and displaying.
7. The method of claim 1, wherein prior to displaying at least one of the retrieved target expressions in the current conversation interface, the method further comprises:
displaying the searched target expression;
in response to detecting a selection operation on the searched target expression, taking the target expression indicated by the selection operation as the at least one target expression.
8. An information processing method is applied to a server side and comprises the following steps:
receiving an expression or an identification thereof serving as a reply object from a target terminal, wherein the expression serving as the reply object is determined by the target terminal in response to detection of a condition that an expression automatic reply is met;
searching the expression which is in the same category as the reply object in a database as a target expression;
and returning at least one target expression in the searched target expressions to the target terminal.
9. The method of claim 8, wherein the searching for an expression in the database that is in the same category as the reply object as a target expression comprises:
searching the expression which is the same as the related information of the reply object from a cloud database according to the related information of the reply object, wherein the related information comprises at least one of the following items: text labels, series, style, author, collection of works to which they belong.
10. The method of claim 8, wherein the returning at least one of the searched target expressions to the target terminal comprises:
in response to that the number of the searched target expressions is at least two, determining at least one target expression from the searched target expressions based on historical behavior data related to expressions of a login user of the target terminal, wherein the historical behavior data comprises at least one of the following expressions: historical expression sending data and historical expression downloading data;
or
And in response to the fact that the number of the searched target expressions is at least two, determining at least one target expression from the searched target expressions based on historical expression sending data of a plurality of users.
11. The method of claim 10, wherein the determining at least one target expression from the searched target expressions based on expression-related historical behavior data of the login user of the target terminal comprises:
determining the sending times of the login user to each searched target expression;
and determining the target expression with the largest sending times as the at least one target expression.
12. The method of claim 10, wherein the determining at least one target expression from the searched target expressions based on expression-related historical behavior data of the login user of the target terminal comprises:
determining at least one target expression from the searched target expressions according to the sequence of the sending times from large to small in the historical expression sending data of the users;
or
Determining at least one target expression from the searched target expressions according to the sequence of the sending times from the most to the least in the historical expression sending data of the users and the expressions replied by the reply object;
or
And determining the at least one target expression from the searched target expressions according to the sequence of the sending times from the most to the least in the historical expression sending data of the plurality of users aiming at the expressions of the same type of the reply object.
13. The method of claim 8, wherein the returning at least one of the searched target expressions to the target terminal comprises:
sending the at least one target expression to a session opposite terminal comprising the target terminal, wherein the session opposite terminal comprises an opposite terminal which carries out instant communication with the target terminal through the session interface;
or
Sending the searched target expression to the target terminal, and receiving at least one returned target expression, wherein the at least one target expression is determined by the target terminal based on selection operation in the searched target expression; and sending the at least one target expression to the conversation opposite terminal.
14. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-13.
15. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, carries out the method of any one of claims 1-13.
CN201911390145.6A 2019-12-30 2019-12-30 Information processing method and device Pending CN111147356A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911390145.6A CN111147356A (en) 2019-12-30 2019-12-30 Information processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911390145.6A CN111147356A (en) 2019-12-30 2019-12-30 Information processing method and device

Publications (1)

Publication Number Publication Date
CN111147356A true CN111147356A (en) 2020-05-12

Family

ID=70521665

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911390145.6A Pending CN111147356A (en) 2019-12-30 2019-12-30 Information processing method and device

Country Status (1)

Country Link
CN (1) CN111147356A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130084923A1 (en) * 2011-10-04 2013-04-04 Joseph Schuman Systems and methods for users to receive and/or reply to information affiliated with communication attempts while remaining substantially disconnected from mobile electronic device(s) and/or networks
CN107707452A (en) * 2017-09-12 2018-02-16 阿里巴巴集团控股有限公司 For the information displaying method, device and electronic equipment of expression
CN109783677A (en) * 2019-01-21 2019-05-21 三角兽(北京)科技有限公司 Answering method, return mechanism, electronic equipment and computer readable storage medium
CN109831572A (en) * 2018-12-14 2019-05-31 深圳壹账通智能科技有限公司 Chat picture control method, device, computer equipment and storage medium
CN109871165A (en) * 2019-02-01 2019-06-11 天津字节跳动科技有限公司 Display methods, device, terminal device and the server that expression is responded
US20190199663A1 (en) * 2017-12-21 2019-06-27 International Business Machines Corporation Chat message processing
CN110263197A (en) * 2019-06-12 2019-09-20 腾讯科技(深圳)有限公司 A kind of image search method, device, computer equipment and storage medium
CN110427513A (en) * 2019-07-10 2019-11-08 百度在线网络技术(北京)有限公司 Picture recommendation method and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130084923A1 (en) * 2011-10-04 2013-04-04 Joseph Schuman Systems and methods for users to receive and/or reply to information affiliated with communication attempts while remaining substantially disconnected from mobile electronic device(s) and/or networks
CN107707452A (en) * 2017-09-12 2018-02-16 阿里巴巴集团控股有限公司 For the information displaying method, device and electronic equipment of expression
US20190199663A1 (en) * 2017-12-21 2019-06-27 International Business Machines Corporation Chat message processing
CN109831572A (en) * 2018-12-14 2019-05-31 深圳壹账通智能科技有限公司 Chat picture control method, device, computer equipment and storage medium
CN109783677A (en) * 2019-01-21 2019-05-21 三角兽(北京)科技有限公司 Answering method, return mechanism, electronic equipment and computer readable storage medium
CN109871165A (en) * 2019-02-01 2019-06-11 天津字节跳动科技有限公司 Display methods, device, terminal device and the server that expression is responded
CN110263197A (en) * 2019-06-12 2019-09-20 腾讯科技(深圳)有限公司 A kind of image search method, device, computer equipment and storage medium
CN110427513A (en) * 2019-07-10 2019-11-08 百度在线网络技术(北京)有限公司 Picture recommendation method and system

Similar Documents

Publication Publication Date Title
CN109460513B (en) Method and apparatus for generating click rate prediction model
US10454863B2 (en) Data processing device and data processing method based on user emotion icon activity
CN111756917B (en) Information interaction method, electronic device and computer readable medium
US9026598B2 (en) Automatically generating request-specific backup contact information in an out of office message
US10097485B2 (en) System and method to deliver emails as expressive conversations on mobile interfaces
CN107731229B (en) Method and apparatus for recognizing speech
US10613717B2 (en) Reproducing state of source environment when image was screen captured on a different computing device using resource location, resource navigation and positional metadata embedded in image
CN110781408B (en) Information display method and device
CN110677267B (en) Information processing method and device
CN111162993B (en) Information fusion method and device
CN109743245B (en) Method and device for creating group
CN109873756B (en) Method and apparatus for transmitting information
CN109977905B (en) Method and apparatus for processing fundus images
CN110634220A (en) Information processing method and device
CN108549586B (en) Information processing method and device
CN112532507B (en) Method and device for presenting an emoticon, and for transmitting an emoticon
CN109829117B (en) Method and device for pushing information
CN108509442B (en) Search method and apparatus, server, and computer-readable storage medium
CN111277488B (en) Session processing method and device
CN111026849B (en) Data processing method and device
CN111010335A (en) Chat expression sending method and device, electronic equipment and medium
CN107622766B (en) Method and apparatus for searching information
CN111147356A (en) Information processing method and device
CN112822089B (en) Method and device for adding friends
CN111125501B (en) Method and device for processing information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200512

RJ01 Rejection of invention patent application after publication