CN114995924A - Information display processing method, device, terminal and storage medium - Google Patents

Information display processing method, device, terminal and storage medium Download PDF

Info

Publication number
CN114995924A
CN114995924A CN202110226477.1A CN202110226477A CN114995924A CN 114995924 A CN114995924 A CN 114995924A CN 202110226477 A CN202110226477 A CN 202110226477A CN 114995924 A CN114995924 A CN 114995924A
Authority
CN
China
Prior art keywords
session
display element
message
conversation
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110226477.1A
Other languages
Chinese (zh)
Inventor
谭敏
高晓宇
王斌
龙辉
孙迟
赵冲
段恺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110226477.1A priority Critical patent/CN114995924A/en
Publication of CN114995924A publication Critical patent/CN114995924A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The application discloses an information display processing method, an information display processing device, a terminal and a storage medium, and belongs to the field of human-computer interaction. The method comprises the following steps: displaying a chat session interface; displaying the conversation message in a chat conversation interface in response to the first user account sending the conversation message or in response to receiving the conversation message from the second user account; and responding to the triggering information existing in the conversation message, and displaying an interactive display element corresponding to the triggering information on a conversation display element in the chat conversation interface, wherein the conversation display element is a display element associated with the conversation message. Through show the interactive display element that corresponds with trigger information on conversation display element, will trigger information and conversation display element pass through interactive display element and associate, when conversation display element changes in chat conversation interface, make the interactive display element that trigger information corresponds also change, promoted the variety of the interactive display element that trigger information corresponds.

Description

Information display processing method, device, terminal and storage medium
Technical Field
The present application relates to the field of communications technologies, and in particular, to an information display processing method, an information display processing apparatus, a terminal, and a storage medium.
Background
The user chats with the friend through the instant messaging application program installed in the terminal.
The user and the friend have a chat session in the chat session window, the chat session window comprises an input control, and the user can edit the session message in the input control and send the session message to the friend through the instant messaging application program. The types of the conversation messages comprise types of characters, pictures, videos, expressions, files, virtual article packages, cards and tickets, user name cards and the like. The expression is used as a mode of emotional expression, and the emotion of the user can be vividly and vividly conveyed.
According to the technical scheme, when the user interacts with the friend through the expression, the special effect animation corresponding to the expression is preset, namely when the user triggers the expression, the special effect animation corresponding to the expression is completely the same, and the display style is fixed and single.
Disclosure of Invention
The embodiment of the application provides an information display processing method, an information display processing device, a terminal and a storage medium, wherein interactive display elements corresponding to trigger information are displayed on session display elements, the trigger information and the session display elements are associated through the interactive display elements, when the session display elements are changed in a chat session interface, the interactive display elements corresponding to the trigger information are also changed, and the diversity of the interactive display elements corresponding to the trigger information is improved. The technical scheme comprises the following scheme:
according to an aspect of the present application, there is provided an information display processing method including the steps of:
displaying a chat session interface;
displaying a session message in the chat session interface in response to a first user account sending the session message or in response to receiving the session message from a second user account;
and responding to the existence of the trigger information in the session message, and displaying an interactive display element corresponding to the trigger information on a session display element in the chat session interface, wherein the session display element is a display element associated with the session message.
According to another aspect of the present application, there is provided an information display processing apparatus including:
the display module is used for displaying a chat session interface;
the display module is used for responding to a session message sent by a first user account or responding to the session message received from a second user account, and displaying the session message in the chat session interface;
the display module is configured to, in response to the presence of the trigger information in the session message, display an interactive display element corresponding to the trigger information on a session display element in the chat session interface, where the session display element is a display element associated with the session message.
According to another aspect of the present application, there is provided a terminal including: a processor and a memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions that is loaded and executed by the processor to implement the information display processing method as described above.
According to another aspect of the present application, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the information display processing method as described above.
According to another aspect of the application, a computer program product or computer program is provided, comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the information display processing method as described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise the following effects:
when the trigger information is displayed in the chat conversation interface, the interactive display elements corresponding to the trigger information are displayed on the conversation display elements, and the trigger information and the interactive display elements are associated through the conversation display elements.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 2 is a flow chart of an information display processing method provided by an exemplary embodiment of the present application;
FIG. 3 is a diagram of a chat session interface provided by an exemplary embodiment of the present application;
FIG. 4 is a diagram of a chat session interface provided by another exemplary embodiment of the present application;
FIG. 5 is a flow chart of an information display processing method provided by another exemplary embodiment of the present application;
FIG. 6 is a diagram of a chat session interface provided by another exemplary embodiment of the present application;
FIG. 7 is a flow chart of an information display processing method provided by another exemplary embodiment of the present application;
FIG. 8 is a block diagram of a computer system provided in another example embodiment of the present application;
FIG. 9 is a flow chart of an information display processing method provided by another exemplary embodiment of the present application;
FIG. 10 is a diagram of a chat session interface provided by another exemplary embodiment of the present application;
FIG. 11 is a flowchart of an information display processing method provided by another exemplary embodiment of the present application;
FIG. 12 is a diagram of a chat session interface provided by another exemplary embodiment of the present application;
FIG. 13 is a diagram of a chat session interface provided by an exemplary embodiment of the application;
FIG. 14 is a flowchart of an information display processing method provided by another exemplary embodiment of the present application;
FIG. 15 is a block diagram of an information display processing device provided in an exemplary embodiment of the present application;
FIG. 16 is a block diagram of a computer device provided in an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, the following detailed description of the embodiments of the present application will be made with reference to the accompanying drawings.
First, terms referred to in the embodiments of the present application are described:
instant Messaging application (IM): the instant messaging application program is used for two or more persons to transmit text messages, voice messages, audio communication, video communication and transmission files in real time through a network. The instant messaging includes two types of architectures, i.e., a C/S architecture and a B/S architecture, wherein the C/S architecture refers to a Client (Client)/Server (Server) architecture, and the B/S architecture refers to a Browser/Server (Server) architecture. Under the C/S framework, a user needs to install a client of an instant messaging application program on a terminal to realize real-time online communication with other users; under the B/S framework, a user does not need to install a client of an instant messaging application program, and real-time online communication is carried out with other users through a browser.
FIG. 1 illustrates a schematic diagram of a computer system provided in an exemplary embodiment of the present application. The computer system may be implemented as a system having an emoticon display function, and the computer system 100 includes: first terminal 110, server 120, second terminal 130.
The first terminal 110 is installed and operated with a first client 111 supporting instant messaging, and the first client 111 may be an application or a web client having an instant messaging function. When the first terminal 110 runs the first client 111, a user interface of the first client 111 is displayed on a screen of the first terminal 110. The application program may be any one of an instant messaging program, a social-type program, a voice call program, a conference program, a web community program, a payment program, a shopping program, a dating program, a marriage program, a video-type (including short video) application program, a live broadcast application program, and a music-type application program. In the embodiment, the application is exemplified as an instant messenger. The first terminal 110 is a terminal used by the first user 112, and the first client 111 logs in a first user account of the first user 112.
The second terminal 130 is installed and operated with a second client 131 supporting instant messaging, and the second client 131 may be an application or a web client having an instant messaging function. When the second terminal 130 runs the second client 131, a user interface of the second client 131 is displayed on the screen of the second terminal 130. The application program may be any one of an instant messaging program, a social program, a voice call program, a conference program, a web community program, a payment program, a shopping program, a friend-making program, a marriage program, a video (including short videos) application program, a live broadcast application program, and a music application program. In the embodiment, the application is exemplified as an instant messenger. The second terminal 130 is a terminal used by the second user 132, and the second client 131 has the second user account of the second user 132 registered thereon.
In some embodiments, the first client 131 and the second client 132 are clients of the same application, e.g., the first client 131 and the second client 132 are clients of the same instant messaging application. The first and second clients 131 and 132 have a messaging function, and illustratively, the message may be a text message, a picture message, a voice message, an emoticon message, a transfer message, or the like. In one example, the first client 131 may send an emoticon, and the second client 132 may receive the emoticon; in another example, the second client 132 may send an emoticon and the first client 131 may receive the emoticon. A first user account is logged in the first client 131, and a second user account is logged in the second client 132, and in some embodiments, the first user account and the second user account have a friend relationship, that is, the first user account is in a friend list of the second client, and the second user account is in a friend list of the first client 131; in other embodiments, the first user account and the second user account do not have a buddy relationship, i.e., the first user account is not in the buddy list of the second client 132 and the second user account is not in the buddy list of the first client 131.
Illustratively, the server 120 may be a background server of the first client 131 and the second client 132 described above. The server 120 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), a big data and artificial intelligence platform, and the like. The server 120 may communicate with the first terminal 110 and the second terminal 130 through a wired or wireless network, such as relaying messages sent and received between the first terminal 110 and the second terminal 130. Optionally, the server 120 undertakes primary computational work and the first terminal 110 and the second terminal 130 undertakes secondary computational work; alternatively, the server 120 undertakes the secondary computing work and the first terminal 110 and the second terminal 130 undertake the primary computing work; alternatively, the server 120, the first terminal 110, and the second terminal 130 perform cooperative computing by using a distributed computing architecture.
Illustratively, the server 120 includes a processor 122, a user account database 123, a combat service module 124, and an Input/Output Interface (I/O Interface) 125 for a user. The processor 122 is configured to load an instruction stored in the server 121, and process data in the user account database 123 and the instant messaging service module 124; the user account database 123 is configured to store data of user accounts used by the first terminal 110, the second terminal 130, and the other terminals 140, such as a head portrait of the user account, a nickname of the user account, a group where the user account is located, and the like; the instant messaging service module 124 is used for providing a plurality of chat rooms (double chat or multi-chat) for users to chat, express, red packages and the like in instant messaging; the user-facing I/O interface 125 is used to establish communication with the first terminal 110 and/or the second terminal 130 through a wireless network or a wired network to exchange data.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
Fig. 2 is a flowchart illustrating an information display processing method according to an exemplary embodiment of the present application. The embodiment is described by taking the method as an example applied to the first terminal 110 or the second terminal 130 in the computer system shown in fig. 1, or other terminals in the computer system, and the method includes the following steps:
step 201, displaying a chat session interface.
The chat conversation interface refers to an interface for presenting conversation messages.
Optionally, the chat session interface is an interface for displaying the session message in the instant messaging application, the chat session interface is an interface for displaying the session message in the social application, the chat session interface is an interface for displaying the session message in the payment application, the chat session interface is an interface for displaying the session message in the shopping application, the interface for displaying the session message in the video application (including the short video application), and the interface for displaying the session message in the take-away application. This is not limited in the examples of the present application.
Optionally, the chat session interface is a private chat session interface between two users, and the chat session interface is a group chat session interface between more than two users.
Schematically, taking an instant messaging application program as an example, a user account is logged in the instant messaging application program, a friend account (in a friend relationship list or an address book) having a friend relationship with the user account is displayed, the user selects a target friend account, and a private chat session interface between the user account and the target friend account is displayed; illustratively, the user selects a target group from a friend relationship list or an address list, and displays a group chat session interface. When the user selects the target friend account, the head portrait or the item of the friend account can be clicked; when the user selects the target group, the avatar or the entry of the target group can be clicked. In some embodiments, a user enters a private chat session interface between a user account and a target account by clicking a head portrait of the target account in a group chat session interface; in other embodiments, the user invites another friend account through a private chat session interface between the user account and the friend account to enter into a group chat session interface.
As shown in fig. 3, chat session interface 30 includes a conversation area 31 and an input control 32, conversation area 31 is used to present at least one of the following conversation messages: the conversation message that the user successfully sent and the conversation message that the user successfully received, conversation message include text message, voice message, expression message and text message and expression message combined message, etc., input control 32 is used for receiving the conversation message that the user input. Illustratively, the conversation messages are displayed in message bubbles, and the conversation messages sent by the users and the received conversation messages are distinguished through the message bubbles with different colors; in some embodiments, the message bubble corresponds to a bubble skin, which is an element for decorating the message bubble, and the conversation message sent by the user and the received conversation message are distinguished through the bubble skin. Also displayed below the input control 32 is an expression area 33 (including dynamic expressions and static expressions) that can be used by the user, wherein the expression area 33 includes the most recently used expressions (i.e., the expressions that have been used for the most recent period of time) and all expressions. The emoticons selected by the user will be displayed in the input control 32 and in the chat session interface 30 by triggering the send control. Further, the user may add an expression displayed in the dialog area 31 to the expression area 33, create a new expression, and delete a saved expression.
Step 202, in response to the first user account sending a conversation message or in response to receiving a conversation message from the second user account, displaying the conversation message in the chat conversation interface.
The first user account is a user account logged in the application program, and the first user account may be an account generated by the application program according to registration information filled by a user, or may be a user account of a third-party application program. For example, the application is a social application, the first user account is an account generated by the social application according to a mobile phone number of the user, or the first user account is a mobile phone number of the user; for another example, the application is a social application, and the first user account is a user account logged in the instant messaging application, that is, the user logs in the social application through the user account logged in the instant messaging application.
The second user account and the first user account are different accounts in the same application program, the second user account and the first user account have a friend relationship in the same application program, or the second user account and the first user account do not have a friend relationship, or the first user account pays attention to (subscribes to) the second user account (namely the first user account is a bean vermicelli of the second user account), or the second user account pays attention to (subscribes to) the first user account (namely the second user account is a bean vermicelli of the first user account), or the first user account and the second user account pay attention to (subscribes to) each other, namely the first user account and the second user account are bean vermicelli of each other.
The conversation message refers to a message sent between user accounts and used for communication, and the conversation message can be at least one of a message, a file, a transfer message, a resource message (used for transferring resources from one account to another account) and a business card message (used for recommending users to others) after words, emoticons, words and emoticons are combined. Illustratively, the conversation messages are displayed in message bubbles. A message bubble refers to a display element carrying a conversation message, typically in the shape of a bubble.
Step 203, responding to the existence of the trigger information in the session message, displaying an interactive display element corresponding to the trigger information on a session display element in the chat session interface, wherein the session display element is a display element associated with the session message.
The conversation display element is a display element displayed in the chat conversation interface and associated with the conversation message, such as a display element bearing the conversation message, a display element representing identity information, a display element identifying honor, and the like. Illustratively, the session display element includes at least one of the following elements: the message bubble of the conversation message (including characters, expressions and the like in the message bubble), the user head portrait corresponding to the message bubble, the user account corresponding to the message bubble, the remark name corresponding to the message bubble and the badge identification corresponding to the message bubble. The remark name corresponding to the message bubble may be a name of a conversation party displayed in the chat session interface, the name may be a nickname of the conversation party in the application program, or a remark name modified by the user, or the name may be a nickname of the user displayed in the group chat session in the conversation party or a modified remark name.
The trigger information refers to information for triggering display of an interactive display element on a conversation display element. An interactive display element refers to a display element that is capable of interacting with or has an association with the trigger information.
The trigger information includes at least one of the following types: the system comprises character information containing key words or key topics, expressions, information containing instructions for controlling a virtual assistant (or a chat assistant or an intelligent robot), and information corresponding to resource transfer.
For example, the trigger information is a conversation message sent by a certain user and containing a specified keyword, the conversation message sent by the user to the chat window contains the keyword of "happy birthday", and the conversation message is used as the trigger information. For another example, the trigger information is a conversation message with a certain topic, and the conversation message sent by the user to the chat window contains a conversation message of "# celebration birthday", and this conversation message is used as the trigger information. For another example, the trigger information is a system message in a chat session interface, the user and the intelligent robot participating in the chat session are included, and the intelligent robot is used for managing matters involved in the chat session process, including adding the user to the chat session, deleting the user participating in the chat session, interacting with the user in the chat session, and the like. Illustratively, a session message sent by a user carries an instruction, where the instruction is used to control the intelligent robot to generate trigger information, for example, when the user sends a session message "robot, i want to see snow", the intelligent robot performs semantic analysis on the session message, and generates the trigger information according to the instruction contained in the semantics, where the trigger information may be: xxx users trigger the snowing function. In some embodiments, the user may also directly input an instruction in the input control, and @ the robot (@ is used to prompt the target to view a message), and the function of generating the trigger information may also be implemented.
The trigger information in the above embodiment is trigger information formed by characters, and the trigger information in the embodiment of the present application may also be expressions. An expression is a functional image that can express emotions and have a certain meaning. The expressions include static expressions and dynamic expressions, the static expressions are a frame of static image, for example, the static expressions may be in a file format of PNG (Portable Network Graphics), and the static expressions may further include Emoji (Emoji), text and words, and the like. The dynamic expression is an animation synthesized by a plurality of frames of images, for example, the dynamic expression may be in a file Format of GIF (Graphics Interchange Format). Alternatively, the animated expression may include two parts, namely a dynamic body diagram and an animated element, the dynamic body diagram is a main body part of the dynamic expression, such as a cartoon image or a head portrait of a user who shoots the dynamic expression, the animated element may be understood as an element which embodies an animated special effect in the dynamic expression, the animated element may substantially embody the animated special effect of the entire dynamic expression, the animated element may serve as an auxiliary element to better represent the dynamic expression, and the animated element may be a dynamic image with the animated special effect in various sizes and colors, such as a heart, a balloon, a water drop, a five-pointed star, a character, and the like.
Optionally, the first emotion is an emotion owned by the user in the application, or the first emotion is an emotion displayed in an emotion store, or an emotion configured by the input method. For example, the first expression is an expression displayed in the expression area 33 as shown in fig. 3, and the user selects the target expression 331. For another example, the first emoticon is an emoticon not downloaded by the user in the emoticon store, and the user selects a target emoticon in the emoticon store and forwards the target emoticon to the chat session interface.
In some embodiments, the first emoticon is displayed in a message bubble, which refers to a conversation display element for carrying a conversation message; in other embodiments, the message bubble with the first emotion directly displayed in the chat session interface is further used for carrying messages sent by users participating in the chat session, when a user sends one message, the message is displayed in the chat session interface in the form of being displayed in one message bubble, and when the user clicks the sending control once, the sending of one message is completed.
Responding to the first emotion sent by the first user account, and displaying a message bubble including the first emotion in a chat session interface; or responding to the first expression sent by the first user account, and displaying the first expression in the chat session interface; or, responding to the first emoticon sent by the second user account, and displaying a message bubble including the first emoticon in the chat conversation interface; or responding to the second user account to send the first emoticon, and displaying the first emoticon in the chat conversation interface. The display style of the message bubble corresponding to the first user account and the display style of the message bubble corresponding to the second user account may be the same or different, which is not limited in this embodiment of the present application.
In addition, the trigger information in the embodiment of the present application may also be information corresponding to resource transfer. Illustratively, a first user account sends a virtual commodity package to a second user account, the virtual commodity package carries a certain number of virtual money, the virtual commodity package is displayed in a chat session interface by a red package element, and the red package element is trigger information. In some embodiments, the type of resource in the virtual package includes at least one of cash, virtual currency, vouchers, coupons, points, virtual pets, gaming accessories, membership cards.
The interactive display element corresponding to the trigger information is displayed by taking the message bubble as a unit.
The interactive display element corresponding to the trigger information comprises at least one of the following conditions:
changing a color of a conversation display element;
changing a size of a conversation display element;
changing a shape of a conversation display element;
changing a display position of a conversation display element;
changing a display hierarchy of the conversation display element;
adding an additional element on the session display element;
part of the elements on the session display element are cleared.
Illustratively, the trigger information is an expression.
The first expression is a dancing brush, and in response to the first expression sent by the first user account or the first expression sent by the second user account being received, the application program changes the color of a message bubble in the chat session interface according to the first expression. For example, before the first expression is not sent, the color of the message bubble is white, and after the first expression is sent, the color of the message bubble is blue. In some embodiments, the color of the message bubble returns to the default color after a set time interval, such as from blue to white after 3 seconds.
The first expression is a magnifying glass, the first expression is sent by the first user account or the first expression sent by the second user account is received, and the size of characters included in a message bubble in the chat conversation interface is changed by the application program according to the first expression. For example, before the first expression is not sent, the font size of the characters in the message bubble is "medium large", and after the first expression is sent, the font size of the characters in the message bubble is "super large", that is, the characters in the message bubble are enlarged. In some embodiments, the message bubble includes a literal word size that returns to a default size after a set time interval, such as a literal word size that returns to "medium size" after 3 seconds.
The first expression is a spring, and in response to the first expression sent by the first user account or the first expression sent by the second user account being received, the application program changes the shape of the message bubble in the chat conversation interface according to the first expression, for example, before the first expression is not sent, the shape of the message bubble is rectangular, and after the first expression is sent, the shape of the message bubble is circular. In some embodiments, the shape of the message bubble returns to the default shape after a set time interval, such as a rectangular shape after 3 seconds.
The first emotion is an upward arrow, in response to the first emotion sent by the first user account or the first emotion sent by the second user account being received, the application program changes the display position of the message bubble in the chat session interface according to the first emotion, for example, before the first emotion is not sent, the message bubble 1 is the last message bubble displayed in the chat session interface, and after the first emotion is sent, the message bubble 1 moves to the position according to the direction indicated by the arrow, that is, the message bubble 1 moves to the previous message bubble. In some embodiments, after a set time interval, the display position of message bubble 1 is returned to the original display position, for example, after 3 seconds, the display position of message bubble 1 is returned to the display position corresponding to the last message bubble in the chat session interface.
The first emoticon is a progress bar, and in response to the first emoticon sent by the first user account or the first emoticon sent by the second user account being received, the application program changes the display level of the message bubble in the chat session interface according to the first emoticon, for example, before the first emoticon is not sent, the message bubble is completely opaque, that is, the display level of the message bubble is the highest, and after the first emoticon is sent, the message bubble is completely transparent, that is, the display level of the message bubble is the lowest. In some embodiments, the display level of the message bubble returns to the default display position after a set time interval, e.g., the message bubble returns to a fully opaque state after 3 seconds.
The first expression is tomato sauce, and in response to the first expression sent by the first user account or the first expression sent by the second user account being received, the application program adds an additional element on the conversation display element according to the first expression, wherein the additional element is an element which does not belong to the conversation display element. For example, after the first expression is sent, the user head portrait corresponding to the message bubble displays tomato paste. In some embodiments, after a set time interval, the additional element on the conversation display element is removed, e.g., the tomato paste on the user avatar corresponding to the message bubble disappears after 3 seconds.
The first expression is scissors, and in response to the first expression sent by the first user account or the first expression sent by the second user account being received, the application program removes part of elements on the session display element according to the first expression, for example, after the first expression is sent, the application program removes the user avatar corresponding to the message bubble, or the application program removes the hair element in the user avatar corresponding to the message bubble. In some embodiments, the display of the removed part of the elements resumes after the set time interval, for example, after 3 seconds, the display of the user avatar corresponding to the message bubble resumes, or the display of the hair element in the user avatar corresponding to the message bubble resumes.
In one example, the first emoticon is a pistol, and after the first emoticon is sent, a flower is displayed on the user's avatar corresponding to the message bubble. And N flowers are displayed on the head portrait of the user corresponding to the N message bubbles before the first expression is sent, wherein the color of one flower is different from that of the rest flowers. As shown in fig. 4, the interactive display elements are flowers, after the first emoticon 311 is sent, a first interactive display element 314 is displayed on a user avatar corresponding to a message bubble before the message bubble corresponding to the first emoticon 311, and a second interactive display element 315 and a third interactive display element 316 are respectively displayed on user avatars corresponding to two message bubbles after the message bubble corresponding to the first emoticon 311. Wherein the second interactive display element 315 is different from the third interactive display element 316.
Illustratively, the trigger information is a keyword or a word with a topic.
The key word is blooming, after the conversation message containing blooming is sent, flowers are displayed on the head portrait of the user corresponding to the message bubble, and N flowers are displayed on the head portrait of the user corresponding to N message bubbles before the conversation message, wherein the color of one flower is different from the color of the other flowers.
The topic is "# blossoming", and after a conversation message including the topic is transmitted, a flower is displayed on the avatar of the user corresponding to the message bubble, and N flowers are displayed on the avatar of the user corresponding to N message bubbles following the conversation message.
Illustratively, the trigger information is the system information displayed in the chat session interface.
The intelligent robot is also arranged in the chat session, a user sends a session message 'robot, i want to see snow', and @ the intelligent robot carries out semantic analysis on the session message to generate triggering information for triggering the snow function. Displaying "xxx user triggered snowing function" in the chat session interface, a snowflake element is displayed on a message bubble in the chat session interface.
Illustratively, the trigger information is information corresponding to the resource transfer.
The first user account sends the virtual commodity package to the second user account through the chat session, the client displays a red package element on the head portrait corresponding to the second user account according to the receiving party of the virtual commodity package, the red package element is trigger information, and the resource in the virtual commodity package is transferred to the second user account in response to receiving the interactive operation (such as clicking operation) on the red package element.
In some embodiments, the interactive display element is visible only by a sender of the session message (where the session message includes the trigger information), for example, in a case that the session message is sent by the first user account, the interactive display element is visible only by a user corresponding to the first user account; for another example, in a case that the session message is sent by the second user account, the interactive display element is only visible to the user corresponding to the second user account. In other embodiments, the interactive display element is only visible to the recipient of the session message, for example, in a case where the session message is sent from the first user account to the second user account, the interactive display element is only visible to the user corresponding to the second user account; for another example, in a case that the session message is that the second user account is sent to the first user account, the interactive display element is only visible to the user corresponding to the first user account. In other embodiments, the interactive display element is visible to both the sender and the recipient of the session message, e.g., the interactive display element is visible to both the user corresponding to the first user account and the user corresponding to the second user account regardless of whether the session message is sent from the first user account or the second user account.
In some embodiments, the interactive display element is displayed on any one of the conversation display elements, such as a flower displayed on a message bubble corresponding to any one of the conversation messages; in other embodiments, the interactive display elements are displayed on message bubbles corresponding to the same user account, for example, flowers are all displayed on message bubbles corresponding to the session message sent by the first user account, or flowers are all displayed on the user avatar of the first user account, or one flower is displayed on the avatar of the first user account and the other flower is displayed on message bubbles corresponding to the session message sent by the first user account.
It should be noted that the interactive display message changes dynamically as the session display element displayed in the chat session interface changes, for example, if the user withdraws the message sent to the chat session interface, the session display element is also removed from display.
In summary, according to the method provided in this embodiment, when the trigger information is displayed in the chat session interface, the interactive display element corresponding to the trigger information is displayed on the session display element, and the trigger information and the interactive display element are associated through the session display element, because the layout and the number of the session display element in the chat session interface are dynamically changed along with the session message, when the user generates the trigger information by sending the session message, the interactive display element displayed on the session display element is also constantly changed, and the display diversity of the interactive display element in the chat session interface is improved.
Fig. 5 shows a flowchart of an information display processing method according to another exemplary embodiment of the present application, and this embodiment is described by taking an example in which the method is applied to the first terminal 110 or the second terminal 130 in the computer system shown in fig. 1, or another terminal in the computer system. The method comprises the following steps:
step 501, displaying a chat session interface.
Step 502, in response to the first user account sending a conversation message or in response to receiving a conversation message from the second user account, displaying the conversation message in a chat conversation interface.
The implementation of step 501 refers to the implementation of step 201, and is not described herein again.
The implementation of step 502 refers to the implementation of step 202, and is not described herein again.
Step 503a, in response to the trigger information existing in the session message, adding an interactive display element corresponding to the trigger information on the session display element.
The added interactive display elements are elements not belonging to the conversation display elements, and the following three conditions are included:
it should be noted that "before the trigger information" and "after the trigger information" in the embodiment of the present application refer to temporal sequence, for example, a session message sent to a chat session before the trigger information is a session display element before the trigger, and a session message sent to the chat session after the trigger information is a session display element after the trigger information is.
Firstly, adding interactive display elements corresponding to trigger information on N conversation display elements in front of the trigger information, wherein N is a positive integer.
Illustratively, the trigger information is taken as a first expression for example, N is 4, the conversation display element is a user avatar corresponding to the message bubble, the first expression is displayed in the message bubble, and interactive display elements are displayed on 4 message bubbles before the message bubble. As shown in fig. 6 (a), a chat session interface displays 4 users participating in a chat session, one user sends a first emoticon 311, and the avatar of the user corresponding to 4 users before the first emoticon 311 respectively displays a first interactive display element 314, a second interactive display element 315, a third interactive display element 316, and a fourth interactive display element 317, where the second interactive display element 315 is different from the rest of the interactive display elements, and the third interactive display element 316 and the fourth interactive display element 317 are displayed on the session display element corresponding to the same user account.
Illustratively, the triggering information is taken as an example of a text containing a keyword or a topic, N is 4, the conversation display element is a user avatar corresponding to a message bubble, when a conversation message containing the keyword or the topic is sent, the message bubble including the conversation message is displayed in a chat conversation interface, and the interactive display elements are displayed on the user avatars corresponding to 4 message bubbles before the message bubble. For example, a message bubble including a keyword of "blossom" is displayed in the chat session interface, and interactive display elements are displayed on the avatar of the user corresponding to 4 message bubbles before the message bubble.
Illustratively, taking the example that the trigger information is system information, N is 4, the session display element is a user avatar corresponding to the message bubble, and when a system message is displayed in the chat session interface, the interactive display element is displayed on the user avatar corresponding to the 4 message bubbles before the system message. The system message is generated after the intelligent robot in the chat session is triggered by the session message sent by the user.
Illustratively, taking the trigger information as the information corresponding to the resource transfer as an example, N is 4, the session display element is a message bubble corresponding to the session message, after the user sends the virtual package, the client records the sending time of the virtual package, and displays the interactive display element (red package element) on 4 message bubbles before the sending time.
And secondly, adding interactive display elements corresponding to the trigger information on N conversation display elements behind the trigger information, wherein N is a positive integer.
Illustratively, the trigger information is taken as a first emotion, N is 4, the conversation display elements are message bubbles corresponding to the conversation messages, the first emotion is displayed in the chat conversation interface, and the interactive display elements are displayed on 4 message bubbles behind the first emotion. As shown in fig. 6 (b), the first emoticon 311 is displayed in the conversation chat interface, some dynamic emoticons are not displayed in the chat interface in a form of being displayed in a message bubble, and a first interactive display element 314, a second interactive display element 315, a third interactive display element 316 and a fourth interactive display element 317 are respectively displayed on 4 message bubbles after the first emoticon 311, where the second interactive display element 315 is different from the rest interactive display elements, and the third interactive display element 316 and the fourth interactive display element 317 are displayed on the conversation display elements corresponding to the same user account.
Illustratively, the triggering information is taken as an example of a text containing a keyword or a topic, N is 4, the conversation display element is a user avatar corresponding to a message bubble, when a conversation message containing the keyword or the topic is sent, the message bubble including the conversation message is displayed in a chat conversation interface, and the interactive display elements are displayed on the user avatars corresponding to 4 message bubbles behind the message bubble. For example, a message bubble including a keyword of "blossom" is displayed in the chat session interface, and flower elements are displayed on the avatar of the user corresponding to 4 message bubbles after the message bubble.
Illustratively, taking the example that the trigger information is system information, N is 4, the session display element is a user avatar corresponding to the message bubble, and when a system message is displayed in the chat session interface, the interactive display element is displayed on the user avatar corresponding to the 4 message bubbles after the system message. The system message is generated after the intelligent robot in the chat session is triggered by the session message sent by the user.
Illustratively, taking the trigger information as the information corresponding to the resource transfer as an example, N is 4, the session display element is a message bubble corresponding to the session message, after the user sends the virtual package, the client records the sending time of the virtual package, and displays the interactive display element (red package element) on 4 message bubbles after the sending time.
And thirdly, adding interactive display elements corresponding to the trigger information on m conversation display elements before the trigger information, and adding interactive display elements corresponding to the trigger information on k conversation display elements after the trigger information. Wherein m + k is N, m, k and N are positive integers, m is less than N, and k is less than N.
Illustratively, the trigger information is taken as a first expression, N is 4, the conversation display element is a message bubble corresponding to the conversation message, the first expression is displayed in the message bubble, the interactive display elements are displayed on the user avatars corresponding to 1(m ═ 1) message bubbles after the first expression, and the interactive display elements are displayed on the user avatars corresponding to 3(k ═ 3) message bubbles after the first expression. As shown in fig. 6 (c), the first emoticon 311 is displayed in the message bubble, the first interactive display element 314 is displayed on the avatar of the user corresponding to 1 message bubble before the first emoticon 311, and the second interactive display element 315, the third interactive display element 316 and the third interactive display element 317 are respectively displayed on the avatar of the user corresponding to 3 message bubbles after the first emoticon 311.
Illustratively, the triggering information is taken as an example of a text containing a keyword or a topic, N is 4, the conversation display element is a user avatar corresponding to a message bubble, when a conversation message containing the keyword or the topic is sent, a message bubble including the conversation message is displayed in a chat conversation interface, an interactive display element is displayed on the user avatar corresponding to 1 message bubble before the message bubble, and an interactive display element is displayed on the user avatar corresponding to 3 message bubbles after the message bubble. For example, a message bubble including a keyword of "blossom" is displayed in the chat session interface, a flower element is displayed on the avatar of the user corresponding to 1 message bubble before the message bubble, and a flower element is displayed on the avatar of the user corresponding to 3 message bubbles after the message bubble.
Illustratively, taking the trigger information as system information as an example, N is 4, the session display element is a user avatar corresponding to the message bubble, when a system message is displayed in the chat session interface, the interactive display element is displayed on the user avatar corresponding to 1 message bubble before the system message, and the interactive display element is displayed on the user avatar corresponding to 3 message bubbles after the system message. The system message is generated after the intelligent robot in the chat session is triggered by the session message sent by the user.
Illustratively, taking the trigger information as the information corresponding to the resource transfer as an example, N is 4, the session display element is a message bubble corresponding to the session message, after the user sends the virtual package, the client records the sending time of the virtual package, displays the interactive display element (red package element) on 1 message bubble before the sending time, and displays the interactive display element on 3 message bubbles after the sending time.
Step 503b, in response to the trigger information existing in the session message, reducing the interactive display element corresponding to the trigger information on the session display element.
The reduced interactive display elements are elements belonging to the conversation display elements, and the method comprises the following three conditions:
firstly, interactive display elements corresponding to trigger information are reduced on N conversation display elements in front of the trigger information, wherein N is a positive integer.
And secondly, reducing interactive display elements corresponding to the trigger information on N conversation display elements behind the trigger information, wherein N is a positive integer.
And thirdly, reducing the interactive display elements corresponding to the trigger information on m conversation display elements before the trigger information, and reducing the interactive display elements corresponding to the trigger information on k conversation display elements after the trigger information. Wherein m + k is N, m, k and N are positive integers, m is less than N, and k is less than N.
Similar to the embodiment of step 503a, the interactive display element may be a part of the session display element, and the reduced interactive display element includes at least one of the following cases: part of elements in the user head portrait, decorative elements (such as head portrait frames) of the user head portrait, characters in the message bubble, the user account number, the account nickname and the identity badge.
For example, the user avatar may be a self-portrait photograph of the user, and the reduced interactive display element may be the user's hair or the user's mouth; for another example, a wreath-shaped head portrait frame is displayed around the head portrait of the user, and the reduced interactive elements may be the wreath-shaped head portrait frame; in another example, the medal identifying the level of the user's account is displayed around the user's avatar, e.g., the medal is "chat fellow", and the reduced interactive display element is the medal.
Step 503c, in response to the trigger information existing in the session message, changing the interactive display element corresponding to the trigger information on the session display element.
The changed interactive display elements can be part of the elements belonging to the session display elements, and can also be additional elements corresponding to the session display elements.
Firstly, changing interactive display elements corresponding to trigger information on N conversation display elements in front of the trigger information, wherein N is a positive integer.
And secondly, changing interactive display elements corresponding to the trigger information on N conversation display elements behind the trigger information, wherein N is a positive integer.
And thirdly, changing the interactive display elements corresponding to the trigger information on m conversation display elements before the trigger information, and changing the interactive display elements corresponding to the trigger information on k conversation display elements after the trigger information. Wherein m + k is N, m, k and N are positive integers, m is less than N, and k is less than N.
Similar to the embodiment of step 503a, the session display element in the chat session interface is divided into two parts, including a part of the session display element before the trigger information and a part of the session display element after the trigger information.
Illustratively, the conversation display element is a user avatar participating in the chat conversation, and the trigger information is an emoticon. The transparency of the user head portrait corresponding to 1 message bubble before the first expression and 3 message bubbles after the first expression is reduced from 50% to 0%; illustratively, the conversation display element is a message bubble, and the shape of the 1 message bubble before and the 3 message bubbles after the first emoticon is changed from a rectangle to a circle; illustratively, the conversation display element is a message bubble and the trigger information is an emoticon. The 1 message bubble before the first emoticon and the 3 message bubbles after the first emoticon are shaken in the chat session interface. It is understood that the corresponding changes of the conversation display element before the first emoticon and the conversation display element after the first emoticon may be the same or different. For example, 2 message bubbles before the first emoticon are shaken in the chat conversation interface, and the transparency of 2 message bubbles after the first emoticon is reduced from 50% to 0%. In some embodiments, when the conversation display element corresponding to the first emotion is changed, the terminal used by the user will also vibrate.
In the above embodiment, the trigger information is used as a reference, and the interactive display element is displayed on the session display element in the context corresponding to the first emotion, that is, the interactive display element is displayed on the session display element corresponding to any user account. It will be appreciated that the interactive display elements may also be displayed on a session display element corresponding to a particular user account. For example, on two session display elements corresponding to the user account a before the first expression, the two session display elements may be session display elements corresponding to the same message or session display elements corresponding to different messages, that is, the interactive display element may be displayed on a message bubble corresponding to one message bubble and a user avatar corresponding to the message bubble, and the interactive display element may be displayed on two message bubbles (the message bubbles are both message bubbles corresponding to the user account a).
Based on the above embodiment, the user may further interact with the interactive display element, and the interaction manner is implemented by step 504:
and 504, responding to the received interactive operation on the interactive display element, and controlling the display speed of the interactive display element on the conversation display element.
When the terminal used by the user is a terminal with a touch display screen, such as a smart phone, a tablet computer, and the like, the interactive operation includes a single-click operation, a double-click operation (including a single-finger double-click operation and a double-finger double-click operation), a long-press operation, a drag operation, a slide operation, a hover operation, and a combination thereof.
When the terminal used by the user is a terminal connected with an external input device, such as a desktop computer, a notebook computer, and the like, the interactive operation includes an operation generated by the external input device or an input device of the terminal, for example, an operation of clicking a left mouse button by the user is an interactive operation.
Illustratively, the interactive display element is a flower, the flower is displayed on the avatar of the user participating in the chat session, the flower is displayed on the avatar of the user in the form of a flower bud at an initial moment, the flower bud is opened for a period of time, and the flower is displayed on the avatar of the user in a blooming state. When the flower is displayed on the head portrait of the user in the form of a flower bud, the user applies a gesture operation of sliding rightward on the interactive display element to control the speed at which the flower bud opens into the flower.
Illustratively, the interactive display element is a flower, which is displayed on the avatar of the user participating in the chat session, the flower flashing once every 2 seconds, and when the user double-clicks the flower, the flower flashing once every 0.5 seconds.
In some embodiments, in response to receiving an interactive operation on a conversation display element, a display speed of the interactive display element on the conversation display element is controlled. Illustratively, the interactive display element is a flower, the flower is displayed on the avatar of the user participating in the chat session, the flower flashes every 2 seconds, and when the user double-clicks the avatar of the user on which the flower is displayed, the flower flashes every 0.5 seconds.
In summary, the method of this embodiment provides a brand new interaction manner, and by increasing, decreasing and changing the interaction display elements corresponding to the trigger emoticons, the trigger information and the interaction display elements are associated through the session display elements, so that when the user generates the trigger information by sending the session message, the session display elements are changed, the interaction display elements are changed accordingly, the change styles of the interaction display elements are increased, decreased and changed, and the display diversity of the interaction display elements in the chat session interface is improved.
According to the method, the session display elements displayed in the chat session interface are divided by taking the trigger information as a reference, and the interactive display elements corresponding to the trigger information are displayed on part of the session display elements, so that the randomness of the interactive display elements during display is increased, the display diversity of the interactive display elements in the chat session interface is improved, and the display mode of the interactive display elements is not single any more.
In addition, the method of the embodiment also enables the user to autonomously control the display speed on the conversation display element by performing interactive operation on the interactive display element through the user, and the user provides a personalized interactive mode and a diversified chat conversation interface.
It is to be understood that the above embodiments may be implemented individually or in any combination.
In an alternative embodiment based on fig. 5, the variation of the interactive display element further includes the following forms, as shown in fig. 7:
step 701, randomly selecting P conversation display elements from N conversation display elements, wherein P is a positive integer and is less than N.
Illustratively, the method is applied to a client used by a user, and the client is a client of an instant messaging application. As shown in fig. 8, the client is installed and operated in the first terminal 110, and the instant messaging application corresponds to the server 120, and is used for collecting and forwarding messages sent by the client. Illustratively, the client corresponding to the first terminal 110 is a first client, the first client is a client sending an expression, the client corresponding to the second terminal 130 is a second client, the second client is a client receiving an expression, the client corresponding to the third terminal 140 is a third client, and the third client is also a client receiving an expression. The first client, the second client and the third client are in the same group chat session message. After the user selects the first expression to be sent, the first client generates a random parameter, and the random parameter is used for uniquely identifying the special effect parameter in the special effect file, so that the corresponding interactive display element is displayed on the session display element.
Step 702, displaying a first interactive display element corresponding to the trigger information on the P session display elements, and displaying a second interactive display element corresponding to the trigger information on the N-P session display elements.
Acquiring a special effect file corresponding to the trigger information according to a file identifier carried by the trigger information, wherein the special effect file comprises special effect parameters corresponding to the interactive display elements; randomly selecting a first special effect parameter and a second special effect parameter from the special effect parameters, wherein the first special effect parameter corresponds to a first interactive display element, and the second special effect parameter corresponds to a second interactive display element; performing effect rendering on the P conversation display elements according to the first special effect parameter, and displaying a first interaction display element on the P conversation display elements; and performing effect rendering on the N-P conversation display elements according to the second special effect parameter, and displaying the second interactive display element on the N-P conversation display elements.
Taking the trigger information as an example of the expression, each expression corresponds to an expression identifier, and the expression identifier is used for uniquely identifying the expression.
The special effect file is a file for configuring special effect parameters, and the special effect parameters are parameters used when displaying special effect animations corresponding to the interactive display elements. The session display element is rendered by adopting a special effect rendering engine according to the special effect parameters, wherein the special effect rendering engine is an application program for rendering special effect animations, and can be a third-party special effect rendering engine or a special effect rendering engine carried by a client. The third-party special effect rendering engine refers to a provider of the special effect rendering engine and a provider of the target application program, and is not the same provider (for example, a manufacturer or a developer), and the provider of the target application program can add the third-party special effect rendering engine to a client of the target application program, so that the capability of special effect rendering is provided for the client.
And expressing the corresponding relation among the expression identification, the expression, the random parameter, the special effect parameter and the interactive display element by using a table I.
Watch 1
Figure BDA0002956539010000201
Illustratively, 2021022610380001 indicates the 0001 st expression generated at 10 o' clock 38 of 26/02/2021, which is the open-gun expression. When the user (using the first client) chooses to send the emoticon to the chat session, the first client generates a random parameter according to the emoticon identifier of the open emoticon. Illustratively, the random parameter is (2, 3, 1), where "2" indicates that one of the session display elements is selected, and the session display element is the 2 nd session display element before (or after) the first emoticon; "3" represents the third one of the effect parameters (i.e. 00003A); "1" represents the special effects parameter of other conversation display elements in the chat conversation interface. Illustratively, after the rifle emoticon is sent, a red flower (interactive display element corresponding to 00003A) is displayed on the 2 nd conversation display element after the message emoticon, and the other conversation display elements display a white flower (interactive display element corresponding to 00001A).
It should be noted that, in this embodiment, the expression identifier and the random parameter are represented by numbers, the random parameter is represented by numbers and letters, and the expression identifier, the random parameter and the special effect parameter may also be of other types.
In some embodiments, because the special effect files corresponding to different expressions may be different, the terminal corresponding to the client stores the correspondence between the expression identifier and the special effect file, and the client obtains the corresponding special effect file locally according to the expression identifier. And the client selects the corresponding special effect parameters from the terminal according to the random parameters, so as to render the effect of the session display elements.
In some embodiments, a corresponding relationship between the expression identifiers and the special effect files is stored in the server, each expression corresponds to a special effect file, the client acquires the corresponding special effect file from the server according to the expression identifier, the client acquires the special effect parameters from the special effect files according to the random parameters, or the client sends the expression identifiers and the random parameters to the server, the server determines the corresponding special effect files according to the expression identifiers, acquires the corresponding special effect parameters from the special effect files according to the random parameters, and then feeds the special effect parameters back to the client.
In some embodiments, the random parameter expands the dimension according to the number of session display elements, changing animated special effects, and the like. For example, the random parameter is (2, 2, 1, 3, 0.5, 1), where the first "2" indicates that the interactive display element is displayed on 2 session display elements before the first emoticon, the second "2" indicates that the interactive display element is displayed on 2 session display elements after the first emoticon, the interactive display element corresponding to the special effect parameter "3" is displayed on the 1 st session display element after the first emoticon, the interactive display element flickers every 0.5 seconds, and the other session display elements display the interactive display element corresponding to the special effect parameter "1".
It can be understood that the special effect file corresponding to the expression may be updated, and illustratively, the special effect file corresponding to the expression is stored in the client, and the client acquires the special effect file corresponding to the expression from the server at preset time intervals or in real time and stores the special effect file locally. Illustratively, the client may update the special effect file corresponding to the expression when the user starts the client, so as to obtain the latest special effect file. Illustratively, the server may send a special file corresponding to the expression to the client periodically.
In summary, in the method of this embodiment, P session display elements are randomly selected from N session display elements, the randomly selected P session display elements and N-P session display elements are displayed in different interaction display elements, and different interaction display elements and trigger information are associated through the session display elements, so that when a user generates trigger information by sending a session message, the randomly selected session display elements are different according to different layouts of the session display elements in a chat session interface, so that the interaction display elements corresponding to the trigger information have different display styles, thereby improving display diversity of the interaction display elements in the chat session interface.
According to the method, the corresponding special effect parameters are obtained according to the file identification carried by the trigger information, and the session display elements are accurately subjected to effect rendering through the special effect parameters, so that the accurate interactive display elements are displayed on the session display elements.
In an alternative embodiment based on fig. 5, the variation of the interactive display element further includes the following forms, as shown in fig. 9:
step 901, randomly selecting P session display elements from N session display elements, where P is a positive integer and P is less than N.
The implementation of step 901 is referred to step 701 in fig. 7, and is not described herein again.
And 902, displaying P types of third interactive display elements corresponding to the trigger information on P types of session display elements, wherein the P types of session display elements are in one-to-one correspondence with the P types of third interactive display elements, and displaying a second interactive display element corresponding to the trigger information on N-P types of session display elements.
Acquiring a special effect file corresponding to the trigger information according to a file identifier carried by the trigger information, wherein the special effect file comprises special effect parameters corresponding to the interactive display elements; randomly selecting P third special effect parameters from the special effect parameters, wherein the P third special effect parameters correspond to P third interactive display elements; and respectively performing effect rendering on the P conversation display elements according to the P third special effect parameters, and displaying P third interactive display elements on the P conversation display elements.
The special effect file is a file for configuring special effect parameters, and the special effect parameters are parameters used when displaying special effect animations corresponding to the interactive display elements. The session display element is rendered by adopting a special effect rendering engine according to the special effect parameters, wherein the special effect rendering engine is an application program for rendering special effect animations, and can be a third-party special effect rendering engine or a special effect rendering engine carried by a client. The third-party special effect rendering engine refers to a provider of the special effect rendering engine and a provider of the target application program, and is not the same provider (for example, a manufacturer or a developer), and the provider of the target application program can add the third-party special effect rendering engine to a client of the target application program, so that the capability of special effect rendering is provided for the client.
In fig. 7, the randomly selected P session display elements all display the same interactive display element, and in this embodiment, the randomly selected P session display elements correspond to the P types of interactive display elements one to one.
Taking the example that the trigger information is the expression, in some embodiments, since the special effect files corresponding to different expressions may be different, the terminal corresponding to the client stores the correspondence between the expression identifier and the special effect file, and the client obtains the corresponding special effect file locally according to the expression identifier. And the client selects the corresponding special effect parameters from the terminal according to the random parameters, so as to render the effect of the session display elements.
In some embodiments, a corresponding relationship between an expression identifier and a special effect file is stored in a server, each expression corresponds to a special effect file, a client obtains the corresponding special effect file from the server according to the expression identifier, the client obtains a special effect parameter from the special effect file according to a random parameter, or the client sends the expression identifier and the random parameter to the server, the server determines the corresponding special effect file according to the expression identifier, obtains the corresponding special effect parameter from the special effect file according to the random parameter, and feeds the special effect parameter back to the client.
It can be understood that, since P different interactive display elements are displayed on the P session display elements, the dimension of the random parameter generated by the client according to the expression identifier is increased.
In some embodiments, the P interactive display elements gradually change according to a time interval from the sending time of the first emoticon, for example, N is equal to 4, P is equal to 2, and the two randomly selected conversation display elements are one conversation display element before the first emoticon and one conversation display element after the first emoticon. As shown in fig. 10 (a), the conversation display element is the avatar of the user, the first emoticon is the gun firing emoticon, N is equal to 4, and P is equal to 2. In the chat session interface, two session display elements exist before the first emoticon, two session display elements also exist after the first emoticon, and the two session display elements selected randomly are the two session display elements closest to the first emoticon, that is, the session display element displaying the second interactive display element 315 and the session display element displaying the third interactive display element 316, the transparency of the second interactive display element 315 is 50%, and the transparency of the third interactive display element 316 is 0%. Namely, the transparency of the interactive display element corresponding to the conversation message with the earlier message sending time is gradually reduced along with the change of time.
Due to the randomness of the interactive display elements when displayed, in other embodiments, the interactive display elements may be displayed on any of the conversational display elements when the temporal relationship with the first emoticon is satisfied. As shown in fig. 10 (b), among the session display elements before the first emoticon, the session display element closest to the first emoticon is selected to display the interactive display element, and among the session display elements after the first emoticon, the last session display element displayed in the chat session interface, that is, the session display element on which the second interactive display element 315 is displayed and the session display element on which the fourth interactive display element 317 is displayed, is selected.
In summary, in the method of this embodiment, P session display elements are randomly selected from N session display elements, different interaction display elements are displayed on the P session display elements that are randomly selected, different interaction display elements are also displayed on the N-P session display elements, and different interaction display elements are associated with trigger information through a plurality of session display elements, so that when a user generates trigger information by sending a session message, the interaction display elements displayed on the session display elements are different according to different layouts of the session display elements in a chat session interface, the session display elements that are randomly selected are different, and the interaction display elements displayed on the session display elements are different, so that the interaction display elements corresponding to the trigger information have different display styles, and display diversity of the interaction display elements in the chat session interface is improved.
According to the method, the corresponding special effect parameters are obtained according to the file identification carried by the trigger information, and the session display elements are accurately subjected to effect rendering through the special effect parameters, so that the accurate interactive display elements are displayed on the session display elements.
It is to be understood that the above embodiments may be implemented individually or in any combination.
Fig. 11 shows a flowchart of an information display processing method according to another exemplary embodiment of the present application. The embodiment is described by taking the method as an example applied to the first terminal 110 or the second terminal 130 in the computer system shown in fig. 1, or other terminals in the computer system. The method comprises the following steps:
step 1101, displaying a chat session interface.
Step 1102, displaying the conversation message in the chat conversation interface in response to the first user account sending the conversation message or in response to receiving the conversation message from the second user account.
The implementation of step 1101 refers to the implementation of step 201, and is not described herein again.
The implementation of step 1102 is referred to the implementation of step 202, and is not described herein again.
Step 1103, in response to the existence of the trigger information in the session message, displaying the special effect animation corresponding to the trigger information.
Schematically, the trigger information is used as an example of the expression. When the first expression is displayed, playing a special effect animation corresponding to the first expression in a chat session interface, wherein the style of the special effect animation comprises at least one of the following styles:
changing the special effect animation corresponding to the color of the first expression;
changing the special effect animation corresponding to the size of the first expression;
changing the special effect animation corresponding to the shape of the first expression;
changing the special effect animation corresponding to the display position of the first expression;
and special effect animation is generated by the action corresponding to the first expression.
Illustratively, the first expression is a gun-opening expression, the gun-opening expression is a static expression displayed in a message bubble, the client switches the scene of the gun-opening expression from black to gold, or the client displays the gun-opening expression in an enlarged manner, or the client switches a gun in the gun-opening expression to another gun, or the client moves the display position of the gun-opening expression to the middle of a chat session interface, or the client plays a special effect animation generated by an action corresponding to the gun-opening expression. The special effect animations can be implemented independently and in any combination.
After the special-effect animation corresponding to the first expression is played, displaying interactive display elements on a first part of session display elements corresponding to the content of the above-mentioned session message of the first expression by taking the first expression as a reference, and displaying the interactive display elements on a second part of session display elements corresponding to the content of the below-mentioned session message of the first expression.
In one example, as shown in fig. 10 (a), with the first emoticon 311 as a reference, interactive display elements, that is, a first interactive display element 314 and a second interactive display element 315, are displayed on a first part of conversation display elements corresponding to the above-mentioned conversation message content of the first emoticon, and interactive display elements, that is, a third interactive display element 316 and a fourth interactive display element 317, are displayed on a second part of conversation display elements corresponding to the below-mentioned conversation message content of the first emoticon.
As shown in fig. 3, a first emoticon 311 is displayed in the chat session interface 30, the first emoticon is a gun-out emoticon, the client enlarges the size of the first emoticon and plays a special-effect animation generated by an action corresponding to the first emoticon, as shown in fig. 12, the first emoticon 311 changes into a gun-out emoticon 312 with a large size, a flower element 313 is displayed at the muzzle of the gun-out emoticon 312, and the flower element 313 is an interactive display element.
And 1104, responding to the display progress of the special effect animation reaching the designated progress, and displaying an interactive display element corresponding to the trigger information on a session display element in the chat session interface.
As shown in fig. 12, in some embodiments, flower element 313 at the muzzle of the pistachio expression 312 is in a state of not blooming buds, at which time the special effect animation displayed in the chat session interface is in a state where the buds gradually bloom into flowers. The process of the flower bud changing into the flower blooming state is the display progress of the special effect animation, and schematically, when the flower blooming, the corresponding interactive display element (flower) is displayed on the conversation element.
In other embodiments, the flower element 313 at the muzzle of the muzzle emoticon 312 is a flower in an open state, at this time, the special effect animation displayed in the chat session interface is in a state that petals gradually wither, and a process of changing the open state of the flower into the withered state of the flower is a display progress of the special effect animation. Illustratively, the withered state of a flower is indicated by the petals falling off the flower, and when the last petal falls off, the corresponding interactive display element (flower) is displayed on the conversation display element.
In summary, in the method of this embodiment, by playing the special-effect animation corresponding to the trigger information, the transition and connection of the first trigger information and the displayed interactive display element are more natural, and the diversity of the interactive display element during display is enriched.
Based on the above embodiment, the display form of the interactive display element further includes the following form:
step 1105, in response to the first time interval having the conversation message displayed in the chat conversation interface again, displaying the interactive display element on the conversation display element corresponding to the conversation message.
The moment of displaying the trigger information in the chat session interface is a first moment, a new session message is displayed in the chat session interface at any moment after the moment, the display moment of the new session message is a second moment, the client judges whether the time interval between the first moment and the second moment is within the first time interval, and in response to the fact that the time interval between the first moment and the second moment is within the first time interval, an interactive display element is displayed on a session display element corresponding to the session message.
In some embodiments, the interactive display element is the same element as displayed on a previous session display element; in other embodiments, the interactive display element is a different element than the interactive display element displayed on the previous session display element. For example, as shown in fig. 10 (b), a new conversation message is generated after the conversation display element corresponding to the third interactive display element 316, and the time interval between the conversation message and the first emoticon 311 is smaller than the first time interval, a fourth interactive display element 317, which is also a flower element, is displayed on the conversation display element corresponding to the newly generated conversation message, or a fifth interactive display element, which is an love element, is displayed on the conversation display element corresponding to the newly generated conversation message.
And in response to that the conversation message is displayed again in the chat conversation interface in the first time interval and the conversation display element corresponding to the conversation message belongs to an element in the N conversation display elements, displaying an interactive display element on the conversation display element corresponding to the conversation message, wherein N is an integer greater than 2.
Illustratively, N is 4, and based on the trigger information, the client obtains two session display elements upward, obtains two session display elements downward, and displays the interactive display elements on the 4 session display elements.
As shown in fig. 10 (b), a new session message is generated after the session display element corresponding to the third interactive display element 316 (the new session message is a session message corresponding to the fourth interactive display element 317), and the session display element corresponding to the new session message belongs to an element within 4(N ═ 4) session display elements, and the fourth interactive display element 317 is displayed on the session display element corresponding to the session message.
In summary, in the method of this embodiment, by determining the time interval (meeting the display condition) between the newly generated conversation message and the trigger information, the client selects whether to render the newly generated conversation message, so that the interactive display elements displayed on the newly generated conversation display elements can be merged into the existing interactive display elements, the uniformity and the aesthetic property of the chat conversation interface are ensured, and meanwhile, the diversity of the interactive display patterns in the chat conversation interface is also improved.
Based on the above embodiment, the display form of the interactive display element further includes the following form:
step 1106, in response to the first user account sending the first session message and receiving the second session message from the second user account, or in response to the first user account receiving the first session message and receiving the second session message from the second user account sending the second session message, displaying the first session message and the second session message in the chat session interface.
And when the first user account and the second user account send session messages to the chat session window, displaying the corresponding session messages in the chat session interface, wherein the first user account and the second user account participate in the same session chat.
Step 1107, in response to the presence of the trigger information in the first session message and the second session message, displaying an interactive display element corresponding to the trigger information on the session display element.
Wherein the first session message and the second session message are the same session message or the first session message and the second session message are different session messages.
And responding to the first expression sent by the first user account and receiving a second expression from the second user account, or responding to the first expression received by the second user account and the second expression sent by the first user account, and displaying an interactive display element associated with the first expression and the second expression on the conversation display element.
Wherein the first expression and the second expression are the same expression, or the first expression and the second expression are different expressions.
Taking the first expression and the second expression as examples, when two users send two different expressions at the same time, the interactive display elements associated with the first expression and the second expression are displayed on the conversation display element.
As shown in fig. 13, the first user sends a first emoticon 311, where the first emoticon 311 is a gun-firing emoticon, a first interactive display element 314 and a second interactive display element 315 are displayed on a conversation display element before the first emoticon 311, and the first interactive display element 314 and the second interactive display element 315 are love elements. The second user sends a second expression 318, the second expression 318 is an archery expression, the client controls a conversation display element before the second expression 318 and a seventh interactive display element 320 displayed on a conversation display element after the second expression 318 according to the association relationship between the first expression 311 and the second expression 318, and the seventh interactive display element 320 is an expression pierced by an arrow. The arrow-through expression is an expression associated with the first expression 311 and the second expression 318.
It should be noted that, the association relationship of the interactive display elements corresponding to the emoticons is stored in the client, and the association relationship is used for representing what kind of interactive display elements should be displayed on the conversation display elements when the plurality of emoticons are displayed on the chat conversation interface. In some embodiments, the interactive display elements corresponding to the first expression and the second expression may be integrated into a new interactive display element, and the interactive display elements corresponding to the first expression may be eliminated from the interactive display elements corresponding to the second expression, or the interactive display elements corresponding to the second expression may be eliminated from the interactive display elements corresponding to the first expression. In other embodiments, the interactive display elements corresponding to the first expression and the interactive display elements corresponding to the second expression are sequentially displayed.
It is understood that step 1105 may be performed prior to step 1106, step 1107, or step 1105 may be performed after step 1106, step 1107, or step 1105 may be performed simultaneously with step 1106, step 1107.
In summary, in the method of this embodiment, by displaying the interactive display elements corresponding to the trigger information on the session display elements, more interactive display elements can be triggered according to the rich expressions, so that a variety of interactive display elements are displayed on the display elements of the chat session interface.
In some embodiments, a light interactive message is displayed in the chat session interface in response to receiving an interactive operation on the interactive display element, illustratively, the light interactive message is a message sent by the first user account to the second user account. The message content of the light interactive message comprises: the system comprises a first field corresponding to a first user account, an action description field used for representing the action executed by the first user account on a second user account, and a second field of the second user account.
The action description field is a field for indicating that the first user account performs an action on the second user account. The actions are used to simulate user interaction actions in a real environment, such as: clapping, embracing, touching, kicking, poking, bumping, family affinity and the like.
In some embodiments, the first field is a name of the first user account, the action description field is a preset field of the server, and the second field is a name of the second user account. In other embodiments, at least one of the first field, the second field, and the action description field is a custom field pre-edited by a user for the light interaction message.
In some embodiments, all or part of the characters in the first field corresponding to the first user account are pre-defined by the first user; and all or part of characters in a second field corresponding to the second user account are pre-defined by the second user.
In some embodiments, the action description field is pre-customized by the first user or the second user. And when the first user and the second user are self-defined with the action description fields, selecting one of the two action description fields according to the priority rule. For example, the priority of the sender of the light interaction message is higher than the priority of the recipient of the light interaction message.
The light interactive message triggers the display of the light interactive message in the chat session interface. Optionally, the light interactive message may also trigger an avatar animation in the chat session interface that displays the avatar of the second user account. Optionally, the avatar animation is a dithering animation of the avatar.
After the interactive display element is displayed on the conversation display element, the user may perform an interactive operation on the interactive display element, for example, a flower (interactive display element) is displayed on the avatar of the user, and when the user clicks the flower, a light interactive message is displayed in the chat conversation interface, where the content of the light interactive message is: the a user stroked the flower of the B user.
In some embodiments, in response to receiving an interactive operation on an interactive display element, display of a special effect animation associated with the interactive display element on a conversational display element is controlled. Illustratively, when the interactive display element is already displayed on the session display element, the user may apply an interactive operation to the interactive display element, so that a corresponding special effect animation is displayed on the session display element. For example, flowers (interactive display elements) are displayed on the user avatar, when the user double-clicks the flowers, the user avatar corresponding to the double-clicked flower displays a shaking animation, or all the user avatars of the user accounts corresponding to the double-clicked flowers display a shaking animation.
In some embodiments, in response to receiving an interactive operation on the interactive display element, a resource corresponding to the interactive display element is obtained. Illustratively, a user may apply an interactive operation to an interactive display element while the interactive display element is already displayed on the session display element, so that the user may obtain resources from the interactive display element. For example, a red envelope (interactive display element) is displayed on the user head portrait, and when the user clicks the red envelope, a red envelope pickup page is displayed, and the user obtains cash from the red envelope. It is understood that the resource may also be virtual currency, game coins, credits, gift certificates, virtual pets, and the like.
Fig. 14 is a flowchart illustrating a method for rendering an interactive display element by a client according to an exemplary embodiment of the present application, where this embodiment is described by taking as an example that the method is applied to the first terminal 110 or the second terminal 130 in the computer system shown in fig. 1, or other terminals in the computer system. The method comprises the following steps:
in step S1, two random parameters (k, c) are read from the message.
Illustratively, the information display processing method is applied to the client, taking the trigger information as an example of the expression. After a user selects a target expression, the target expression is displayed in a chat control in a chat session interface, and meanwhile, a client generates two random parameters (k, c), wherein illustratively, k represents a k-th session display element before the first expression is randomly selected from the session display elements, and c represents a special effect parameter of an interactive display element displayed on the k-th session display element. And taking the conversation display element as a message bubble and the interactive display element as a flower.
Illustratively, the random parameter k is randomly selected from [ -N, N-1 ]. Taking the first expression as a reference, N represents a context conversation display element corresponding to the first expression, taking N equal to 4 as an example, taking the first expression as an origin, the conversation display elements before the first expression are respectively-1, -2, -3, -4 (sorted from near to far according to the display time from the first expression), and the conversation display elements after the first expression are 0, 1, 2, 3 (sorted from near to far according to the display time from the first expression).
And step S2, rendering the bursting animation.
The client renders the bursting animation, which is a special effect animation corresponding to the expression, such as the gun-firing expression in the above embodiment, and the corresponding special effect animation is to display a flower at the muzzle.
In step S3, N session display elements are acquired upward and downward, respectively.
And on the basis of the first expression, N is 4, and 8 conversation display elements are rendered upwards and downwards in total.
In step S4, of the 2N session display elements, a first interactive display element is displayed on the kth session display element, and second interactive display elements are displayed on the remaining session display elements.
Illustratively, if k is equal to 1, according to the embodiment of step S1, an interactive display element is displayed on the session display element corresponding to-1.
Step S5, is 2N session display elements already rendered?
The client determines whether 2N session display elements have been rendered, that is, whether the interaction display elements have been displayed on the corresponding session display elements, and if 2N session display elements have been rendered, the process goes to step S8; if the rendering of the 2N session display elements is not completed, the process proceeds to step S6.
In step S6, is the duration of emotions displayed in the chat session interface exceed a set time interval?
The client judges whether the time interval between the moment of displaying the expression in the chat session interface and the display moment of the unrendered session display element exceeds the set time interval, if so, the step S8 is executed; if the set time interval is not exceeded, the process proceeds to step S7.
In step S7, the new session message generated by the message receiving interface is monitored.
If the set time interval is not exceeded, the client monitors whether the message receiving interface has a newly generated conversation message, and if the message receiving interface generates a new conversation message, the conversation display element corresponding to the new conversation message is rendered.
Step S8 ends.
And finishing the rendering flow of the display elements of the whole session.
It should be noted that the rendering process of the session display elements may also be implemented in a server, and the server specifies that, among 2N session display elements, different interactive display elements are displayed on the session display elements corresponding to the message identifiers of P session messages.
The interactive display elements in the embodiment of the application are displayed according to the conversation display elements, that is, the interactive display elements are displayed on the conversation display elements possibly corresponding to the same user account.
The following are embodiments of an apparatus of the present application that may be used to perform embodiments of the methods of the present application. For details which are not disclosed in the device embodiments of the present application, reference is made to the method embodiments of the present application.
Fig. 15 is a block diagram of an information display processing apparatus according to an exemplary embodiment of the present application, where the apparatus includes:
a display module 1510 configured to display a chat session interface;
the display module 1510 is configured to send a session message in response to a first user account, or receive a session message from a second user account, or display a session message in a chat session interface;
the displaying module 1510 is configured to, in response to the trigger information existing in the session message, display an interactive display element corresponding to the trigger information on a session display element in the chat session interface, where the session display element is a display element associated with the session message.
In an alternative embodiment, the apparatus includes a processing module 1520;
the processing module 1520, configured to add an interactive display element corresponding to the trigger information to the session display element; or, reducing the interactive display elements corresponding to the trigger information on the session display elements; or changing an interactive display element corresponding to the trigger information on the session display element.
In an alternative embodiment, the session display element includes at least one of the following elements:
a message bubble comprising a conversation message;
a user head portrait corresponding to the message bubble;
the user account corresponding to the message bubble;
and the remark name corresponding to the message bubble.
In an optional embodiment, the display module 1510 is configured to display an interactive display element corresponding to the trigger information on N session display elements before the trigger information; or displaying interactive display elements corresponding to the trigger information on N session display elements behind the trigger information; or, displaying the interactive display elements corresponding to the trigger information on m session display elements before the trigger information, and displaying the interactive display elements corresponding to the trigger information on k session display elements after the trigger information; wherein m + k is N, m, k and N are positive integers, m is less than N, and k is less than N.
In an alternative embodiment, the processing module 1520 is configured to randomly select P session display elements from the N session display elements, where P is a positive integer and P < N; the display module 1510 is configured to display a first interactive display element corresponding to the trigger information on the P session display elements, and display a second interactive display element corresponding to the trigger information on the N-P session display elements.
In an alternative embodiment, the apparatus includes an acquisition module 1530;
the obtaining module 1530 is configured to obtain, according to the file identifier carried in the trigger information, a special effect file corresponding to the trigger information, where the special effect file includes a special effect parameter corresponding to an interactive display element;
the processing module 1520, configured to randomly select a first special effect parameter and a second special effect parameter from the special effect parameters, where the first special effect parameter corresponds to a first interactive display element, and the second special effect parameter corresponds to a second interactive display element;
the display module 1510 is configured to perform effect rendering on the P session display elements according to the first special effect parameter, and display the first interactive display element on the P session display elements;
the display module 1510 is configured to perform effect rendering on the N-P session display elements according to the second special effect parameter, and display the second interactive display element on the N-P session display elements.
In an alternative embodiment, the processing module 1520 is configured to randomly select P session display elements from the N session display elements, where P is a positive integer and P < N;
the display module 1510 is configured to display P types of third interactive display elements corresponding to the trigger information on the P types of session display elements, where the P types of session display elements correspond to the P types of third interactive display elements one-to-one, and display a second interactive display element corresponding to the trigger information on the N-P types of session display elements.
In an optional embodiment, the obtaining module 1530 is configured to obtain, according to a file identifier carried in the trigger information, a special effect file corresponding to the trigger information, where the special effect file includes special effect parameters corresponding to the interactive display elements;
the processing module 1520, configured to randomly select P third special effect parameters from the special effect parameters, where the P third special effect parameters correspond to P types of third interactive display elements;
the display module 1510 is configured to perform effect rendering on the P session display elements according to the P third special effect parameters, and display the P third interactive display elements on the P session display elements.
In an alternative embodiment, the display module 1510 is configured to, in response to displaying the conversation message again in the chat conversation interface within the first time interval, display the interactive display element on the conversation display element corresponding to the conversation message.
In an optional embodiment, the display module 1510 is configured to display a special effect animation corresponding to the trigger information.
In an alternative embodiment, the special effects animation includes at least one of the following styles:
changing the special effect animation corresponding to the color of the trigger information;
changing the special effect animation corresponding to the size of the trigger information;
changing the special effect animation corresponding to the shape of the trigger information;
changing the special effect animation corresponding to the display position of the trigger information;
and triggering special effect animation generated by the action corresponding to the information.
In an optional embodiment, the display module 1510 is configured to, in response to that the display progress of the special effect animation reaches a specified progress, display an interactive display element corresponding to the trigger information on a conversation display element in the chat conversation interface.
In an alternative embodiment, the display module 1510 is configured to control a display speed of the interactive display element on the session display element in response to receiving the interactive operation on the interactive display element.
In an optional embodiment, the display module 1510 is configured to display a light interactive message in the chat session interface in response to receiving the interactive operation on the interactive display element, where the light interactive message is a message sent by the first user account to the second user account; wherein the message content of the light interaction message comprises: the first field corresponding to the first user account, the action description field used for representing the action executed by the first user account on the second user account, and the second field of the second user account.
In an alternative embodiment, the display module 1510 is configured to control display of a special effect animation associated with the interactive display element on the conversation display element in response to receiving the interactive operation on the interactive display element.
In an optional embodiment, the display module 1510 is configured to, in response to receiving an interactive operation on the interactive display element, obtain a resource corresponding to the interactive display element.
In an optional embodiment, the display module 1510 is configured to display the first session message and the second session message in the chat session interface in response to the first user account sending the first session message and receiving the second session message from the second user account, or in response to the first session message from the second user account receiving the second session message sent by the first user account; responding to the existence of the trigger information in the first session message and the second session message, and displaying an interactive display element corresponding to the trigger information on the session display element;
wherein the first session message and the second session message are the same session message or the first session message and the second session message are different session messages.
In an alternative embodiment, the interactive display elements include at least one of:
changing a color of a conversation display element;
changing a size of a conversation display element;
changing a shape of a conversation display element;
changing a display position of a conversation display element;
changing a display hierarchy of the conversation display element;
adding an additional element on the session display element;
part of the elements on the session display element are cleared.
Fig. 16 shows a block diagram of a computer device 1600 provided in an exemplary embodiment of the present application. The computer device 1600 may be a portable mobile terminal, such as: smart phones, tablet computers, MP3 players (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4). Computer device 1600 may also be referred to by other names such as user equipment, portable terminals, etc.
Generally, computer device 1600 includes: a processor 1601, and a memory 1602.
Processor 1601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1601 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1601 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1601 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, the processor 1601 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1602 may include one or more computer-readable storage media, which may be tangible and non-transitory. The memory 1602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1602 is used to store at least one instruction for execution by the processor 1601 to implement the information display processing method provided in embodiments of the present application.
In some embodiments, computer device 1600 may also optionally include: peripheral interface 1603 and at least one peripheral. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1604, a touch screen display 1605, a camera assembly 1606, audio circuitry 1607, a positioning assembly 1608, and a power supply 1609.
Peripheral interface 1603 can be used to connect at least one I/O (Input/Output) related peripheral to processor 1601 and memory 1602. In some embodiments, processor 1601, memory 1602, and peripheral interface 1603 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1601, the memory 1602, and the peripheral interface 1603 may be implemented on separate chips or circuit boards, which is not limited by this embodiment.
The Radio Frequency circuitry 1604 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 1604 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1604 converts electrical signals into electromagnetic signals to be transmitted, or converts received electromagnetic signals into electrical signals. Optionally, the radio frequency circuit 1604 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, etc. The radio frequency circuit 1604 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1604 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The touch display 1605 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. The touch display 1605 also has the ability to capture touch signals on or over the surface of the touch display 1605. The touch signal may be input to the processor 1601 as a control signal for processing. The touch display 1605 is used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the touch display 1605 may be one, providing the front panel of the computer device 1600; in other embodiments, the touch display 1605 can be at least two, each disposed on a different surface of the computer device 1600 or in a folded design; in other embodiments, the touch display 1605 may be a flexible display disposed on a curved surface or on a folded surface of the computer device 1600. Even the touch display screen 1605 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The touch Display 1605 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or other materials.
Camera assembly 1606 is used to capture images or video. Optionally, camera assembly 1606 includes a front camera and a rear camera. Generally, a front camera is used for realizing video call or self-shooting, and a rear camera is used for realizing shooting of pictures or videos. In some embodiments, the number of the rear cameras is at least two, and each of the rear cameras is any one of a main camera, a depth-of-field camera and a wide-angle camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions. In some embodiments, camera assembly 1606 can also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1607 is used to provide an audio interface between a user and the computer device 1600. The audio circuitry 1607 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1601 for processing or inputting the electric signals to the radio frequency circuit 1604 for voice communication. For stereo capture or noise reduction purposes, the microphones may be multiple and located at different locations on the computer device 1600. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1601 or the radio frequency circuit 1604 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1607 may also include a headphone jack.
The Location component 1608 is employed to locate a current geographic Location of the computer device 1600 for purposes of navigation or LBS (Location Based Service). The Positioning component 1608 may be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 1609 is used to provide power to the various components within computer device 1600. Power supply 1609 may be alternating current, direct current, disposable or rechargeable. When power supply 1609 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, computer device 1600 also includes one or more sensors 1610. The one or more sensors 1610 include, but are not limited to: acceleration sensor 1611, gyro sensor 1612, pressure sensor 1613, fingerprint sensor 1614, optical sensor 1615, and proximity sensor 1616.
The acceleration sensor 1611 may detect acceleration magnitudes on three coordinate axes of a coordinate system established with the computer apparatus 1600. For example, the acceleration sensor 1611 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1601 may control the touch display screen 1605 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1611. The acceleration sensor 1611 may also be used for acquisition of motion data of a game or a user.
Gyroscope sensor 1612 can detect computer device 1600's organism direction and turned angle, and gyroscope sensor 1612 can gather the 3D action of user to computer device 1600 in coordination with acceleration sensor 1611. From the data collected by the gyro sensor 1612, the processor 1601 may perform the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensors 1613 may be disposed on the side bezel of the computer device 1600 and/or on the lower layer of the touch display 1605. When the pressure sensor 1613 is disposed on the side frame of the computer device 1600, a user's grip signal on the computer device 1600 can be detected, and left-right hand recognition or shortcut operation can be performed based on the grip signal. When the pressure sensor 1613 is disposed at the lower layer of the touch display 1605, the operability control on the UI interface can be controlled according to the pressure operation of the user on the touch display 1605. The operability control comprises at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 1614 is used to collect a fingerprint of the user to identify the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1601 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 1614 may be disposed on the front, back, or side of the computer device 1600. When a physical button or vendor Logo is provided on the computer device 1600, the fingerprint sensor 1614 may be integrated with the physical button or vendor Logo.
The optical sensor 1615 is used to collect ambient light intensity. In one embodiment, the processor 1601 may control the display brightness of the touch display screen 1605 based on the ambient light intensity collected by the optical sensor 1615. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1605 is increased; when the ambient light intensity is low, the display brightness of the touch display 1605 is turned down. In another embodiment, the processor 1601 may also dynamically adjust the shooting parameters of the camera assembly 1606 based on the ambient light intensity collected by the optical sensor 1615.
A proximity sensor 1616, also referred to as a distance sensor, is typically disposed on the front side of the computer device 1600. The proximity sensor 1616 is used to capture the distance between the user and the front of the computer device 1600. In one embodiment, the touch display 1605 is controlled by the processor 1601 to switch from a bright screen state to a dark screen state when the proximity sensor 1616 detects that the distance between the user and the front surface of the computer device 1600 is gradually decreasing; the touch display 1605 is controlled by the processor 1601 to switch from a rest state to a lighted state when the proximity sensor 1616 detects that the distance between the user and the front surface of the computer device 1600 is gradually increasing.
Those skilled in the art will appreciate that the configuration shown in FIG. 16 is not intended to be limiting of computer device 1600, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
An embodiment of the present application further provides a terminal, including: a processor and a memory, the computer device memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, the at least one instruction, at least one program, set of codes, or set of instructions being loaded and executed by the processor to implement the information display processing method in the above-described embodiments.
Embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program, which, when executed by a processor, implements the information display processing method in the above-described embodiments.
Embodiments of the present application also provide a computer program product or a computer program, where the computer program product or the computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium. A processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the information display processing method as in the above-described embodiments.
It should be understood that reference herein to "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
It will be understood by those skilled in the art that all or part of the steps of implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The present application is intended to cover various modifications, alternatives, and equivalents, which may be included within the spirit and scope of the present application.

Claims (19)

1. An information display processing method, characterized by comprising:
displaying a chat session interface;
displaying a session message in the chat session interface in response to a first user account sending the session message or in response to receiving the session message from a second user account;
and responding to the existence of the trigger information in the session message, and displaying an interactive display element corresponding to the trigger information on a session display element in the chat session interface, wherein the session display element is a display element associated with the session message.
2. The method of claim 1, wherein displaying an interactive display element corresponding to the trigger information on a session display element in the chat session interface comprises:
adding an interactive display element corresponding to the trigger information on the session display element;
or the like, or a combination thereof,
reducing interactive display elements corresponding to the trigger information on the session display elements;
or the like, or, alternatively,
and changing an interactive display element corresponding to the trigger information on the session display element.
3. The method of claim 2, wherein the session display element comprises at least one of:
a message bubble comprising the conversation message;
a user head portrait corresponding to the message bubble;
the user account corresponding to the message bubble;
and the remark name corresponding to the message bubble.
4. The method of any of claims 1 to 3, further comprising:
displaying interactive display elements corresponding to the trigger information on N conversation display elements in front of the trigger information;
or the like, or, alternatively,
displaying interactive display elements corresponding to the trigger information on N session display elements behind the trigger information;
or the like, or, alternatively,
displaying interactive display elements corresponding to the trigger information on m session display elements before the trigger information, and displaying interactive display elements corresponding to the trigger information on k session display elements after the trigger information;
wherein m + k is N, m, k and N are positive integers, m is less than N, and k is less than N.
5. The method of claim 4, further comprising:
randomly selecting P conversation display elements from the N conversation display elements, wherein P is a positive integer and is less than N;
and displaying a first interactive display element corresponding to the trigger information on the P session display elements, and displaying a second interactive display element corresponding to the trigger information on the N-P session display elements.
6. The method according to claim 5, wherein the displaying a first interactive display element corresponding to the trigger information on the P session display elements and a second interactive display element corresponding to the trigger information on N-P session display elements comprises:
acquiring a special effect file corresponding to the trigger information according to a file identifier carried by the trigger information, wherein the special effect file comprises special effect parameters corresponding to the interactive display elements;
randomly selecting a first special effect parameter and a second special effect parameter from the special effect parameters, wherein the first special effect parameter corresponds to the first interactive display element, and the second special effect parameter corresponds to the second interactive display element;
performing effect rendering on the P session display elements according to the first special effect parameter, and displaying the first interactive display element on the P session display elements;
and performing effect rendering on the N-P conversation display elements according to the second special effect parameter, and displaying the second interactive display element on the N-P conversation display elements.
7. The method of claim 4, further comprising:
randomly selecting P conversation display elements from the N conversation display elements, wherein P is a positive integer and is less than N;
displaying P types of third interactive display elements corresponding to the trigger information on the P types of session display elements, wherein the P types of session display elements are in one-to-one correspondence with the P types of third interactive display elements; and displaying a second interactive display element corresponding to the trigger information on the N-P session display elements.
8. The method of claim 7, wherein the displaying P third interactive display elements corresponding to the first emoticon on the P conversational display elements comprises:
acquiring a special effect file corresponding to the trigger information according to a file identifier carried by the trigger information, wherein the special effect file comprises special effect parameters corresponding to the interactive display elements;
randomly selecting P third special effect parameters from the special effect parameters, wherein the P third special effect parameters correspond to the P third interactive display elements;
and performing effect rendering on the P conversation display elements according to the P third special effect parameters, and displaying P third interactive display elements on the P conversation display elements.
9. The method of any of claims 1 to 3, further comprising:
and responding to the conversation message displayed in the chat conversation interface again in the first time interval, and displaying the interactive display element on the conversation display element corresponding to the conversation message.
10. The method of any of claims 1-3, wherein after displaying the trigger in the chat session interface, the method further comprises:
and displaying the special effect animation corresponding to the trigger information.
11. The method of claim 10, further comprising:
and responding to the display progress of the special effect animation to reach the designated progress, and displaying an interactive display element corresponding to the trigger information on a session display element in the chat session interface.
12. The method of any of claims 1 to 3, further comprising:
in response to receiving the interactive operation on the interactive display element, displaying a light interactive message in the chat session interface, wherein the light interactive message is a message sent by the first user account to the second user account;
wherein the message content of the light interaction message comprises: the first field corresponding to the first user account, the action description field used for representing the action executed by the first user account on the second user account, and the second field of the second user account.
13. The method of any of claims 1 to 3, further comprising:
and responding to the received interactive operation on the interactive display element, and controlling the conversation display element to display special effect animation associated with the interactive display element.
14. The method of any of claims 1 to 3, further comprising:
and responding to the received interactive operation on the interactive display element, and acquiring the resource corresponding to the interactive display element.
15. The method of any of claims 1 to 3, further comprising:
and in response to receiving the interactive operation on the interactive display element, controlling the display speed of the interactive display element on the conversation display element.
16. The method of any of claims 1 to 3, further comprising:
displaying a first session message and a second session message in the chat session interface in response to receiving the first session message from the first user account and receiving the second session message from the second user account, or in response to receiving the first session message from the second user account and the second session message from the first user account;
in response to the trigger information existing in the first session message and the second session message, displaying an interactive display element corresponding to the trigger information on the session display element;
wherein the first session message and the second session message are the same session message, or the first session message and the second session message are different session messages.
17. An information display processing apparatus characterized by comprising:
the display module is used for displaying a chat session interface;
the display module is used for responding to a session message sent by a first user account or responding to the session message received from a second user account, and displaying the session message in the chat session interface;
the display module is configured to, in response to the presence of the trigger information in the session message, display an interactive display element corresponding to the trigger information on a session display element in the chat session interface, where the session display element is a display element associated with the session message.
18. A terminal, characterized in that the terminal comprises: a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the information display processing method according to any one of claims 1 to 16.
19. A computer-readable storage medium, characterized in that a computer program is stored thereon, which when executed by a processor implements the information display processing method according to any one of claims 1 to 16.
CN202110226477.1A 2021-03-01 2021-03-01 Information display processing method, device, terminal and storage medium Pending CN114995924A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110226477.1A CN114995924A (en) 2021-03-01 2021-03-01 Information display processing method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110226477.1A CN114995924A (en) 2021-03-01 2021-03-01 Information display processing method, device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN114995924A true CN114995924A (en) 2022-09-02

Family

ID=83018066

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110226477.1A Pending CN114995924A (en) 2021-03-01 2021-03-01 Information display processing method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN114995924A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115604213A (en) * 2022-09-30 2023-01-13 维沃移动通信有限公司(Cn) Interaction method and device and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115604213A (en) * 2022-09-30 2023-01-13 维沃移动通信有限公司(Cn) Interaction method and device and electronic equipment

Similar Documents

Publication Publication Date Title
US11494993B2 (en) System and method to integrate content in real time into a dynamic real-time 3-dimensional scene
CN111882309B (en) Message processing method, device, electronic equipment and storage medium
CN112672176B (en) Interaction method, device, terminal, server and medium based on virtual resources
CN114205324B (en) Message display method, device, terminal, server and storage medium
US20120327091A1 (en) Gestural Messages in Social Phonebook
CN116086483A (en) Generating personalized map interfaces with enhanced icons
CN112258241A (en) Page display method, device, terminal and storage medium
CN110139143B (en) Virtual article display method, device, computer equipment and storage medium
CN111242682B (en) Article display method
CN112181573A (en) Media resource display method, device, terminal, server and storage medium
CN113709022A (en) Message interaction method, device, equipment and storage medium
CN112417180A (en) Method, apparatus, device and medium for generating album video
CN112870697B (en) Interaction method, device, equipment and medium based on virtual relation maintenance program
CN114995924A (en) Information display processing method, device, terminal and storage medium
CN113965539A (en) Message sending method, message receiving method, device, equipment and medium
CN116993432A (en) Virtual clothes information display method and electronic equipment
US20240004456A1 (en) Automated configuration of augmented and virtual reality avatars for user specific behaviors
CN114327197B (en) Message sending method, device, equipment and medium
CN111726697B (en) Multimedia data playing method
CN114968021A (en) Message display method, device, equipment and medium
CN113873270A (en) Game live broadcast method, device, system, electronic equipment and storage medium
CN116578204A (en) Information flow advertisement display method, device, equipment and storage medium
JP7312975B1 (en) Terminal device control program, terminal device, terminal device control method, server device control program, server device, and server device control method
CN114942803A (en) Message display method, device, equipment and medium
CN114392565A (en) Virtual photographing method, related device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40073928

Country of ref document: HK