CN117850946A - Interaction method, device, equipment and storage medium - Google Patents

Interaction method, device, equipment and storage medium Download PDF

Info

Publication number
CN117850946A
CN117850946A CN202410031127.3A CN202410031127A CN117850946A CN 117850946 A CN117850946 A CN 117850946A CN 202410031127 A CN202410031127 A CN 202410031127A CN 117850946 A CN117850946 A CN 117850946A
Authority
CN
China
Prior art keywords
interactive
elements
interactive element
expression
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410031127.3A
Other languages
Chinese (zh)
Inventor
罗曼珺
王卿羽
方昂翔
汤理围
李秋婷
张雯
朱源
褚珂
林文峰
张逸尘
冀新宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202410031127.3A priority Critical patent/CN117850946A/en
Publication of CN117850946A publication Critical patent/CN117850946A/en
Pending legal-status Critical Current

Links

Abstract

Embodiments of the present disclosure relate to interaction methods, apparatuses, devices, and storage media. The method proposed herein comprises: displaying a plurality of interactive elements input by a user in an input field; providing at least one candidate interactive element, wherein the at least one candidate interactive element is determined based on a group of interactive elements in a plurality of interactive elements, and the group of interactive elements at least comprises expression elements; and replacing a set of interactive elements of the plurality of interactive elements with the target interactive element based on the selection of the target interactive element from the at least one candidate interactive element. In this way, embodiments of the present disclosure can support a user to obtain a fused interactive element by combining multiple interactive elements, thereby enriching the interactive elements available to the user.

Description

Interaction method, device, equipment and storage medium
Technical Field
Example embodiments of the present disclosure relate generally to the field of computers and, more particularly, relate to interactive methods, apparatus, devices, and computer-readable storage media.
Background
With the development of computer technology, the internet has become an important platform for people to interact information. In the process of information interaction between people through the internet, various types of interaction elements (such as emoticons and the like) have become important media for people to socially express and exchange information. It is desirable to be able to use more formal interactive elements in the interaction process.
Disclosure of Invention
In a first aspect of the present disclosure, an interaction method is provided. The method comprises the following steps: displaying a plurality of interactive elements input by a user in an input field; providing at least one candidate interactive element, wherein the at least one candidate interactive element is determined based on a group of interactive elements in a plurality of interactive elements, and the group of interactive elements at least comprises expression elements; and replacing a set of interactive elements of the plurality of interactive elements with the target interactive element based on the selection of the target interactive element from the at least one candidate interactive element.
In a second aspect of the present disclosure, an interaction method is provided. The method comprises the following steps: presenting an editing window based on the first selection of the first interactive element; presenting the selected first interactive element in the editing window; and presenting a fused interactive element in the editing window based on the second selection of the at least one second interactive element, the fused interactive element being determined based on the first interactive element and the at least one second interactive element.
In a third aspect of the present disclosure, an interaction device is provided. The device comprises: a display module configured to display a plurality of interactive elements input by a user in an input field; a providing module configured to provide at least one candidate interactive element, the at least one candidate interactive element being determined based on a set of interactive elements of the plurality of interactive elements; and a replacement module configured to replace a set of interactive elements of the plurality of interactive elements with the target interactive element based on a selection of the target interactive element of the at least one candidate interactive element.
In a fourth aspect of the present disclosure, an electronic device is provided. The apparatus comprises at least one processing unit; and at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit. The instructions, when executed by at least one processing unit, cause the apparatus to perform the method of the first or second aspect.
In a fifth aspect of the present disclosure, a computer-readable storage medium is provided. The computer readable storage medium has stored thereon a computer program executable by a processor to implement the method of the first or second aspect.
It should be understood that what is described in this section of the disclosure is not intended to limit key features or essential features of the embodiments of the disclosure, nor is it intended to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, wherein like or similar reference numerals denote like or similar elements, in which:
FIG. 1 illustrates a schematic diagram of an example environment in which embodiments in accordance with the present disclosure may be implemented;
FIGS. 2A-2D illustrate example interfaces according to some embodiments of the present disclosure;
3A-3B illustrate example interfaces according to some embodiments of the present disclosure;
fig. 4A-4D illustrate example interfaces according to some embodiments of the present disclosure;
FIG. 5 illustrates a flow chart of an example interaction process, according to some embodiments of the present disclosure;
FIG. 6 illustrates a flow chart of an example interaction process, according to some embodiments of the present disclosure;
FIG. 7 illustrates a schematic block diagram of an example interaction device, according to some embodiments of the disclosure; and
fig. 8 illustrates a block diagram of an electronic device capable of implementing various embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been illustrated in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather, these embodiments are provided so that this disclosure will be more thorough and complete. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be noted that any section/subsection headings provided herein are not limiting. Various embodiments are described throughout this document, and any type of embodiment may be included under any section/subsection. Furthermore, the embodiments described in any section/subsection may be combined in any manner with any other embodiment described in the same section/subsection and/or in a different section/subsection.
In describing embodiments of the present disclosure, the term "comprising" and its like should be taken to be open-ended, i.e., including, but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The term "some embodiments" should be understood as "at least some embodiments". Other explicit and implicit definitions are also possible below. The terms "first," "second," and the like, may refer to different or the same object. Other explicit and implicit definitions are also possible below.
Embodiments of the present disclosure may relate to user data, the acquisition and/or use of data, and the like. These aspects all follow corresponding legal and related regulations. In embodiments of the present disclosure, all data collection, acquisition, processing, forwarding, use, etc. is performed with knowledge and confirmation by the user. Accordingly, in implementing the embodiments of the present disclosure, the user should be informed of the type of data or information, the range of use, the use scenario, etc. that may be involved and obtain the authorization of the user in an appropriate manner according to the relevant laws and regulations. The particular manner of notification and/or authorization may vary depending on the actual situation and application scenario, and the scope of the present disclosure is not limited in this respect.
In the present description and embodiments, if the personal information processing is concerned, the processing is performed on the premise of having a validity base (for example, obtaining agreement of the personal information body, or being necessary for executing a contract, etc.), and the processing is performed only within a prescribed or contracted range. The user refuses to process the personal information except the necessary information of the basic function, and the basic function is not influenced by the user.
In the process of information interaction of people through the internet, interactive elements such as expressions and the like are important carriers for people to exchange information. However, in the conventional interactive process, it is difficult for people to participate in the process of producing interactive elements, so that the interactive elements that can be used are relatively limited.
The embodiment of the disclosure provides an interaction scheme. According to this scheme, a plurality of interactive elements input by the user can be displayed in the input field. Further, at least one candidate interactive element may be provided, the at least one candidate interactive element being determined based on a set of interactive elements of the plurality of interactive elements, the set of interactive elements including at least the expression element. Accordingly, a set of interactive elements of the plurality of interactive elements may be replaced with the target interactive element based on the selection of the target interactive element of the at least one candidate interactive element.
In this way, embodiments of the present disclosure can support a user to obtain a fused interactive element by combining multiple interactive elements, thereby enriching the interactive elements available to the user.
Various example implementations of the scheme are described in further detail below in conjunction with the accompanying drawings.
Example Environment
FIG. 1 illustrates a schematic diagram of an example environment 100 in which embodiments of the present disclosure may be implemented. As shown in fig. 1, an example environment 100 may include an electronic device 110.
In this example environment 100, an electronic device 110 may be running an application 120 that supports interface interactions. The application 120 may be any suitable type of application for interface interaction, examples of which may include, but are not limited to: video applications, social applications, or other suitable applications. The user 140 may interact with the application 120 via the electronic device 110 and/or its attached device.
In the environment 100 of fig. 1, if the application 120 is in an active state, the electronic device 110 may present an interface 150 for supporting interface interactions through the application 120.
In some embodiments, the electronic device 110 communicates with the server 130 to enable provisioning of services for the application 120. The electronic device 110 may be any type of mobile terminal, fixed terminal, or portable terminal, including a mobile handset, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, media computer, multimedia tablet, palmtop computer, portable gaming terminal, VR/AR device, personal communication system (Personal Communication System, PCS) device, personal navigation device, personal digital assistant (Personal Digital Assistant, PDA), audio/video player, digital camera/video camera, positioning device, television receiver, radio broadcast receiver, electronic book device, gaming device, or any combination of the preceding, including accessories and peripherals for these devices, or any combination thereof. In some embodiments, electronic device 110 is also capable of supporting any type of interface to the user (such as "wearable" circuitry, etc.).
The server 130 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, a content distribution network, basic cloud computing services such as big data and an artificial intelligence platform. Server 130 may include, for example, a computing system/server, such as a mainframe, edge computing node, computing device in a cloud environment, and so on. The server 130 may provide background services for applications 120 in the electronic device 110 that support virtual scenes.
A communication connection may be established between server 130 and electronic device 110. The communication connection may be established by wired means or wireless means. The communication connection may include, but is not limited to, a bluetooth connection, a mobile network connection, a universal serial bus (Universal Serial Bus, USB) connection, a wireless fidelity (Wireless Fidelity, wiFi) connection, etc., as embodiments of the disclosure are not limited in this respect. In embodiments of the present disclosure, the server 130 and the electronic device 110 may implement signaling interactions through a communication connection therebetween.
It should be understood that the structure and function of the various elements in environment 100 are described for illustrative purposes only and are not meant to suggest any limitation as to the scope of the disclosure.
Some example embodiments of the present disclosure will be described below with continued reference to the accompanying drawings.
Example interactions
An example interaction procedure according to an embodiment of the present disclosure will be described below with reference to the accompanying drawings.
Fig. 2A-2D illustrate example interfaces 200A-200D according to some embodiments of the present disclosure. The interfaces 200A-200D may be provided by the electronic device 110 shown in fig. 1.
The following specific embodiments will describe example processes of the present disclosure with expressive elements as examples of interactive elements. Such expressive elements may include, for example, but are not limited to: image expression, speech expression, pigment and text, etc. Such interactive elements may include, for example, character elements, audio elements, image elements, etc., in addition to expression elements.
As shown in fig. 2A, in interface 200, electronic device 110 may provide an input field 206. In some examples, the input field 206 may be, for example, any suitable type of control for inputting information. The input field 206 may be provided by, for example, an appropriate application, such as an input method or a currently active application.
As shown, the electronic device 110 may display an expression panel 202, for example. The expression panel 202 may, for example, provide a collection of expressions 204 that are currently available to the user. As an example, the expression set 204 may include an expression that the user has added or a default expression of the application, e.g., emoji expression.
As shown in fig. 2A, the electronic device 110 may receive a user selection of an expression in the expression set 204 and may display the corresponding expression in the input field 206 accordingly. For example, the electronic device 110 may display a plurality of expressions, namely expression 208, expression 210, and expression 212, entered by the user via the expression set 204.
In some embodiments, the electronic device 110 may display at least one candidate expression. In some examples, the candidate expression, also referred to as a "fused expression," may be determined by combining a corresponding plurality of expressions, e.g., fused expression 214, fused expression 216, and fused expression 218. As an example, such fused expressions 214-218 may be determined based on a plurality of expressions (e.g., expression 210 and expression 212) of the expressions input by the user.
In some embodiments, a model may be utilized to generate a fused expression corresponding to the plurality of expressions. For example, the electronic device 110 and/or the server 130 may provide the model with expressions to be combined, e.g., expression 210 and expression 212.
Further, the electronic device 110 and/or the server 130 may obtain a new fused expression generated by the model based on the expression 210 and the expression 212, e.g., the fused expression 214.
In some embodiments, the electronic device 110 and/or the server 130 may also instruct the model to fuse multiple expressions to be combined according to a particular pattern. For example, the model may be instructed to generate a new fused expression by combining visual information of multiple expressions. For example, the model may be instructed to generate a new fused expression by combining the visual content of expression 210 and expression 212.
As an example, the model may be instructed to generate a new fused expression by fusing semantic information of the expression. For example, the model may be instructed to adjust the visual content of the expression 210 according to the speech information of the expression 212, so that the new fused expression can embody the semantic information of the expression 210 and the expression 212 at the same time.
In some embodiments, the electronic device 110 and/or the server 130 may generate one or more fused expressions for a particular expression combination. For example, the fused expressions 214-218 may be generated based on the same expression combination (e.g., expression 210 and expression 212).
In some embodiments, the fused expression 214 through the fused expression 218 may be generated in real-time based on the expression 210 and the expression 212. Alternatively, the electronic device 110 and/or the server 130 may pre-construct a fused expression library corresponding to the plurality of expression combinations, and may determine one or more fused expressions corresponding to the current expression combination from the fused expression library, e.g., fused expression 214 to fused expression 218.
In some examples, the expression combinations may consider, for example, the order in which expressions were entered. For example, the resulting fused expression of "expression 210 and expression 212" may be different from the resulting fused expression of "expression 212 and expression 210". Alternatively, the types of expressions to be combined may be considered regardless of the order in which they are input.
In some embodiments, the fused expressions 214-218 may also be expressions made by a designer, for example.
In some embodiments, the electronic device 110 may determine the expression 210 and the expression 212 to be combined from the input content in the input field 206. In some embodiments, in the case that the last input content is an expression, the electronic device 110 may determine the expression 210 and the expression 212 to be combined according to the input order of the expressions in the input content. For example, the electronic device 110 may determine the last consecutive two or three expressions in the input content as the expressions to be combined.
In some embodiments, the electronic device 110 may also detect successive expressions from the input content that can be combined, regardless of whether such expressions are at the end of the input content. Taking fig. 2A as an example, the electronic device 110 provides a set of fused expressions determined based on the expression 208 and the expression 210, for example, in the case where the expression 208 and the expression 210 are determined to be able to be combined.
In some embodiments, the electronic device 110 may also determine the expressions to be combined based on user operation. For example, the electronic device 110 may receive a user selection of the expressions 210 and 212 to determine that the expressions to be combined include the expressions 210 and 212.
In some embodiments, the electronic device 110 may also update the provided fusion expression in real-time according to changes in the input content in the input field 206. For example, in the case where the newly input content is a character, the electronic device 110 may stop providing the fusion expression, for example. As another example, in the event that an input of a new expression is received, the electronic device 110 may also provide a fused expression corresponding to the new expression combination.
For example, where the user entered the expressions 208 and 210, the electronic device 110 may provide a set of fused expressions determined based on the expressions 208 and 210. Further, in the event that the user further inputs expression 212, electronic device 110 may provide a set of fused expressions determined based on expression 210 and expression 212.
As another example, in the event that the user deletes expression 212, such as by deleting control 220, electronic device 110 can, for example, re-provide the set of fused expressions determined based on expression 208 and expression 210.
In some embodiments, to improve the efficiency of the user using the fused expression, the electronic device 110 may also display a hint information, for example, based on the expression that the user has entered, to help the user know which expressions can be combined with the entered expression to obtain the fused expression. For example, in the event that the user inputs the expression 210, the electronic device 110 may prompt that the expression 212 can be combined with the expression 210 to obtain a corresponding fused expression.
In some embodiments, the electronic device 110 may also receive a user selection of the provided fused expression and may replace a corresponding plurality of expressions in the input field 206 based on the fused expression selected by the user. For example, taking fig. 2A and 2B as an example, in the case of a fused expression 214 selected by the user, the electronic device 110 may replace the expression 210 and the expression 212 in the input field 206 with the fused expression 214.
Based on the mode, the embodiment of the disclosure can support the user to combine the existing expressions to acquire the new expressions, so that the available expressions of the user are enriched, and the expression interaction experience is improved.
In some embodiments, electronic device 110 may also support further combinations of fused expressions 214, for example. For example, if the electronic device 110 determines that the expression 208 can be further combined with the fused expression 214, the electronic device 110 may also provide a new set of fused expressions determined based on the expression 208 and the fused expression 214, for example.
In some embodiments, the electronic device 110 may also send the acquired fusion expression 214. As an example, upon receiving a selection of the send portal 222, the electronic device 110 may send a message generated based on the fused expression 214. Taking fig. 2B as an example, the message may include, for example, an expression 208 and a fused expression 214.
In some embodiments, such messages may be sent to any suitable application or service. For example, such messages may be entered into any suitable application or service, such as a local document, an online document, an instant messaging service, a comment service, a private messaging service, and the like.
In some embodiments, taking fig. 2C as an example, such a message 224 may be sent to a live interface as a comment message.
In some embodiments, as shown in fig. 2D, the electronic device 110 may also update the historical expression list 226 based on the user's selection of the fused expression 214. Such a list of historical expressions 226 may display a set of historical fused expressions used by the user, each of which may be determined based on a corresponding plurality of expressions.
For example, after a fused expression 214 is selected, the fused expression 214 may be added to the historical expression list 226. In some embodiments, the historical expression list 226 may maintain a predetermined number of fused expressions, and when the number of historical fused expressions used by the user exceeds the predetermined number, the electronic device 110 may, for example, replace the fused expression that was used earliest in the list with a newly used fused expression.
In some embodiments, the historical expression list 226 may be displayed in an expression panel, for example, and may be arranged according to the order of use of the fused expressions, for example.
Based on the mode, the embodiment of the disclosure can facilitate the user to conveniently input the fusion expression.
Although reference is made above to obtaining a fusion expression based on multiple expressions, it should be understood that embodiments of the present disclosure may also support fusion of other types of interactive elements. Such interactive elements may also include character elements, audio elements, image elements, etc., so that fusion of expression elements with other types of interactive elements may be supported.
For example, the electronic device 110 may obtain corresponding fused interactive elements, e.g., expression elements, character elements, audio elements, image elements, etc., based on the entered characters and expressions. For example, the user may enter the character "cry" and further enter a laugh expression. Fusion can be performed, for example, based on semantic information of the character and semantic information and/or visual information of the expression, and for example, a fused expression of "crying not to be" can be obtained.
An example interaction process will be described with further reference to fig. 3A and 3B. Fig. 3A-3B illustrate example interfaces 300A and 300B according to some embodiments of the present disclosure. Interface 300A and interface 300B may be provided by electronic device 110 shown in fig. 1.
As shown in fig. 3A, interface 300A may correspond, for example, to an interface to which the messages discussed above are sent, e.g., a live interface. It should be appreciated that interface 300A may correspond to a client of a user (e.g., user a) sending message 310 or may correspond to a client of another user, such as another user of a live room.
As shown in fig. 3A, a message 310 may be generated and sent based on the process discussed above, and the message 310 may include a fused expression 320. Further, upon receiving a selection of the fused expression 320, the electronic device 110 may display a window 330 as shown in fig. 3B.
As shown in fig. 3B, the electronic device 110 may display target information 340 in the window 330 to indicate that the fused expression 320 is determined based on the two expressions shown. Further, the electronic device 110 may also provide a send portal 350 for sending the fused expression 320, and may input the fused expression 340 in a corresponding input component (e.g., a comment input component of a live room) based on a selection of the send portal 350.
Additionally, as shown in FIG. 3B, the electronic device 110 may also display one or more other fused expressions, such as fused expression 360-1 and fused expression 360-2, in the window 330. Such a fused expression 360-1 and fused expression 360-2 may be, for example, generated or shared by other users. Illustratively, the electronic device 110 may also present corresponding user information in association with the fused expression 360-1 or the fused expression 360-2.
Further, the electronic device 110 may also quickly input by selecting the fused expression 360-1 or the fused expression 360-2, for example.
Based on such a manner, the embodiment of the disclosure may also enable other users to view and use such a fused expression, thereby improving expression interaction efficiency.
Similarly, other types of converged interactive elements may also support similar interactive processes, for example, to support users viewing the converged process of such converged interactive elements, and to quickly share such converged interactive elements.
In some embodiments, embodiments of the present disclosure may also support a user to obtain a fused interactive element in other suitable ways. Fig. 4A-4D illustrate example interfaces 400A-400D according to some embodiments of the present disclosure. Interfaces 400A-400D may be provided by electronic device 110 shown in fig. 1. The example interaction process continues with the expressive element as an example of an interaction element.
As shown in fig. 4A, the electronic device 110 may present an expression panel 405 at an interface 400A, such expression panel 405 may include a set of expressions 410 available to a user.
Further, as shown in fig. 4B, the electronic device 110 may receive a user operation on the expression 420 and may present an expression edit window 415. For example, the electronic device 110 may receive a long press operation of the expression 420 by the user and display the expression edit window 415 accordingly.
Further, the electronic device 110 may move the expression 420 from the initial position into the expression editing window 415 based on a drag operation of the user on the expression 420. Accordingly, the electronic device 110 may display the selected expression 420 in the expression editing window.
Further, the electronic device 110 may also receive a user selection of one or other expressions, for example. For example, the electronic device 110 may receive a user's operation to drag the expression 425 to the expression editing window 415, and may display the expression 425 in the expression editing window 415 accordingly, thereby simultaneously displaying the expression 420 and the expression 425.
In some embodiments, as shown in fig. 4D, the electronic device 110 may display a fused expression 430 determined based on the expression 420 and the expression 425 in the expression editing window 415.
In some embodiments, electronic device 110 may automatically display fusion expression 430, for example, after a predetermined period of time without receiving further user input. As another example, the electronic device 110 may also provide a fused expression 430 determined based on the multiple expressions added in the expression editing window 415, for example, based on a user's request.
In some examples, before the fused expression 430 is provided, the electronic device 110 may also receive, for example, a user's operation on other expressions and further add it to the expression editing window 415 for obtaining the corresponding fused expression.
In some examples, electronic device 110 may also support, for example, further combining of fused expressions 430 with other expressions. For example, the electronic device 110 may support the user's manipulation of other expressions to obtain new fused expressions corresponding to the fused expression 430 and the selected other expressions.
In some embodiments, as shown in fig. 4D, electronic device 110 may also provide a send portal in association with a fused expression 430. Further, upon receiving a selection of the send portal, electronic device 110 can input the fused expression 430 into an associated target component, e.g., a comment input component. Alternatively, the electronic device 110 may also send the fused expression directly, for example, as a comment message into a live interface.
Accordingly, the electronic device 110 may stop displaying the expression edit window 415 based on the selection of the send portal or the cancel portal.
In some embodiments, the interactive elements used for fusion may also include, for example, image elements, audio elements, character elements, and the like. The electronic device 110 may support, for example, fusion between interactive elements of the same type, or may also support fusion between interactive elements of different types. For example, the user may fuse a new expression or a new character by dragging one expression with one or more characters.
Based on such a manner, embodiments of the present disclosure can support more freely exploring fusion between interactive elements, thereby enhancing interactive interest.
Example procedure
Fig. 5 illustrates a flow chart of an interaction example process 500, according to some embodiments of the present disclosure. The process 500 may be implemented at the electronic device 110. Process 500 is described below with reference to fig. 1.
As shown, at block 510, the electronic device 110 displays a plurality of interactive elements entered by a user in an input field.
At block 520, the electronic device 110 provides at least one candidate interactive element, the at least one candidate interactive element being determined based on a set of interactive elements of the plurality of interactive elements, the set of interactive elements including at least an expression element.
At block 530, the electronic device 110 replaces a set of interactive elements of the plurality of interactive elements with the target interactive element based on the selection of the target interactive element of the at least one candidate interactive element.
In some embodiments, the set of interactive elements includes at least one of: a first set of interactive elements determined based on a display order of the plurality of interactive elements; a second set of interactive elements of the plurality of interactive elements determined to be capable of being combined to generate a fused interactive element; the user operates the indicated third set of interactive elements.
In some embodiments, at least one candidate interactive element is generated based on the following process: providing a set of interactive elements to the model; and obtaining a combined interaction element generated by the model based on the group of interaction elements as a candidate interaction element.
In some embodiments, the model is based on visual and/or semantic information of a set of interactive elements to generate a combined interactive element.
In some embodiments, the at least one candidate interactive element is a first set of candidate interactive elements, and the process 500 further comprises: in response to the target interactive element being capable of being combined with at least one other interactive element of the plurality of interactive elements after replacement, a second set of candidate interactive elements is provided, the second set of candidate interactive elements being determined based on the target interactive element and the at least one other interactive element.
In some embodiments, process 500 further comprises: and sending a message generated based on the target interactive element.
In some embodiments, sending the message generated based on the target interactive element includes: and sending comment messages generated based on the target interaction elements in the live interface.
In some embodiments, the message is displayed at a client of the target user, the client configured to: and presenting target information based on a preset operation of a target user on target interactive elements in the message, wherein the target interactive elements are determined based on a group of interactive elements.
In some embodiments, the client is further configured to: presenting a first transmission portal; and inputting the target interactive element in an input component of the client based on the selection for the first send portal.
In some embodiments, process 500 further comprises: based on the selection of the target interactive element, a history interactive element list is updated, the history interactive element list including at least one history fusion interactive element currently used by the user, wherein the at least one history fusion interactive element is determined based on a corresponding plurality of interactive elements.
In some embodiments, process 500 further comprises: a list of historical interactive elements is displayed in the interactive element panel.
In some embodiments, process 500 further comprises: the target fusion interactive element is input into the input field based on the selection of the target fusion interactive element in the at least one history fusion interactive element.
Fig. 6 illustrates a flow chart of an interaction example process 600 according to some embodiments of the disclosure. The process 600 may be implemented at the electronic device 110. Process 600 is described below with reference to fig. 1.
As shown in fig. 6, at block 610, the electronic device 110 presents an editing window based on a first selection of a first interactive element.
At block 620, the electronic device 110 presents the selected first interactive element in the editing window.
At block 630, the electronic device 110 presents a fused interactive element in the editing window based on the second selection of the at least one second interactive element, the fused interactive element being determined based on the first interactive element and the at least one second interactive element.
In some embodiments, the process 600 further comprises: based on the selection of the second send entry in the edit window, the fusion interactive element is entered into the target component or sent.
In some embodiments, presenting the selected first interactive element in the editing window includes: in response to the first interactive element being moved from the first position into the editing window, the selected first interactive element is presented in the editing window.
In some embodiments, the second selection comprises: moving the at least one second interactive element from the second position into the editing window such that the editing window displays the first interactive element and the at least one second interactive element.
In some embodiments, the first interactive element, the second interactive element, or the fused interactive element comprises at least one of: expression elements, character elements, image elements, audio elements.
Example apparatus and apparatus
Embodiments of the present disclosure also provide corresponding apparatus for implementing the above-described methods or processes. Fig. 7 illustrates a schematic block diagram of an example interaction device 700, according to some embodiments of the disclosure. The apparatus 700 may be implemented as or included in the electronic device 110. The various modules/components in apparatus 700 may be implemented in hardware, software, firmware, or any combination thereof.
As shown in fig. 7, the apparatus 700 includes a display module 710 configured to display a plurality of interactive elements input by a user in an input field; a providing module configured to provide at least one candidate interactive element, the at least one candidate interactive element being determined based on a set of interactive elements of the plurality of interactive elements, the set of interactive elements including at least an expression element; and a replacement module 720 configured to replace a set of interactive elements of the plurality of interactive elements with the target interactive element based on a selection of the target interactive element of the at least one candidate interactive element.
In some embodiments, the set of interactive elements includes at least one of: a first set of interactive elements determined based on a display order of the plurality of interactive elements; a second set of interactive elements of the plurality of interactive elements determined to be capable of being combined to generate a fused interactive element; the user operates the indicated third set of interactive elements.
In some embodiments, at least one candidate interactive element is generated based on the following process: providing a set of interactive elements to the model; and obtaining a combined interaction element generated by the model based on the group of interaction elements as a candidate interaction element.
In some embodiments, the model is based on visual and/or semantic information of a set of interactive elements to generate a combined interactive element.
In some embodiments, the at least one candidate interactive element is a first set of candidate interactive elements, the providing module 720 is further configured to: in response to the target interactive element being capable of being combined with at least one other interactive element of the plurality of interactive elements after replacement, a second set of candidate interactive elements is provided, the second set of candidate interactive elements being generated based on the target interactive element and the at least one other interactive element.
In some embodiments, the apparatus 700 further comprises a transmitting module configured to: and sending a message generated based on the target interactive element.
In some embodiments, the transmitting module is further configured to: and sending comment messages generated based on the target interaction elements in the live interface.
In some embodiments, the message is displayed at a client of the target user, the client configured to: and presenting target information based on a preset operation of a target user on target interactive elements in the message, wherein the target interactive elements are determined based on a group of interactive elements.
In some embodiments, the client is further configured to: presenting a first transmission portal; and inputting the target interactive element in an input component of the client based on the selection for the first send portal.
In some embodiments, the apparatus 700 further comprises an update module configured to: based on the selection of the target interactive element, a history interactive element list is updated, the history interactive element list including at least one history fusion interactive element currently used by the user, wherein the at least one history fusion interactive element is determined based on a corresponding plurality of interactive elements.
In some embodiments, the apparatus 700 further comprises an interactive element viewing module configured to: a list of historical interactive elements is displayed in the interactive element panel.
In some embodiments, the apparatus 700 further comprises an interactive element input module configured to: the target fusion interactive element is input into the input field based on the selection of the target fusion interactive element in the at least one history fusion interactive element.
Fig. 8 illustrates a block diagram of an electronic device 800 in which one or more embodiments of the disclosure may be implemented. It should be understood that the electronic device 800 illustrated in fig. 8 is merely exemplary and should not be construed as limiting the functionality and scope of the embodiments described herein. The electronic device 800 illustrated in fig. 8 may be used to implement the electronic device 110 of fig. 1.
As shown in fig. 8, the electronic device 800 is in the form of a general-purpose electronic device. Components of electronic device 800 may include, but are not limited to, one or more processors or processing units 810, memory 820, storage device 830, one or more communication units 840, one or more input devices 850, and one or more output devices 860. The processing unit 810 may be a real or virtual processor and is capable of performing various processes according to programs stored in the memory 820. In a multiprocessor system, multiple processing units execute computer-executable instructions in parallel to increase the parallel processing capabilities of electronic device 800.
Electronic device 800 typically includes multiple computer storage media. Such a medium may be any available media that is accessible by electronic device 800, including, but not limited to, volatile and non-volatile media, removable and non-removable media. The memory 820 may be volatile memory (e.g., registers, cache, random Access Memory (RAM)), non-volatile memory (e.g., read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory), or some combination thereof. Storage device 830 may be a removable or non-removable medium and may include machine-readable media such as flash drives, magnetic disks, or any other medium that may be used to store information and/or data and that may be accessed within electronic device 800.
The electronic device 800 may further include additional removable/non-removable, volatile/nonvolatile storage media. Although not shown in fig. 8, a magnetic disk drive for reading from or writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk may be provided. In these cases, each drive may be connected to a bus (not shown) by one or more data medium interfaces. Memory 820 may include a computer program product 825 having one or more program modules configured to perform the various methods or acts of the various embodiments of the present disclosure.
The communication unit 840 enables communication with other electronic devices through a communication medium. Additionally, the functionality of the components of the electronic device 800 may be implemented in a single computing cluster or in multiple computing machines capable of communicating over a communications connection. Thus, the electronic device 800 may operate in a networked environment using logical connections to one or more other servers, a network Personal Computer (PC), or another network node.
The input device 850 may be one or more input devices such as a mouse, keyboard, trackball, etc. The output device 860 may be one or more output devices such as a display, speakers, printer, etc. The electronic device 800 may also communicate with one or more external devices (not shown), such as storage devices, display devices, etc., with one or more devices that enable a user to interact with the electronic device 800, or with any device (e.g., network card, modem, etc.) that enables the electronic device 800 to communicate with one or more other electronic devices, as desired, via the communication unit 840. Such communication may be performed via an input/output (I/O) interface (not shown).
According to an exemplary implementation of the present disclosure, a computer-readable storage medium having stored thereon computer-executable instructions, wherein the computer-executable instructions are executed by a processor to implement the method described above is provided. According to an exemplary implementation of the present disclosure, there is also provided a computer program product tangibly stored on a non-transitory computer-readable medium and comprising computer-executable instructions that are executed by a processor to implement the method described above.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus, devices, and computer program products implemented according to the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various implementations of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of implementations of the present disclosure has been provided for illustrative purposes, is not exhaustive, and is not limited to the implementations disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various implementations described. The terminology used herein was chosen in order to best explain the principles of each implementation, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand each implementation disclosed herein.

Claims (20)

1. An interaction method, comprising:
displaying a plurality of interactive elements input by a user in an input field;
providing at least one candidate interactive element, the at least one candidate interactive element being determined based on a set of interactive elements of the plurality of interactive elements, the set of interactive elements including at least an expression element; and
based on a selection of a target interactive element of the at least one candidate interactive element, replacing the set of interactive elements of the plurality of interactive elements with the target interactive element.
2. The method of claim 1, wherein the set of interactive elements comprises at least one of:
A first set of interactive elements determined based on a display order of the plurality of interactive elements;
a second set of interactive elements of the plurality of interactive elements determined to be capable of combining to generate a fused interactive element;
the user operates the indicated third set of interactive elements.
3. The method of claim 1, wherein the at least one candidate interactive element is generated based on:
providing the set of interactive elements to a model; and
and acquiring combined interaction elements generated by the model based on the group of interaction elements as the candidate interaction elements.
4. The method of claim 3, wherein the model is to generate the combined interactive element based on visual and/or semantic information of the set of interactive elements.
5. The method of claim 1, wherein the at least one candidate interactive element is a first set of candidate interactive elements, the method further comprising:
providing a second set of candidate interactive elements in response to the target interactive element being combinable with at least one other interactive element of the plurality of interactive elements after replacement, the second set of candidate interactive elements being generated based on the target interactive element and the at least one other interactive element.
6. The method of claim 1, further comprising:
and sending a message generated based on the target interactive element.
7. The method of claim 6, wherein sending a message generated based on the target interactive element comprises:
and sending comment messages generated based on the target interaction elements in a live interface.
8. The method of claim 7, wherein the message is displayed at a client of a target user, the client configured to:
and presenting target information based on a preset operation of the target user on the target interaction element in the message, wherein the target interaction element is determined based on the group of interaction elements.
9. The method of claim 8, wherein the client is further configured to:
presenting a first transmission portal; and
the target interactive element is input in an input component of the client based on the selection of the first send portal.
10. The method of claim 1, further comprising:
based on the selection of the target interactive element, a history interactive element list is updated, the history interactive element list including at least one history fusion interactive element currently used by a user, wherein the at least one history fusion interactive element is determined based on a corresponding plurality of interactive elements.
11. The method of claim 10, further comprising:
and displaying the history interaction element list in an interaction element panel.
12. The method of claim 10, further comprising:
inputting a target fusion interactive element in the input field based on selection of the target fusion interactive element in the at least one history fusion interactive element.
13. An interaction method, comprising:
presenting an editing window based on the first selection of the first interactive element;
presenting the selected first interactive element in the editing window; and
based on a second selection of at least one second interactive element, a fused interactive element is presented in the editing window, the fused interactive element being determined based on the first interactive element and the at least one second interactive element.
14. The method of claim 13, further comprising:
and inputting the fusion interaction element into a target component or transmitting the fusion interaction element based on the selection of the second transmission entrance in the editing window.
15. The method of claim 13, wherein presenting the selected first interactive element in the editing window comprises:
In response to the first interactive element being moved from a first position into the editing window, the selected first interactive element is presented in the editing window.
16. The method of claim 13, wherein the second selecting comprises:
and moving the at least one second interactive element from a second position into the editing window, so that the editing window displays the first interactive element and the at least one second interactive element.
17. The method of claim 13, wherein the first interactive element, the second interactive element, or the fused interactive element comprises at least one of:
expression elements, character elements, image elements, audio elements.
18. An interaction device, comprising:
a display module configured to display a plurality of interactive elements input by a user in an input field;
a providing module configured to provide at least one candidate interactive element, the at least one candidate interactive element being determined based on a set of interactive elements of the plurality of interactive elements, the set of interactive elements including at least an expression element; and
a replacement module configured to replace the set of interactive elements of the plurality of interactive elements with a target interactive element based on a selection of the target interactive element from the at least one candidate interactive element.
19. An electronic device, comprising:
at least one processing unit; and
at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit, which when executed by the at least one processing unit, cause the electronic device to perform the method of any one of claims 1 to 12 or 13 to 17.
20. A computer readable storage medium having stored thereon a computer program executable by a processor to implement the method of any of claims 1 to 12 or 13 to 17.
CN202410031127.3A 2024-01-08 2024-01-08 Interaction method, device, equipment and storage medium Pending CN117850946A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410031127.3A CN117850946A (en) 2024-01-08 2024-01-08 Interaction method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410031127.3A CN117850946A (en) 2024-01-08 2024-01-08 Interaction method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117850946A true CN117850946A (en) 2024-04-09

Family

ID=90534224

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410031127.3A Pending CN117850946A (en) 2024-01-08 2024-01-08 Interaction method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117850946A (en)

Similar Documents

Publication Publication Date Title
US10728192B2 (en) Apparatus and method for message reference management
CN102388377B (en) System and method for editing a conversation in a hosted conversation system
CN109889424B (en) Information processing method, device and storage medium
CN109582904B (en) Published content modification method, device, server, terminal and storage medium
US9860198B1 (en) Apparatus and method for message reference management
CN110489663B (en) Social content control method and device and computer equipment
CN115079872B (en) Document processing method, device, equipment and medium
CN109525488B (en) Instant message publishing method, device, terminal, server and storage medium
US20240095459A1 (en) Topic Identification Based on Virtual Space Machine Learning Models
CN117850946A (en) Interaction method, device, equipment and storage medium
US11876771B2 (en) Message display method and apparatus, computer device, storage medium, and program product
US20230370403A1 (en) Method and apparatus for messaging service
US20230368105A1 (en) Contextual workflow buttons
CN102651720A (en) On-line group photo taking method and device
US11902228B1 (en) Interactive user status
US11784955B1 (en) Virtual space server redundancy across geographic regions
JP7173179B2 (en) Information processing system, information processing device, user terminal, and program
US20240121124A1 (en) Scheduled synchronous multimedia collaboration sessions
WO2024087533A1 (en) Expression image sharing method and apparatus, computer device, and storage medium
CN117850937A (en) Interaction method, device, equipment and storage medium
CN117745885A (en) Expression generating and work publishing method, device, equipment and storage medium
CN117908736A (en) Interaction method, device, equipment and storage medium
CN116150163A (en) Information prompting method, device, equipment, storage medium and product
CN117572989A (en) Content collaboration method, device, equipment and storage medium
CN117834576A (en) Expression interaction method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination