CN111580724A - Information interaction method, equipment and storage medium - Google Patents

Information interaction method, equipment and storage medium Download PDF

Info

Publication number
CN111580724A
CN111580724A CN202010598558.XA CN202010598558A CN111580724A CN 111580724 A CN111580724 A CN 111580724A CN 202010598558 A CN202010598558 A CN 202010598558A CN 111580724 A CN111580724 A CN 111580724A
Authority
CN
China
Prior art keywords
interaction
information
interactive
interface
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010598558.XA
Other languages
Chinese (zh)
Other versions
CN111580724B (en
Inventor
张艳军
林晓鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010598558.XA priority Critical patent/CN111580724B/en
Publication of CN111580724A publication Critical patent/CN111580724A/en
Application granted granted Critical
Publication of CN111580724B publication Critical patent/CN111580724B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides an information interaction method, equipment and a storage medium; the method comprises the following steps: receiving an interaction triggering operation acted on an interaction triggering control corresponding to a target object on a first interaction interface, responding to the interaction triggering operation, and displaying an interaction triggering interface; receiving an editing operation acted on an editing control on an interactive trigger interface, responding to the editing operation, and sending first interactive information to a target object; receiving interaction confirmation information sent by the target object aiming at the first interaction information, and displaying a second interaction interface based on the interaction confirmation information; on a second interactive interface, receiving first interactive operation acted on the first interactive control, responding to the first interactive operation, and sending second interactive information to the target object; and displaying a target interaction result corresponding to the target object on a result display interface based on the second interaction information. Through this application embodiment, can promote interactive mode's variety.

Description

Information interaction method, equipment and storage medium
Technical Field
The present application relates to information processing technologies in the field of computer applications, and in particular, to an information interaction method, device, and storage medium.
Background
With the development of information technology, information interaction is widely applied; generally, in order to increase interest and enthusiasm of users for information interaction, a social platform is provided with a plurality of interaction modes for realizing information interaction, for example, anonymous interaction is one of them.
Generally, when anonymous interaction is implemented, interaction information is sent anonymously from a sending object to a receiving object, and then the interaction information is received by the receiving object, so as to implement anonymous interaction. However, in the process of implementing anonymous interaction, only the sending object sends the interaction information in an anonymous way, and the interaction way is single.
Disclosure of Invention
The embodiment of the application provides an information interaction method, information interaction equipment and a storage medium, and the diversity of interaction modes can be improved.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides an information interaction method, which comprises the following steps:
receiving an interaction triggering operation acted on an interaction triggering control corresponding to a target object on a first interaction interface, responding to the interaction triggering operation, and displaying an interaction triggering interface;
receiving an editing operation acted on an editing control on the interaction triggering interface, responding to the editing operation, and sending first interaction information to the target object;
receiving interaction confirmation information sent by the target object aiming at the first interaction information, and displaying a second interaction interface based on the interaction confirmation information;
receiving a first interactive operation acted on a first interactive control on the second interactive interface, responding to the first interactive operation, and sending second interactive information to the target object;
and displaying a target interaction result corresponding to the target object on a result display interface based on the second interaction information, so as to finish the interaction with the target object.
The embodiment of the present application further provides an information interaction method, including:
receiving first interaction information sent by a sending object, and displaying the first interaction information on a fourth interaction interface;
aiming at the displayed first interaction information, receiving an interaction confirmation operation acted on an interaction confirmation control, responding to the interaction confirmation operation, acquiring interaction confirmation information, and sending the interaction confirmation information to the sending object;
displaying a fifth interactive interface based on the interactive confirmation information;
receiving second interaction information sent by the sending object, and displaying the second interaction information on the fifth interaction interface;
and displaying a target interaction result corresponding to the sending object on a result display interface based on the second interaction information, so as to finish the interaction with the sending object.
The embodiment of the application provides an information interaction method, which comprises the following steps:
receiving first interaction information sent to a target object by a sending object, and sending the first interaction information to the target object;
receiving interaction confirmation information sent by the target object aiming at the first interaction information, and sending the interaction confirmation information to the sending object;
receiving second interaction information sent by the sending object aiming at the interaction confirmation information, and sending the second interaction information to the target object;
determining a target interaction result of the sending object and the target object based on the second interaction information;
and sending the target interaction result to the sending object and the target object.
The embodiment of the application provides a first information interaction device, including:
the interaction triggering module is used for receiving interaction triggering operation acted on an interaction triggering control corresponding to the target object on the first interaction interface, responding to the interaction triggering operation and displaying the interaction triggering interface;
the interaction triggering module is further used for receiving an editing operation acted on an editing control on the interaction triggering interface, responding to the editing operation and sending first interaction information to the target object;
the interaction module is used for receiving interaction confirmation information sent by the target object aiming at the first interaction information and displaying a second interaction interface based on the interaction confirmation information;
the interaction module is further used for receiving a first interaction operation acted on a first interaction control on the second interaction interface, responding to the first interaction operation and sending second interaction information to the target object;
and the first display module is used for displaying a target interaction result corresponding to the target object on a result display interface based on the second interaction information so as to finish the interaction with the target object.
The embodiment of the application provides a second information interaction device, including:
the information display module is used for receiving first interaction information sent by a sending object and displaying the first interaction information on a fourth interaction interface;
the information sending module is used for receiving the interaction confirmation operation acted on the interaction confirmation control aiming at the displayed first interaction information, responding to the interaction confirmation operation, acquiring the interaction confirmation information and sending the interaction confirmation information to the sending object;
the interaction starting module is used for displaying a fifth interaction interface based on the interaction confirmation information;
the information receiving module is used for receiving second interaction information sent by the sending object and displaying the second interaction information on the fifth interaction interface;
and the second display module is used for displaying a target interaction result corresponding to the sending object on a result display interface based on the second interaction information so as to finish the interaction with the sending object.
An embodiment of the present application provides a service apparatus, including:
the information pushing module is used for receiving first interaction information sent to a target object by a sending object and sending the first interaction information to the target object;
the information pushing module is further configured to receive interaction confirmation information sent by the target object for the first interaction information, and send the interaction confirmation information to the sending object;
the information processing module is used for receiving second interaction information sent by the sending object aiming at the interaction confirmation information and sending the second interaction information to the target object;
the information processing module is further configured to determine a target interaction result between the sending object and the target object based on the second interaction information;
the information processing module is further configured to send the target interaction result to the sending object and the target object.
An embodiment of the present application provides a first information interaction device, including:
a first memory for storing executable instructions;
the first processor is configured to implement the information interaction method applied to the first information interaction device provided in the embodiment of the present application when the executable instructions stored in the first memory are executed.
The embodiment of the application provides a second information interaction device, which comprises:
a second memory for storing executable instructions;
and the second processor is used for realizing the information interaction method applied to the second information interaction equipment provided by the embodiment of the application when the executable instructions stored in the second memory are executed.
An embodiment of the present application provides a service device, including:
a third memory for storing executable instructions;
and the third processor is used for realizing the information interaction method applied to the service equipment provided by the embodiment of the application when the executable instructions stored in the third memory are executed.
The embodiment of the application provides a computer-readable storage medium, which stores executable instructions for causing a first processor to execute, so as to implement the information interaction method applied to a first information interaction device; or the information interaction method is used for causing the second processor to execute so as to realize the information interaction method applied to the second information interaction equipment; or for causing the third processor to execute, so as to implement the above-mentioned information interaction method applied to the service device.
The embodiment of the application has the following beneficial effects: the sending object can send the first interaction information to the target object to trigger interaction, and can continue to interact with the target object after interaction triggering, so that a final target interaction result corresponding to the target object is obtained and displayed; that is to say, the interaction of sending object and target object is realized through multiple interaction to corresponding target interaction result can also be visualized, consequently, has realized that a style is interactive and has the interactive mode of interactive result, and interactive mode is diversified, thereby can promote the variety of interactive mode.
Drawings
FIG. 1 is a diagram of an exemplary anonymous interaction approach;
FIG. 2 is an alternative architecture diagram of an information interaction system provided by an embodiment of the present application;
fig. 3 is a schematic structural diagram of a terminal in fig. 2 according to an embodiment of the present disclosure;
fig. 4a is a schematic structural diagram of another terminal in fig. 2 according to an embodiment of the present disclosure;
fig. 4b is a schematic structural diagram of a server in fig. 2 according to an embodiment of the present disclosure;
FIG. 5 is a schematic flow chart of an alternative information interaction method provided in the embodiments of the present application;
FIG. 6 is a schematic diagram of an exemplary first interactive interface provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of an exemplary interactive triggering interface provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of an exemplary fourth interactive interface provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of an exemplary second interactive interface provided by an embodiment of the present application;
FIG. 10 is a schematic diagram of an exemplary results presentation interface provided by embodiments of the present application;
FIG. 11a is a schematic diagram of another exemplary first interactive interface provided by an embodiment of the present application;
FIG. 11b is a schematic diagram illustrating an exemplary trigger schedule according to an embodiment of the present invention;
FIG. 12 is a schematic flow chart diagram illustrating an alternative information interaction method according to an embodiment of the present disclosure;
FIG. 13 is a schematic diagram illustrating an exemplary information interaction flow provided by an embodiment of the present application;
FIG. 14 is a schematic diagram of another exemplary first interactive interface provided by embodiments of the present application;
FIG. 15 is a schematic diagram of another exemplary interactive triggering interface provided by an embodiment of the present application;
fig. 16 is a schematic diagram of an exemplary method for acquiring table white information according to an embodiment of the present disclosure;
fig. 17 is a diagram illustrating an exemplary display table white message provided by an embodiment of the present application;
fig. 18 is a schematic diagram of an exemplary dedicated live room interface provided by an embodiment of the present application;
fig. 19 is a schematic diagram of an exemplary terminal processing flow provided in an embodiment of the present application;
fig. 20 is a schematic diagram of another exemplary terminal processing flow provided in the embodiment of the present application;
fig. 21 is a schematic diagram of an exemplary logic architecture provided in an embodiment of the present application.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the following description, reference is made to the terms "first \ second \ third \ fourth \ fifth" merely to distinguish between similar objects and not to represent a particular ordering for the objects, and it is to be understood that "first \ second \ third \ fourth \ fifth" may be interchanged with a particular order or sequence as appropriate to enable the embodiments of the application described herein to be practiced in an order other than that illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used in the examples of this application have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the embodiments of the present application is for the purpose of describing the embodiments of the present application only and is not intended to be limiting of the present application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) And (4) live broadcast: the method refers to a technology for collecting data of a broadcasting party through certain equipment, compressing the data into a viewable and transmittable video stream through a series of processing such as video coding, and outputting the video stream to a user side for viewing.
2) And (4) control: the touch control method refers to a component which is displayed on an interface and can trigger specific processing through touch control, such as a button, a link, an input box, a tab, an icon or a selection box and the like.
Generally, when anonymous interaction is implemented, interaction information is sent anonymously from a sending object to a receiving object, and then the interaction information is received by the receiving object, so as to implement anonymous interaction. For example, referring to FIG. 1, FIG. 1 is a diagram of an exemplary anonymous interaction; as shown in fig. 1, an interactive application is run and an interactive notification interface 1-1 is displayed, a control "anonymous hidden love whitelist" 1-11 is displayed on the interactive notification interface 1-1, and when the "anonymous hidden love whitelist" 1-11 is clicked, the interactive notification interface 1-1 jumps to an interactive viewing interface 1-2; when the receiving object receives the anonymous interaction information, displaying prompt information 1-21 on an interaction viewing interface 1-2: "there is a secret friend who likes you secretly! Is you getting to try on your mind soon? Please check in the following contacts ", and also display a list 1-22 of objects to be interacted (" selected dark love (5), a control (heart shape) + dark love object 1-221, a control + dark love object 1-222, a control + dark love object 1-223, a control + dark love object 1-224, a control + dark love object 1-225 ") and a list 1-23 of friends (" friends 1-231, … … "); at this time, the receiving object is used as a sending object, and anonymous information is sent by selecting the object in the list of objects to be interacted 1-22 and/or the list of friends 1-23, so as to realize interaction. In other words, when the user interacts anonymously, only the object such as a friend which is related to the sending object can be selected to send the interactive information, so that the method is a social contact method, the interactive method is single, the interestingness is low, and the interactive method is only limited to sending and receiving of the information.
Based on this, an embodiment of the present application provides an information interaction method, an apparatus, a device, and a storage medium, which can improve diversity of interaction modes, and an exemplary application of the first information interaction device, the second information interaction device, and the service device provided in the embodiment of the present application is described below. Next, an exemplary application will be described when the first information interaction apparatus and the second information interaction apparatus are both implemented as terminals, and the service apparatus is implemented as a server.
Referring to fig. 2, fig. 2 is an alternative architecture diagram of an information interaction system provided in this embodiment of the present application, in order to support an information interaction application, in the information interaction system 100, a terminal 400 (a first information interaction device, hereinafter referred to as a first interaction device) and a terminal 200 (a second information interaction device, hereinafter referred to as a second interaction device) are connected to a server 300 (a service device) through a network 500, where the network 500 may be a wide area network or a local area network, or a combination of the two.
The terminal 400 is used for receiving the interaction triggering operation acted on the interaction triggering control corresponding to the target object on the first interaction interface, responding to the interaction triggering operation and displaying the interaction triggering interface; receiving an editing operation acted on an editing control on an interactive trigger interface, responding to the editing operation, and sending first interactive information to a target object; receiving interaction confirmation information sent by the target object aiming at the first interaction information, and displaying a second interaction interface based on the interaction confirmation information; on a second interactive interface, receiving first interactive operation acted on the first interactive control, responding to the first interactive operation, and sending second interactive information to the target object; and displaying a target interaction result corresponding to the target object on a result display interface based on the second interaction information, so as to finish the interaction with the target object.
The terminal 200 is configured to receive first interaction information sent by the sending object, and display the first interaction information on a fourth interaction interface; aiming at the displayed first interactive information, receiving interactive confirmation operation acted on the interactive confirmation control, responding to the interactive confirmation operation, acquiring interactive confirmation information, and sending the interactive confirmation information to a sending object; displaying a fifth interactive interface based on the interactive confirmation information; receiving second interaction information sent by the sending object, and displaying the second interaction information on a fifth interaction interface; and displaying a target interaction result corresponding to the sending object on a result display interface based on the second interaction information, so as to finish the interaction with the sending object.
The server 300 is configured to provide a data service to the terminal 400 and the terminal 200 to enable transmission of the first interactive information, the interactive confirmation information, and the second interactive information between the terminal 400 and the terminal 200.
Referring to fig. 3, fig. 3 is a schematic diagram illustrating a structure of a terminal in fig. 2 according to an embodiment of the present disclosure; as shown in fig. 3, the terminal 400 includes: at least one first processor 410, a first memory 450, at least one first network interface 420, and a first user interface 430. The various components in the terminal 400 are coupled together by a first bus system 440. It is understood that the first bus system 440 is used to enable connection communications between these components. The first bus system 440 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as first bus system 440 in fig. 3.
The first Processor 410 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The first user interface 430 includes one or more first output devices 431, including one or more speakers and/or one or more visual display screens, that enable the presentation of media content. The first user interface 430 also includes one or more first input devices 432, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The first memory 450 includes either volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The non-volatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a Random Access Memory (RAM). The first memory 450 described in embodiments herein is intended to comprise any suitable type of memory. The first memory 450 optionally includes one or more storage devices physically located remote from the first processor 410.
In some embodiments, the first memory 450 is capable of storing data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplified below.
A first operating system 451 including system programs for processing various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and processing hardware-based tasks;
a first network communication module 452 for communicating to other computing devices via one or more (wired or wireless) first network interfaces 420, an exemplary first network interface 420 comprising: bluetooth, wireless-compatibility authentication (Wi-Fi), and Universal Serial Bus (USB), etc.;
a first display module 453 for enabling presentation of information (e.g., user interfaces for operating peripherals and displaying content and information) via one or more first output devices 431 (e.g., display screens, speakers, etc.) associated with the first user interface 430;
a first input processing module 454 for detecting one or more user inputs or interactions from one of the one or more first input devices 432 and translating the detected inputs or interactions.
In some embodiments, the first information interaction device provided in the embodiments of the present application may be implemented in software, and fig. 3 illustrates the first information interaction device 455 stored in the first memory 450, which may be software in the form of programs and plug-ins, and includes the following software modules: an interaction triggering module 4551, an interaction module 4552, a first presentation module 4553 and an identity updating module 4554, the functions of which will be described below.
Referring to fig. 4a, fig. 4a is a schematic structural diagram of another terminal in fig. 2 according to an embodiment of the present disclosure; as shown in fig. 4a, the terminal 200 includes: at least one second processor 210, a second memory 250, at least one second network interface 220, and a second user interface 230. The various components in the terminal 200 are coupled together by a second bus system 240. It is understood that the second bus system 240 is used to enable connection communication between these components. The second bus system 240 includes a power bus, a control bus, and a status signal bus in addition to the data bus. But for clarity of illustration the various buses are labeled as the second bus system 240 in figure 4 a.
The second processor 210 may be an integrated circuit chip having signal processing capabilities, such as a general purpose processor, a digital signal processor, or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc., wherein the general purpose processor may be a microprocessor or any conventional processor, etc.
The second user interface 230 includes one or more second output devices 231, including one or more speakers and/or one or more visual displays, that enable the presentation of media content. The second user interface 230 also includes one or more second input devices 232, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The second memory 250 includes either volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory, and the volatile memory may be a random access memory. The second memory 250 described in embodiments herein is intended to comprise any suitable type of memory. The second memory 250 optionally includes one or more storage devices physically located remote from the second processor 210.
In some embodiments, the second memory 250 is capable of storing data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplified below.
A second operating system 251 including system programs for processing various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and processing hardware-based tasks;
a second network communication module 252 for communicating to other computing devices via one or more (wired or wireless) second network interfaces 220, an exemplary second network interface 220 comprising: bluetooth, wireless compatibility authentication, universal serial bus, and the like;
a second display module 253 to enable presentation of information (e.g., a user interface for operating peripherals and displaying content and information) via one or more second output devices 231 (e.g., a display screen, speakers, etc.) associated with the second user interface 230;
a second input processing module 254 for detecting one or more user inputs or interactions from one of the one or more second input devices 232 and translating the detected inputs or interactions.
In some embodiments, the first information interaction device provided in the embodiments of the present application may be implemented in software, and fig. 4a illustrates the second information interaction device 255 stored in the second storage 250, which may be software in the form of programs and plug-ins, and includes the following software modules: an information display module 2551, an information sending module 2552, an interactive opening module 2553, an information receiving module 2554, a second display module 2555 and a setting module 2556, the functions of which will be described below.
Referring to fig. 4b, fig. 4b is a schematic structural diagram of a server in fig. 2 according to an embodiment of the present disclosure; as shown in fig. 4b, the server 300 includes: at least one third processor 310, a third memory 350, at least one third network interface 320, and a third user interface 330. The various components in server 300 are coupled together by a third bus system 340. It will be appreciated that the third bus system 340 is used to enable connection communications between these components. The third bus system 340 includes a power bus, a control bus, and a status signal bus in addition to the data bus. But for clarity of illustration the various buses are labeled as the third bus system 340 in figure 4 b.
The third processor 310 may be an integrated circuit chip having signal processing capabilities, such as a general purpose processor, a digital signal processor, or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc., wherein the general purpose processor may be a microprocessor or any conventional processor, etc.
The third user interface 330 includes one or more third output devices 331, including one or more speakers and/or one or more visual display screens, that enable presentation of media content. The third user interface 330 also includes one or more third input devices 332, including user interface components that facilitate user input, such as a keyboard, a mouse, a microphone, a touch screen display, a camera, other input buttons and controls.
The third memory 350 includes either volatile memory or nonvolatile memory, and may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory, and the volatile memory may be a random access memory. The third memory 350 described in embodiments herein is intended to comprise any suitable type of memory. The third memory 350 optionally includes one or more storage devices physically located remote from the third processor 310.
In some embodiments, the third memory 350 is capable of storing data, examples of which include programs, modules, and data structures, or a subset or superset thereof, to support various operations, as exemplified below.
A third operating system 351 including system programs for processing various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks;
a third network communication module 352 for communicating to other computing devices via one or more (wired or wireless) third network interfaces 320, the exemplary third network interface 320 including: bluetooth, wireless compatibility authentication, universal serial bus, and the like;
a third display module 353 for enabling presentation of information (e.g., a user interface for operating peripherals and displaying content and information) via one or more third output devices 331 (e.g., a display screen, speakers, etc.) associated with the third user interface 330;
a third input processing module 354 for detecting one or more user inputs or interactions from one of the one or more third input devices 332 and translating the detected inputs or interactions.
In some embodiments, the service device provided by the embodiments of the present application may be implemented in software, and fig. 4b illustrates a service device 355 stored in the third memory 350, which may be software in the form of programs and plug-ins, and includes the following software modules: an information push module 3551 and an information processing module 3552, functions of which will be described below.
In other embodiments, the first information interaction Device, the second information interaction Device, and the service Device provided in this embodiment may be implemented in hardware, and for example, the first information interaction Device, the second information interaction Device, and the service Device provided in this embodiment may be a processor in the form of a hardware decoding processor, which is programmed to execute the information interaction method provided in this embodiment, for example, the processor in the form of the hardware decoding processor may be implemented by one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), or other electronic components.
The information interaction method provided by the embodiment of the present application will be described below with reference to exemplary applications and implementations of the terminal and the server provided by the embodiment of the present application.
Referring to fig. 5, fig. 5 is an optional flowchart of an information interaction method provided in the embodiment of the present application, and the method is applied to the first interaction device and the second interaction device, and will be described with reference to the steps shown in fig. 5.
S501, receiving an interaction triggering operation acted on an interaction triggering control corresponding to a target object by first interaction equipment on a first interaction interface, responding to the interaction triggering operation, and displaying the interaction triggering interface.
In the embodiment of the application, the sending object runs the information interaction application on the first interaction device, and at the moment, the first interaction interface is displayed on the first interaction device; the method comprises the steps that a plurality of objects to be interacted and a touch control corresponding to each object to be interacted are displayed on a first interaction interface, when a sending object selects one object from the objects to be interacted for interaction and touches the touch control corresponding to the selected object, first interaction equipment receives interaction triggering operation acting on an interaction triggering control corresponding to a target object; and then, the first interactive equipment responds to the interactive triggering operation and displays an interactive triggering interface so as to trigger information interaction through the interactive triggering interface.
It should be noted that the sending object is an object for initiating information interaction; the first interactive interface is an entry interface for information interaction of a sending object, and at least an interactive trigger control corresponding to a target object is displayed on the first interactive interface, as shown in fig. 6, an anonymous whitelist button 6-12 (interactive trigger control) corresponding to a main broadcast 6-11 (target object) is included on a whitelist square 6-1 (first interactive interface); the target object is an object to be interacted determined by the sending object, that is, the receiving object, such as any anchor, any friend, any concerned object, or any object in the platform, and may also be multiple objects; the interactive trigger control is a touch control used for triggering interaction with the target object, such as an 'anonymous form white' button, a 'chat' button, a touch interactive icon and the like; the interactive trigger operation is an operation for triggering the interactive trigger control, such as clicking, double clicking, long pressing or sliding.
In addition, the interaction triggering interface is an interface for triggering interaction with the target object, and when the first interaction device displays the interaction triggering interface, the interaction triggering interface may be popped up on the first interaction interface, or jump from the first interaction interface to the interaction triggering interface, or display the interaction triggering interface in a designated area on the first interaction interface, and so on.
Illustratively, referring to FIG. 7, when the "anonymous whitespace" button 6-12 in FIG. 6 is clicked (interactive trigger operation), an interactive trigger interface 7-1 is presented below the whitespace square 6-1 and the whitespace square 6-1 is blurred.
S502, the first interaction device receives editing operation acted on the editing control on the interaction triggering interface, responds to the editing operation and sends first interaction information to the target object.
In the embodiment of the application, an editing control for editing interactive information is displayed on a displayed interactive triggering interface, and when a sending object sends the interactive information to a target object through the editing control, the first interactive device receives editing operation acted on the editing control; at this time, the first interactive device responds to the editing operation, acquires the interactive information, namely the first interactive information, and sends the first interactive information to the interactive server, so that the interactive server sends the first interactive information to the target object in a first sending mode, and interaction between the sending object and the target object is triggered through sending of the first interactive information.
It should be noted that the interactive server, i.e. the server 300 in fig. 2, is a server corresponding to the information interactive application; the interactive triggering interface at least comprises an editing control, and the editing control comprises at least one touch control, such as an input box and a button, an input box, a selection control and a button, and the like; and the editing operation is an operation for triggering the editing control, such as clicking, double-clicking, long-pressing or sliding. The first transmission method is a transmission method of the first interactive information, and examples thereof include an anonymous transmission method, a broadcast transmission method, and a one-object transmission method. The first interaction information is information for triggering interaction between the sending object and the target object, and may be at least one of text, video, audio, picture, and attachment information.
S503, the second interactive device receives the first interactive information sent by the sending object, and the first interactive information is displayed on the fourth interactive interface.
In the embodiment of the application, when a sending object sends first interaction information to a target object through first interaction equipment in a first sending mode, second interaction equipment receives the first interaction information sent by the sending object; therefore, the target object runs the information interaction application on the second interaction device, and when the fourth interaction interface is displayed on the second interaction device, the first interaction information is displayed on the fourth interaction interface in a first display mode, so that the target object can acquire the first interaction information.
It should be noted that the fourth interactive interface refers to an interface where the target object performs information interaction, and may be an interface similar to the first interactive interface, a video playing interface, a live broadcast interface, or other interactive interfaces. The first display mode is a display mode corresponding to the first sending mode, for example, when the first sending mode is an anonymous sending mode, the first display mode is a mode of anonymously displaying the first interaction information; when the first sending mode is a broadcast sending mode, the first display mode is a display mode for displaying the first interactive information and determining that the first interactive information is a broadcast message; and when the first sending mode is a single-object sending mode, the first display mode is a display mode in which the sending object sends the first interaction information to the target object.
S504, the second interaction device receives interaction confirmation operation acting on the interaction confirmation control according to the displayed first interaction information, responds to the interaction confirmation operation, obtains interaction confirmation information, and sends the interaction confirmation information to the sending object.
In this embodiment of the application, when the second interactive device displays the first interactive information on the fourth interactive interface, a control corresponding to the first interactive information is also displayed, and the interactive confirmation control is a control touched by the target object in the controls corresponding to the first interactive information, and includes an interactive control (for example, a "accepting whitelist" button, a "waiting for a pop-up" button, a chat icon control, a voice icon control, a later-reminding control, and the like) or a rejection interactive control (for example, a "rejecting whitelist" button, a rejection icon control, and the like); the target object operates on the interactive confirmation control according to the displayed first interactive information, and the second interactive equipment receives the interactive confirmation operation acting on the interactive confirmation control; at this time, the second interactive device responds to the interactive confirmation operation, and then obtains reply information of the target object to the sending object aiming at the first interactive information, namely interactive confirmation information; and then, the interaction confirmation information is sent to an interaction server, so that the interaction server sends the interaction confirmation information to the first interaction equipment.
It should be noted that the interactive confirmation operation is an operation for triggering the interactive confirmation control, for example, clicking, double-clicking, long-pressing, or sliding; the interactive confirmation information corresponds to the interactive confirmation control, and when the interactive confirmation control is the interactive control, the interactive confirmation information is information for confirming interaction; for example, when the interactive control is a "accept the form" button, the interactive confirmation information is information for accepting the form and performing interaction, and when the interactive control is a "wait for the heart" button, the interactive confirmation information is information for performing subsequent interaction. When the interactive confirmation control is the interaction rejection control, the interactive confirmation information is the information for rejecting the interaction, and at this time, the subsequent interactive process is not executed.
Referring to fig. 8, fig. 8 is a schematic diagram of an exemplary fourth interactive interface provided in an embodiment of the present application; as shown in fig. 8, a letter icon 8-11 and a text 8-12 ("may you have got an anonymous list with a letter |" (letter icon 8-11 and text 8-12 are collectively referred to as first interactive information) are displayed on the video playing interface 8-1 (fourth interactive interface), and a "accept whites" button 8-13 and a "wait for a heart" button 8-14 (button 8-13 and button 8-14 are collectively referred to as interactive confirmation control) corresponding to the letter icon 8-11 and text 8-12 are also displayed.
And S505, the first interaction equipment receives interaction confirmation information sent by the target object aiming at the first interaction information, and displays a second interaction interface based on the interaction confirmation information.
In the embodiment of the application, after the target object sends the interactive confirmation information to the sending object through the second interactive device, the sending object also receives the interactive confirmation information through the first interactive device; at this time, the first interactive device identifies the interactive confirmation information, and if the interactive confirmation information is information that refuses the interaction, the first interactive device displays the interactive confirmation information to inform the sending object, and ends the interactive process. And if the interactive confirmation information is information for confirming the interaction, the first interactive equipment displays the second interactive interface so that the sending object interacts with the target object through the second interactive interface.
It should be noted that, when the first interaction device displays the second interaction interface based on the interaction confirmation information, the first interaction device may jump from the first interaction interface to the second interaction interface, may also display the second interaction interface on the first interaction interface, may also display the intermediate interaction interface according to the interaction confirmation information, and display the second interaction interface again according to the interaction operation performed by the sending object and the target object on the intermediate interaction interface, and the like, which is not specifically limited in this embodiment of the present application.
S506, the first interaction equipment receives first interaction operation acted on the first interaction control on the second interaction interface, responds to the first interaction operation, and sends second interaction information to the target object.
In the embodiment of the application, after the first interactive device displays the second interactive interface, the sending object can operate on the second interactive interface to realize secondary interaction with the target object; here, a first interactive control is displayed on the second interactive interface, and when the sending object touches the first interactive control to interact with the target object, the first interactive device receives a first interactive operation acting on the first interactive control; at this time, the first interactive device responds to the first interactive operation to obtain second interactive information, and the second interactive information is sent to the interactive server, so that the interactive server sends the second interactive information to the second interactive device in a second sending mode.
It should be noted that the first interactive control includes at least one touch-controllable control, for example, including an input box and a button, an input box, a selection control and a button, and the like; and the first interactive operation is an operation for triggering the first interactive control, such as clicking, double-clicking, long-pressing or sliding. The second transmission method is, for example, an anonymous transmission method, a broadcast transmission method, a single object transmission method, or the like, and may be the same as or different from the first transmission method; the second interactive information may be at least one of text, video, audio, picture, and attachment information, and may also be notification information (for example, a live video of the sending object sharing the target object). In addition, second interactive information can be displayed on the second interactive interface.
For example, referring to fig. 9, fig. 9 is a schematic diagram of an exemplary second interactive interface provided in the embodiment of the present application; as shown in fig. 9, the second interactive interface is a dedicated live broadcasting room interface 9-1, an image 9-11 and an interactive area 9-12 corresponding to the target object are displayed on the dedicated live broadcasting room interface 9-1, and the interaction area 9-12 displays interaction information 9-121 (the special live broadcast room is opened, the whitening time is # and the bullet screen function is unlocked after 52 minutes ### -; you can not bullet screen but get the bullet screen, i.e. the second interaction information), the special live broadcast interface 9-1 also displays an input box 9-13, a gift icon 9-14 and other icons 9-15 (a first interaction control), when at least one of the object touch input boxes 9-13, the gift icons 9-14 and the other icons 9-15 (first interactive control) is transmitted, a first interactive operation is received.
And S507, the first interaction equipment displays a target interaction result corresponding to the target object on a result display interface based on the second interaction information, so that the interaction with the target object is completed.
In the embodiment of the application, after the first interaction device sends the second interaction information to the target object, the corresponding result display interface is further displayed based on the second interaction information, so that the target interaction result of the sending object and the target object is displayed on the result display interface. Here, when the first interactive device sends the second interactive information to the target object through the interactive server, the interactive server may further determine whether the sending object and the target object satisfy the interaction condition based on the second interactive information (whether the number of interactions exceeds a preset number, whether a value corresponding to the interactions exceeds a preset value, etc.), and when the sending object and the target object satisfy the interaction condition, send information to the first interactive device, so that the first interactive device displays the result display interface, and displays the target interaction result of the sending object and the target object on the result display interface, at this time, the interaction between the sending object and the target object is completed.
It should be noted that the result display interface may be a newly skipped interface, may also be an interface displayed on the second interactive interface, may also be an interface skipped through an intermediate interface, and in addition, the result display interface and the first interactive interface may implement interface switching through a tab; the target interaction result refers to an interaction result between the sending object and the target object, for example, information such as CP (pairing) combination, interaction level, and the like.
Illustratively, referring to fig. 10, fig. 10 is a schematic diagram of an exemplary results presentation interface provided by an embodiment of the present application; as shown in FIG. 10, the result display interface is an interaction result ranking list interface 10-1, and in the interaction result ranking list interface 10-1, the target interaction result 10-11 includes a ranking 10-111, an avatar combination 10-112 and an interaction result value 10-113.
And S508, displaying a fifth interactive interface by the second interactive equipment based on the interactive confirmation information.
It should be noted that after the target object clicks the interactive confirmation control on the fourth interactive interface, on one hand, the second interactive device sends the obtained interactive confirmation information to the sending object through the interactive server, and on the other hand, the second interactive device displays a new interactive interface, that is, a fifth interactive interface, based on the interactive confirmation information, so as to implement continuous interaction between the target object and the sending object through the displayed fifth interactive interface.
S509, the second interaction device receives second interaction information sent by the sending object, and the second interaction information is displayed on the fifth interaction interface.
In the embodiment of the application, second interaction information sent by the object to the target object through the second interaction interface is sent, and the second interaction device is displayed through the fifth interaction interface. Here, the second interactive device displays the second interactive information by using a second preset display method; the second display mode corresponds to the second sending mode, for example, when the second sending mode is an anonymous sending mode, the second display mode is a mode of anonymously displaying the second interactive information; when the second sending mode is a broadcast sending mode, the second display mode is a display mode for displaying the second interactive information and determining that the second interactive information is a broadcast message; and when the second sending mode is the single-object sending mode, the second display mode is a display mode of displaying that the sending object sends the second interaction information to the target object.
And S510, the second interaction equipment displays a target interaction result corresponding to the sending object on a result display interface based on the second interaction information, so that the interaction with the sending object is completed.
In the embodiment of the application, the first interactive device sends the second interactive information to the target object through the interactive server, and when the interactive server determines that the sending object and the target object meet the interaction condition based on the second interactive information, the interactive server sends information to the second interactive device in addition to the first interactive device, so that the second interactive device displays the result display interface, and displays the target object and the target interactive result of the sending object on the result display interface, and at this time, the interaction between the sending object and the target object is completed.
It should be noted that the interfaces for displaying the target interaction result corresponding to the sending object and the target object in the second interactive device and the first interactive device are the same, and may of course be different.
It can be understood that, the sending object can send the first interaction information to the target object in the first sending mode to trigger interaction, and can continue to interact with the target object after the interaction is triggered, so as to obtain and display a final target interaction result corresponding to the target object; that is to say, the interaction between the sending object and the target object is realized through various interactions, and the corresponding target interaction result can be displayed, so that an interaction mode with various types of interaction and interaction results is realized, the interaction mode is diversified, and the diversity of the interaction mode can be improved; meanwhile, the activity degree of the user in information interaction application can be improved, the interaction of the user is stimulated, and the interaction effect is improved.
In the embodiment of the application, S501 can be realized through S5011-S5013; that is, the first interactive device receives the interactive trigger operation acting on the interactive trigger control corresponding to the target object on the first interactive interface, responds to the interactive trigger operation, and displays the interactive trigger interface, including S5011 to S5013, which are described below.
S5011, displaying a list of objects to be interacted by the first interaction device on the first interaction interface in a mode of combining the objects and the trigger control.
In the embodiment of the application, when the first interactive device displays the first interactive interface, the first interactive device interacts with the interactive server, the interactive server pulls each object to be interacted from the database and returns the object to the first interactive device, the first interactive device displays the acquired objects to be interacted in a list, and each object to be interacted is displayed in a mode of combining an object and a trigger control; at this time, the display of the object list to be interacted on the first interaction interface is realized.
It should be noted that the interaction server may also pull each object to be interacted from the cache (e.g., redis cache), and only when each object to be interacted does not exist in the cache, pull each object from the database; in addition, when each object to be interacted does not exist in the cache, each object to be interacted can be pulled from the database through the computing service, each object to be interacted is stored in the cache, and then each object to be interacted is pulled from the cache by the interaction server.
In addition, the number of objects included in the list of objects to be interacted with may be preset, for example, 50; moreover, the pull condition for pulling each object to be interacted may be at least one of an object of interest (e.g., a person of interest), an object of historical search, an associated object (e.g., a guardian anchor) and a recommended object (e.g., a platform recommended anchor, a platform recommended interaction object), and when the pulled each object to be interacted includes multiple objects, the objects to be interacted may be combined according to a preset ratio to obtain a preset number of objects, and the preset number of objects may be combined into an object list to be interacted; for example, when the pulled objects to be interacted include a daemon anchor, a stakeholder, a search history, and a platform recommendation anchor, the method may be performed according to the "daemon anchor: attention is paid to people: searching history: and the platform recommends that the anchor is 5:2:2: 1', and a preset number of objects are determined from the pulled objects to be interacted to form an object list to be interacted.
S5012, the first interaction device receives interaction triggering operation acting on the interaction triggering control corresponding to the target object in the displayed list of the objects to be interacted.
It should be noted that, because the list of objects to be interacted is displayed on the first interaction interface, and the display mode of each object in the list of objects to be interacted is displayed in a mode that the object and the corresponding touch control are combined, the object is sent to the list of objects to be interacted displayed on the first interaction interface, when the object is selected from the list to be interacted and information interaction is performed, and when the touch control corresponding to the selected object is triggered, the first interaction device determines the selected object, that is, the target object, and can also obtain interaction triggering operation acting on the trigger control corresponding to the target object.
Here, the trigger control corresponding to the target object is an interactive trigger control.
S5013, the first interactive device responds to the interactive triggering operation, and displays an interactive triggering interface in a preset interactive triggering area on the first interactive interface.
In the embodiment of the application, the first interactive device responds to the received interactive trigger operation to trigger the display of the interactive trigger interface. Here, the interactive trigger interface is displayed in a preset interactive trigger area on the first interactive interface, as shown in fig. 7.
In addition, in the embodiment of the present application, S501 may also be implemented by the following steps: the method comprises the steps of receiving an object searching operation acted on an object searching control on a first interactive interface, responding to the object searching operation, displaying an interactive trigger control comprising a target object and a target object on the first interactive interface, receiving an interactive trigger operation acted on the interactive trigger control corresponding to the target object, responding to the interactive trigger operation, and displaying an interactive trigger interface. That is, the sending object may also display an interactive trigger interface for triggering interaction with the target object by searching for the target object on the first interactive interface.
Referring to fig. 11a, fig. 11a is a schematic diagram of another exemplary first interactive interface provided by an embodiment of the present application; as shown in fig. 11a, a to-be-interacted object list 11-11 including a recommended anchor, a king of whiteware and a star of tomorrow is displayed on a whiteware square 11-1 (a first interactive interface), and an object search control 11-12 is also displayed; thus, the sending object may select a target object from the list of objects to be interacted with 11-11 or determine a target object for interaction through the object search control 11-12.
It can be understood that, when determining the target object for interaction, the sending object may determine the target object by searching, and may also determine the target object from a list composed of any objects; therefore, in the embodiment of the application, the target object can be any object, the interaction selectivity is wide, no limitation is caused, and the interaction effect is good.
In the embodiment of the application, the editing control comprises an information editing control, an attachment adding control and an information sending control, and the editing operation comprises an information editing operation, an attachment selecting operation and an information sending operation; at this time, S502 can be realized through S5021-S5024; that is, the first interactive device receives, on the interaction trigger interface, the editing operation applied to the editing control, and sends, in response to the editing operation, first interaction information to the target object, including S5021-S5024, which are described below.
S5021, the first interaction equipment receives information editing operation acted on the information editing control on the interaction triggering interface, responds to the information editing operation, and displays editing information in the information display area.
It should be noted that, an information editing control (e.g., an information input box, an information input area, etc.) is displayed on the interactive trigger interface, and when the sending object performs touch control on the information editing control to edit a text (e.g., input text information, select text information, etc.), the first interactive device also receives an information editing operation (e.g., an input operation of inputting text information, a click operation of selecting text information, etc.) applied to the information editing control; at this time, the first interactive device responds to the information editing operation, obtains text information edited by the sending object through the information editing control, and displays the editing information on the information display area of the interactive triggering interface, so that the sending object can know the text information edited by the sending object.
S5022, the first interaction equipment receives attachment selection operation acted on the attachment adding control, responds to the attachment selection operation and displays a list of attachments to be selected.
It should be noted that an accessory adding control (for example, a "add a gift" link, a "select a gift" button, etc.) is also displayed on the interactive trigger interface, and when the sending object performs touch control (for example, clicking the link or the button, etc.) on the accessory adding control to add the accessory, the first interactive device also receives an accessory selecting operation (for example, clicking the link or the button, etc.) acting on the accessory adding control; at this time, the first interactive device responds to the accessory selection operation, interacts with the interactive server, enables the interactive server to pull the selectable accessories and return the selectable accessories to the first interactive device, and the first interactive device displays the selectable accessories in a list form, so that the display of the accessory list to be selected is completed.
S5023, the first interaction equipment receives the accessory determining operation acted on the target accessory on the displayed accessory list to be selected, responds to the accessory determining operation, and displays the target accessory in the accessory display area.
It should be noted that, when the sending object selects an accessory from each accessory in the displayed list of accessories to be selected by clicking or voice, the first interaction device also receives an accessory determination operation acting on the target accessory, so as to respond to the accessory determination operation, obtain the accessory selected by the sending object, that is, the target accessory, and display the target accessory in the accessory display area of the interaction trigger interface, so that the sending object knows the accessory selected by itself.
S5024, the first interactive equipment receives information sending operation acted on the information sending control, responds to the information sending operation, combines the editing information and/or the target accessory into first interactive information, and sends the first interactive information to the target object.
It should be noted that an information sending control (for example, a "send" button, a "whitetouch" link, etc.) is also displayed on the interaction triggering interface, and when the sending object performs touch control (for example, clicking, long-pressing, etc.) on the information sending control for the displayed editing information and the displayed target attachment, the first interaction device also receives an information sending operation (for example, clicking, long-pressing, etc.) acting on the information sending control; at this time, the first interactive device responds to the information sending operation, combines the editing information and/or the target accessory into first interactive information, and sends the first interactive information to the target object in a first sending mode.
In embodiment S5023 of the present application, the first interactive device displays a target accessory in the accessory display area in response to an accessory determination operation, which includes S50231-S50235, and the following steps are respectively described.
S50231, the first interactive device responds to the accessory determining operation and obtains current value information and target value information corresponding to the target accessory.
It should be noted that the attachment corresponds to the value information, and when the sending object selects the attachment to send to the target object, the first interaction device obtains the current value information corresponding to the sending object and the target value information corresponding to the target attachment by interacting with the interaction server, and determines the processing to be executed in response to the attachment determination operation by comparing the current value information and the target value information.
Here, the current value information and the target value information may be virtual wealth, virtual resources, and the like.
S50232, when the current value information is larger than or equal to the target value information, the first interaction device updates the current value information into the value difference between the current value information and the target value information, and displays the target attachment in the attachment display area.
In the embodiment of the application, when the first interactive device compares the current value information with the target value information, if the current value information is greater than or equal to the target value information, the current value information of the sending object is enough, and the target attachment can be selected; therefore, the first interactive equipment interacts with the interactive server, and the target value information is subtracted from the current value information, namely the current value information is updated to be the value difference between the current value information and the target value information; and simultaneously, displaying the target accessory in an accessory display area of the interactive triggering interface.
S50233, when the current value information is smaller than the target value information, the first interaction device jumps to a value increasing interface.
In addition, in the embodiment of the application, when the first interactive device compares the current value information with the target value information, if the current value information is smaller than the target value information, it indicates that the current value information of the sending object is insufficient, and the target attachment cannot be selected; therefore, the first interactive device can directly jump to the value increasing interface; and inquiring whether to jump to a value increasing page or not from the sending object, and jumping to a value increasing interface according to the operation of the sending object.
It should be noted that the value adding interface is an interface for increasing the current value information of the sending object, for example, a recharging interface and a wallet recharging page.
S50234, the first interaction device receives value increasing operation acting on the value increasing control on the value increasing interface, responds to the value increasing operation, and updates the current value information into the value sum of the current value information and the increased value information.
It should be noted that a value increase control (for example, a determination control corresponding to the increased value information) is displayed on the value increase interface, and when the sending object touches the value increase control to increase the value information, the first interaction device also receives a value increase operation acting on the value increase control; at this time, the first interactive device triggers value increase processing on the current value information in response to the value increase operation, thereby updating the current value information to the value sum of the current value information and the increased value information.
S50235, when the updated current value information is larger than or equal to the target value information, the first interaction device updates the updated current value information into the value difference between the updated current value information and the target value information, and displays the target attachment in the attachment display area.
In the embodiment of the application, after the value increasing processing of the sending object is completed, the first interactive device compares the updated current value information with the target value information, and when the updated current value information is greater than or equal to the target value information, the updated current value information of the sending object is enough to select the target attachment; therefore, the first interactive device interacts with the interactive server, and the target value information is subtracted from the updated current value information, namely the updated current value information is updated to be the value difference between the current value information and the target value information; and simultaneously, displaying the target accessory in an accessory display area of the interactive triggering interface.
Here, when the updated current value information is smaller than the target value information, the process continues to jump to the value increasing interface, and processing steps similar to S50234 and S50235 are performed, which is not described herein again in this embodiment of the present application.
In this embodiment S505, the first interactive device displays the second interactive interface based on the interactive confirmation information, which includes S5051-S5055, and the following steps are described separately.
S5051, when the interactive confirmation information is continuous interactive information, the first interactive equipment displays the trigger progress corresponding to the target object in the interactive object area on the first interactive interface.
It should be noted that the interaction confirmation information is continuous interaction information or agreement interaction information, where the continuous interaction information is information that triggers subsequent interaction through continuous interaction, and the agreement interaction information is information that directly triggers subsequent interaction.
Here, when the interaction confirmation information is the continuous interaction information, it indicates that the sending object needs to continuously interact with the target object to perform subsequent interaction; therefore, the first interactive equipment displays the trigger progress corresponding to the target object in the interactive object area on the first interactive interface, so that the sending object can know the progress of subsequent interaction through the displayed trigger progress; the trigger progress refers to the progress of triggering the subsequent interaction with the target object, and the display form of the trigger progress is, for example, a progress bar, a current value/target value, and the like. It is easy to know that the trigger progress of the initial display is 0.
Exemplarily, referring to fig. 11b, fig. 11b is a schematic diagram illustrating an exemplary trigger schedule provided by the embodiment of the present invention; as shown in fig. 11b, an interacted object list 11-21 is displayed on the first interaction interface 11-2, and a target object avatar 11-22, a trigger progress 11-23, prompt information 11-24 ("knock difference 2.5 ten thousand heart movement value") corresponding to the trigger progress 11-23 and an added interacted object control 11-25 are displayed in the interacted object list 11-21. In addition, a list 11-22 of objects to be interacted, including a recommendation anchor and a whiteware king, is displayed on the first interaction interface 11-2, and an object search control 11-26 is also displayed.
S5052, the first interactive device receives continuous interactive operation acting on the displayed triggering progress, responds to the continuous interactive operation, and displays a continuous interactive interface comprising the triggering progress.
In the embodiment of the application, the display of the trigger progress can be displayed in a touch manner, so that when the sending object continues to interact with the target object, the trigger progress of the touch display of the sending object is sent, and the first interaction device receives continuous interaction operation acting on the displayed trigger progress; at this time, the first interactive device responds to the continuous interactive operation, and displays a continuous interactive interface (for example, an interface for watching a live video of the target object) so as to continuously interact with the sending object through the continuous interactive interface, thereby triggering subsequent interaction.
It should be noted that the continuous interactive interface may be an interface displayed on the first interactive interface, or may be a new interface skipped from the first interactive interface; when the continuous interactive interface is the interface displayed on the first interactive interface, the trigger progress is still displayed on the first interactive interface; and when the continuous interactive interface is a new interface jumped from the first interactive interface, the continuous interactive interface also shows a trigger progress.
S5053, the first interactive device receives second interactive operation acting on the second interactive control on the continuous interactive interface, responds to the second interactive operation, and sends third interactive information to the target object.
It should be noted that a second interactive control (for example, a message sending control, an attachment selection control, a sharing control, and the like) is displayed on the continuous interactive interface, and when the sending object continues to interact with the sending object by touching the second interactive control on the continuous interactive interface, the first interactive device also receives a second interactive operation; at this time, the first interactive device responds to the second interactive operation, third interactive information is obtained, and the third interactive information is sent to the interactive server, so that the interactive server sends the third interactive information to the target object in a second sending mode, and the trigger progress is pushed through sending of the third interactive information.
Here, the second interactive control includes at least one touchable control, for example, including at least one of a text entry box, an attachment selection link, a video push control, and the like; the second interactive operation is at least one of video browsing operation, attachment sending operation and video pushing operation, and may be an operation for triggering a second interactive control, such as clicking, double-clicking, long-pressing or sliding; the third interactive information is interactive information for recommending the trigger progress, and can be at least one of text, video, audio, pictures, accessories and other information.
S5054, the first interactive equipment updates the trigger progress according to the third interactive information, and when the updated trigger progress reaches a preset trigger position, the interface entry control is displayed.
In the embodiment of the application, the first interactive device interacts with the interactive server to obtain a progress value (for example, a whitelist value) corresponding to the interactive information in a preset period (for example, 1 minute), and then updates the trigger progress according to the progress value; here, the interactive information in the preset period belongs to the third interactive information; and when the subsequent interaction between the sending object and the target object is triggered by continuously updating the trigger progress according to the third interaction information, if the updated trigger progress reaches the preset trigger position, the subsequent interaction between the sending object and the target object is triggered, and the interface entry control is displayed.
It should be noted that, when the updated trigger progress reaches the preset trigger position, the first interaction device interacts with the interaction server, so that the interaction server creates new interaction data (for example, whitelisted live broadcast room), the first interaction device obtains the new interaction data created by the interaction server, and displays an interface entry control (for example, an address link, a page jump button, and the like) on the continuous interaction interface, so that the new interaction data is displayed through the touch interface entry control, and subsequent interaction is performed.
In addition, the first interactive device can also display third interactive information on the continuous interactive interface so that the sending object can acquire the interactive information between the sending object and the target object.
It is easy to know that, when the trigger progress does not reach the preset trigger position, if the sending object reenters the first interactive interface, the triggered interactive object is still pulled from the cache or the database through the interactive server on the first interactive interface, and the trigger progress corresponding to the target object is displayed.
In the embodiment of the application, in the process of generating the third interactive information, the sending object and the first interactive device may also end the information interaction process.
S5055, the first interactive equipment receives an entering operation acted on the interface entrance control, responds to the entering operation and displays a second interactive interface.
It should be noted that, when the sending object performs touch control on the displayed interface entry control, the first interaction device also receives an entry operation acting on the interface entry control, responds to the entry operation, and jumps from the continuous interaction interface to the second interaction interface; the second interactive interface is an interface for interaction between the sending object and the interactive object.
Here, when the interaction confirmation information is the agreement interaction information, it indicates that the target object agrees to interact with the transmission object, so that the first interaction device displays the interface entry control on the first interaction interface, receives an entry operation acting on the interface entry control, and displays the second interaction interface in response to the entry operation.
Accordingly, in the embodiment of the present application, S508 may be implemented by S5081 or S5082, where:
and S5081, when the interaction confirmation information is continuous interaction information, the second interaction equipment plays the real-time video stream on the fourth interaction interface, receives third interaction information sent by the sending object in the real-time video stream playing process, displays the third interaction information on the fourth interaction interface, and displays the fifth interaction interface based on the third interaction information.
It should be noted that, when the interaction confirmation information is the continuous interaction information, the second interaction device continues to execute processing on the fourth interaction interface on the target object side; when the target object is a main broadcast, playing the real-time video stream on a fourth interactive interface, receiving third interactive information sent by the sending object in the real-time video stream playing process, and displaying the third interactive information on the fourth interactive interface; here, the second interactive device interacts with the interactive server based on the third interactive information to display the fifth interactive interface, and interacts with the sending object through the fifth interactive interface. In addition, the real-time video stream is the video information generated by live broadcasting of the target object.
And S5082, when the interaction confirmation information is the agreement interaction information, the second interaction equipment displays a fifth interaction interface.
It should be noted that, when the target object touches the agreement interaction control (for example, the "accept whiteout" button) for the displayed first interaction information, the second interaction device directly displays the fifth interaction interface, so as to directly interact with the sending object through the fifth interaction interface.
It can be understood that when the sending object triggers the interaction with the target object, the sending object triggers the subsequent interaction by continuing the interaction, so that a layer-by-layer progressive interaction mode is realized, the interaction mode is novel, and the interaction effect is improved. In addition, the embodiment of the application combines interaction and live broadcast, so that the diversity of interaction modes is realized.
In the present embodiment, S5082 is followed by S5083 and S5084; that is, when the interaction confirmation information is the agreement interaction information, and after the second interaction device displays the fifth interaction interface, the information interaction method further includes S5083 and S5084, which are described below.
And S5083 displaying a second preset time schedule on a fifth interactive interface by the second interactive device.
It should be noted that, when the interaction confirmation information is the agreement interaction information, the interaction server triggers a countdown service, so that the second interaction device displays a second preset time schedule on the fifth interaction interface, and timing is realized through the second preset time schedule.
In addition, when the interactive server starts the countdown service, the interactive server also informs the sending object through the first interactive device: interaction with the target object has been triggered; for example, the triggered interactive address information is pushed to the sending object through a short message and notification bar information.
And S5084, when the second preset time progress reaches a second preset time point and the sending object is in a non-entering state, the second interactive equipment closes the fifth interactive interface and displays the fourth interactive interface.
It should be noted that the second preset time point is an ending time point of the interaction triggered by waiting for the sending object to participate; the non-entered state refers to a state in which the sending object is not involved in the triggered interaction. In addition, when the second interactive device closes the fifth interactive interface and displays the third interactive interface, the interactive server destroys the triggered interactive related processing information, such as the data corresponding to the opened live broadcast room, the triggered interactive related process, and the like.
In this embodiment S5054, after the first interactive device updates the trigger progress according to the third interactive information, the information interaction method further includes S5056 and S5057, which are described below.
S5056, when the updated trigger progress reaches the preset trigger position, the first interaction device sends interface entry information to other objects, so that the other objects display a third interaction interface through the interface entry information in a preset interaction mode.
It should be noted that, when subsequent interaction between the sending object and the target object is triggered, the first interaction device sends interface entry information to other objects through the interaction server by interacting with the interaction server, so that the other objects display the third interaction interface through the interface entry information in a preset interaction manner.
Here, the other objects are objects except the sending object and the target object in all objects in the information interaction system corresponding to the information interaction application; the interface entry information is entry information of other objects participating in subsequent interaction between the sending object and the target object, for example, address information participating in the subsequent interaction between the sending object and the target object; the third interactive interface is an interface which is displayed by other object sides and is related to the subsequent interaction between the sending object and the target object; the preset interaction mode is a mode in which other objects participate in subsequent interaction between the sending object and the target object, such as a watching participation mode, a comment participation mode, and a watching and gift participation mode.
It is easy to know that the second interactive interface, the third interactive interface and the fifth interactive interface are interfaces respectively displayed on the sides of the sending object, the other objects and the target object at the same time and related to the subsequent interaction of the sending object and the target object.
S5057, the first interactive equipment displays other objects on the second interactive interface based on the interactive participation operation of the other objects on the interface entry information.
It should be noted that when other objects show the third interactive interface through the interface entry information, other objects are displayed on the second interactive interface; and other objects are also displayed on the fifth interactive interface.
In addition, in S504 of this embodiment, after the second interactive device responds to the interactive confirmation operation and obtains the interactive confirmation information, the information interaction method further includes: when the interaction confirmation information is agreement interaction information, the second interaction equipment sends interface entry information to other objects so that the other objects display a third interaction interface through the interface entry information in a preset interaction mode; and displaying other objects on the fifth interactive interface based on the interactive participation operation of the other objects on the interface entrance information.
It should be noted that, when the target object touches the consent interaction control (for example, the "accept form" button) for the displayed first interaction information, the second interaction device also issues a global notification to other objects, so that the other objects participate in the interaction between the sending object and the target object.
It can be understood that, because the sending object and the target object trigger the interaction and when the subsequent interaction is triggered, other objects can also participate in, a multi-object participation mode is realized, the interaction mode is diversified, and the interaction effect can be improved.
In the embodiment of the application, a first preset time progress and an interaction progress are also displayed on the second interaction interface; at this time, S507 may be implemented by S5071-S5074; that is, the first interactive device displays the target interaction result corresponding to the target object on the result display interface based on the second interaction information, so as to complete the interaction with the target object, including S5071-S5074, which are described below.
And S5071, the first interactive device updates the interactive progress based on the second interactive information.
It should be noted that, the first interactive device periodically updates the interaction progress based on the interaction information (belonging to the second interaction information) of the sending object and the target object. Here, the interaction progress is a progress of the transmission object interacting with the target object to form an interaction combination (team).
It should be further noted that, when other objects are also displayed on the second interactive interface, information corresponding to participation operations executed by the other objects through the third interactive interface may also be used to update the interactive progress; for example, the popularity value and the gift value corresponding to other objects can also be used to update the interaction progress.
And S5072, when the first preset time progress reaches a first preset time point and the updated interaction progress reaches a preset combination position, the first interaction equipment determines a target interaction result successfully combined with the target object, and displays a result display interface.
It should be noted that, the first interaction device is preset with a time period (for example, 52 minutes), a combination value is also preset on the time progress bar corresponding to the first preset time point, and a combination position is preset on the interaction progress bar; when the first preset time progress reaches a first preset time point, the time is up, and if the updated interaction progress also reaches a preset combination position, the successful combination of the sending object and the target object is determined, so that a target interaction result of the successful combination of the sending object and the target object is obtained; at this time, the display of the result display interface is triggered.
In addition, when the first preset time progress reaches the first preset time point, if the updated interaction progress does not reach the preset combination position, the combination failure of the sending object and the target object is determined, so that a target interaction result of the combination failure of the sending object and the target object is obtained, and the information interaction process is ended.
In the embodiment of the application, the preset combination position is a position corresponding to information set in real time by the second interactive device according to the target object. That is, before the second interactive device displays the fifth interactive interface, the information method further includes S5081 and S5082, which are described below.
S5081, displaying an interactive value setting interface by the second interactive device.
It should be noted that, before the fifth interactive interface is displayed by the second interactive device, the interactive value setting interface is displayed first, so that the target object performs setting of the interactive value through the interactive value setting interface.
And S5082, the second interactive device receives setting operation acted on the interactive value setting control on the interactive value setting interface.
In the embodiment of the application, an interaction value setting control (for example, an interaction value input box or an interaction value selection box, a "ok" button, etc.) is displayed on the interaction value setting interface, and when the target object performs touch control on the interaction value setting control to set the interaction value, the second interaction device also receives a setting operation acting on the interaction value setting control.
Correspondingly, the second interactive device displays a fifth interactive interface, including: and the second interactive equipment responds to the setting operation, obtains a value to be interacted, and displays a fifth interactive interface comprising a preset combination position based on the value to be interacted. Here, the value to be interacted, i.e., the interaction value set by the target object through the interaction value setting control.
And S5073, calculating a target interaction value corresponding to the second interaction information by the first interaction equipment based on a preset calculation rule, and determining a target display position of the target interaction result on the result display interface according to the target interaction value.
In this embodiment of the application, the first interaction device is preset with a preset calculation rule, which means that different types of interaction information in the second interaction information correspond to different interaction values (for example, a gift sending value 1 is a value of 1 interaction, and watching 1 group of anchor live broadcasts is a value of 1 interaction, etc.), and is used for calculating a target interaction value corresponding to the second interaction information. Therefore, the first interaction device can determine a target interaction value corresponding to the second interaction information based on the preset calculation rule; when the result display interface is a ranking list of the interaction results, the display position of the target interaction result, namely the target display position, is determined by the target interaction value.
In addition, the first interaction device can also determine a target interaction value by combining interaction values corresponding to the participation interaction information of other objects.
S5074, the first interaction device displays the target interaction result on the target display position of the result display interface, and therefore interaction with the target object is completed.
In the embodiment of the application, when the first interaction device determines that the target interaction result is on the target display position of the result display interface, the target interaction result is displayed on the target display position of the result display interface.
In embodiment S5072 of the present application, after the first interaction device determines a target interaction result successfully combined with the target object, the information interaction method further includes:
s5075, the first interactive identifier of the update sending object and the second interactive identifier of the target object are both preset interactive identifiers.
That is, after the sending object and the target object are successfully combined, the first interaction device is further configured to update the first interaction identifier of the sending object and the second interaction identifier of the target object, and perform the next information interaction according to the updated first interaction identifier and the updated second interaction identifier; that is, in the next information interaction, neither the sending object nor the target object can perform information interaction any more. Here, the preset interactive mark refers to a mark corresponding to a successful combination.
Accordingly, in the embodiment S501 of the present application, the first interactive device responds to the interactive trigger operation to display an interactive trigger interface, which includes S5014 and S5015, and the steps are described below.
S5014, the first interaction device responds to the interaction triggering operation and obtains a first interaction identifier corresponding to the sending object and a second interaction identifier of the target object.
In the embodiment of the application, before the interaction between the sending object and the target object is triggered, a first interaction identifier corresponding to the sending object and a second interaction identifier of the target object are obtained, so as to judge whether the interaction between the sending object and the target object is triggered according to the first interaction identifier and the second interaction identifier.
S5015, when the first interactive identification and the second interactive identification are not matched with the preset interactive identification, the first interactive equipment displays an interactive trigger interface.
It should be noted that, when the first interaction identifier and the second interaction identifier are not matched with the preset interaction identifier, it indicates that the sending object and the target object are not successfully interacted, and interaction can be performed, so that the first interaction device displays an interaction trigger interface, so that the sending object and the target object interact through the interaction trigger interface. And when the first interactive identification and/or the second interactive identification are matched with the preset interactive identification, the sending object and the target object cannot interact, and at the moment, the prompt information is displayed, and the information interaction process is ended.
In the embodiment of the present application, referring to fig. 12, fig. 12 is a schematic view illustrating another optional flow chart of an information interaction method provided in the embodiment of the present application; as shown in fig. 12:
s1201, the service equipment receives first interaction information sent by a sending object to a target object, and sends the first interaction information to the target object;
s1202, the service equipment receives interaction confirmation information sent by the target object aiming at the first interaction information, and sends the interaction confirmation information to the sending object;
s1203, the service equipment receives second interaction information sent by a sending object aiming at the interaction confirmation information, and sends the second interaction information to the target object;
s1204, the service equipment determines a target interaction result of the sending object and the target object based on the second interaction information;
s1205, the service equipment sends the target interaction result to a sending object;
and S1206, the service equipment sends the target interaction result to the target object.
In this embodiment of the application, before the service device determines to send the target interaction result between the object and the target object based on the second interaction information, the information interaction method further includes: the service equipment acquires a value to be interacted set by the target object.
Correspondingly, the service device determines a target interaction result between the sending object and the target object based on the second interaction information, and the method comprises the following steps: the service equipment determines an interaction value corresponding to the second interaction information according to the corresponding relation between the interaction information and the interaction value to obtain a target interaction value; when the target interaction value is larger than or equal to the value to be interacted, generating target combination information; and acquiring other combined information, and inserting the target combined information into other combined information based on the target interaction value to obtain a target interaction result.
In this embodiment of the application, the service device determines, according to a correspondence between the interaction information and the interaction value, an interaction value corresponding to the second interaction information, and obtains a target interaction value, including: the service equipment receives fourth interaction information sent to the target object by other objects; and taking the interaction value corresponding to the second interaction information and the interaction value corresponding to the fourth interaction information as target interaction values according to the corresponding relation between the interaction information and the interaction values.
In this embodiment of the application, before the service device receives first interaction information sent by the sending object to the target object and sends the first interaction information to the target object, the method further includes: the service equipment acquires objects corresponding to all preset object types based on a preset object type set; combining the objects corresponding to the preset object categories into an initial object set to be interacted; selecting an object set to be interacted from the initial object set to be interacted based on the preset object proportion and the preset object quantity; and sending the set of objects to be interacted to the first interaction equipment so that the first interaction equipment displays the set of objects to be interacted in a list form.
Next, an exemplary application of the embodiment of the present application in a practical application scenario will be described.
Referring to fig. 13, fig. 13 is a schematic diagram illustrating an exemplary information interaction flow provided by an embodiment of the present application; as shown in fig. 13, the exemplary information interaction process includes:
and S1301, displaying an interface (a first interactive interface) corresponding to the whitelisted square, wherein a main play list (an object list to be interacted) and a whitelisted main play list (information displayed in an interacted object area) which can be whitelisted by a user are displayed on the interface corresponding to the whitelisted square.
It should be noted that, when a user (a sending object) enters a whitepack square from a home page of an information interaction application through a terminal (a first interaction device), the first interaction device also displays an interface corresponding to the whitepack square; the whitelisted anchor list is a list corresponding to an anchor for which the user has sent whitelisted information (first interaction information).
Referring to fig. 14, a main playlist 14-11 whiteable for the user and a whited main playlist 14-12 are displayed on the interface 14-1 corresponding to the whitespace; the anchor list 14-11 for the user to form the user's whitelist includes recommended anchors ("anchor 1", "anchor 2", "anchor 3", and more) and whitelist queens ("anchor 4", "anchor 5", "anchor 6", and more), wherein each anchor is displayed in a manner of combining an anchor avatar, anchor correspondence information (number of whites, nickname, signature, and the like), and an "anonymous whitelist" button. In addition, the main broadcasting can be searched for whitewashing through a search icon 14-13 on the interface 14-1 corresponding to the whitewashing square.
S1302, receiving a click operation (interactive trigger operation) on an anonymous form header button (interactive trigger control) corresponding to a "anchor 2" (target object) in an anchor list available for the user to form a form header, responding to the click operation, blurring an interface corresponding to a form header square, and displaying a form header editing interface (interactive trigger interface) below the interface corresponding to the form header square.
Referring to fig. 15, a whitelist information editing interface 15-1 is displayed below (in a preset interaction trigger area) the interface 15-2 corresponding to the whitelist square, where the whitelist information editing interface 15-1 includes a title 15-11: the head portrait of the anchor 2 writes a banner message bar to the anchor 2, and further comprises a text editing field 15-12 (message editing control), and a prompt message (not shown in the figure) can be displayed on the text editing field, and further comprises a link 15-13 (attachment adding control) for adding a gift, a button 15-14 (message sending control) for sending anonymous banner, and further comprises a link 15-15 for "rule". In addition, the "send anonymous whitespace" button 15-14 may be displayed in a non-touch-sensitive manner when the text is not edited and the gift is not selected.
S1303, the table whiteware (first interactive information) is sent to the anchor 2 through the table whiteware editing interface.
When the text "guess who i is" is entered in the text editing field 15-12 in FIG. 15, you are given a prompt bar, i is who I thinks you want you in your live room each day, and eventually buys up to courage and express you. "and after selecting the gift, referring to fig. 16, the text (edit information) is displayed in a region 16-1 (information display region) corresponding to the text edit field 15-12 of the whiteform information edit interface 15-1, and the selected gift" gift icon, one microphone, early celebrity name "is displayed in a region 16-2 (attachment display region) corresponding to the" add one gift "link 15-13 of the whiteform information edit interface 15-1; at this time, the display form of the send anonymous form "button 15-14 is changed to a touchable form; as will be readily appreciated, whitelists include text and gifts.
And S1304, when the "anchor 2" clicks the "accepting the form" button (interactive confirmation control), displaying an interface (second interactive interface) corresponding to the dedicated live broadcast room.
S1305, when the anchor 2 clicks a button waiting for explosion (interactive confirmation control), adding the anchor 2 to a whitelisted anchor list on an interface corresponding to the whitelisted square; and receiving interactive information (third interactive information) between the user and the anchor 2, and displaying an interface corresponding to the exclusive live broadcast room when the cardiac motion value (the value corresponding to the trigger progress) corresponding to the mutual information reaches a preset cardiac motion value (the value corresponding to the preset trigger position).
Referring to fig. 17, after receiving the whitelist information sent by the user, the anchor 2 displays the whitelist through a pop-up box 17-11 on a live broadcast interface 17-1 (a second interactive interface), and displays a whitelist icon 17-12 and a text 17-13 on the pop-up box 17-11: "you may be happy you-you have received an anonymous form information! "," accept whiteout "button 17-17, and" wait for a pop "button 17-14. The live broadcast room interface 17-1 is also displayed with a whitelight identifier 17-15 and corresponding prompt information 17-16: "received form white message is there". The live broadcasting interface 17-1 also displays live broadcasting pictures, live broadcasting interactive information and other related live broadcasting information.
When the "accept-to-speak" button is clicked by the "anchor 2", referring to fig. 18, on the terminal used by the user, a dedicated live-broadcast-room interface 18-1 is displayed, and the dedicated live-broadcast-room interface 18-1 includes a live-broadcast video screen 18-11, a countdown 18-12 (a first preset time schedule), a team progress bar 18-13 (an interaction schedule), a gift icon 18-14, a text edit box 18-15, a sharing icon 18-16, audience information 18-17 (other objects), and other information.
And S1306, according to the interaction information (second interaction information) between the user and the "anchor 2" on the interface corresponding to the dedicated live broadcast room, determining success (combination success) or failure (combination failure) of the user and the "anchor 2" interaction team.
In the following, an exemplary application provided by the embodiments of the present application is continued. Referring to fig. 19, fig. 19 is a schematic diagram illustrating an exemplary terminal processing flow provided in an embodiment of the present application; as shown in fig. 19, the exemplary terminal processing flow includes:
s1901, starting; i.e. the user starts the information interaction whitewashing process through the terminal 400.
S1902, a whitespace (first interactive interface) is displayed.
S1903, the selected anchor (target object) is acquired.
S1904, judging whether the user and/or the selected anchor are grouped; judging whether the first interactive identification of the sending object and/or the second interactive identification of the target object are matched with the preset interactive identification, and if so, executing S1928; when no, S1905 is executed.
S1905, acquiring the document (editing information) and the gift (target attachment).
S1906, judging whether the balance (current value information) is sufficient; that is, whether the balance is equal to or more than the amount of money of the selected gift (target value information) is judged, and if yes, S1907 is executed; when no, S1908 is executed.
S1907, anonymously (first transmission mode) transmitting a form letter (first interaction information) including a case and a gift to the selected anchor; s1909 is executed.
After the anchor is selected, the gift identification indicating the white letter and the gift to be given is transmitted to the server 300, and the server 300 issues the white letter and the gift to be given to the terminal 200.
S1908, displaying a recharging interface (value increasing interface) and recharging; the balance is recharged through the recharging interface, the recharging is completed, and S1906 is executed.
S1909, judging whether the selected anchor accepts the whiteout; when yes, execute S1910; when no, S1912 is executed.
S1910, judging whether the user is in an online state, and if so, executing S1911; when no, S1916 is executed.
It should be noted that, after the anchor clicks "accept the adAN _ SNtation", the server 300 starts a 15-minute time counting service and a 52-minute time counting service to determine whether the 15-minute content of the user enters the dedicated live broadcasting room, and counts the table white value within 52 minutes.
S1911, displaying an exclusive live broadcast room entrance (interface entrance control); execution proceeds to S1915.
S1912, displaying a cardiac value progress bar (trigger progress).
S1913, continuing to interact; namely, the third interactive information is acquired through the continuous interactive interface.
S1914, when the heartbeat value progress bar is full (the updated trigger progress reaches the preset trigger position), knocking the heart; execution is performed at S1911 and S1918.
S1915, when the second preset time is up, judging whether a special live broadcast room interface is displayed or not; when no, execute S1928; when yes, S1919 is executed.
S1916, sending a notice; to inform the user that the selected anchor received the whiteout.
S1917, changing the state to an online state; s1911 is executed.
S1918, notifying other users and the anchor; execution proceeds to S1919.
S1919, displaying other users and anchor (other objects), a whiteware progress bar (interactive progress), a smooth chat time (first preset time progress), a romantic background and a congratulation animation on the interface of the special live broadcast room; s1927 and S1920 are performed, or S1921 is performed.
S1920, carrying out interactive processing (sending second interactive information) such as gift sending, live broadcast sharing and live broadcast watching through a dedicated live broadcast room interface; s1921 or S1922 is performed.
S1921, giving up; s1928 is performed.
S1922, judging whether the first preset time is reached; if so, perform S1923; when no, S1920 and S1927 are performed.
S1923, a whiteout value (target interaction value) is obtained.
S1924, judging whether the white value is larger than a preset value (a value corresponding to a preset combination position); if so, perform S1925; when no, S1928 is performed.
S1925, determining that the team formation is successful.
S1926, displaying the team on the ranking list interface (result display interface).
S1927, a calling notification is sent to other users and the anchor; to enable other users and anchor to present and watch; s1923 is performed.
S1928, ending; that is, the terminal 400 ends the information interaction whitelisting process.
In the following, an exemplary application provided by the embodiments of the present application is continued. Referring to fig. 20, fig. 20 is a schematic diagram of another exemplary terminal processing flow provided in the embodiment of the present application; as shown in fig. 20, the exemplary terminal processing flow includes:
s2001, start; i.e. the user starts the information interaction process through the terminal 200.
S2002, popping up a table message (displaying first interaction information in a first display mode); s2003 or S2006 is executed.
S2003, determining to accept the form (when the interaction confirmation information is the agreement interaction information); s2004 or S2005 is executed.
S2004, notifying other users and the anchor; s2008 is executed.
S2005, acquiring a set preset value (a value to be interacted); s2008 is executed.
And S2006, determining to wait for the explosion center (when the interaction confirmation information is the continuous interaction information).
S2007, popping the heart; s2005 is performed.
And S2008, displaying other objects, a whitewash progress bar, a chatting countdown, a romantic background and a congratulation animation on a special live broadcast interface (a fifth interactive interface).
S2009, judging whether the first preset time is reached; when so, S2010 is executed; and if not, continuously judging whether the first preset time is reached or not.
And S2010, acquiring a whitelist value (target interaction value).
S2011, judging whether the white value is larger than a preset value (a value corresponding to a preset combination position); when yes, S2012 is executed; when no, S2014 is performed.
S2012, determining that the team formation is successful.
S2013, displaying the team on a ranking list interface (result display interface).
S2014, finishing; that is, the terminal 200 ends the information interaction process.
In the following, an exemplary application provided by the embodiments of the present application is continued. To implement the information interaction flow shown in fig. 19 and fig. 20, referring to fig. 21, fig. 21 is a schematic diagram of an exemplary logic architecture provided by an embodiment of the present application; as shown in fig. 21, the interactive server includes a push service module 21-1, an interactive service module 21-2, a statistical clearing self-driven module 21-3, a database 21-4 and a cache 21-5. Wherein:
the push service module 21-1 is used for indicating a module for pushing the table message to the user and the anchor, maintaining a push list and pushing the table message in a first-in first-out sequence; the system comprises a first interaction device, a second interaction device and a server, wherein the first interaction device is used for pushing a table white message (first interaction information) of a user (a sending object) to a terminal (second interaction device) corresponding to a main broadcast (a target object); the system is also used for acquiring a dedicated live broadcast room from the interactive service module 21-2 and pushing information corresponding to the dedicated live broadcast room to a user; and is further configured to push the knock result to the terminal when the knock (the updated trigger progress reaches the preset trigger position) occurs, so as to display the second interactive interface in the terminal 400 and the fifth interactive interface in the terminal 200.
The interactive service module 21-2 is a service module for the terminal to call and is used for acquiring a main playlist for the user to make a whitelist and a whitelisted main playlist from the cache 21-5; the data corresponding to the anchor list for the user to form the whitespace and the whitespace-formed anchor list comprises an anchor identifier of the whitespace, a dedicated live broadcast room, a whitespace value (to-be-interacted finger) of the current whitespace live broadcast room, the time of 'waiting for the popping' of the anchor click, the time of 'accepting the whitespace' of the anchor click, the state of whether the heartbeat value is full or not, and the like. The system is also used for managing the exclusive live broadcast room, namely when the main broadcast of the table is in receipt of the table or the table is exploded, the exclusive live broadcast room is generated according to the main broadcast identification, the user identification and the timestamp information, the data of the exclusive live broadcast room is stored in the database 21-4, and the exclusive live broadcast room is destroyed when the team formation is successful. The system is further configured to obtain the queue list from the cache 21-5, send the queue list to the terminal for presentation, and periodically (for example, every other day) update the queue list and store the queue list in the cache 21-5; here, when the team formation list is updated, the user data that has been successfully formed in the database is directly read, and the total number is calculated according to the 30% popularity white value and the 70% salutation white value for sorting.
The statistical clearing self-driven module 21-3 is used for counting the cardiac value and the white expression value through the data in the cache 21-5, so as to instruct the push service module 211 to carry out information push according to the counted cardiac value and white expression value, and when the cardiac value and white expression value data do not exist in the cache 21-5, reading the database 21-4, and storing the data about the cardiac value and the white expression value in the data 21-4 into the cache 21-5; and also for triggering a countdown server for a first preset time (e.g., 15 minutes) and a second preset time (e.g., 52 minutes).
And the database 21-4 is connected with the cache 21-5, and the data in the corresponding data table comprises user identification, a main broadcast list available for the user to whitewash, a whitewashed main broadcast list, a team list, a cardiac value, a whitewashing value and the like.
A buffer 21-5 for buffering a anchor list, a whited anchor list, a team list, and cardiac and whited values that are available for the user to whiter.
It will be appreciated that by way of an anonymous whitewashing, a user whitewashes to anycast; therefore, when the surprise is brought to the anchor, the embarrassment when the user is refused is avoided, and the interaction effect is improved. In addition, the interaction is advanced layer by layer through a mode of combining with live broadcasting, not only the anchor and the user participate, but also other users participate in the interaction through a surrounding mode, and the interaction mode is diversified. And the interaction corresponds to the interaction result, the interaction frequency can be improved, the intimacy and the mutual sensibility of the user and the anchor are improved, the anchor is helped to improve the income, and the participation and the whitening experience of the user are improved.
Continuing with the exemplary structure of the first information interaction device 455 provided by the present application as a software module, in some embodiments, as shown in fig. 3, the software module stored in the first information interaction device 455 of the first memory 450 may include:
the interaction triggering module 4551 is configured to receive, on the first interaction interface, an interaction triggering operation that acts on an interaction triggering control corresponding to the target object, and display the interaction triggering interface in response to the interaction triggering operation;
the interaction triggering module 4551 is further configured to receive, on the interaction triggering interface, an editing operation applied to an editing control, respond to the editing operation, and send first interaction information to the target object in a first sending manner;
the interaction module 4552 is configured to receive interaction confirmation information sent by the target object for the first interaction information, and display a second interaction interface based on the interaction confirmation information;
the interaction module 4552 is further configured to receive, on the second interaction interface, a first interaction operation that acts on the first interaction control, and send, in response to the first interaction operation, second interaction information to the target object in a second sending manner;
the first display module 4553 is configured to display a target interaction result corresponding to the target object on a result display interface based on the second interaction information, so as to complete interaction with the target object.
In this embodiment of the application, the interaction triggering module 4551 is further configured to display a list of objects to be interacted on the first interaction interface in a manner that the objects are combined with a triggering control; receiving the interaction triggering operation acted on the interaction triggering control corresponding to the target object in the displayed list of the objects to be interacted; and responding to the interaction triggering operation, and displaying the interaction triggering interface in a preset interaction triggering area on the first interaction interface.
In the embodiment of the application, the editing control comprises an information editing control, an attachment adding control and an information sending control, and the editing operation comprises an information editing operation, an attachment selecting operation and an information sending operation; the interactive triggering module 4551 is further configured to receive, on the interactive triggering interface, the information editing operation that acts on the information editing control, respond to the information editing operation, and display editing information in an information display area; receiving the accessory selection operation acted on the accessory adding control, responding to the accessory selection operation, and displaying a list of accessories to be selected; receiving an accessory determining operation acted on a target accessory on the displayed list of accessories to be selected, responding to the accessory determining operation, and displaying the target accessory in an accessory display area; receiving the information sending operation acted on the information sending control, responding to the information sending operation, combining the editing information and/or the target accessory into the first interactive information, and sending the first interactive information to the target object.
In this embodiment of the application, the interaction triggering module 4551 is further configured to respond to the accessory determining operation, and obtain current value information and target value information corresponding to the target accessory; and when the current value information is larger than or equal to the target value information, updating the current value information into the value difference between the current value information and the target value information, and displaying the target attachment in the attachment display area.
In this embodiment of the application, the interaction triggering module 4551 is further configured to jump to a value increasing interface when the current value information is smaller than the target value information; receiving a value increasing operation acted on a value increasing control on the value increasing interface, responding to the value increasing operation, and updating the current value information into the value sum of the current value information and the increased value information; and when the updated current value information is larger than or equal to the target value information, updating the updated current value information into the value difference between the updated current value information and the target value information, and displaying the target attachment in the attachment display area.
In this embodiment of the application, the interaction module 4552 is further configured to, when the interaction confirmation information is continuous interaction information, display a trigger progress corresponding to the target object in an interacted object region on the first interaction interface; receiving continuous interaction operation acting on the displayed trigger progress, responding to the continuous interaction operation, and displaying a continuous interaction interface comprising the trigger progress; receiving a second interaction operation acted on a second interaction control on the continuous interaction interface, responding to the second interaction operation, and sending third interaction information to the target object, wherein the second interaction operation is at least one of video browsing operation, attachment sending operation and video pushing operation; updating the trigger progress according to the third interaction information, and displaying an interface entry control when the updated trigger progress reaches a preset trigger position; and receiving an entering operation acted on the interface entrance control, responding to the entering operation, and displaying the second interactive interface.
In this embodiment of the application, the interaction module 4552 is further configured to send interface entry information to other objects when the updated trigger progress reaches the preset trigger position, so that the other objects display a third interaction interface through the interface entry information in a preset interaction manner; and displaying the other objects on the second interactive interface based on the interactive participation operation of the other objects on the interface entrance information.
In the embodiment of the application, a first preset time progress and an interaction progress are also displayed on the second interaction interface; the first display module 4553 is further configured to update the interaction progress based on the second interaction information; when the first preset time progress reaches a first preset time point and the updated interaction progress reaches a preset combination position, determining the target interaction result successfully combined with the target object, and displaying the result display interface; calculating a target interaction value corresponding to the second interaction information based on a preset calculation rule, and determining a target display position of the target interaction result on the result display interface according to the target interaction value; and displaying the target interaction result at the target display position of the result display interface so as to finish the interaction with the target object.
In this embodiment of the application, the first information interaction apparatus 455 further includes an identifier updating module 4554, configured to update both the first interaction identifier of the sending object and the second interaction identifier of the target object to be preset interaction identifiers.
In this embodiment of the application, the interaction triggering module 4551 is further configured to respond to the interaction triggering operation, and obtain the first interaction identifier corresponding to the sending object and the second interaction identifier of the target object; and when the first interactive identification and the second interactive identification are not matched with the preset interactive identification, displaying the interactive trigger interface.
Continuing with the exemplary structure of the second information interaction device 255 implemented as a software module provided in the embodiments of the present application, in some embodiments, as shown in fig. 4a, the software module stored in the second information interaction device 255 of the second storage 250 may include:
the information display module 2551 is configured to receive first interaction information sent by a sending object, and display the first interaction information in a first display manner on a fourth interaction interface;
an information sending module 2552, configured to receive, for the displayed first interaction information, an interaction confirmation operation that acts on an interaction confirmation control, obtain, in response to the interaction confirmation operation, interaction confirmation information, and send the interaction confirmation information to the sending object;
an interaction starting module 2553, configured to display a fifth interaction interface based on the interaction confirmation information;
an information receiving module 2554, configured to receive second interaction information sent by the sending object, and display the second interaction information in a second preset display manner on the fifth interaction interface;
a second display module 2555, configured to display a target interaction result corresponding to the sending object on a result display interface based on the second interaction information, so as to complete interaction with the sending object.
In this embodiment of the application, the interaction starting module 2553 is further configured to display the fifth interaction interface when the interaction confirmation information is agreement interaction information; and when the interaction confirmation information is continuous interaction information, playing a real-time video stream on the fourth interaction interface, receiving third interaction information sent by the sending object in the real-time video stream playing process, displaying the third interaction information on the fourth interaction interface, and displaying the fifth interaction interface based on the third interaction information.
In this embodiment of the application, the information sending module 2552 is further configured to send interface entry information to other objects when the interaction confirmation information is agreement interaction information, so that the other objects display a third interaction interface through the interface entry information in a preset interaction manner; and displaying the other objects on the fifth interactive interface based on the interactive participation operation of the other objects on the interface entry information.
In this embodiment of the application, the interaction starting module 2553 is further configured to display a second preset time schedule on the fifth interaction interface; and when the second preset time progress reaches a second preset time point and the sending object is in a non-entering state, closing the fifth interactive interface and displaying the fourth interactive interface.
In this embodiment, the second information interaction device 255 further includes a setting module 2556, configured to display an interaction value setting interface; and receiving a setting operation acted on the interactive value setting control on the interactive value setting interface.
In this embodiment of the application, the interaction starting module 2553 is further configured to respond to the setting operation, obtain a value to be interacted, and display the fifth interaction interface including the preset combination position based on the value to be interacted.
Continuing with the exemplary structure of the service device 355 implemented as software modules provided by the embodiments of the present application, in some embodiments, as shown in fig. 4b, the software modules stored in the service device 355 of the third memory 350 may include:
the information pushing module 3551 is configured to receive first interaction information sent by a sending object to a target object, and send the first interaction information to the target object;
the information pushing module 3551 is further configured to receive interaction confirmation information sent by the target object for the first interaction information, and send the interaction confirmation information to the sending object;
an information processing module 3552, configured to receive second interaction information sent by the sending object for the interaction confirmation information, and send the second interaction information to the target object;
the information processing module 3552 is further configured to determine a target interaction result between the sending object and the target object based on the second interaction information;
the information processing module 3552 is further configured to send the target interaction result to the sending object and the target object.
The embodiment of the application provides a computer-readable storage medium storing executable instructions, wherein the executable instructions are stored, and when being executed by a first processor, the executable instructions cause the first processor to execute the information interaction method applied to a first interaction device, which is provided by the embodiment of the application; or, when the executable instructions are executed by the second processor, the second processor is caused to execute the information interaction method applied to the second interaction device provided by the embodiment of the application; or, when the executable instructions are executed by the third processor, the third processor is caused to execute the information interaction method applied to the service device provided by the embodiment of the application; for example, an information interaction method as shown in fig. 5.
According to an information interaction method applied to a first interaction device, or applied to a second interaction device, or applied to a service device, provided by an embodiment of the present application, an embodiment of the present application provides a computer program product or a computer program, where the computer program product or the computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the computer device executes the information interaction method provided in the various optional implementation modes applied to the first interaction device, the second interaction device or the service device.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, executable instructions may be written in any form of programming language (including compiled or interpreted languages), in the form of programs, software modules, scripts or code, and may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext markup Language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
In summary, according to the embodiment of the application, the sending object can send the first interaction information to the target object to trigger interaction, and can continue to interact with the target object after the interaction is triggered, so as to obtain and display a final target interaction result corresponding to the target object; that is to say, the interaction of sending object and target object is realized through multiple interaction to corresponding target interaction result can also be carried out the appearance, consequently, has realized that a style is interactive and has the interactive mode of interactive result, and interactive mode is diversified, can promote the variety of interactive mode.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.

Claims (15)

1. An information interaction method, comprising:
receiving an interaction triggering operation acted on an interaction triggering control corresponding to a target object on a first interaction interface, responding to the interaction triggering operation, and displaying an interaction triggering interface;
receiving an editing operation acted on an editing control on the interaction triggering interface, responding to the editing operation, and sending first interaction information to the target object;
receiving interaction confirmation information sent by the target object aiming at the first interaction information, and displaying a second interaction interface based on the interaction confirmation information;
receiving a first interactive operation acted on a first interactive control on the second interactive interface, responding to the first interactive operation, and sending second interactive information to the target object;
and displaying a target interaction result corresponding to the target object on a result display interface based on the second interaction information, so as to finish the interaction with the target object.
2. The method of claim 1, wherein the editing controls comprise an information editing control, an attachment adding control, and an information sending control, and the editing operations comprise an information editing operation, an attachment selecting operation, and an information sending operation;
the receiving, on the interaction trigger interface, an editing operation applied to an editing control, and sending first interaction information to the target object in response to the editing operation includes:
receiving the information editing operation acted on the information editing control on the interactive triggering interface, responding to the information editing operation, and displaying editing information in an information display area;
receiving the accessory selection operation acted on the accessory adding control, responding to the accessory selection operation, and displaying a list of accessories to be selected;
receiving an accessory determining operation acted on a target accessory on the displayed list of accessories to be selected, responding to the accessory determining operation, and displaying the target accessory in an accessory display area;
receiving the information sending operation acted on the information sending control, responding to the information sending operation, combining the editing information and/or the target accessory into the first interactive information, and sending the first interactive information to the target object.
3. The method of claim 2, wherein displaying the target attachment in an attachment display area in response to the attachment determination operation comprises:
responding to the accessory determining operation, and acquiring current value information and target value information corresponding to the target accessory;
when the current value information is larger than or equal to the target value information, updating the current value information into the value difference between the current value information and the target value information, and displaying the target attachment in the attachment display area;
when the current value information is smaller than the target value information, jumping to a value increasing interface;
receiving a value increasing operation acted on a value increasing control on the value increasing interface, responding to the value increasing operation, and updating the current value information into the value sum of the current value information and the increased value information;
and when the updated current value information is larger than or equal to the target value information, updating the updated current value information into the value difference between the updated current value information and the target value information, and displaying the target attachment in the attachment display area.
4. The method of any of claims 1 to 3, wherein presenting a second interactive interface based on the interactive confirmation information comprises:
when the interaction confirmation information is continuous interaction information, displaying a trigger progress corresponding to the target object in an interacted object area on the first interaction interface;
receiving continuous interaction operation acting on the displayed trigger progress, responding to the continuous interaction operation, and displaying a continuous interaction interface comprising the trigger progress;
receiving a second interaction operation acted on a second interaction control on the continuous interaction interface, responding to the second interaction operation, and sending third interaction information to the target object, wherein the second interaction operation is at least one of video browsing operation, attachment sending operation and video pushing operation;
updating the trigger progress according to the third interaction information, and displaying an interface entry control when the updated trigger progress reaches a preset trigger position;
and receiving an entering operation acted on the interface entrance control, responding to the entering operation, and displaying the second interactive interface.
5. The method of claim 4, wherein after the updating the trigger schedule according to the third interaction information, the method further comprises:
when the updated trigger progress reaches the preset trigger position, sending interface entry information to other objects so that the other objects display a third interactive interface through the interface entry information in a preset interactive mode;
and displaying the other objects on the second interactive interface based on the interactive participation operation of the other objects on the interface entrance information.
6. The method according to any one of claims 1 to 3, wherein a first preset time progress and an interaction progress are further displayed on the second interactive interface;
the displaying the target interaction result corresponding to the target object on a result display interface based on the second interaction information so as to complete the interaction with the target object includes:
updating the interaction progress based on the second interaction information;
when the first preset time progress reaches a first preset time point and the updated interaction progress reaches a preset combination position, determining the target interaction result successfully combined with the target object, and displaying the result display interface;
calculating a target interaction value corresponding to the second interaction information based on a preset calculation rule, and determining a target display position of the target interaction result on the result display interface according to the target interaction value;
and displaying the target interaction result at the target display position of the result display interface so as to finish the interaction with the target object.
7. An information interaction method, comprising:
receiving first interaction information sent by a sending object, and displaying the first interaction information on a fourth interaction interface;
aiming at the displayed first interaction information, receiving an interaction confirmation operation acted on an interaction confirmation control, responding to the interaction confirmation operation, acquiring interaction confirmation information, and sending the interaction confirmation information to the sending object;
displaying a fifth interactive interface based on the interactive confirmation information;
receiving second interaction information sent by the sending object, and displaying the second interaction information on the fifth interaction interface;
and displaying a target interaction result corresponding to the sending object on a result display interface based on the second interaction information, so as to finish the interaction with the sending object.
8. The method of claim 7, wherein displaying a fifth interactive interface based on the interactive confirmation information comprises:
when the interaction confirmation information is agreement interaction information, displaying the fifth interaction interface;
and when the interaction confirmation information is continuous interaction information, playing a real-time video stream on the fourth interaction interface, receiving third interaction information sent by the sending object in the real-time video stream playing process, displaying the third interaction information on the fourth interaction interface, and displaying the fifth interaction interface based on the third interaction information.
9. The method according to any one of claims 7 or 8, wherein after displaying the fifth interactive interface when the interaction confirmation message is an agreement interaction message, the method further comprises:
displaying a second preset time progress on the fifth interactive interface;
and when the second preset time progress reaches a second preset time point and the sending object is in a non-entering state, closing the fifth interactive interface and displaying the fourth interactive interface.
10. The method of claim 7 or 8, wherein before presenting the fifth interactive interface, the method further comprises:
displaying an interaction value setting interface;
receiving a setting operation acted on an interaction value setting control on the interaction value setting interface;
the display fifth interactive interface includes:
and responding to the setting operation, acquiring a value to be interacted, and displaying the fifth interaction interface comprising a preset combination position based on the value to be interacted.
11. An information interaction method, comprising:
receiving first interaction information sent to a target object by a sending object, and sending the first interaction information to the target object;
receiving interaction confirmation information sent by the target object aiming at the first interaction information, and sending the interaction confirmation information to the sending object;
receiving second interaction information sent by the sending object aiming at the interaction confirmation information, and sending the second interaction information to the target object;
determining a target interaction result of the sending object and the target object based on the second interaction information;
and sending the target interaction result to the sending object and the target object.
12. A first information interaction device, comprising:
a first memory for storing executable instructions;
a first processor adapted to implement the method of any of claims 1 to 6 when executing executable instructions stored in the first memory.
13. A second information interaction device, comprising:
a second memory for storing executable instructions;
a second processor, adapted to perform the method of any of claims 7 to 10 when executing the executable instructions stored in the second memory.
14. A service device, comprising:
a third memory for storing executable instructions;
a third processor, configured to execute the executable instructions stored in the third memory, to implement the method of claim 11.
15. A computer-readable storage medium having stored thereon executable instructions for causing a first processor to perform the method of information interaction of any one of claims 1 to 6 when executed; or for causing a second processor to execute, implementing the information interaction method of any one of claims 7 to 10; or for causing a third processor to execute, implementing the information interaction method of any of claim 11.
CN202010598558.XA 2020-06-28 2020-06-28 Information interaction method, equipment and storage medium Active CN111580724B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010598558.XA CN111580724B (en) 2020-06-28 2020-06-28 Information interaction method, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010598558.XA CN111580724B (en) 2020-06-28 2020-06-28 Information interaction method, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111580724A true CN111580724A (en) 2020-08-25
CN111580724B CN111580724B (en) 2021-12-10

Family

ID=72124112

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010598558.XA Active CN111580724B (en) 2020-06-28 2020-06-28 Information interaction method, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111580724B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112040330A (en) * 2020-09-09 2020-12-04 北京字跳网络技术有限公司 Video file processing method and device, electronic equipment and computer storage medium
CN112312153A (en) * 2020-10-29 2021-02-02 腾讯科技(深圳)有限公司 Live broadcast interaction realization method and computer readable storage medium
CN112561632A (en) * 2020-12-08 2021-03-26 北京达佳互联信息技术有限公司 Information display method, device, terminal and storage medium
CN113115114A (en) * 2021-03-02 2021-07-13 北京达佳互联信息技术有限公司 Interaction method, device, equipment and storage medium
CN113126852A (en) * 2021-05-18 2021-07-16 腾讯科技(深圳)有限公司 Dynamic message display method, related device, equipment and storage medium
CN113867593A (en) * 2021-10-18 2021-12-31 北京字跳网络技术有限公司 Interaction method, device, electronic equipment and storage medium
CN114125566A (en) * 2021-12-29 2022-03-01 阿里巴巴(中国)有限公司 Interaction method and system and electronic equipment
CN114764363A (en) * 2020-12-31 2022-07-19 上海擎感智能科技有限公司 Prompting method, prompting device and computer storage medium
CN114979686A (en) * 2022-04-29 2022-08-30 北京达佳互联信息技术有限公司 Live broadcast interaction method and device, electronic equipment and storage medium
WO2022213287A1 (en) * 2021-04-06 2022-10-13 百果园技术(新加坡)有限公司 Live broadcast interaction method and apparatus, live broadcast server, terminal, and storage medium
WO2022227623A1 (en) * 2021-04-30 2022-11-03 北京达佳互联信息技术有限公司 Method for displaying publish progress, and electronic device
CN115396131A (en) * 2021-05-20 2022-11-25 拉扎斯网络科技(上海)有限公司 Communication method, communication device, electronic equipment and computer-readable storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120236103A1 (en) * 2011-02-23 2012-09-20 Supyo, Inc. Platform for pseudo-anonymous video chat with intelligent matching of chat partners
CN103634681A (en) * 2013-11-29 2014-03-12 腾讯科技(成都)有限公司 Method, device, client end, server and system for live broadcasting interaction
CN105916045A (en) * 2016-05-11 2016-08-31 乐视控股(北京)有限公司 Interactive live broadcast method and device
US20170116666A1 (en) * 2015-10-23 2017-04-27 Inspired Logic, Inc. Social interaction enabling system
US20180309960A1 (en) * 2007-11-21 2018-10-25 Skype Ireland Technologies Holdings High Quality Multimedia Transmission from a Mobile Device for Live and On-Demand Viewing
CN108809799A (en) * 2017-04-28 2018-11-13 腾讯科技(深圳)有限公司 Method for sending information, method for information display, apparatus and system
US20190297360A1 (en) * 2017-04-18 2019-09-26 Tencent Technology (Shenzhen) Company Ltd Data live streaming method, and related device and system
CN110933453A (en) * 2019-12-05 2020-03-27 广州酷狗计算机科技有限公司 Live broadcast interaction method and device, server and storage medium
CN111182318A (en) * 2019-12-17 2020-05-19 北京达佳互联信息技术有限公司 Contribution score generation method and device in live broadcast, electronic equipment and storage medium
CN111277849A (en) * 2020-02-11 2020-06-12 腾讯科技(深圳)有限公司 Image processing method and device, computer equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180309960A1 (en) * 2007-11-21 2018-10-25 Skype Ireland Technologies Holdings High Quality Multimedia Transmission from a Mobile Device for Live and On-Demand Viewing
US20120236103A1 (en) * 2011-02-23 2012-09-20 Supyo, Inc. Platform for pseudo-anonymous video chat with intelligent matching of chat partners
CN103634681A (en) * 2013-11-29 2014-03-12 腾讯科技(成都)有限公司 Method, device, client end, server and system for live broadcasting interaction
US20170116666A1 (en) * 2015-10-23 2017-04-27 Inspired Logic, Inc. Social interaction enabling system
CN105916045A (en) * 2016-05-11 2016-08-31 乐视控股(北京)有限公司 Interactive live broadcast method and device
US20190297360A1 (en) * 2017-04-18 2019-09-26 Tencent Technology (Shenzhen) Company Ltd Data live streaming method, and related device and system
CN108809799A (en) * 2017-04-28 2018-11-13 腾讯科技(深圳)有限公司 Method for sending information, method for information display, apparatus and system
CN110933453A (en) * 2019-12-05 2020-03-27 广州酷狗计算机科技有限公司 Live broadcast interaction method and device, server and storage medium
CN111182318A (en) * 2019-12-17 2020-05-19 北京达佳互联信息技术有限公司 Contribution score generation method and device in live broadcast, electronic equipment and storage medium
CN111277849A (en) * 2020-02-11 2020-06-12 腾讯科技(深圳)有限公司 Image processing method and device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YANGYANGYANH1: "探探怎么匿名表白", 《HTTPS://JINGYAN.BAIDU.COM/ARTICLE/F3AD7D0FC15E2F48C3345BD9.HTML》 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11889143B2 (en) 2020-09-09 2024-01-30 Beijing Zitiao Network Technology Co., Ltd. Video file processing method and apparatus, electronic device, and computer storage medium
CN112040330B (en) * 2020-09-09 2021-12-07 北京字跳网络技术有限公司 Video file processing method and device, electronic equipment and computer storage medium
CN112040330A (en) * 2020-09-09 2020-12-04 北京字跳网络技术有限公司 Video file processing method and device, electronic equipment and computer storage medium
CN112312153A (en) * 2020-10-29 2021-02-02 腾讯科技(深圳)有限公司 Live broadcast interaction realization method and computer readable storage medium
CN112312153B (en) * 2020-10-29 2021-07-16 腾讯科技(深圳)有限公司 Live broadcast interaction realization method and computer readable storage medium
CN112561632B (en) * 2020-12-08 2022-04-22 北京达佳互联信息技术有限公司 Information display method, device, terminal and storage medium
CN112561632A (en) * 2020-12-08 2021-03-26 北京达佳互联信息技术有限公司 Information display method, device, terminal and storage medium
CN114764363B (en) * 2020-12-31 2023-11-24 上海擎感智能科技有限公司 Prompting method, prompting device and computer storage medium
CN114764363A (en) * 2020-12-31 2022-07-19 上海擎感智能科技有限公司 Prompting method, prompting device and computer storage medium
CN113115114B (en) * 2021-03-02 2022-12-27 北京达佳互联信息技术有限公司 Interaction method, device, equipment and storage medium
CN113115114A (en) * 2021-03-02 2021-07-13 北京达佳互联信息技术有限公司 Interaction method, device, equipment and storage medium
WO2022213287A1 (en) * 2021-04-06 2022-10-13 百果园技术(新加坡)有限公司 Live broadcast interaction method and apparatus, live broadcast server, terminal, and storage medium
WO2022227623A1 (en) * 2021-04-30 2022-11-03 北京达佳互联信息技术有限公司 Method for displaying publish progress, and electronic device
CN113126852A (en) * 2021-05-18 2021-07-16 腾讯科技(深圳)有限公司 Dynamic message display method, related device, equipment and storage medium
CN115396131A (en) * 2021-05-20 2022-11-25 拉扎斯网络科技(上海)有限公司 Communication method, communication device, electronic equipment and computer-readable storage medium
CN113867593A (en) * 2021-10-18 2021-12-31 北京字跳网络技术有限公司 Interaction method, device, electronic equipment and storage medium
WO2023065997A1 (en) * 2021-10-18 2023-04-27 北京字跳网络技术有限公司 Interaction method and apparatus, electronic device, and storage medium
CN114125566A (en) * 2021-12-29 2022-03-01 阿里巴巴(中国)有限公司 Interaction method and system and electronic equipment
CN114125566B (en) * 2021-12-29 2024-03-08 阿里巴巴(中国)有限公司 Interaction method, interaction system and electronic equipment
CN114979686A (en) * 2022-04-29 2022-08-30 北京达佳互联信息技术有限公司 Live broadcast interaction method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111580724B (en) 2021-12-10

Similar Documents

Publication Publication Date Title
CN111580724B (en) Information interaction method, equipment and storage medium
US11388120B2 (en) Parallel messaging apparatus and method thereof
CN109005417B (en) Live broadcast room entering method, system, terminal and device for playing game based on live broadcast
US11526818B2 (en) Adaptive task communication based on automated learning and contextual analysis of user activity
CN112333086B (en) Service method and device based on chat group and electronic equipment
CN112367528B (en) Live broadcast interaction method and computer equipment
WO2020236341A1 (en) Providing consistent interaction models in communication sessions
KR102249501B1 (en) Method, system, and computer program for providing ruputation badge for video chat
WO2021244257A1 (en) Song processing method and apparatus, electronic device, and readable storage medium
CN113179416B (en) Live content rebroadcasting method and related equipment
CN112422405B (en) Message interaction method and device and electronic equipment
CN113568545A (en) Comment content display method, terminal and storage medium
CN112748974A (en) Information display method, device, equipment and storage medium based on session
CN110781998A (en) Recommendation processing method and device based on artificial intelligence
CN112287220B (en) Session group pushing method, device, equipment and computer readable storage medium
CN114390299B (en) Song requesting method, apparatus, device and computer readable storage medium
CN112187624A (en) Message reply method and device and electronic equipment
KR20210142515A (en) Method, apparatus and computer program for providing bidirectional interaction broadcasting service with viewer participation
CN115079892A (en) Information display method, device and equipment based on graphic identification and storage medium
CN114513480B (en) Group chat-based information processing method, device, equipment and computer storage medium
CN110855554B (en) Content aggregation method and device, computer equipment and storage medium
CN114779998A (en) Information display method, device and equipment based on graphic identification and storage medium
CN116248966A (en) Live broadcast interaction method and device, electronic equipment and storage medium
CN115268724A (en) Information statistical method, device, equipment and storage medium
CN112799748A (en) Expression element display method, device and equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40027468

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant