CN117119205A - Object interaction method, device, equipment and storage medium - Google Patents

Object interaction method, device, equipment and storage medium Download PDF

Info

Publication number
CN117119205A
CN117119205A CN202210527088.7A CN202210527088A CN117119205A CN 117119205 A CN117119205 A CN 117119205A CN 202210527088 A CN202210527088 A CN 202210527088A CN 117119205 A CN117119205 A CN 117119205A
Authority
CN
China
Prior art keywords
communication interface
picture
interaction
virtual resources
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210527088.7A
Other languages
Chinese (zh)
Inventor
潘艾婧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202210527088.7A priority Critical patent/CN117119205A/en
Publication of CN117119205A publication Critical patent/CN117119205A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles

Abstract

The embodiment of the application discloses an object interaction method, device, equipment and storage medium, wherein the method comprises the following steps: displaying a first picture in a communication interface of the first object in the communication process of the first object and the M second objects; the first frame includes: n virtual resources to be selected by the first object; displaying a second picture in a communication interface of each second object in the M second objects so that at least one second object interacting with the first object performs a triggering operation on the second picture; wherein, the picture content of the first picture is the same as the picture content of the second picture; and updating the communication interface of the first object according to the triggering operation of at least one second object in the second picture, wherein the updated communication interface is used for displaying the heat of N virtual resources in the form of thermodynamic diagram. The embodiment of the application can improve the interaction efficiency between the first object and at least one second object and intuitively reflect the idea of the at least one second object.

Description

Object interaction method, device, equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to an object interaction method, device, apparatus, and storage medium.
Background
Currently, in an interactive scenario (e.g., a live interactive scenario) of a first object and one or more second objects, if the first object (e.g., a host) wants to learn about the ideas of each second object (e.g., a viewer) before making a decision on a virtual resource, a vote can be initiated, solicited and learned about the ideas of the second objects participating in the vote; or, each second object can interact with the first object by sending a barrage, so that the first object can know the opinion and the idea of each second object through the barrage, and the like; but feedback obtained by conventional means such as barrages, votes, etc. is often not intuitive enough. Based on this, how to efficiently and intuitively interact becomes a research hotspot.
Disclosure of Invention
The embodiment of the application provides an object interaction method, device, equipment and storage medium, which can improve interaction efficiency between a first object and at least one second object, namely, can reflect ideas of the at least one second object to the first object efficiently and can reflect ideas of the at least one second object intuitively.
In one aspect, an embodiment of the present application provides an object interaction method, where the method includes:
Displaying a first picture in a communication interface of a first object and M second objects in a communication process of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
displaying a second picture in a communication interface of each second object in the M second objects so that at least one second object interacting with the first object performs a triggering operation on the second picture; wherein, the picture content of the first picture is the same as the picture content of the second picture;
updating a communication interface of the first object according to a triggering operation executed by the at least one second object in the second picture, wherein the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
In another aspect, an embodiment of the present application provides another object interaction method, where the method includes:
displaying a second picture in a communication interface of a second object in the process of communicating the second object with the first object; the picture content of the second picture is the same as the picture content of the first picture displayed in the communication interface of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
in a state that the second object is in interaction with the first object, responding to the triggering operation of the second object on the second picture, informing the terminal equipment of the first object of updating the communication interface of the first object according to the triggering operation executed by the second object;
the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
In another aspect, an embodiment of the present application provides an object interaction apparatus, including;
a first output unit, configured to display a first screen in a communication interface of a first object and M second objects during a communication process of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
the first output unit is further configured to display a second screen in a communication interface of each second object in the M second objects, so that at least one second object that interacts with the first object performs a triggering operation on the second screen; wherein, the picture content of the first picture is the same as the picture content of the second picture;
the first processing unit is used for updating the communication interface of the first object according to the triggering operation of the at least one second object in the second picture, and the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
In another aspect, an embodiment of the present application provides another object interaction apparatus, including:
a second output unit, configured to display a second screen in a communication interface of a second object during communication between the second object and the first object; the picture content of the second picture is the same as the picture content of the first picture displayed in the communication interface of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
the second processing unit is used for responding to the triggering operation of the second object on the second picture under the condition that the second object is in an interaction state with the first object, informing the terminal equipment of the first object of updating the communication interface of the first object according to the triggering operation executed by the second object;
the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
In another aspect, an embodiment of the present application provides a computer device including a processor and a memory, where the memory is configured to store a computer program;
in one embodiment, the computer program when executed by the processor performs the steps of:
displaying a first picture in a communication interface of a first object and M second objects in a communication process of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
displaying a second picture in a communication interface of each second object in the M second objects so that at least one second object interacting with the first object performs a triggering operation on the second picture; wherein, the picture content of the first picture is the same as the picture content of the second picture;
updating a communication interface of the first object according to a triggering operation executed by the at least one second object in the second picture, wherein the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
In another embodiment, the computer program when executed by the processor performs the steps of:
displaying a second picture in a communication interface of a second object in the process of communicating the second object with the first object; the picture content of the second picture is the same as the picture content of the first picture displayed in the communication interface of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
in a state that the second object is in interaction with the first object, responding to the triggering operation of the second object on the second picture, informing the terminal equipment of the first object of updating the communication interface of the first object according to the triggering operation executed by the second object;
the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
In another aspect, embodiments of the present application provide a computer storage medium storing a computer program;
in one embodiment, the computer program is adapted to be loaded by a processor and to perform the steps of:
displaying a first picture in a communication interface of a first object and M second objects in a communication process of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
displaying a second picture in a communication interface of each second object in the M second objects so that at least one second object interacting with the first object performs a triggering operation on the second picture; wherein, the picture content of the first picture is the same as the picture content of the second picture;
updating a communication interface of the first object according to a triggering operation executed by the at least one second object in the second picture, wherein the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
In another embodiment, the computer program is adapted to be loaded by a processor and to perform the steps of:
displaying a second picture in a communication interface of a second object in the process of communicating the second object with the first object; the picture content of the second picture is the same as the picture content of the first picture displayed in the communication interface of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
in a state that the second object is in interaction with the first object, responding to the triggering operation of the second object on the second picture, informing the terminal equipment of the first object of updating the communication interface of the first object according to the triggering operation executed by the second object;
the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
In another aspect, embodiments of the present application provide a computer program product comprising a computer program which, when executed by a processor, implements any of the above mentioned object interaction methods.
According to the embodiment of the application, in the communication process of the first object and the M second objects, a first picture can be displayed in the communication interface of the first object, and a second picture can be displayed in the communication interface of each second object; wherein, the first picture comprises: n virtual resources to be selected by the first object, and the picture content of the second picture is the same as the picture content of the first picture. Then, the terminal device of at least one second object interacting with the first object can respond to the triggering operation of the corresponding second object on the second picture; correspondingly, the terminal device of the first object can update the communication interface of the first object according to the triggering operation of the at least one second object in the second picture, and the updated communication interface is used for displaying the heat of the N virtual resources in the form of thermodynamic diagrams, so that the interaction efficiency between the first object and the at least one second object is improved, and the idea that the at least one second object performs resource selection for the N virtual resources is intuitively fed back. Therefore, the embodiment of the application can more efficiently, accurately and intuitively enable the first object to obtain the feedback opinion of at least one second object, and increase the interestingness in the interaction process. In addition, the embodiment of the application can support the active participation of the second object in the interaction with the first object, so that a decision for selecting resources for N virtual resources is provided for the first object, the participation of the second object in the interaction process can be improved, the second object is promoted to have more immersed experience in the interaction process, and the object viscosity (namely the user viscosity) is improved; in addition, especially for the second object (such as the disabled person such as the deaf-mute) which can not interact with the first object through the voice, the second object can be experienced more smoothly and without barriers through a simple triggering operation in a silent state.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1a is a system architecture diagram of a communication system according to an embodiment of the present application;
FIG. 1b is a system architecture diagram of another communication system provided by an embodiment of the present application;
FIG. 2 is a flow chart of an object interaction method according to an embodiment of the present application;
fig. 3a is a schematic diagram of displaying a second picture sent to a terminal device of a second object according to an embodiment of the present application;
FIG. 3b is a schematic diagram showing a thermodynamic diagram provided by an embodiment of the present application;
FIG. 3c is a schematic diagram of another illustrative thermodynamic diagram according to an embodiment of the present application;
FIG. 3d is a diagram of a screen displaying a virtual resource indicating a selected target virtual resource according to an embodiment of the present application;
FIG. 3e is a schematic diagram showing a decision prompt according to an embodiment of the present application;
FIG. 3f is a schematic diagram of another embodiment of the present application for displaying decision cues;
FIG. 3g is a schematic diagram showing a thermodynamic diagram provided by an embodiment of the present application;
FIG. 3h is a schematic diagram showing a thermodynamic diagram according to an embodiment of the present application;
FIG. 3i is a schematic diagram of a thermodynamic diagram canceling display provided by an embodiment of the present application;
FIG. 3j is a schematic diagram illustrating a closed interaction mode according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating another method for object interaction according to an embodiment of the present application;
FIG. 5a is a schematic diagram of a smoothing process according to an embodiment of the present application;
FIG. 5b is a schematic flow chart of generating a thermodynamic diagram according to an embodiment of the present application;
FIG. 5c is a schematic diagram showing a mapping relationship between color and thermal color values according to an embodiment of the present application;
FIG. 5d is a schematic diagram showing a thermodynamic diagram in a target area according to an embodiment of the present application;
FIG. 5e is a schematic diagram of another embodiment of the present application for displaying a thermodynamic diagram in a target area;
FIG. 6 is a flowchart of another method for object interaction according to an embodiment of the present application;
FIG. 7a is a schematic diagram of a display status prompt provided by an embodiment of the present application;
FIG. 7b is a schematic diagram of another embodiment of a status prompt message;
FIG. 7c is a schematic diagram of another embodiment of a status indication;
FIG. 7d is a schematic diagram of canceling the display of interactive prompt information according to an embodiment of the present application;
FIG. 7e is a schematic diagram showing interactive prompt messages according to an embodiment of the present application;
FIG. 7f is a schematic illustration of a visual highlighting provided by an embodiment of the present application;
FIG. 7g is a schematic illustration of another visual highlighting provided by an embodiment of the present application;
FIG. 7h is a schematic diagram of a communication interface for updating a second object according to an embodiment of the present application;
FIG. 7i is a schematic diagram of another communication interface for updating a second object according to an embodiment of the present application;
FIG. 8 is a flowchart of another method for object interaction according to an embodiment of the present application;
FIG. 9a is a schematic structural diagram of an object interaction device according to an embodiment of the present application;
FIG. 9b is a schematic structural diagram of another object interaction device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application.
In an embodiment of the application, a communication system is involved; specifically, the communication system may include at least: the terminal device 11 of the first object, the terminal devices 12 of the M second objects, and the server 13, and M is a positive integer. Wherein the terminal device 11 of the first object refers to a terminal device used by the first object, and the first object refers to an object (i.e. a user) responsible for initiating communication in the communication process; the terminal device 12 of the second object is a terminal device used by the second object, and the second object is an object for viewing a screen related to the first object after the communication is initiated. The terminal devices mentioned herein may include, but are not limited to: smart phones, tablet computers, notebook computers, desktop computers, smart watches, smart voice interaction devices, smart home appliances, vehicle terminals, aircraft, and the like; a wide variety of clients (APP) may be running within the terminal, such as live broadcast clients, video play clients, social clients, browser clients, information flow clients, educational clients, and so forth.
The server 13 refers to a service device that can establish communication between the terminal device 11 of the first object and the terminal device 12 of each second object, thereby providing various services such as an information interaction service, a live broadcast service, etc., for each terminal, wherein the server 13 may be an independent physical server; in this case, the system architecture of the communication system can be seen from fig. 1 a. Alternatively, the server 13 may be a server cluster or a distributed system formed by a plurality of physical servers; for example, the server 13 may include: a social server for providing information interaction services, a communication server for providing initiating communication services, etc., in which case the system architecture of the communication system may be as shown in fig. 1 b. Still alternatively, the server 13 may be a cloud server that provides a cloud service, a cloud database, cloud computing (cloud computing), cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs (Content Delivery Network, content delivery networks), basic cloud computing services such as big data and artificial intelligence platforms, and the like.
It should be noted that the above mentioned communication system may be applied to a live broadcast scenario or a conference scenario, that is, the above mentioned communication process may refer to a live broadcast process or a conference process, which is not limited in the present application; wherein, live broadcast refers to: the information is synchronously produced and released along with the occurrence and development processes of the event on site, and the information network release mode of the bidirectional circulation process is provided. It will be appreciated that in a live scene, the first object mentioned above may be a main cast object (i.e. an object responsible for live broadcast), while the second object may be a viewer object (i.e. an object viewing a live view of a live broadcast object); in a conference scenario, a first object may be a conference hosting object (i.e., an object responsible for conducting conference hosting), while a second object refers to an object that participates in and learns the content of a conference. For convenience of explanation, the live broadcast scene is taken as an example in the following; the specific type of the live broadcast scene is not limited in the embodiment of the application, and for example, the live broadcast scene can be a game live broadcast scene (a scene of playing a game by a first object in the process of playing the game by a first object), a singing live broadcast scene (a scene of singing by a live first object), an e-commerce live broadcast scene (a scene of playing or recommending an article resource by a live first object), and the like; the application is not limited in this regard.
Based on the communication system, the embodiment of the application provides an object interaction scheme to improve interaction efficiency between a first object and at least one second object in M second objects, and intuitively feeds back the idea of the at least one second object to the first object. In a specific implementation, the general principle of the object interaction scheme is as follows: in the process that the first object communicates with the M second objects, the terminal device 11 of the first object may display a first screen in the communication interface of the first object and send a second screen to the terminal device 12 of each second object through the server 13, so that the terminal device 12 of each second object may display the second screen in the communication interface of the corresponding second object, so that at least one second object interacting with the first object performs a triggering operation on the second screen, and the interaction refers to a process of interdependent actions occurring between the first object and the second object by propagating information through language or other modes; wherein, the picture content of the first picture is the same as the picture content of the second picture. Accordingly, the terminal device 12 of each second object in the at least one second object may respond to the triggering operation of the corresponding second object for the second screen, and notify the terminal device 11 of the first object through the server 13 to update the communication interface of the first object according to the triggering operation performed by the corresponding second object.
Therefore, the object interaction scheme provided by the embodiment of the application aims to provide a more direct and instant interaction mode for the first object and each second object, so that the triggering operation of at least one second object interacting with the first object on the second picture can be realized, and the communication interface of the first object is updated according to the triggering operation of at least one second object on the second picture, thereby intuitively feeding back the idea that at least one second object triggers on the corresponding picture content to the first object, improving the interaction efficiency between the first object and at least one second object, enhancing the communication interaction between the first object and at least one second object, and further improving the object viscosity (namely the user viscosity).
Based on the above description, an embodiment of the present application proposes an object interaction method, which may be performed by a terminal device of a first object in the above-mentioned communication system; or by a live client running in the terminal device of the first object. For convenience of explanation, the method for executing the object interaction by the terminal device of the first object will be described later; referring to fig. 2, the object interaction method may include the following steps S201 to S203:
S201, displaying a first picture in a communication interface of a first object in the communication process of the first object and M second objects; the first frame includes: n virtual resources to be selected by the first object; m and N are positive integers.
It is to be appreciated that the first object may communicate with M second objects for different content, and accordingly, the first screen is generated according to at least one behavior performed by the first object during the communication, and the at least one behavior performed by the first object is different according to a scene in which the first object is located, so the first screen includes, but is not limited to: game scenes, singing scenes, merchandise recommendation scenes, and the like.
For example, if the first object is a main broadcasting object and the first object is live in a live game scene, the at least one action performed by the first object in the live game process includes: the first object performs the action of triggering the operation on the virtual resource in the game interface; then, the first frame may include: one game screen involved in the game of the first object, and the N virtual resources include: during the game play of the first object, the game resources involved.
For another example, if the first object is a main broadcasting object and the first object is live broadcasting in a singing live broadcasting scene, the at least one action performed by the first object in the live broadcasting process includes: a behavior of the first object singing in the live broadcasting room; then, the first frame may include: in the singing process of the first object, a picture displayed in a screen of a terminal device of the first object, and the N virtual resources may include: during the singing of the first object, singing resources, such as song options, sound effect options and the like, are involved. Alternatively, the first screen may include: in the live broadcast process of the first object, terminal equipment of the first object acquires the acquired picture of the environmental information of the first object side.
For another example, if the first object is a main broadcasting object and the first object is live in an e-commerce live broadcasting scene, the at least one action performed by the first object in the live broadcasting process includes: the first object performs or recommends the behavior of the article resource in the live broadcast room; the first frame may include: in the live broadcasting process of the first object, a picture obtained by collecting the environmental information of the first object side by the terminal equipment of the first object, or a display picture generated by the first object by shooting, drawing and the like for commodities to be recommended is manufactured into display content; correspondingly, the N virtual resources comprise display information of the article resources which the first object wants to recommend to the M second objects.
S202, displaying a second picture in a communication interface of each second object in the M second objects so that at least one second object interacting with the first object executes a triggering operation on the second picture; the interactions referred to herein may be referred to specifically as immediate interactions (or as real-time interactions).
Wherein, the picture content of the first picture is the same as the picture content of the second picture; the first screen and the second screen may be the same screen or different screens, and this is not a limitation. For example, when the first object is a main broadcasting object and the first object is live broadcasting in an e-commerce live broadcasting scene, the first screen may be a screen obtained by collecting environmental information on the first object side as known from the foregoing description; then in this case the second picture and the first picture are the same picture. That is, the terminal device of the first object may collect the environmental information on the first object side, and take the collected pictures as the first picture and the second picture, respectively. As another example, when the first object is a main broadcasting object and the first object is live broadcasting in a live game scene, the first screen may be a game screen displayed in a screen of a terminal device of the first object; in this case, the second screen and the first screen are different screens, and in particular, the second screen may include a live screen obtained by collecting (e.g., screen recording) the first screen by the terminal device of the first object. That is, the terminal device of the first object may record the first picture displayed in the communication interface of the first object in real time to obtain a second picture, and send the second picture to the terminal devices of the second objects through the server; accordingly, the terminal device of each second object may receive the second screen and display the second screen in the communication interface of the corresponding second object, as shown in fig. 3 a. It should be noted that fig. 3a only illustrates the transmission process and display manner of the second picture, which is not limited by the present application.
It should be noted that, if the second frame includes a live broadcast frame obtained by collecting (e.g. recording a screen) the first frame by the terminal device of the first object, further, the terminal device of the first object may also collect environmental information on the first object side, and send the collected environmental information to the terminal devices of the second objects through the server, where the environmental information is used to reflect: appearance characteristics or limb language of the first object, etc.; correspondingly, the terminal equipment of each second object can display the environment information in the communication interface of the corresponding second object; it should be understood that any of the second objects may optionally display the above-mentioned environmental information, or may optionally not display the above-mentioned environmental information, which is not limited by the present application.
Further, the terminal device of the first object may further display interaction prompt information in the communication interface of each second object in response to the interaction mode opening operation detected in the communication interface of the first object; wherein, interactive prompt message is used for prompting: whether the corresponding second object participates in an interaction with the first object; and after any second object performs a confirmation operation on the interaction prompt information, any second object is determined to participate in the interaction with the first object. It can be understood that after the terminal device of the first object detects the opening operation of the interaction mode, the server can send interaction prompt information to the terminal devices of the second objects; correspondingly, after the terminal equipment of each second object receives the interaction prompt information, the interaction prompt information can be displayed in the communication interface of the corresponding second object.
It should be noted that the interactive mode opening operation includes, but is not limited to: an operation of inputting a voice password for opening an interaction mode, an operation of pressing the communication interface for the first object, a continuous clicking operation (i.e., an operation in which the number of clicks within a preset clicking time reaches a clicking threshold value), an operation of inputting a preset gesture, and the like are performed with respect to a triggering operation of an interaction component in the communication interface of the first object; the application is not limited to the specific embodiment of the interactive mode opening operation. The pressing operation may refer to a long-press operation (i.e., an operation in which the pressing time period is longer than the time period threshold), or may refer to a short-press operation (i.e., an operation in which the pressing time period is less than or equal to the time period threshold), which is not limited in the present application; in addition, the preset gesture input on the communication interface of the first object can be set according to the service requirement, and the specific content of the preset gesture is not limited. Correspondingly, the terminal equipment of the first object can detect the starting operation of the interaction mode in real time; if the interactive mode starting operation is detected, starting an interactive mode, and updating the communication interface of the first object to obtain an updated communication interface, wherein the updated communication interface is illustrated in a later-mentioned interface schematic diagram (as shown in fig. 3 b).
In one specific implementation, the terminal device of the first object may display the interaction component in the communication interface of the first object; when the interaction component is detected to be triggered, it may be determined that an interaction mode opening operation is detected in the communication interface of the first object. In this case, the interactive mode opening operation is a triggering operation of the pointer on the interactive component in the communication interface of the first object; accordingly, the triggering operation herein may refer to a pressing operation or a continuous clicking operation, which is not limited in the present application. The application can also refer to the interaction component as an instant interaction button or an instant interaction reminding button. It should be understood that when the terminal device of the first object displays (i.e. displays) the instant interaction reminding button on the first object side, a normal picture is displayed in the communication interface of the second object; for example, if the first object is live in a live game scene, a normal live game screen (i.e., a second screen) displayed on the communication interface of the second object is as shown in fig. 3 a.
In another specific implementation, the terminal device of the first object may determine that an interactive mode opening operation is detected in the communication interface of the first object in response to a pressing operation or a continuous clicking operation for the communication interface of the first object; that is, upon detecting a pressing operation or a continuous clicking operation with respect to the communication interface of the first object, it may be determined that an interactive mode opening operation is detected in the communication interface of the first object.
In still another specific implementation, when the terminal device of the first object detects a preset gesture input in the communication interface of the first object, it may be determined that an interactive mode opening operation is detected in the communication interface of the first object; or, when detecting an operation of starting the voice password of the interaction mode, the terminal device of the first object may determine that the interaction mode starting operation is detected in the communication interface of the first object, and so on.
And S203, updating the communication interface of the first object according to the triggering operation of at least one second object in the second picture, wherein the updated communication interface is used for displaying the heat of N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources.
And when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
It should be noted that, the thermodynamic diagram may be used to intuitively reflect the position distribution situation of each triggering operation, so as to reflect the heat of the virtual resource corresponding to each position; the thermodynamic diagram refers to: the image obtained by highlighting each divided region with a different display color in the first screen according to the number of trigger operations corresponding to each divided region, that is, the image in which density information of occurrence of data activity of a certain region is displayed in a special highlight form. The depth of the display color of any divided region in the thermodynamic diagram is related to the number of triggering operations corresponding to the any divided region; specifically, the larger the maximum value in the interval range where the thermodynamic color value corresponding to the number of triggering operations corresponding to any one of the divided regions is located, the darker the thermodynamic color value of any one of the divided regions is in the thermodynamic diagram, the thermodynamic color value is a value for describing the depth of the display color in the thermodynamic diagram, and the thermodynamic color value of any one of the divided regions is proportional to the number of triggering operations corresponding to any one of the divided regions; conversely, the lighter the display color of the divided region in the thermodynamic diagram. Correspondingly, the darker the display color of the divided area where any virtual resource is located, the higher the heat of the any virtual resource; the lighter the display color of the divided area where any virtual resource is located, the lower the heat of the any virtual resource; in other words, the heat of the virtual resource is positively correlated with the depth of the display color of the divided region in the thermodynamic diagram. Optionally, the thermodynamic color value of any of the divided regions may refer to the number of triggering operations corresponding to the any of the divided regions, or may refer to data after mapping the triggering operations corresponding to the any of the divided regions, which is not limited in the present application.
It can be understood that if the number of triggering operations corresponding to the first divided area is greater than the number of triggering operations corresponding to the second divided area, the thermodynamic color value of the first divided area is greater than the thermodynamic color value of the second divided area, and correspondingly, if the maximum value of the interval range in which the thermodynamic color value of the first divided area is greater than the maximum value of the interval range in which the thermodynamic color value of the second divided area is located, the depth of the display color of the first divided area in the thermodynamic diagram is greater than the depth of the display color of the second divided area in the thermodynamic diagram; if the maximum value of the interval range in which the thermodynamic color value of the first divided area is located is equal to the maximum value of the interval range in which the thermodynamic color value of the second divided area is located, that is, the interval range in which the thermodynamic color value of the first divided area is located is the same as the interval range in which the thermodynamic color value of the second divided area is located, the depth of the display color of the first divided area in the thermodynamic diagram is equal to the depth of the display color of the second divided area in the thermodynamic diagram.
In a specific implementation, the terminal device of the first object may acquire a thermodynamic diagram generated according to a triggering operation performed by at least one second object in the second screen, and display the thermodynamic diagram in the communication interface of the first object. Correspondingly, the terminal equipment of the first object can display the thermodynamic diagram as a mask layer on a first picture in the communication interface of the first object in a superposition manner; or outputting a sub-page in the communication interface of the first object, and displaying the second picture and the thermodynamic diagram in a superposition manner in the sub-page.
For example, when the first object is a main broadcasting object and the first object is broadcasting in a live game scene, the N virtual resources 31 include game resources involved in the playing process of the first object, so that the thermodynamic diagram may be used to reflect the heat of each game resource, that is, the deeper the display color of an area where any game resource is located in the thermodynamic diagram 30, the higher the heat of any game resource is, when the communication interface of the first object is updated according to the triggering operation performed by at least one second object in the second screen, a schematic diagram of taking the thermodynamic diagram 30 as a mask layer may be referred to as shown in fig. 3b, and a schematic diagram of outputting one sub-page 34 may be referred to as shown in fig. 3 c. Correspondingly, the first object can intuitively know the distribution of the times of the triggering operation executed by the second object through the display color of the thermodynamic diagram, so as to know the target virtual resource with highest heat degree in the N virtual resources, namely at least one virtual resource which the second object most wants to select; accordingly, assuming that the display color of the divided area where the virtual resource 311 is located is the deepest, it is stated that at least one second object expects the first object to select the virtual resource 311, so that the first object understands the intention of each second object more intuitively and efficiently.
Based on the above, the terminal device of the first object may further select, according to the updated communication interface, a target virtual resource from the N virtual resources, where the target virtual resource is a virtual resource with the highest heat degree in the N virtual resources; displaying a picture for indicating that the target virtual resource is selected in a communication interface of the first object; it should be understood that the terminal device of the first object may determine, according to the thermodynamic diagram displayed in the updated communication interface, a target virtual resource from the N virtual resources, and further select the target virtual resource. Or the terminal device of the first object may further determine, according to the updated communication interface, a target virtual resource from the N virtual resources, and output a decision prompt in the communication interface of the first object, where the decision prompt is used to prompt the first object to select the target virtual resource from the N virtual resources, and so on.
The decision prompt can be a text prompt or a highlighting prompt, and the application is not limited to the text prompt; accordingly, highlighting cues include, but are not limited to: a highlighting cue and an identifying highlighting cue, etc. Alternatively, the terminal device of the first object may display a screen for indicating that the target virtual resource has been selected when detecting a selection operation for the N virtual resources; or outputting a decision prompt when a decision prompt operation of the communication interface for the first object is detected, and the like.
For example, the terminal device of the first object may display a screen for indicating that the target virtual resource (e.g., virtual resource 311) has been selected in the communication interface of the first object, as shown in fig. 3 d; correspondingly, the terminal device of the first object may output a decision prompt on the communication interface of the first object, and when the decision prompt is a text prompt 351, a schematic diagram of outputting the decision prompt may be shown in fig. 3e, and when the decision prompt is a highlight prompt 352, a schematic diagram of outputting the decision prompt may be shown in fig. 3 f.
Accordingly, assuming that the first object is live in the live game scene, when the interaction mode opening operation is a triggering operation of the interaction component 32 in the communication interface for the first object, a schematic diagram of updating the communication interface for the first object may be shown with reference to fig. 3 b; when the interactive mode opening operation is a pressing operation of the communication interface of the first object, a schematic diagram of updating the communication interface of the first object may be shown with reference to fig. 3 g; when the operation of starting the interaction mode is an operation of inputting a preset gesture, taking the preset gesture as the gesture V for illustration, a schematic diagram of updating the communication interface of the first object may be shown in fig. 3 h.
Further, when it is determined that the interactive mode opening operation is detected in the communication interface of the first object, the terminal device of the first object may display a target element in the communication interface of the first object, where the target element includes a switch button for opening or closing the thermodynamic diagram, and the switch button is in an open state; the thermodynamic diagram is displayed when the switch button is in an on state, and the thermodynamic diagram is cancelled from being displayed after the switch button is switched from the on state to the off state. It will be appreciated that if the switch button for turning on or off the thermodynamic diagram is in an on state, the terminal device of the first object may cancel the display of the thermodynamic diagram when detecting that the switch button for turning on or off the thermodynamic diagram is triggered. Optionally, the target element may further include a switch identifier for describing a state in which the switch button is located, where the switch identifier may be used to describe that the switch button is located in an on state or an off state.
Optionally, the target element may further include a switch button for turning on or off a decision prompt, where the decision prompt is displayed when the switch button is in an on state, and when the switch button is switched from the on state to the off state, the decision prompt is cancelled from being displayed; based on this, when the switch button for turning on or off the decision prompt is in the off state, the terminal device of the first object may determine the target virtual resource and output the decision prompt at the communication interface of the first object when detecting a trigger operation for the switch button for turning on or off the decision prompt.
It can be understood that, since the interaction mode opening operation may refer to a triggering operation for an interaction component in the communication interface of the first object, the terminal device of the first object may display the target element in the communication interface of the first object when detecting that the interaction component is triggered; in this case, the terminal device of the first object may determine a display position of the interaction component in the communication interface of the first object; and displaying the target element at the display position of the interactive component to cover or replace the interactive component with the target element. It should be noted that, the terminal device of the first object may also display the target element at other display positions besides the interactive component, which is not limited in the present application.
Optionally, after the first object starts the interaction mode, the terminal device of the first object may display the interaction duration in the communication interface of the first object; the target element further comprises an interaction mode closing component (i.e. a stop interaction button), and the interaction time is displayed in a display area of the interaction mode closing component. Correspondingly, if the thermodynamic diagram is displayed in the communication interface of the first object, the terminal equipment of the first object can cancel the display of the thermodynamic diagram and close the interaction mode when detecting that the interaction mode closing component is triggered; if a decision prompt is displayed in the communication interface of the first object, the terminal device of the first object can cancel the display of the decision prompt and close the interaction mode when detecting that the interaction mode closing component is triggered. It should be noted that, the above-mentioned interaction duration may refer to a difference between the system time when the start operation of the interaction mode is detected and the current system time, or may refer to a difference between the system time when the confirmation operation of the interaction prompt information is received by any second object after the start of the interaction mode and the current system time. It should be appreciated that the first object may cancel the instant interaction and turn off the display of the thermodynamic diagram at any time.
For example, assuming that the first object is live in a live game scene, the communication interface of the first object is updated by displaying thermodynamic diagrams, and the communication interface of the first object includes the interaction component 32, when the interaction component 32 is detected to be triggered, the terminal device of the first object may display the target element 33 at the display position of the interaction component, the target element 33 may include a switch button 331 for turning on or off the thermodynamic diagrams and an interaction mode turning-off component 332, and a schematic diagram of displaying the target element 33 is shown in fig. 3 b; when the switch button 331 for turning on or off the thermodynamic diagram is in an on state, that is, the switch identifier 333 of the switch button 331 is used to describe that the switch button 331 is in an on state, and it is detected that the switch button 331 for turning on or off the thermodynamic diagram is triggered, the schematic diagram of the thermodynamic diagram 30 is cancelled as shown in fig. 3i, and the switch identifier 333 at this time may be used to describe that the switch button 331 is in an off state; when the interactive mode closing component 332 is triggered, a schematic diagram of closing the interactive mode is shown in fig. 3 j.
It should be noted that fig. 3b to fig. 3j are all schematic views that are merely exemplary for showing a thermodynamic diagram, a screen for indicating that a target virtual resource has been selected, a decision prompt, or the like in the communication interface of the first object, and the present application is not limited thereto. For example, in fig. 3 b-3 j, the display position of the target element 33 is the same as the display position of the interaction component 32. However, in other embodiments, the display position of the target element 33 may be different from the display position of the interaction component 32; as another example, in fig. 3 d-3 f, after displaying a screen or a decision prompt for indicating that the target virtual resource has been selected in the communication interface of the first object, the terminal device of the first object may cancel displaying the thermodynamic diagram, where the switch for turning on or off the thermodynamic diagram may be switched from an on state to an off state, and so on.
According to the embodiment of the application, in the communication process of the first object and the M second objects, a first picture is displayed in the communication interface of the first object, and a second picture is displayed in the communication interface of each second object, so that at least one second object which interacts with the first object performs a triggering operation on the second picture; wherein, the first picture comprises: n virtual resources to be selected by the first object, and the picture content of the second picture is the same as the picture content of the first picture. Then, according to the triggering operation of at least one second object in the second picture, the communication interface of the first object is updated, and the updated communication interface is used for displaying the heat of N virtual resources in a thermodynamic diagram form, so that the interaction efficiency between the first object and the at least one second object is improved, and the idea that the at least one second object performs resource selection on the N virtual resources is intuitively fed back. Therefore, the embodiment of the application can more efficiently, accurately and intuitively enable the first object to obtain the feedback opinion of at least one second object, and increase the interestingness in the interaction process, thereby improving the object viscosity (namely the user viscosity).
Based on the above description, the embodiments of the present application also provide a more specific object interaction method, which may be performed by the terminal device of the first object in the above-mentioned communication system; or by a client with live functionality running in the terminal device of the first object. For convenience of explanation, the method for executing the object interaction by the terminal device of the first object will be described later; referring to fig. 4, the object interaction method may include the following steps S401 to S404:
s401, displaying a first picture in a communication interface of a first object in the communication process of the first object and M second objects; the first frame includes: n virtual resources to be selected by the first object; m and N are positive integers.
S402, displaying a second picture in a communication interface of each second object in the M second objects so that at least one second object interacting with the first object executes a triggering operation on the second picture; wherein, the picture content of the first picture is the same as the picture content of the second picture.
S403, acquiring a thermodynamic diagram generated according to the triggering operation executed by at least one second object in the second picture.
Wherein the thermodynamic diagram is used for reflecting the heat degree of each virtual resource in the N virtual resources; it should be noted that, when the terminal device of the first object acquires the thermodynamic diagram generated according to the triggering operation performed by the at least one second object in the second screen, the terminal device of the first object may receive the thermodynamic diagram generated by the server according to the triggering operation performed by the at least one second object in the second screen, that is, the server may first generate the thermodynamic diagram according to the triggering operation performed by the at least one second object in the second screen, and then, correspondingly, the terminal device of the first object may receive the thermodynamic diagram sent by the server; the terminal device of the first object may also generate a thermodynamic diagram according to the triggering operation performed by the at least one second object in the second screen, in which case the terminal device of the first object may receive information of each triggering operation sent by the terminal device of the second object through the server, which is not limited in the present application.
In a specific implementation, when generating a thermodynamic diagram according to a trigger operation performed by at least one second object in the second screen, trigger position information generated according to the trigger operation performed by at least one second object in the second screen may be acquired, and one trigger position information may be used to indicate a trigger position of the corresponding trigger operation; and carrying out region division on the communication interface of the first object to obtain a plurality of division regions, so as to respectively count the trigger position information in each division region and obtain the number of trigger operations corresponding to each division region. In this case, the display color of each divided region may be determined according to the number of trigger operations corresponding to each divided region, and the depth of the display color of any divided region is positively influenced by the number of trigger operations corresponding to any divided region, that is, the depth of the display color is unchanged or deepened as the number of trigger operations corresponding to the divided region increases; and highlighting the corresponding divided areas by adopting the display colors of the divided areas to obtain a thermodynamic diagram, wherein the heat of any virtual resource is positively correlated with the depth of the display color of the divided area where the any virtual resource is positioned. It should be noted that, each divided area may be a rectangular small area, that is, the communication interface of the first object may be adaptively divided into a matrix of a plurality of rectangular small areas according to the communication interface of the first object; specifically, the communication interface of the first object may be divided according to a preset length and a preset width to obtain a matrix of (communication interface length/preset length) ×number of division areas (communication interface width/preset width). The present application does not limit the preset length and the preset width, for example, the preset length may be 10px (pixel) or 8px, and the preset width may be 10px or 6px, and the preset length and the preset width may be the same or different. It is understood that when the units of the preset length and the preset width are pixels, the units of the communication interface length and the communication interface width are pixels.
It should be understood that after the trigger operation performed in the second screen according to the at least one second object is acquired, the generated trigger position information may be mapped to the corresponding divided area, so that the number of trigger operations corresponding to each divided area is obtained according to the trigger position information in each divided area. Specifically, when the trigger position information corresponds to the trigger operations one by one, that is, each time a trigger operation is executed, one trigger position information is generated, summation operation can be performed on the trigger position information in each division area, so as to obtain the number of trigger operations corresponding to each division area; when any trigger position information can also carry the number of trigger operations acting on any trigger position, that is, any trigger position information can also be used for indicating the number of trigger operations acting on any trigger position, summation operation can be respectively carried out on the number of trigger operations indicated by the trigger position information in each division area, so as to obtain the number of trigger operations corresponding to each division area. It should be noted that, in the present application, the trigger position information may also be referred to as a coordinate value, and when the trigger operation performed by the second object with respect to the second screen is a click operation, the number of times of the trigger operation may also be referred to as the number of clicks; in this case, when each coordinate value carries information of the number of clicks of the second object, the collected coordinate values of the clicking region may be mapped into the divided matrix, and then the number of clicks corresponding to each coordinate value is accumulated into the matrix (i.e., the divided region) to which each coordinate value belongs, so as to obtain the number of clicks of each divided region. Optionally, after the number of triggering operations corresponding to each divided area is obtained, the number of triggering operations corresponding to each divided area may be used to generate a data matrix, and based on data in the data matrix, a display color of each divided area may be determined to generate a thermodynamic diagram.
Further, in the process of determining the display color of each divided area according to the number of triggering operations corresponding to each divided area, the number of triggering operations corresponding to each divided area may be mapped to a preset range for describing the thermal color value to obtain the thermal color value of each divided area, so that the display color of each divided area is determined according to the thermal color value of each divided area. In this case, the data matrix formed by the number of trigger operations corresponding to each divided region may be converted into a color matrix map including the thermal color values of each divided region, so that the display color of each divided region is determined from the color matrix map. It should be noted that the preset range for describing the thermal color value may be set according to actual requirements or may be set empirically, which is not limited by the present application. For example, when the preset range for describing the thermal color value is [0,1], the number of trigger operations corresponding to each divided region may be mapped into [0,1], that is, the number of trigger operations corresponding to each divided region may be normalized to be mapped to the thermal color value of the corresponding range, thereby obtaining the color matrix diagram.
In a specific implementation, when determining the display color of each divided area according to the thermal color value of each divided area, the thermal color value of each divided area may be subjected to smoothing (i.e., convolution operation) to obtain a smoothed thermal color value of each divided area, and then the display color of each divided area is determined according to the smoothed thermal color value of each divided area. In other words, the color matrix diagram may be smoothed (i.e. convolved) to obtain a smoothed color matrix diagram, so that the display color of each divided region is determined according to the smoothed color matrix diagram. It should be noted that the smoothing process herein may calculate an average value between the thermodynamic color value of each divided region (i.e., the rectangular small region) and the thermodynamic color value of the adjacent divided region, so as to soften the edges of each region of the thermodynamic diagram to generate a smoother thermodynamic diagram.
For example, as shown in fig. 5a, in the process of smoothing the divided region 51, the thermodynamic color values of the divided region 51 and the thermodynamic color values of the adjacent 8 divided regions may be weighted and summed to obtain a smoothed thermodynamic color value of the divided region 51; assuming that the weights corresponding to the thermodynamic color values of each divided region are the same and the sum of the weights is 1, the average value between the thermodynamic color value of the divided region 51 and the thermodynamic color values of the adjacent 8 divided regions may be calculated, and the obtained average value is taken as the smoothed thermodynamic color value of the divided region 51, in which case the smoothed thermodynamic color value of the divided region 51 is approximately equal to 0.54, that is, the value of the divided region 51 in the smoothed color matrix diagram may be 0.54. It should be noted that, fig. 5a illustrates the smoothing process only by way of example, which is not limited by the embodiment of the present application; for example, in other embodiments, the number of adjacent divided regions of the divided regions 51 may be 15 or 16.
In this case, a specific process of generating a thermodynamic diagram may be as shown in fig. 5 b; it should be understood that fig. 5b illustrates a specific process for generating a thermodynamic diagram, which is not limiting in this regard; for example, the number of trigger operations corresponding to each divided region may be normalized without generating a data matrix; as another example, the smoothing process may not be performed, but a thermodynamic diagram may be generated from thermodynamic color values (i.e., color matrix diagrams) of the respective divided regions, and so on.
Further, when the display color of each divided region is determined according to the smoothed thermal color value of each divided region, the display color of each divided region may be determined based on the mapping relationship between the smoothed thermal color value of each divided region and each display color. It should be understood that the smoothing may be not performed, and the display color of each divided region may be determined according to the thermal color value of each divided region, that is, the display color of each divided region may be determined based on the mapping relationship between the thermal color value of each divided region and each display color, which is not limited in the present application.
For example, assuming that the preset range for describing the thermodynamic color values is [0,1], and assuming that the display color of the divided region is color a when the thermodynamic color values of the divided region are located in the interval range [0,0.3], the display color of the divided region is color B when the thermodynamic color values of the divided region are located in the interval range (0.3, 0.6], and the display color of the divided region is color C when the thermodynamic color values of the divided region are located in (0.6,1 ], the depth of the color C is greater than the depth of the color B and the depth of the color B is greater than the depth of the color a, and further assuming that the communication interface of the first object is divided into 12 divided regions and the smoothed thermodynamic color values and the corresponding display colors of the respective divided regions are respectively shown in fig. 5C, it should be noted that fig. 5C only exemplarily represents the mapping relationship between the display color and the respective thermodynamic color values, for example, the application is not limited thereto, and the display color class in the thermodynamic diagram may be 4 or 5, and so on in other embodiments.
In another specific implementation, the trigger position information generated according to the trigger operation performed by at least one second object in the second picture can be obtained, a target area in the communication interface of the first object is determined, and each trigger position information is filtered according to the target area to obtain target trigger position information in the target area; the target area is divided into a plurality of divided areas, so that target trigger position information in each divided area is counted respectively, and the number of trigger operations corresponding to each divided area is obtained; correspondingly, determining the display color of each divided area according to the number of triggering operations corresponding to each divided area; and highlighting the corresponding divided areas by adopting the display colors of the divided areas to obtain a thermodynamic diagram. In this case, the thermodynamic diagram is used to reflect the heat of the virtual resources in the target area.
In determining the target area in the communication interface of the first object, the terminal device of the first object may respond to the area selection operation for the communication interface of the first object, and may set the area indicated by the area selection operation as the target area. Accordingly, when the terminal device of the first object detects a sliding operation with respect to the communication interface of the first object, it may be determined that an area selection operation is detected in the communication interface of the first object, and an area surrounded by a sliding track indicated by the sliding operation may be taken as a target area; the terminal device of the first object may also determine that the region selection operation is detected in the communication interface of the first object when the pressing operation or the continuous clicking operation with respect to the communication interface of the first object is detected, in which case a region selection frame may be displayed, the region selection frame may be moved, and the size of the region selection frame may be modified, and the terminal device of the first object may regard a region included in the region selection frame as a target region, which is not limited in the present application. It should be appreciated that upon detection of the region selection operation, a corresponding thermodynamic diagram may be displayed on the target region in the communication interface of the first object to update the communication interface of the first object, and the updated communication interface is illustrated in the interface schematic diagram mentioned later (e.g., fig. 5 d).
S404, displaying thermodynamic diagram in the communication interface of the first object
For example, assuming that the first object is live in a live game scene, when the region selection operation refers to a sliding operation, that is, the terminal device of the first object may take a region surrounded by the sliding track 52 indicated by the sliding operation as a target region, a schematic diagram of the thermodynamic diagram is displayed as shown in fig. 5 d; when a pressing operation or a continuous clicking operation with respect to the communication interface of the first object is detected, it may be determined that an area selection operation is detected in the communication interface of the first object, and an area selection box 53 is displayed to determine a target area, thereby displaying a schematic diagram of a thermodynamic diagram as shown in fig. 5 e. It should be understood that fig. 5d and 5e are schematic diagrams showing thermodynamic diagrams by way of example only, and the application is not limited in this regard.
It should be noted that the above description about steps S403 and S404 only exemplarily illustrates a manner in which the terminal device of the first object uses thermodynamic diagrams to update the communication interface of the first object, and the specific manner of updating the communication interface of the first object is not limited by the present application. In other embodiments, the terminal device of the first object may also update the communication interface of the first object with a screen for indicating that the target virtual resource has been selected, may also update the communication interface of the first object with a decision hint, and so on.
Specifically, if the communication interface of the first object is updated by displaying a screen for indicating that the target virtual resource has been selected, the terminal device of the first object may determine the target virtual resource from the N virtual resources according to a trigger operation performed by at least one second object in the second screen; then, a target virtual resource is selected, and a screen indicating that the target virtual resource has been selected is displayed in the communication interface of the first object. Correspondingly, if the communication interface of the first object is updated by displaying the decision prompt, the terminal device of the first object can determine the target virtual resource from the N virtual resources according to the triggering operation executed by at least one second object in the second picture; and outputting a decision prompt at the communication interface of the first object, wherein the decision prompt is used for prompting the first object to select a target virtual resource in the N virtual resources.
When the target virtual resource is determined from the N virtual resources according to the trigger operation performed by the at least one second object in the second screen, the terminal device of the first object may acquire the thermodynamic diagram generated according to the trigger operation performed by the at least one second object in the second screen, and determine the target virtual resource from the N virtual resources according to the thermodynamic diagram, that is, the virtual resource in the region with the deepest display color in the thermodynamic diagram is used as the target virtual resource. Or the terminal equipment of the first object can acquire a target identifier generated according to the triggering operation executed by at least one second object in the second picture, and determine target virtual resources from N virtual resources according to the target identifier; the execution subject for generating the target identifier may be a terminal device of the first object or a server according to a triggering operation performed by at least one second object in the second screen, which is not limited in the present application; when the execution subject for generating the target identifier is the server according to the trigger operation performed by the at least one second object in the second screen, the server may generate the target identifier according to the trigger operation performed by the at least one second object in the second screen, and transmit the target identifier to the terminal device of the first object. Or the terminal device of the first object can directly determine the target virtual resource from the N virtual resources according to the triggering operation executed by at least one second object in the second picture; in this case, the terminal device of the first object may receive each piece of trigger position information sent by the terminal device of the corresponding second object through the server, count the number of trigger operations corresponding to each divided area according to each piece of trigger position information, and use the virtual resource on the divided area with the largest number of times as the target virtual resource; or, according to the number of triggering operations corresponding to each divided area, calculating the thermodynamic color value of each divided area, and taking the virtual resource on the divided area with the maximum thermodynamic color value as a target virtual resource, and the like. The application is not limited to the specific implementation of determining the target virtual resource.
According to the embodiment of the application, the first picture can be displayed in the communication interface of the first object and the communication interfaces of the M second objects in the communication process of the first object, and the second picture can be displayed in the communication interface of each second object, so that at least one second object which interacts with the first object can execute triggering operation on the second picture. Then, a thermodynamic diagram generated according to the triggering operation performed by the at least one second object in the second screen can be obtained, the thermodynamic diagram is displayed in the communication interface of the first object, and the thermodynamic diagram can be used for reflecting the heat degree of each virtual resource, so that the idea that the at least one second object performs resource selection for N virtual resources is intuitively fed back. Therefore, the embodiment of the application can more efficiently, accurately and intuitively enable the first object to obtain the feedback opinion of the at least one second object, namely, the feedback opinion is intuitive and instant, the communication between the first object and the at least one second object can be realized in a more intuitive and understandable manner, the immersion sense and the interestingness in the interaction process are increased, and the viscosity of the objects is improved.
Based on the above description, the embodiments of the present application also provide an object interaction method, which may be performed by the terminal device of the second object in the above-mentioned communication system; or by a client running in the terminal device of the second object with live viewing functionality. For convenience of explanation, the method for executing the object interaction by the terminal device of the second object is taken as an example to be explained in the following; referring to fig. 6, the object interaction method may include the following steps S601-S602:
S601, displaying a second picture in a communication interface of the second object in the process of communicating the second object with the first object; the picture content of the second picture is the same as the picture content of the first picture displayed in the communication interface of the first object; the first frame includes: n virtual resources to be selected by the first object; m and N are positive integers.
The specific implementation manner of step S601 is the same as that of displaying the second screen in the communication interface of the second object in the above embodiment, and the disclosure is not repeated here.
It should be noted that, the terminal device of the second object may receive the interaction prompt information, where the interaction prompt information may be sent by the terminal device of the first object to the terminal device of the second object through the server, or may be sent by the server to the terminal device of the second object after the server receives the sending notification of the terminal device of the first object.
Further, after receiving the interaction prompt information, the terminal device of the second object may display the interaction prompt information in the communication interface of the second object, where the interaction prompt information is used for prompting: whether the corresponding second object participates in an interaction with the first object; and then, in response to the confirmation operation for the interaction prompt information, outputting a state prompt information in the communication interface of the second object, wherein the state prompt information is used for prompting that the second object is in an interaction state with the first object. It should be noted that, the display position of the state prompt information may be the same as the display position of the interaction prompt information, or may be different from the display position of the interaction prompt information, which is not limited in the present application; it can be understood that, when the display position of the state prompt information is the same as the display position of the interaction prompt information, the terminal device of the second object may cover or replace the interaction prompt information with the state prompt information in response to the confirmation operation for the interaction prompt information.
Among them, the confirmation operation includes, but is not limited to: a triggering operation of a confirmation component in the communication interface of the second object, an operation of inputting a voice password for confirming the interaction prompt information, a pressing operation or continuous clicking operation of the communication interface of the second object, an operation of inputting a preset gesture, and the like. It should be noted that the specific content of the preset gesture is not limited in the present application.
In a specific implementation, the terminal device of the second object may display a confirmation component in the communication interface of the second object; when the confirmation component is detected to be triggered, it may be determined that a confirmation operation is detected in the communication interface of the second object. Optionally, the interactive prompt information may be displayed through a popup window, and the terminal device of the second object may display a confirmation component in the popup window; the present application may also refer to the validation component as a de-participation button. In this case, after receiving the interaction prompt information, a popup window may be popped up on the communication interface of the second object to inquire whether the second object participates in the interaction, and the interaction may also be referred to as instant interaction. In this case, the second object may trigger (e.g., click, press) the participation button, may enter the interaction with the first object, and the popup may be updated to be in a state of participating in the interaction.
For example, assuming that the first object is live in a live game scene, the terminal device of the second object may display interactive prompt information 71 in the communication interface of the second object, and when the confirmation operation is a triggering operation of the pointer on the confirmation component 72 in the communication interface of the second object, a schematic diagram of displaying the status prompt information 73 is shown in fig. 7 a; when the confirmation operation is a pressing operation, a schematic diagram showing a status prompt 73 is shown in fig. 7 b; when the confirmation operation is an operation of inputting a preset gesture, the preset gesture is taken as an example for describing, and a schematic diagram of displaying the state prompt information 73 is shown in fig. 7 c. It should be noted that, fig. 7 a-7 c are schematic diagrams illustrating the display status prompt 73, which is not limited in this regard; for example, the status indicators 73 shown in fig. 7 a-7 b are all located in the display area outside the second screen, but in other embodiments, the status indicators 73 may also be suspended above the second screen, and so on.
Further, after the interactive prompt information is displayed, if the confirmation operation for the interactive prompt information is not detected within the preset time, the interactive prompt information is not displayed in the communication interface of the second object, that is, the terminal device of the second object may start counting down from the preset time after displaying the interactive prompt information, and cancel displaying the interactive prompt information when the counting down is finished. Or if the closing operation for the interaction prompt information is detected, canceling the display of the interaction prompt information in the communication interface of the second object; specifically, the terminal device of the second object may display a button for closing the countdown (i.e. a closing component for closing the interactive prompt information) in the communication interface of the second object, and when the button for closing the countdown is detected to be triggered, the terminal device of the second object may determine that a closing operation for the interactive prompt information is detected. Optionally, when the interactive prompt information and the button for closing the countdown are displayed in the pop-up window, the button for closing the countdown may be displayed in the upper right corner of the pop-up window; in this case, the terminal device of the second object may click to close the popup window before the countdown is finished or the popup window automatically disappears when the countdown is finished, so that the display of the interactive prompt information is canceled.
For example, assuming that the first object is live in a live game scene, when the closing operation is a trigger operation of the pointer to the button 74 for closing the countdown, the display of the interactive prompt information 71 is canceled as shown in fig. 7 d. Note that, the closing operation may also refer to a pressing operation and a continuous clicking operation of the communication interface of the second object, an operation of inputting a preset gesture or an operation of inputting a voice password for canceling the display of the interaction prompt information, and the like; the application is not limited in this regard.
Optionally, if the display of the interaction prompt information is canceled and the second object is in a state of not interacting with the first object, a confirmation recovery component may be displayed in the communication interface of the second object; when the triggering operation aiming at the confirmation recovery component is detected, the interaction prompt information and the confirmation component can be displayed in the communication interface of the second object; alternatively, a confirmation component can be displayed in the communication interface of the second object, and so forth. For example, assuming that the first object is live in a live game scene, when a trigger operation for the confirmation restoring component 75 is detected, the interactive prompt 71 and the interactive confirmation component 72 may be displayed in the communication interface of the second object, and a schematic diagram of displaying the interactive prompt 71 is shown in fig. 7 e.
S602, in a state that the second object is in interaction with the first object, responding to a triggering operation of the second object on the second picture, notifying a terminal device of the first object to update a communication interface of the first object according to the triggering operation executed by the second object.
The updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
It should be noted that, the terminal device of the second object may respond to the triggering operation of the second object for the second screen, and perform a visual highlighting on the operation position (i.e., the triggering position) of the triggering operation performed by the second object in the second screen; wherein the visual highlighting includes at least one of: highlighting, namely performing identification processing on the operation position by adopting a preset identification; triggering operations herein include, but are not limited to: click operation and press operation, and the like; the preset identifier may be a text identifier or an image identifier, which is not limited in the present application. In this case, a visual highlighting may be used to indicate that the second object triggered successfully; accordingly, each time the terminal device of the second object performs the visual highlighting to feed back the trigger result to the second object, the fed back data (i.e. the above mentioned trigger position information) will be recorded by the terminal device of the second object and uploaded to the server (i.e. the background) for storage. Further, before the terminal device of the second object performs the visual highlighting, it may be determined whether the operation position of the triggering operation is in the triggering permission area; correspondingly, if the operation position is in the allowed trigger area, performing visual highlighting on the operation position of the trigger operation executed by the second object in the second picture; if the operation position is not in the trigger allowing area, the operation position of the trigger operation executed by the second object in the second screen is not visually highlighted. The permission trigger area may refer to a display area of the second screen, or may refer to a display area of the second screen corresponding to the above-mentioned target area, or may refer to a display area of the second screen corresponding to an area of the communication interface of the first object other than the interaction mode closing component.
For example, assuming that the first object is live in a live game scene, when the visual highlighting includes highlighting, a schematic diagram of the visual highlighting is shown in fig. 7 f; when the visual highlighting includes identification of the operating position using a preset identification 76, a schematic illustration of the visual highlighting is shown in FIG. 7 g. Based on this, the second object has transitioned from the passive viewing state to actively participating in the game of the first object, which can help the first object to collude and can experience the feeling of playing the game together with the first object. It should be understood that fig. 7f and 7g are schematic views that are merely exemplary representations of visual highlighting, and the present application is not limited in this regard.
It should be noted that, in the process of communication between the second object and the first object, if the first object closes the interaction mode, the terminal device of the second object may receive the closing notification and exit the interaction with the first object, that is, the terminal device of the second object may display the picture before participating in the interaction in the communication interface of the second object, and continue to watch the second picture. Optionally, after receiving the closing notification, the terminal device of the second object may further display interaction mode closing information; the interactive mode closing information may be displayed by a touch (a message prompt box), may be displayed by a pop-up window, or may be displayed by a pop-up screen, which is not limited in the present application.
In the embodiment of the present application, the terminal device of the second object may further exit the interaction with the first object in response to an interaction mode exit operation for the communication interface of the second object. Specifically, the terminal device of the second object may display the exit component in the communication interface of the second object; when the exit component is detected to be triggered, determining that an interactive mode exit operation is detected in a terminal interface of the second object; alternatively, the exit component may be displayed in a prompt element, and the prompt element may further include the status prompt information described above, that is, the terminal device of the second object may display the exit component and the status prompt information in the same area. It should be noted that, the interaction mode exit operation may also refer to a pressing operation or a continuous clicking operation of the communication interface for the second object, and may also refer to an operation of inputting a voice password for exiting interaction or an operation of inputting a preset gesture, and so on; the application is not limited in this regard.
For example, assuming that the first object is live in the live game scene, after the first object closes the interaction mode, the terminal device of the second object may exit the interaction with the first object, and display the interaction mode closing information 77 in the communication interface of the second object, so as to update the schematic diagram of the communication interface of the second object as shown in fig. 7 h; when it is detected that an exit component 78 in the communication interface of the second object is triggered, a schematic diagram of updating the communication interface of the second object is shown in fig. 7 i. It should be noted that fig. 7h and fig. 7i are schematic diagrams illustrating a communication interface for updating the second object, which is not limited by the present application.
In the embodiment of the application, a second picture can be displayed in a communication interface of the second object in the process of communicating the second object with the first object, the picture content of the second picture is the same as the picture content of a first picture displayed in the communication interface of the first object, and the first picture comprises: n virtual resources to be selected by the first object. Then, in a state that the second object is in interaction with the first object, responding to a triggering operation of the second object on a second picture, notifying a terminal device of the first object to update a communication interface of the first object according to the triggering operation executed by the second object, wherein the updated communication interface is used for displaying the heat of N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; based on this, the second object may actively participate in the interaction with the first object, providing the first object with a decision to make a resource selection for the N virtual resources. Therefore, the embodiment of the application can improve the participation of the second object in the interaction process, and promote the second object to have more immersed experience in the interaction process; in addition, especially for the second object (such as the disabled person such as the deaf mute) which can not interact with the first object through the voice, the second object can be more smoothly and unobstructed experienced by the mode of interacting with the first object through simple triggering operation under the silence state, so that the object viscosity is improved.
As can be seen from the above description, the object interaction method provided by the embodiment of the present application can be applied in different application scenarios, such as an application scenario based on live game, an application scenario based on singing, and an application scenario based on commodity recommendation live broadcast (i.e. live e-commerce broadcast). The following takes an application scenario based on live game as an example, and further describes the application process of the above-mentioned object interaction method with reference to the flowchart shown in fig. 8:
when a first object (i.e. a main broadcasting object) wants to play a game for broadcasting, a target game can be selected for broadcasting, and then M second objects (i.e. audience objects) can watch the live pictures of the first object, wherein M is a positive integer. In the live broadcast process, namely in the communication process of the first object and the M second objects, the terminal equipment (i.e. the front end of the anchor) of the first object can display a first picture in the communication interface (i.e. the screen or the front end page) of the first object, wherein the first picture is one game picture related in the game playing process of the first object, and the first picture comprises game resources related in the game playing process of the first object; the terminal device of the first object can acquire a first picture displayed in a communication interface of the first object to obtain a second picture, so that the second picture is sent to the terminal devices of the second objects through a server (namely a background system); correspondingly, after receiving the second frames, the terminal devices of the second objects can display the second frames in the communication interfaces of the corresponding second objects.
Correspondingly, the terminal equipment of the first object can display an interaction component (namely, an instant interaction button is started) in the communication interface of the first object, and the terminal equipment of the first object can detect the triggering operation of the first object on the interaction component in real time; if the triggering operation is detected, that is, if the first object is detected to click on the interaction component in the communication interface of the first object, the terminal device of the first object can submit the initiation interaction application to the server, and then the server can return the result of successful initiation application to the terminal device of the first object. In this case, the server may initiate a request to participate in the instant interaction (i.e., interaction prompt information) to the terminal devices of each of the M second objects, that is, the server may initiate a request to participate in the instant interaction to all the terminal devices of the second objects that are watching live.
Based on the above, the terminal device of each second object can display a popup window in the communication interface of the corresponding second object, inquire whether the corresponding second object participates in instant interaction, and attach a confirmation component (namely a confirmation button or a de-participation button) beside the corresponding second object; then, the terminal device of each second object may detect the triggering operation of the corresponding second object for the confirmation component in real time, and if the terminal device of any second object detects that any second object clicks the confirmation component to confirm participation in the interaction, the terminal device of any second object may update the communication interface of any second object to an ongoing state of the interaction. In this state, any of the second objects may trigger any trigger area (i.e., trigger position) that the first object is supposed to click on. Further, after the terminal device of any second object confirms that the any second object participates in the interaction, the triggering operation executed by the any second object can be detected in real time, so as to obtain the specific triggering coordinate value and the triggering times of the triggering area of the triggering operation executed by the any second object in the communication interface.
Further, the terminal device participating in at least one second object interacting with the first object may send the corresponding coordinate values and trigger times to the server, the server may collect coordinate values and trigger times of all second objects participating in instant interaction triggering communication interfaces, and visualize the data into a form of thermodynamic diagram, and further return the thermodynamic diagram generated in real time to the terminal device of the first object, where the thermodynamic diagram may display the position distribution of each triggering operation by displaying the depth of the color, so as to reflect the heat of the game resource in each area. Accordingly, after the terminal device of the first object receives the thermodynamic diagram, the thermodynamic diagram may be covered in a communication interface of the first object in a semitransparent cover layer form to be displayed to the first object. Or after the terminal device of the first object is started in the interaction mode, according to the real-time trigger data transmitted by the server, generating a trigger quantity thermodynamic diagram executed by the second object on the communication interface of the first object, so that the first object can intuitively see the distribution of the quantity of the trigger operations executed by the current second object through the color of the thermodynamic diagram.
As can be seen from the above description, when the object interaction method provided by the embodiment of the present application is applied to an application scenario based on live game, the position distribution of the triggering operation performed by at least one second object participating in the interaction with the first object may be displayed in real time in the communication interface of the first object in the form of a thermodynamic diagram, so that the first object may view the area that each second object wants to trigger in a more intuitive thermodynamic diagram form, and may directly determine the area that the second object wants to trigger through the color shade of the thermodynamic diagram, thereby knowing in real time the game resource that at least one second object wants to select, and effectively improving the interaction efficiency between the first object and at least one second object, and increasing the interestingness. And at least one second object participating in the interaction with the first object can actively participate in the live broadcast to help the first object to make a trigger decision, so that the participation of the second object can be improved, and the viscosity of a user can be improved.
Based on the above description of the related embodiments of the object interaction method, in one aspect, the embodiment of the present application further provides an object interaction device, where the object interaction device may be a computer program (including program code) running in a computer device, and the computer device herein refers to a terminal device of a first object; as shown in fig. 9a, the object interaction device may include a first output unit 901 and a first processing unit 902. The object interaction device may perform the object processing method shown in fig. 2 or fig. 4, i.e. the object interaction device may operate the above units:
a first output unit 901, configured to display a first screen in a communication interface of a first object and M second objects during a communication process of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
the first output unit 901 is further configured to display a second screen in a communication interface of each of the M second objects, so that at least one second object that interacts with the first object performs a triggering operation on the second screen; wherein, the picture content of the first picture is the same as the picture content of the second picture;
The first processing unit 902 is configured to update, according to a trigger operation performed by the at least one second object in the second screen, a communication interface of the first object, where the updated communication interface is used to display heat of the N virtual resources in a thermodynamic diagram form, so as to prompt the first object to perform resource selection; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
In one embodiment, the first processing unit 902, when updating the communication interface of the first object according to the triggering operation performed by the at least one second object in the second screen, may be specifically configured to:
acquiring a thermodynamic diagram generated according to a triggering operation performed by the at least one second object in the second picture;
and displaying the thermodynamic diagram in a communication interface of the first object.
In another embodiment, the first processing unit 902 may be specifically configured to, when displaying the thermodynamic diagram in the communication interface of the first object:
Superposing and displaying the thermodynamic diagram as a mask layer on a first picture in a communication interface of the first object;
or outputting a sub-page in the communication interface of the first object, and displaying the second picture and the thermodynamic diagram in a superposition manner in the sub-page.
In another embodiment, the first processing unit 902 is further configured to, after updating the communication interface of the first object according to a triggering operation performed by the at least one second object in the second screen:
selecting a target virtual resource from the N virtual resources according to the updated communication interface, wherein the target virtual resource is the virtual resource with the highest heat in the N virtual resources;
the first output unit 901 may be further configured to:
and displaying a picture for indicating that the target virtual resource is selected in the communication interface of the first object.
In another embodiment, the first processing unit 902 is further configured to, after updating the communication interface of the first object according to a triggering operation performed by the at least one second object in the second screen:
determining a target virtual resource from the N virtual resources according to the updated communication interface, wherein the target virtual resource is the virtual resource with the highest heat in the N virtual resources;
The first output unit 901 may further be configured to:
and outputting a decision prompt at the communication interface of the first object, wherein the decision prompt is used for prompting the first object to select a target virtual resource in the N virtual resources.
In another embodiment, the first output unit 901 may be further used for:
responding to the detected interactive mode starting operation in the communication interfaces of the first objects, and displaying interactive prompt information in the communication interfaces of the second objects;
wherein, the interactive prompt message is used for prompting: whether the corresponding second object participates in an interaction with the first object; and after any second object performs a confirmation operation on the interaction prompt information, any second object is determined to participate in the interaction with the first object.
In another embodiment, the first output unit 901 may be further used for: displaying an interaction component in a communication interface of the first object; the first processing unit 902 is further operable to: and when the interaction component is detected to be triggered, determining that an interaction mode opening operation is detected in the communication interface of the first object.
In another embodiment, the first output unit 901 may be further used for:
When the interaction component is detected to be triggered, displaying a target element in a communication interface of the first object; the target element comprises a switch button for switching on or off thermodynamic diagrams, and the switch button is in an on state;
wherein the thermodynamic diagram is displayed in an on state of the switch button, and the thermodynamic diagram is cancelled from being displayed when the switch button is switched from the on state to the off state.
In another embodiment, when the first output unit 901 displays the target element in the communication interface of the first object, the first output unit may be specifically configured to:
determining a display position of the interaction component in a communication interface of the first object; and displaying a target element at a display position of the interactive component to cover or replace the interactive component with the target element.
In another embodiment, the first output unit 901 may be further used for:
after the first object starts an interaction mode, displaying interaction time in a communication interface of the first object;
the target element further comprises an interaction mode closing component, and the interaction time length is displayed in a display area of the interaction mode closing component.
In another embodiment, the first object is a main broadcasting object, and the first object is live broadcasting in a live game scene;
wherein the N virtual resources include: during the game play of the first object, the game resources involved;
the first frame includes: a game screen involved in the game of the first object; the second frame includes: and acquiring the first picture by the terminal equipment of the first object to obtain a live broadcast picture.
On the other hand, the embodiment of the application also provides another object interaction device, which may be a computer program (including program code) running in a computer device, where the computer device refers to a terminal device of a second object; as shown in fig. 9b, the object interaction device may comprise a second output unit 903 and a second processing unit 904. The object interaction device may perform the object processing method shown in fig. 6, i.e. the object interaction device may run the above units:
a second output unit 903, configured to display a second screen in a communication interface of a second object during a process of communicating with the first object; the picture content of the second picture is the same as the picture content of the first picture displayed in the communication interface of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
A second processing unit 904, configured to, in a state where the second object is in interaction with the first object, notify, in response to a trigger operation of the second object for the second screen, a terminal device of the first object to update a communication interface of the first object according to the trigger operation performed by the second object;
the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
In one embodiment, the second output unit 903 may be further configured to:
in response to a trigger operation of the second object on the second screen, performing a visual highlighting of an operation position of the trigger operation performed on the second object in the second screen;
wherein the visual highlighting includes at least one of: highlighting and adopting a preset mark to carry out identification processing on the operation position.
In another embodiment, the second output unit 903 may be further configured to:
after receiving the interaction prompt information, displaying the interaction prompt information in a communication interface of the second object; the interaction prompt message is used for prompting: whether the corresponding second object participates in an interaction with the first object;
and responding to the confirmation operation aiming at the interaction prompt information, and outputting a state prompt message in a communication interface of the second object, wherein the state prompt message is used for prompting that the second object is in an interaction state with the first object.
In another embodiment, after the second output unit 903 displays the interaction prompt information, the method may further be used to:
if the confirmation operation for the interaction prompt information is not detected within the preset time, canceling the display of the interaction prompt information in the communication interface of the second object;
or if the closing operation for the interaction prompt information is detected, canceling to display the interaction prompt information in the communication interface of the second object.
According to one embodiment of the present application, the steps involved in the method shown in fig. 2 or fig. 4 may be performed by the units in the object interaction device shown in fig. 9 a. For example, both of step S201 and step S202 shown in fig. 2 may be performed by the first output unit 901 shown in fig. 9a, and step S203 may be performed by the first processing unit 902 shown in fig. 9 a. As another example, steps S401 and S402 shown in fig. 4 may each be performed by the first output unit 901 shown in fig. 9a, steps S403 and S404 may each be performed by the first processing unit 902 shown in fig. 9a, and so on.
According to another embodiment of the present application, the steps involved in the method shown in fig. 6 may be performed by the units in the object interaction device shown in fig. 9 b. For example, step S601 shown in fig. 6 may be performed by the second output unit 903 shown in fig. 9b, step S602 may be performed by the second processing unit 904 shown in fig. 9b, and so on.
According to another embodiment of the present application, each unit in the object interaction device shown in fig. 9a and 9b may be separately or completely combined into one or several other units, or some unit(s) thereof may be further split into a plurality of units with smaller functions, which may achieve the same operation without affecting the implementation of the technical effects of the embodiments of the present application. The above units are divided based on logic functions, and in practical applications, the functions of one unit may be implemented by a plurality of units, or the functions of a plurality of units may be implemented by one unit. In other embodiments of the present application, any of the object interaction devices may also include other units, and in practical applications, these functions may also be implemented with assistance from other units, and may be implemented by cooperation of multiple units.
According to another embodiment of the present application, an object interaction device as shown in fig. 9a may be constructed by running a computer program (including program code) capable of executing the steps involved in the respective methods as shown in fig. 2 or fig. 4 on a general-purpose computing device such as a computer including a Central Processing Unit (CPU), a random access storage medium (RAM), a read only storage medium (ROM), etc., and a storage element, and implementing the object interaction method of the embodiments of the present application; and the object interaction device as shown in fig. 9b may be constructed by running a computer program (including a program code) capable of executing the steps involved in the corresponding method as shown in fig. 6 on a general-purpose computing device such as a computer including a Central Processing Unit (CPU), a random access storage medium (RAM), a read only storage medium (ROM), etc., and a storage element, and implementing the object interaction method of the embodiments of the present application. The computer program may be recorded on, for example, a computer storage medium, and loaded into and run in the above-described computing device through the computer storage medium.
According to the embodiment of the application, in the communication process of the first object and the M second objects, a first picture can be displayed in the communication interface of the first object, and a second picture can be displayed in the communication interface of each second object; wherein, the first picture comprises: n virtual resources to be selected by the first object, and the picture content of the second picture is the same as the picture content of the first picture. Then, the terminal device of at least one second object interacting with the first object can respond to the triggering operation of the corresponding second object on the second picture; correspondingly, the terminal device of the first object can update the communication interface of the first object according to the triggering operation of the at least one second object in the second picture, and the updated communication interface is used for displaying the heat of the N virtual resources in the form of thermodynamic diagrams, so that the interaction efficiency between the first object and the at least one second object is improved, and the idea that the at least one second object performs resource selection for the N virtual resources is intuitively fed back. Therefore, the embodiment of the application can more efficiently, accurately and intuitively enable the first object to obtain the feedback opinion of at least one second object, and increase the interestingness in the interaction process. In addition, the embodiment of the application can support the active participation of the second object in the interaction with the first object, so that a decision for selecting resources for N virtual resources is provided for the first object, the participation of the second object in the interaction process can be improved, the second object is promoted to have more immersed experience in the interaction process, and the object viscosity (namely the user viscosity) is improved; in addition, especially for the second object (such as the disabled person such as the deaf-mute) which can not interact with the first object through the voice, the second object can be experienced more smoothly and without barriers through a simple triggering operation in a silent state.
Based on the description of the method embodiment and the device embodiment, the embodiment of the application also provides a computer device. Referring to fig. 10, the computer device includes at least a processor 1001, an input interface 1002, an output interface 1003, and a computer storage medium 1004. Wherein the processor 1001, input interface 1002, output interface 1003, and computer storage medium 1004 within a computer device may be connected by a bus or other means.
The computer storage medium 1004 may be stored in a memory of a computer device, the computer storage medium 1004 is used for storing a computer program, the computer program includes program instructions, and the processor 1001 is used for executing the program instructions stored in the computer storage medium 1004. The processor 1001, or CPU (Central Processing Unit ), is a computing core and a control core of a computer device, which is adapted to implement one or more instructions, in particular to load and execute one or more instructions to implement a corresponding method flow or a corresponding function.
In one embodiment, when the computer device is a terminal device of a first object, the processor 1001 in this embodiment of the present application may be configured to perform a series of object interactions, specifically including: displaying a first picture in a communication interface of a first object and M second objects in a communication process of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers; displaying a second picture in a communication interface of each second object in the M second objects so that at least one second object interacting with the first object performs a triggering operation on the second picture; wherein, the picture content of the first picture is the same as the picture content of the second picture; updating a communication interface of the first object according to a triggering operation executed by the at least one second object in the second picture, wherein the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, j epsilon [1, N ], and the like.
In another embodiment, when the computer device is a terminal device of a second object, the processor 1001 in this embodiment of the present application may be configured to perform a series of object interactions, including:
displaying a second picture in a communication interface of a second object in the process of communicating the second object with the first object; the picture content of the second picture is the same as the picture content of the first picture displayed in the communication interface of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers; in a state that the second object is in interaction with the first object, responding to the triggering operation of the second object on the second picture, informing the terminal equipment of the first object of updating the communication interface of the first object according to the triggering operation executed by the second object; the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, j epsilon [1, N ], and the like.
The embodiment of the application also provides a computer storage medium (Memory), which is a Memory device in the computer device and is used for storing programs and data. It is understood that the computer storage media herein may include both built-in storage media in a computer device and extended storage media supported by the computer device. The computer storage media provides storage space that stores an operating system of the computer device. Also stored in the memory space are one or more instructions, which may be one or more computer programs (including program code), adapted to be loaded and executed by the processor. The computer storage medium herein may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory; alternatively, it may be at least one computer storage medium located remotely from the aforementioned processor. In one embodiment, one or more instructions stored in a computer storage medium may be loaded and executed by a processor to implement the various method steps described above in connection with the embodiments of the object interaction methods shown in FIG. 2 or FIG. 4; in another embodiment, one or more instructions stored in a computer storage medium may be loaded and executed by a processor to implement the various method steps described above with respect to the embodiment of the object interaction method shown in FIG. 6.
According to the embodiment of the application, in the communication process of the first object and the M second objects, a first picture can be displayed in the communication interface of the first object, and a second picture can be displayed in the communication interface of each second object; wherein, the first picture comprises: n virtual resources to be selected by the first object, and the picture content of the second picture is the same as the picture content of the first picture. Then, the terminal device of at least one second object interacting with the first object can respond to the triggering operation of the corresponding second object on the second picture; correspondingly, the terminal device of the first object can update the communication interface of the first object according to the triggering operation of the at least one second object in the second picture, and the updated communication interface is used for displaying the heat of the N virtual resources in the form of thermodynamic diagrams, so that the interaction efficiency between the first object and the at least one second object is improved, and the idea that the at least one second object performs resource selection for the N virtual resources is intuitively fed back. Therefore, the embodiment of the application can more efficiently, accurately and intuitively enable the first object to obtain the feedback opinion of at least one second object, and increase the interestingness in the interaction process. In addition, the embodiment of the application can support the active participation of the second object in the interaction with the first object, so that a decision for selecting resources for N virtual resources is provided for the first object, the participation of the second object in the interaction process can be improved, the second object is promoted to have more immersed experience in the interaction process, and the object viscosity (namely the user viscosity) is improved; in addition, especially for the second object (such as the disabled person such as the deaf-mute) which can not interact with the first object through the voice, the second object can be experienced more smoothly and without barriers through a simple triggering operation in a silent state.
It should be noted that according to an aspect of the present application, there is also provided a computer program product or a computer program comprising computer instructions stored in a computer storage medium. When the computer device is a terminal device of the first object, a processor of the computer device reads the computer instructions from the computer storage medium, and the processor executes the computer instructions, so that the computer device performs the method provided in various optional manners in the aspect of the object interaction method embodiment shown in fig. 2 or fig. 4; when the computer device is a terminal device of the second object, a processor of the computer device reads the computer instructions from the computer storage medium, and the processor executes the computer instructions, so that the computer device performs the method provided in various optional manners in the aspect of the object interaction method embodiment shown in fig. 6.
It is also to be understood that the foregoing is merely illustrative of the present application and is not to be construed as limiting the scope of the application, which is defined by the appended claims.

Claims (20)

1. An object interaction method, comprising:
displaying a first picture in a communication interface of a first object and M second objects in a communication process of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
displaying a second picture in a communication interface of each second object in the M second objects so that at least one second object interacting with the first object performs a triggering operation on the second picture; wherein, the picture content of the first picture is the same as the picture content of the second picture;
updating a communication interface of the first object according to a triggering operation executed by the at least one second object in the second picture, wherein the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
2. The method of claim 1, wherein updating the communication interface of the first object according to the triggering operation performed by the at least one second object in the second screen comprises:
acquiring a thermodynamic diagram generated according to a triggering operation performed by the at least one second object in the second picture;
and displaying the thermodynamic diagram in a communication interface of the first object.
3. The method of claim 2, wherein displaying the thermodynamic diagram in the communication interface of the first object comprises:
superposing and displaying the thermodynamic diagram as a mask layer on a first picture in a communication interface of the first object;
or outputting a sub-page in the communication interface of the first object, and displaying the second picture and the thermodynamic diagram in a superposition manner in the sub-page.
4. The method of claim 1, wherein after updating the communication interface of the first object according to the triggering operation performed by the at least one second object in the second screen, the method further comprises:
selecting a target virtual resource from the N virtual resources according to the updated communication interface, wherein the target virtual resource is the virtual resource with the highest heat in the N virtual resources;
And displaying a picture for indicating that the target virtual resource is selected in the communication interface of the first object.
5. The method of claim 1, wherein after updating the communication interface of the first object according to the triggering operation performed by the at least one second object in the second screen, the method further comprises:
determining a target virtual resource from the N virtual resources according to the updated communication interface, wherein the target virtual resource is the virtual resource with the highest heat in the N virtual resources;
and outputting a decision prompt at the communication interface of the first object, wherein the decision prompt is used for prompting the first object to select a target virtual resource in the N virtual resources.
6. The method according to claim 1, wherein the method further comprises:
responding to the detected interactive mode starting operation in the communication interfaces of the first objects, and displaying interactive prompt information in the communication interfaces of the second objects;
wherein, the interactive prompt message is used for prompting: whether the corresponding second object participates in an interaction with the first object; and after any second object performs a confirmation operation on the interaction prompt information, any second object is determined to participate in the interaction with the first object.
7. The method of claim 6, wherein the method further comprises:
displaying an interaction component in a communication interface of the first object;
and when the interaction component is detected to be triggered, determining that an interaction mode opening operation is detected in the communication interface of the first object.
8. The method of claim 7, wherein the method further comprises:
when the interaction component is detected to be triggered, displaying a target element in a communication interface of the first object; the target element comprises a switch button for switching on or off thermodynamic diagrams, and the switch button is in an on state;
wherein the thermodynamic diagram is displayed in an on state of the switch button, and the thermodynamic diagram is cancelled from being displayed when the switch button is switched from the on state to the off state.
9. The method of claim 8, wherein displaying the target element in the communication interface of the first object comprises:
determining a display position of the interaction component in a communication interface of the first object; and displaying a target element at a display position of the interactive component to cover or replace the interactive component with the target element.
10. The method of claim 8, wherein the method further comprises:
after the first object starts an interaction mode, displaying interaction time in a communication interface of the first object;
the target element further comprises an interaction mode closing component, and the interaction time length is displayed in a display area of the interaction mode closing component.
11. The method of claim 1, wherein the first object is a host object, the first object being live in a live game scene;
wherein the N virtual resources include: during the game play of the first object, the game resources involved;
the first frame includes: a game screen involved in the game of the first object; the second frame includes: and acquiring the first picture by the terminal equipment of the first object to obtain a live broadcast picture.
12. An object interaction method, comprising:
displaying a second picture in a communication interface of a second object in the process of communicating the second object with the first object; the picture content of the second picture is the same as the picture content of the first picture displayed in the communication interface of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
In a state that the second object is in interaction with the first object, responding to the triggering operation of the second object on the second picture, informing the terminal equipment of the first object of updating the communication interface of the first object according to the triggering operation executed by the second object;
the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
13. The method according to claim 12, wherein the method further comprises:
in response to a trigger operation of the second object on the second screen, performing a visual highlighting of an operation position of the trigger operation performed on the second object in the second screen;
wherein the visual highlighting includes at least one of: highlighting and adopting a preset mark to carry out identification processing on the operation position.
14. The method according to claim 12, wherein the method further comprises:
after receiving the interaction prompt information, displaying the interaction prompt information in a communication interface of the second object; the interaction prompt message is used for prompting: whether the corresponding second object participates in an interaction with the first object;
and responding to the confirmation operation aiming at the interaction prompt information, and outputting a state prompt message in a communication interface of the second object, wherein the state prompt message is used for prompting that the second object is in an interaction state with the first object.
15. The method of claim 14, wherein after displaying the interactive alert message, the method further comprises:
if the confirmation operation for the interaction prompt information is not detected within the preset time, canceling the display of the interaction prompt information in the communication interface of the second object;
or if the closing operation for the interaction prompt information is detected, canceling to display the interaction prompt information in the communication interface of the second object.
16. An object interaction device, comprising:
a first output unit, configured to display a first screen in a communication interface of a first object and M second objects during a communication process of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
The first output unit is further configured to display a second screen in a communication interface of each second object in the M second objects, so that at least one second object that interacts with the first object performs a triggering operation on the second screen; wherein, the picture content of the first picture is the same as the picture content of the second picture;
the first processing unit is used for updating the communication interface of the first object according to the triggering operation of the at least one second object in the second picture, and the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
17. An object interaction device, comprising:
a second output unit, configured to display a second screen in a communication interface of a second object during communication between the second object and the first object; the picture content of the second picture is the same as the picture content of the first picture displayed in the communication interface of the first object; the first picture includes: n virtual resources to be selected by the first object; m and N are positive integers;
The second processing unit is used for responding to the triggering operation of the second object on the second picture under the condition that the second object is in an interaction state with the first object, informing the terminal equipment of the first object of updating the communication interface of the first object according to the triggering operation executed by the second object;
the updated communication interface is used for displaying the heat of the N virtual resources in a thermodynamic diagram form so as to prompt the first object to select the resources; and when the number of times of the trigger operation executed by the j-th virtual resource in the N virtual resources is higher than the number of times of the trigger operation executed by other virtual resources in the N virtual resources, the heat degree of the j-th virtual resource is higher than the heat degree of other virtual resources in the N virtual resources, and j epsilon [1, N ].
18. A computer device comprising a processor and a memory for storing a computer program which, when executed by the processor, implements the method of any of claims 1-11; alternatively, the computer program, when executed by the processor, implements the method of any of claims 12-15.
19. A computer storage medium storing a computer program which, when executed by a processor, implements the method of any one of claims 1-11; alternatively, the computer program, when executed by the processor, implements the method of any of claims 12-15.
20. A computer program product comprising a computer program which, when executed by a processor, implements the method of any of claims 1-11; alternatively, the computer program, when executed by the processor, implements the method of any of claims 12-15.
CN202210527088.7A 2022-05-16 2022-05-16 Object interaction method, device, equipment and storage medium Pending CN117119205A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210527088.7A CN117119205A (en) 2022-05-16 2022-05-16 Object interaction method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210527088.7A CN117119205A (en) 2022-05-16 2022-05-16 Object interaction method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117119205A true CN117119205A (en) 2023-11-24

Family

ID=88798898

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210527088.7A Pending CN117119205A (en) 2022-05-16 2022-05-16 Object interaction method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117119205A (en)

Similar Documents

Publication Publication Date Title
CN110784752B (en) Video interaction method and device, computer equipment and storage medium
CN112565804B (en) Live broadcast interaction method, equipment, storage medium and system
EP4145841A1 (en) Method for interacting in live streaming and electronic device
WO2020248711A1 (en) Display device and content recommendation method
US20210350482A1 (en) Systems, methods, and media for providing an interactive presentation to remote participants
CN113411656B (en) Information processing method, information processing device, computer equipment and storage medium
CN113573092B (en) Live broadcast data processing method and device, electronic equipment and storage medium
CN109660854A (en) Video recommendation method, device, equipment and storage medium
WO2023109037A1 (en) Interaction method based on live-streaming room, and electronic device
CN106559312A (en) Group management and device based on controlled plant
CN113269585A (en) Method for acquiring virtual currency on live broadcast platform and terminal equipment
CN114663188A (en) Interactive data processing method and device, electronic equipment and storage medium
CN112073740A (en) Information display method, device, server and storage medium
KR20220002850A (en) Method and apparatus for displaying an interface for providing a social network service through an anonymous based profile
CN112947819B (en) Message display method, device, storage medium and equipment for interactive narrative work
KR101891155B1 (en) Composed notice function apparatas and method of using for chatting application in a portable terminal
CN113177759A (en) Logistics information display method and device and projection equipment
CN117119205A (en) Object interaction method, device, equipment and storage medium
CN115767112A (en) Information processing method and device
CN113515336B (en) Live room joining method, creation method, device, equipment and storage medium
CN115639927A (en) Virtual resource identifier display method, virtual resource identifier configuration information processing method, virtual resource identifier display device, virtual resource configuration information processing device and virtual resource identifier configuration information processing equipment
CN112073302B (en) User management method, device and computer readable medium
CN113891123A (en) Method, device and system for pushing virtual space information
CN114666643A (en) Information display method and device, electronic equipment and storage medium
US20160005321A1 (en) Systems, methods, and media for providing virtual mock trials

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination