CN114895787A - Multi-person interaction method and device, electronic equipment and storage medium - Google Patents
Multi-person interaction method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN114895787A CN114895787A CN202210556342.6A CN202210556342A CN114895787A CN 114895787 A CN114895787 A CN 114895787A CN 202210556342 A CN202210556342 A CN 202210556342A CN 114895787 A CN114895787 A CN 114895787A
- Authority
- CN
- China
- Prior art keywords
- resource
- interaction
- user
- target
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 542
- 238000000034 method Methods 0.000 title claims abstract description 167
- 230000008569 process Effects 0.000 claims abstract description 80
- 230000000694 effects Effects 0.000 claims abstract description 58
- 230000002452 interceptive effect Effects 0.000 claims description 135
- 230000001960 triggered effect Effects 0.000 claims description 34
- 238000012545 processing Methods 0.000 claims description 12
- 230000033001 locomotion Effects 0.000 claims description 8
- 238000009877 rendering Methods 0.000 claims description 7
- 230000003068 static effect Effects 0.000 claims description 7
- 230000006870 function Effects 0.000 abstract description 14
- 238000010586 diagram Methods 0.000 description 12
- 235000017899 Spathodea campanulata Nutrition 0.000 description 9
- 230000006399 behavior Effects 0.000 description 9
- 238000004590 computer program Methods 0.000 description 8
- 239000000725 suspension Substances 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 230000000670 limiting effect Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000002829 reductive effect Effects 0.000 description 3
- 238000013475 authorization Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000002860 competitive effect Effects 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000009966 trimming Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the disclosure provides a multi-person interaction method, a multi-person interaction device, electronic equipment and a storage medium, wherein the method comprises the following steps: determining target resources when target operation is detected in the interaction process of a user based on a real-time interaction interface; the target resource is a resource for the interaction of the at least one user; displaying target resources in a real-time interaction interface corresponding to a user so that the user can perform resource interaction based on the target resources; and when detecting that the resource interaction pause or termination condition is met, displaying an interaction pause or termination picture corresponding to the target resource. Through the technical scheme of the embodiment of the disclosure, rich interaction functions are realized, so that the interactivity and interestingness among users are improved in the user interaction process, and the technical effect of user experience is further improved.
Description
Technical Field
The disclosed embodiments relate to multimedia interaction technologies, and in particular, to a multi-user interaction method and apparatus, an electronic device, and a storage medium.
Background
At present, with the popularization of terminal devices, various application software can be developed and installed on the terminal devices to realize interaction between users based on the various application software, for example, a corresponding session is performed based on multi-person session software.
However, the existing multi-person conversation software can realize the interaction of voice, characters, expressions, shared documents, forwarded documents and the like, and has the problems of low interaction richness and poor interactivity, so that the user experience is poor, and the user loss is caused.
Disclosure of Invention
The disclosure provides a multi-user interaction method, a multi-user interaction device, electronic equipment and a storage medium, so as to realize rich interaction functions, improve interactivity and interestingness among users in a user interaction process, and further improve technical effects of user experience.
In a first aspect, an embodiment of the present disclosure provides a multi-person interaction method, where the method includes:
determining target resources when target operation is detected in the interaction process of a user based on a real-time interaction interface; the target resource is a resource for the interaction of the at least one user;
displaying target resources in a real-time interaction interface corresponding to a user so that the user can perform resource interaction based on the target resources;
and when detecting that the resource interaction pause or termination condition is met, displaying an interaction pause or termination picture corresponding to the target resource.
In a second aspect, an embodiment of the present disclosure further provides a multi-person interaction device, where the device includes:
the target resource determining module is used for determining target resources when target operation is detected in the real-time interaction interface-based interaction process of a user; the target resource is a resource for the interaction of the at least one user;
the target resource display module is used for displaying the target resources in a real-time interaction interface corresponding to the user so that the user can perform resource interaction based on the target resources;
and the picture display module is used for displaying the interactive pause or termination picture corresponding to the target resource when the condition that the resource interactive pause or termination condition is met is detected.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, where the electronic device includes:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a multi-person interaction method as in any of the embodiments of the disclosure.
In a fourth aspect, the embodiments of the present disclosure further provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are used for performing the multi-person interaction method according to any one of the embodiments of the present disclosure.
According to the technical scheme of the embodiment, the target resource is determined when the target operation is detected in the process that the user interacts based on the real-time interaction interface, the target resource is displayed in the real-time interaction interface corresponding to the user to interact based on the target resource, the purpose of multi-user interaction is achieved, when the condition that the resource interaction pause or termination is met is detected, the interaction pause or termination picture corresponding to the target resource is displayed, the problems of low interaction abundance, poor interactivity and poor user use experience in multi-user conversation are solved, the rich interaction function is achieved, the interactivity and interestingness among the users are improved in the user interaction process, and the technical effect of the user use experience is further improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
Fig. 1 is a schematic flowchart of a multi-user interaction method according to an embodiment of the disclosure;
fig. 2 is a schematic diagram of a real-time interactive interface with a special attribute superimposition effect according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a real-time interactive interface when multi-user interaction ends according to an embodiment of the disclosure;
fig. 4 is a schematic structural diagram of a multi-user interaction device according to an embodiment of the disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
It is understood that before the technical solutions disclosed in the embodiments of the present disclosure are used, the type, the use range, the use scene, etc. of the personal information related to the present disclosure should be informed to the user and obtain the authorization of the user through a proper manner according to the relevant laws and regulations.
For example, in response to receiving an active request from a user, a prompt message is sent to the user to explicitly prompt the user that the requested operation to be performed would require the acquisition and use of personal information to the user. Thus, the user can autonomously select whether to provide personal information to software or hardware such as an electronic device, an application program, a server, or a storage medium that performs the operations of the disclosed technical solution, according to the prompt information.
As an optional but non-limiting implementation manner, in response to receiving an active request from the user, the manner of sending the prompt information to the user may be, for example, a pop-up window, and the prompt information may be presented in a text manner in the pop-up window. In addition, a selection control for providing personal information to the electronic device by the user's selection of "agreeing" or "disagreeing" can be carried in the pop-up window.
It is understood that the above notification and user authorization process is only illustrative and not limiting, and other ways of satisfying relevant laws and regulations may be applied to the implementation of the present disclosure.
It will be appreciated that the data involved in the subject technology, including but not limited to the data itself, the acquisition or use of the data, should comply with the requirements of the corresponding laws and regulations and related regulations.
Before the technical solution is introduced, an application scenario may be exemplarily described. The technical scheme disclosed by the invention can be applied to any scene of multi-person conversation, such as multi-person character conversation, multi-person voice conversation and/or multi-person video conversation. The technical scheme provided by the embodiment of the disclosure can be adopted to realize the multi-person conversation scene. The technical scheme provided by the embodiment of the present disclosure may also be embedded in an existing live video scene, for example, in a scene where multiple people live in a single line, the technical scheme provided by the embodiment of the present disclosure may be implemented. It can be understood that the technical solution can be integrated into any existing session software and conference software, for example, the method can be integrated into an instant messaging software, and can also be integrated into a multi-user session scene in the form of an external link, i.e., a web page.
Fig. 1 is a schematic flow diagram of a multi-user interaction method provided in an embodiment of the present disclosure, where the embodiment of the present disclosure is applicable to displaying a corresponding interaction situation in a real-time interaction application scenario supported by the internet, and the method may be executed by a multi-user interaction device, and the device may be implemented in a form of software and/or hardware, and optionally implemented by an electronic device, where the electronic device may be a mobile terminal, a PC terminal, a server terminal, or the like. The method provided by the embodiment of the disclosure can be executed by the client, the server, or both.
As shown in fig. 1, the method includes:
s110, in the process that a user interacts based on the real-time interaction interface, when target operation is detected, target resources are determined.
The real-time interactive interface is any interactive interface in a real-time interactive application scene. The real-time interactive application scenario may be implemented through the internet and computer means, such as an interactive application implemented through a native program or a web program, etc. In a real-time interactive interface, multiple users may be allowed to interact with various forms of interactive behavior, such as entering text, voice, video, or sharing of content objects. For example, a multi-person chat interface may be used as a real-time interactive interface. The users may be users that interact in real-time, such as users that participate in a real-time conversation, and so on. The target operation is a behavior that triggers an interaction. The target resource may be a resource required for a subsequent interaction, i.e. a resource for at least one user to interact with. The target resource can be a game resource, a book resource, a music resource, a video resource, and the like. If the game resource is provided, a competitive game or the like can be performed based on the game resource. Music resources can share music and the like, video resources can share videos, book resources can share book contents and the like.
Specifically, each user may perform resource interaction based on a real-time interaction interface, optionally, competitive interaction, and whether each user performs a target operation or not may be detected during the interaction process, and if so, a resource corresponding to the current target operation, that is, a target resource may be determined.
Optionally, the target operation may include at least one of:
firstly, triggering a target interaction control.
The target interaction control may be a control for multi-user interaction in a real-time interaction interface, and may be a physical button or a virtual button, for example, a button such as "play together".
Specifically, when the user triggers the target interaction control, such as: clicking the "play with game" button may determine that the target operation was generated.
And secondly, triggering an interactive awakening word by the voice information of the user.
The voice information may be a voice stream transmitted by the user in real time. The interactive wake-up word may be a preset word for multi-person interaction, for example: words such as "start playing", "start interacting", "listen to music together", etc.
Specifically, when the voice message includes an interactive wake-up word, it is considered that the target operation is generated.
Illustratively, the interactive wake-up word is a start game, and the voice message of the user is "let us start a game bar", which may be considered that the voice message of the current user triggers the interactive wake-up word.
The target operation is triggered by at least one mode, so that interactive triggering modes can be enriched, more triggering choices are provided for a user, and the experience degree of the user is improved.
Optionally, in order to provide a plurality of interaction types to the user to enrich the form of the interaction, the target resource may be determined by the following steps:
step one, a resource list comprising at least one resource to be selected is displayed in a real-time interactive interface of a user.
The resource to be selected may be a resource required for various interactions, for example: game resources, book resources, music resources, video resources, or the like. The resource list can be a display list formed by interaction of various resources.
Specifically, the resource list may be generated in advance according to various resources to be selected. And after the target operation is detected, displaying the resource list in a real-time interactive interface of the user for the user to select from.
And step two, determining the target resource based on the triggering operation of at least one resource to be selected.
Specifically, the user may select and trigger at least one resource to be selected in the resource list, and further, according to the triggering condition of each resource to be selected, for example: determining the target resource according to the sequence, the triggering frequency and the like.
In order to increase the triggering diversity of the target resource and enhance the participation of each user, the target resource can be determined based on the triggering operation on at least one resource to be selected by at least one of the following ways:
the first mode is that the target resource is determined according to the trigger time stamp of the user on at least one resource to be selected.
The trigger timestamp may be time information for the user to select and trigger each resource to be selected.
Specifically, each user may selectively trigger each resource to be selected in the resource list, may determine a trigger timestamp of each selective trigger behavior, and may use the resource to be selected corresponding to the earliest trigger timestamp as the target resource.
And secondly, determining the target resource according to the triggered frequency of at least one resource to be selected in the preset selection duration.
The preset selection duration may be a preset duration of selection interaction reserved for each user, for example, may be 3s, 5s, and the like, and may be set according to a user requirement, which is not specifically limited in the embodiment of the present disclosure. The triggered frequency may be the number of times that each resource to be selected is triggered.
Specifically, the triggered frequency of each resource to be selected in the preset selection duration is recorded, and when the preset selection duration is reached, the resource to be selected with the highest triggered frequency is used as the target resource.
And thirdly, taking the resource to be selected triggered by the behavior user as a target resource.
Wherein the behavior user corresponds to a user who triggers the target operation.
It should be noted that, when the target operation is detected, it may be determined that all users in the real-time interaction interface participate in the resource interaction, and when the target operation is detected, an inquiry dialog box may be pushed to all users in the real-time interaction interface to determine the users participating in the resource interaction, for example: "is involved in an interaction? "determines that the user who selects" yes "is involved in the resource interaction.
It should be further noted that a resource list including at least one resource to be selected may be displayed on the real-time interactive interface of each user, and each user may determine the interaction that each user wants to participate in by making a selection.
And S120, displaying the target resource in the real-time interaction interface corresponding to the user so as to perform resource interaction based on the target resource.
Specifically, the target resource can be displayed in a real-time interaction interface of each user participating in the interaction, so that each user can perform resource interaction based on the target resource.
Illustratively, the target resource is a game resource, and before the resource interaction starts, the game can be formally started by counting down, for example: "3", "2", "1" and "start".
In order to enable each user to smoothly participate in resource interaction, the target resource can be loaded before the target resource is displayed in the real-time interaction interface corresponding to the user, so that the target resource is issued to the real-time interaction interface corresponding to the user.
Specifically, after the target resource is determined and before the target resource is displayed, the client requests to load the target resource from the server, or the server issues the target resource to the client, so as to display the loaded target resource on a real-time interaction interface for the user to perform subsequent resource interaction.
When the target resource is a game resource, in order to enrich display elements for game interaction in the real-time interaction interface and enhance interactivity, the target resource is displayed in the real-time interaction interface corresponding to the user in the following way under the condition that the game resource comprises at least one first element and one second element:
the method comprises the steps of displaying a basic identification of a user in a first area of the real-time interactive interface, and displaying a first element and a second element in a target resource in a second area of the real-time interactive interface.
Wherein the first element may be a user controllable element for resource interaction, such as: basketball elements in the shooting game, etc. The second element may be a target element when the user performs resource interaction, for example: a basket element in a shooting game, etc. The basic identity may be an identity for distinguishing between different users. The first area may be an area for displaying a basic identity of the user, for example: the upper region in the real-time interactive interface, etc. The second area may be an area for displaying the first element and the second element when the resource interaction is performed, or may be a remaining area excluding the first area.
Specifically, the basic identifier of the user may be displayed in the first area of the real-time interaction interface to prompt each user participating in the current resource interaction. The first element and the second element in the target resource can be displayed in a second area of the real-time interaction interface, so that each user can interact with the resource based on the first element and the second element in the second area.
Optionally, in the second area, in addition to displaying the first element and the second element, a scene element may be displayed, for example: interactive background elements, etc.
Optionally, in order to enrich interest and interactivity of resource interaction performed according to the first element and the second element, the resource interaction may be performed based on the target resource through the following steps, including:
step one, when a first element is detected to be triggered in a current real-time interactive interface, determining a target running track of the first element based on position information and speed information of the released first element.
The speed information includes a speed magnitude and/or a speed direction. The target trajectory may be a motion trajectory of the first element after releasing the first element, for example: parabolic trajectories, etc.
Specifically, the user may trigger the first element in the real-time interactive interface, and a position where the first element is located when the first element is triggered may be used as the position information. The speed of releasing the first element may be determined according to the trigger duration, the sliding distance, and the like when the user releases the first element, and the speed direction of releasing the first element may also be determined according to the sliding direction, and the like when the user releases the first element. Furthermore, the target operation track for releasing the first element can be determined according to the position information and the speed information for releasing the first element.
It should be noted that, if the speed information for releasing the first element only includes the speed magnitude, the speed direction may be fixed, for example: right ahead, etc. If the speed information for releasing the first element only includes the speed direction, the speed can be preset to form the target running track.
And secondly, determining the resource interaction attribute characteristics in the interaction process based on the target running track and the position information of the second element.
The resource interaction attribute feature may be a numerical value used for measuring a resource interaction matching degree in a current resource interaction process, and may be, for example, a score.
Specifically, based on the target moving track and the position information of the second element, the relative position of the target moving track and the position information of the second element can be determined, and further, the resource interaction attribute characteristics in the resource interaction process can be determined according to the relative position.
In order to more accurately determine the resource interaction attribute characteristics in the resource interaction process and enhance the interest and fairness of the interaction, the resource interaction attribute characteristics of each user can be determined in the following ways:
it should be noted that the same manner may be used for determining the resource interaction attribute characteristics for each user, and therefore, one of the users is taken as an example for description.
And if the first element is determined to fall into the second element based on the target running track and the position information of the second element, accumulating the resource interaction attribute characteristics for the current user.
The current user corresponds to the current real-time interactive interface, namely the user performs resource interaction through the current real-time interactive interface.
Specifically, the first element moves along the target running track, and when the first element falls into the second element, the success of the resource interaction through the first element at this time can be determined, so that the resource interaction attribute characteristics can be accumulated for the current user.
Illustratively, the target resource is a game resource, specifically a basketball game resource, the first element is a basketball element, the second element is a basket element, and the resource interaction attribute feature corresponds to a shooting hit point. When the basketball element is detected to be triggered in the current real-time interactive interface, the target running track of the basketball element is determined based on the position information and the speed information of the released basketball element, when the basketball element is determined to fall into the basketball element through the target running track, the shooting hit integral is increased, and when the basketball element is determined not to fall into the basketball element through the target running track, the shooting hit integral is not changed. It should be noted that the basketball element can be selected by clicking the basketball element, only one basketball element can be selected at the same time, and at the moment, the user does not loosen his hand, and the basketball element can move along with the finger touch point all the time. When a basketball element is selected, the player can see the basketball element as a shooting, and after the basketball element is shot, the player can determine which point of the horizontal position of the basketball element reaches the basketball rim element according to the vector of the last displacement so as to calculate and display the parabola animation. If the holding state is not displaced all the time, the basketball element can be shot straight upwards. It should be noted that the basketball element, the rim element, or other boundary elements may have collision and physical properties, so that the basketball element may be popped out of the basket element or turned into the basket element, thereby improving the reality of the interaction.
In order to increase the difficulty of resource interaction and improve the interest of resource interaction, the resource interaction can be realized by the following modes:
and when detecting that the interaction duration corresponding to the target resource reaches a first duration threshold, controlling the second element to move according to a preset motion track, and repeatedly executing the step of determining the resource interaction attribute characteristics in the resource interaction process when detecting that the first element is triggered.
Wherein, the interaction duration may be the duration of the resource interaction. The first time threshold may be a duration time for which the position of the second element is fixed and unchanged, and may be determined according to a current resource interaction condition, or determined by presetting, and a specific time is not specifically limited in the embodiment of the present disclosure.
Specifically, when it is detected that the interaction duration corresponding to the target resource reaches the first duration threshold, the difficulty of resource interaction may be increased, and the second element may be made to move, or the second element may be controlled to move according to a preset movement trajectory. And when the triggering first element is detected, repeatedly executing the step of determining the resource interaction attribute characteristics in the resource interaction process to complete the condition statistics of the resource interaction.
Illustratively, taking the first element as a basketball element and the second element as a basket element, the basket element enters a side-to-side swinging state when the interaction duration corresponding to the target resource reaches 30 seconds (the first duration threshold), and continues until the resource interaction is finished.
On the basis of the above-mentioned disclosed embodiment, the resource interaction process of other users can be displayed in the resource interaction process of the current user, which specifically may be:
and displaying the resource interaction process corresponding to other users according to the corresponding transparency parameter.
The transparency parameter may be a preset parameter or a parameter adjusted according to actual conditions, and is used for transparency adjustment, for example: the smaller the transparency parameter, the more opaque. The other users may be users other than the user corresponding to the current real-time interactive interface.
Specifically, in the process of interaction of the current user, the resource interaction processes corresponding to other users can be displayed through the transparency parameter in the current real-time interaction interface, so that the interaction participation of the user is increased, and the interaction experience of the current user is not influenced.
In order to display the resource interaction process corresponding to other users in the real-time interaction interface of the current user, the resource interaction data streams of other users can be pulled, the corresponding transparency parameter is determined according to the resource interaction attribute characteristics of the user, and the resource interaction process of the corresponding user is rendered on the current real-time interaction interface.
Wherein the other users include all users or users triggering selection. The resource interaction data stream may be a data stream for representing a resource interaction process, and the resource interaction data stream is generated according to an operation of a user, so that the resource interaction data streams of the other users are generally different.
Specifically, all or part of the resource interaction data streams of other users can be pulled, and the resource interaction attribute characteristics of the other users can be obtained, so that different transparency parameters can be determined according to different resource interaction attribute characteristics, and the resource interaction process of the users is rendered to the current real-time interaction interface. Optionally, the higher the value corresponding to the resource interaction attribute feature is, the larger the transparency parameter is.
Optionally, the transparency parameter is obtained by rendering based on a preset alpha channel value, and different transparency parameters correspond to different transparency masks.
Optionally, the alpha channel value may be set as a default value, and when a change in the transparency parameter is detected, the alpha channel value is adjusted for rendering. The alpha channel value may not be set in advance, and when the transparency parameter is detected to be, the alpha channel value is set, so that rendering is performed based on the value of the alpha channel in the rendering process.
Optionally, the current user may trigger, in a selected manner, another user who wants to display the resource interaction process in the current real-time interaction interface, so that the current user can actively select the display content in the current real-time interaction interface, which may specifically be implemented in the following manner:
and pulling the resource interaction data stream of at least one other user triggered by the current user.
Specifically, before the resource interaction starts, the basic identifier of each user currently participating in the resource interaction can be provided for the current user, so that the current user can select the user who wants to watch in the resource interaction process. Or, in the resource interaction process, the current user can trigger the basic identification of one or more other users in the current real-time interaction interface in a selection mode. And further, in the process of resource interaction, pulling the triggered resource interaction data streams of other users for semi-transparent display.
It should be noted that, if the current user does not trigger the basic identifier of any other user, the resource interaction data stream of the other user may not be pulled, that is, only the resource interaction of the current user is displayed in the current real-time interaction interface. The use of network bandwidth can be reduced by means of current user triggering.
Optionally, in order to reduce the pressure of the processor and the network occupancy rate, when the target resource is a game resource, the resource interaction process corresponding to other users may be displayed according to the corresponding transparency parameter in the following manner:
periodically acquiring element position information of a first element of other users;
and determining the interaction process of other corresponding users based on the interpolation processing of the element position information, and displaying according to the corresponding transparency parameters.
Specifically, the element position information of the first element of the other users in the resource interaction process may be periodically obtained according to a preset period. Furthermore, resource interaction can be displayed in an interpolation processing mode based on the position information of two adjacent elements, and the consistency of the first resource element displayed in the current real-time interaction interface based on the transparency parameter is improved.
It should be noted that, if the first element is a basketball element and the second element is a basket element, the shooting machine in the real world can be physically simulated in a 3D manner in the above manner.
On the basis of the above disclosed embodiment, if the target resource is an audiovisual resource, the method may include:
and displaying the triggering operation of other users on the audio-visual resources and the audio-visual resources corresponding to the triggering operation in the real-time interactive interface according to the corresponding transparency parameters.
Where an audiovisual resource may be an audio or visual resource, for example: music, movies, books, etc.
Exemplarily, if the audiovisual resource is a book or the like, the current pages of the book of other users can be displayed in the real-time interactive interface according to the corresponding transparency parameter; if the audio-visual resource is music or a film, the progress information of other users on the audio-visual resource can be displayed in the real-time interactive interface according to the corresponding transparency parameter.
It is understood that if the target resource is an audiovisual resource, the same processing as the game resource can be performed, and book interface sharing is implemented, for example, a user underlines a certain position of a shared book and can display the underlines on the book interfaces of other users.
Optionally, in the process of resource interaction based on the target resource, the interactive tension and interactivity can be improved by displaying the target resource interaction attribute characteristics of other users, which specifically may be:
acquiring target resource interaction attribute characteristics of other users; and determining attribute display information in the current real-time interactive interface based on at least one target resource interaction attribute characteristic and the current resource interaction attribute characteristic of the current user.
And the current real-time interactive interface corresponds to the current user. The target resource interaction attribute feature may be a resource interaction attribute feature corresponding to each of the other users. The current resource interaction attribute feature may be a resource interaction attribute feature corresponding to the current user. The attribute display information may be resource interaction attribute characteristics of each user displayed in the current real-time interaction interface, and may include target resource interaction attribute characteristics of one or more other users and current resource interaction attribute characteristics of the current user.
Specifically, the target resource interaction attribute features of other users may be obtained in real time or periodically, and the target resource interaction attribute features that need to be displayed subsequently are determined, for example: all target resource interaction attribute features, or a preset number of target resource interaction attribute features that are in front of each other in sequence. And integrating the determined target resource interaction attribute characteristics with the current resource interaction attribute characteristics to obtain attribute display information displayed in the current real-time interaction interface.
Illustratively, the display of the attribute display information may be displayed according to the change of the real-time resource interaction attribute characteristics, or may be displayed in a descending order according to a ranking list, and if the resource interaction attribute characteristics are the same, the resource interaction attribute characteristics of the user who reaches the resource interaction attribute characteristics first are preferentially displayed.
When the target resource is a game resource, in order to improve the participation sense and resource interaction sense of users with behind resource interaction attribute characteristics, in the process of resource interaction based on the target resource, the method can comprise the following steps:
and when the interaction time reaches a second preset time threshold, determining the target user according to the resource interaction attribute characteristics of at least one user.
The second preset duration threshold may be a preset duration for determining a user with a resource interaction attribute feature behind, and may be determined according to the total duration of the resource interaction, or may be set according to an actual requirement, which is not specifically limited in this embodiment. The target user may be the user with the lowest feature of the resource interaction attributes.
Specifically, when it is detected that the interaction duration reaches a second preset duration threshold, the resource interaction attribute characteristics of each user are obtained, and the user with the lowest resource interaction attribute characteristics is taken as the target user.
It should be noted that, if there are at least two users whose resource interaction attribute features are the lowest, all the users whose resource interaction attribute features are the lowest are taken as target users.
And issuing the attribute superposition special effect for the target user, and displaying in a real-time interactive interface corresponding to the target user.
The attribute superposition special effect can be a special effect which is provided for a target user and is used for rapidly improving the resource interaction attribute characteristics, and the attribute superposition special effect is a dynamic special effect or a static special effect. Static effects may be effects that are not absolute position-invariant in a real-time interactive interface, such as: the static special effect may be a question mark bubble special effect or the like. The dynamic effect may be an effect that changes in absolute position in the real-time interactive interface, such as: the dynamic special effect may be a question mark bubble special effect or the like which reciprocates within a preset region.
Specifically, the attribute superposition special effect is issued to the target user, and the attribute superposition special effect is displayed in a real-time interaction interface of the target user, so that the target user can rapidly improve the resource interaction attribute characteristics by triggering the attribute superposition special effect.
It should be noted that, if the attribute superposition special effect is a static special effect, the difficulty of obtaining attribute superposition is reduced; if the attribute superposition special effect is a dynamic special effect, the interactivity and the interestingness of attribute addition are enhanced.
In order to enable the target user to determine more clearly whether the attribute superposition special effect is triggered, the attribute conversion may be performed on the second element, and in order to improve the fairness of the interaction, the use of the attribute superposition special effect should have a certain number or time limit, which specifically may be:
and when detecting that the first element triggers the attribute superposition special effect, configuring superposition attributes for at least one first element to obtain a first superposition element.
The overlay attribute may be a special effect attribute added for the first element, and an addition attribute configured for the resource interaction attribute feature, for example: the addition attribute can be 2 times or 3 times of the resource interaction attribute characteristic corresponding to the first element. The first overlay element may be an element that adds a special effect on the basis of the first element, for example: the first element is a basketball element, and the first superposed element is a burning basketball element, a luminous basketball element and the like.
Specifically, when the target motion trajectory of the first element passes through the attribute superposition special effect, the first element is considered to trigger the attribute superposition special effect, at this time, the superposition attribute may be configured for one or more first elements, and the first element configured with the superposition attribute is determined as the first superposition element.
It should be noted that the number of the first overlay elements may be a preset number, for example: 4, 5, etc., or the number of first elements appearing in the refresh within a preset time.
And if the target running track of the first superposed element is coincident with the position information of the second element, superposing the resource interaction attribute characteristics of the target user until the first superposed element is the last first superposed element.
Illustratively, taking a target resource as a game resource, specifically a basketball game resource, a first element is a basketball element, a second element is a basket element, and a resource interaction attribute feature corresponds to a shooting hit integral, for example, as shown in fig. 2, in the real-time interaction interface with an attribute superposition special effect shown in fig. 2, a basic identifier a of a current user and a shooting hit integral of the current user may be displayed, and basic identifiers B, C and D of other users and shooting hit integral of each other user may also be displayed. The user initially has 4 basketball elements, and each time a basketball element is dropped into a basket element, a shot hits score + 1. Every second preset time threshold, such as 15 seconds, the target user with the lowest shooting hit integral is provided with an attribute superposition special effect, such as: a question mark bubble is refreshed near the basket element of the real-time interactive interface of the target user, and the question mark bubble can randomly float. When the target user hits the attribute superposition special effect by launching the basketball element, the basketball element is upgraded to the fireball element, and when the fireball element is thrown into the basket element, the shooting hit score +2 is achieved. Optionally, when the target user hits the attribute superposition special effect by launching the basketball element, that is, when shooting and hitting the question mark bubble, it may be displayed that the bubble is exploded to scatter 4 groups of flames onto the basketball element, and at this time, 4 basketball elements become fireball elements. And when the fireball elements are thrown into the basket elements, the goal special effect corresponding to the fireball elements (different from the goal special effect of the basketball elements and capable of displaying a +2 character) can be played, and after each fireball element is thrown out, the fireball element is changed into the basketball element no matter whether the fireball element is thrown in or not, namely the attribute superposition special effect disappears after the fireball elements fall to the ground. When a plurality of users are judged to be the lowest shooting hit score at the same time in the moment of the target user, all the users can obtain the attribute superposition special effect.
And S130, when the condition that the interaction pause or termination of the resource is met is detected, displaying an interaction pause or termination picture corresponding to the target resource.
Wherein, the resource interaction pause or termination condition may be a condition for stopping the interaction. The interaction pause or termination picture can be a picture displayed when the resource interaction is paused or terminated, and can be a picture which displays the resource interaction attribute characteristics of the current user or the resource interaction attribute characteristics of all the users when the resource interaction pause or termination condition is met. The interactive pause condition may be a condition of temporary pause, for example, a pause control manually triggered by a user during the interaction process to show the corresponding picture. The system may pause when a highlight is detected, or may pause when the last few seconds of trimming. The interaction termination condition may be a condition for stopping the interaction. Correspondingly, the content of the interactive pause picture and the content of the interactive termination picture can be the same or different, and whether the content is the same or not can be set according to actual requirements. For example, the interaction stop condition may be that the interaction duration reaches the stop duration, and a picture to be displayed when the interaction duration reaches the stop duration is used as the interaction stop picture.
Specifically, when it is detected that a resource interaction suspension or termination condition is satisfied, it is determined to stop or suspend the resource interaction. At this time, when stopping or pausing the resource interaction, the resource interaction attribute feature of the user can be obtained, and the resource interaction attribute feature, namely the interaction pause or termination picture, is displayed in the real-time interaction interface of the user.
To more accurately determine when to stop resource interaction, the resource interaction suspension or termination condition includes at least one of:
firstly, the interactive accumulated time length reaches the interactive pause or termination time length.
The accumulated interaction duration may be a duration of the resource interaction. The interaction pause or termination duration may be a preset duration of resource interaction, for example: 30s, 60s, and the like, the specific duration may be set according to actual requirements, and is not specifically limited in this embodiment. For example, the pause period is 30S, and the stop period is 60S.
And secondly, triggering an interaction pause or termination control.
The interaction pause or termination control can be a control for pausing or terminating the interaction, a physical control or a virtual control, such as a "stop game" button. Or, if any one of the areas in the display interface is triggered, the pause control is considered to be triggered.
It should be noted that the resource interaction may be stopped or suspended when it is detected that either user triggers the interaction suspension or termination control.
And thirdly, triggering interaction pause or stopping the awakening word by the voice information of the user.
The interaction pause or termination wake-up word may be a preset word for stopping multi-user resource interaction, for example: words such as "stop game", "stop interaction", "pause game", etc.
Specifically, the user may send the voice message in the real-time interactive interface, and when the voice message includes an interactive pause or termination wake-up word, the resource interactive pause or termination condition may be considered to be reached.
Illustratively, the interactive pause or termination wake word is a stop game, the voice message of the user is "let us stop the game bar", and the current voice message of the user can be considered to trigger the interactive pause or termination wake word.
Optionally, in order to improve the display effect of the resource interaction attribute feature and the interactivity of multi-person interaction, when the resource interaction attribute feature corresponding to the user is displayed, the method may include:
acquiring resource interaction attribute characteristics corresponding to a user when the resource interaction is suspended or terminated; and determining an interactive pause or termination picture according to the descending information of the resource interactive attribute characteristics.
The descending information may be information of descending order of the interactive attribute characteristics of each resource.
Specifically, when the resource interaction suspension or termination condition is met, the interaction suspension or termination is determined, and at this time, the resource interaction attribute characteristics corresponding to each user can be acquired. And then, according to the resource interaction attribute characteristics, performing descending order arrangement to obtain descending information, and displaying the resource interaction attribute characteristics in descending order arrangement and the users corresponding to the resource interaction attribute characteristics on the real-time interaction interface corresponding to the users.
According to the technical scheme of the embodiment, the target resource is determined when the target operation is detected in the process that the user interacts based on the real-time interaction interface, the target resource is displayed in the real-time interaction interface corresponding to the user to interact based on the target resource, the purpose of multi-user interaction is achieved, when the condition that the resource interaction pause or termination is met is detected, the interaction pause or termination picture corresponding to the target resource is displayed, the problems of low interaction abundance, poor interactivity and poor user use experience in multi-user conversation are solved, the rich interaction function is achieved, the interactivity and interestingness among the users are improved in the user interaction process, and the technical effect of the user use experience is further improved.
The above-mentioned disclosed embodiment is a case where each user performs resource interaction in the belonging real-time interaction interface, that is, each user does not interfere with each other when performing resource interaction. Therefore, on the basis of the disclosed embodiment, if the target resource is a game resource, the game resource comprises a first element and a second element, and the interactivity and the interest of the multi-player interaction can be enhanced through the following steps:
step one, when a control triggering multiple persons to interact in the same real-time interaction interface is detected, determining participating target interaction users.
The target interactive user can be a user participating in interaction among multiple people in the same real-time interactive interface. Illustratively, it may be that a dialog box "is engaged in the interaction of the same real-time interactive interface? And providing two options of yes and no, and taking the user triggering the option of yes as the target interactive user.
Specifically, when a control triggering multiple persons to interact in the same real-time interaction interface is detected, multiple users triggering the control are used as target interaction users.
And step two, determining a hit interactive user hitting the second element according to the target running track of the first element corresponding to the target interactive user.
The target interactive users comprise hit interactive users. The hit interactive user may be a target interactive user corresponding to the first element of the hit second element.
Specifically, when each target interaction user performs resource interaction, the target running track of the first element corresponding to each target interaction user is determined, and the target interaction user corresponding to the first element which hits the second element is used as the hit interaction user.
It should be noted that the first elements of different target interaction users can change the target moving track when colliding, so as to increase the difficulty and interactivity of the interaction. If there are multiple target interactive users' first elements that hit the second element, there may be more than one hit interactive user.
And step three, accumulating the resource interaction attribute characteristics of the hit interaction users to update the resource interaction attribute characteristics.
Specifically, when it is determined that the interactive user is hit, the resource interaction attribute features of the hit interactive user may be accumulated, the next hit interactive user may be determined, the resource interaction attribute features may be gradually accumulated until the resource interaction pause or termination condition is satisfied, and when the resource interaction pause or termination condition is satisfied, the current resource interaction attribute feature of each target interactive user may be determined.
On the basis of the above-mentioned disclosed embodiment, in the process of resource interaction, there is a situation that another user joins in the current session, and the user can be used as an observer role, and the current resource interaction is displayed on the real-time interaction interface of the user, so that the user can watch the resource interaction of each user and participate in the interaction of the current session in time, specifically, the method includes:
and when a newly added user to be interacted is detected, displaying the resource interaction process of each user in the real-time interaction interface of the user to be interacted.
The user to be interacted can be a user who looks at the resource interaction of each user.
Specifically, the user to be interacted can enter the resource interaction in a role of a spectator in the resource interaction process, and the current resource interaction is watched by displaying the resource interaction process of each user in the real-time interaction interface of the user to be interacted.
It should be noted that the user to be interacted can select any user as the main viewing angle, and the real-time interaction interface of the main viewing angle is displayed in the real-time interaction page of the user to be interacted.
On the basis of the above-mentioned disclosed embodiment, optionally, when the trigger resource release control is detected, the target resource is released, and the real-time interactive interface is displayed.
The resource release control may be a control for releasing the target resource, and may be a physical control or a virtual control, for example, an "end interaction" button.
Specifically, if the user wants to quit the current resource interaction, the resource release control can be triggered. And when the triggering of the resource release control is detected, releasing the target resource and displaying a real-time interactive interface so that the user can perform subsequent conversation. By the method, the cached target resource can be cleared timely, and the storage pressure of the memory is reduced, so that the method disclosed by the embodiment of the invention is suitable for terminal equipment with various processing capacities.
On the basis of the above-described embodiment, optionally, when the control triggering the resource interaction again is detected, the target resource is kept from being released, and the operation of performing the resource interaction based on the target resource is repeatedly performed.
The control for performing the resource interaction again may be a control for repeatedly performing the same resource interaction, and may be a physical control or a virtual control, such as a "next" button.
Specifically, if the user wants to perform the current resource interaction again, the control for performing the resource interaction again may be triggered. When the control triggering the resource interaction again is detected, the target resource is kept not to be released, and the operation of performing the resource interaction based on the target resource is repeatedly executed, so that the user can conveniently perform the resource interaction again, the problems of overlarge processor pressure and overlong user waiting time caused by frequent release and loading of the target resource are avoided, and the real-time performance of the interaction is improved.
For example, the real-time interaction interface at the end of multi-user interaction shown in fig. 3 may include a user and corresponding resource interaction attribute features, may include resource release control "exit game" and "change game" buttons, and may further include a control "come round" button for performing resource interaction again.
Fig. 4 is a schematic structural diagram of a multi-user interaction device according to an embodiment of the disclosure, as shown in fig. 4, the device includes: a target resource determination module 210, a target resource display module 220, and a screen presentation module 230.
The target resource determining module 210 is configured to determine a target resource when a target operation is detected in a real-time interaction process based on a real-time interaction interface; the target resource is a resource for the interaction of the at least one user; the target resource display module 220 is configured to display a target resource in a real-time interaction interface corresponding to a user, so that the user performs resource interaction based on the target resource; the image displaying module 230 is configured to display an interaction pause or termination image corresponding to the target resource when it is detected that a resource interaction pause or termination condition is met.
Optionally, the target operation includes at least one of: triggering a target interaction control; the voice information of the user triggers an interactive wake-up word.
Optionally, the target resource determining module 210 is further configured to display a resource list including at least one resource to be selected in the real-time interactive interface of the user; and determining the target resource based on the triggering operation of the at least one resource to be selected.
Optionally, the target resource determining module 210 is further configured to determine the target resource according to a trigger timestamp of the user on the at least one resource to be selected; and/or determining the target resource according to the triggered frequency of at least one resource to be selected in a preset selection duration; and/or taking the resource to be selected triggered by the behavior user as the target resource; wherein the behavior user corresponds to a user who triggers the target operation.
Optionally, the apparatus further comprises: and the target resource loading module is used for loading the target resource so as to issue the target resource to the real-time interaction interface corresponding to the user.
Optionally, the target resource is a game resource, where the game resource includes at least one first element and one second element, and the target resource display module 220 is further configured to display a basic identifier of a user in a first area of the real-time interactive interface, and display the first element and the second element in the target resource in a second area of the real-time interactive interface.
Optionally, the target resource display module 220 is further configured to, when it is detected that the first element is triggered in the current real-time interactive interface, determine a target moving trajectory of the first element based on the position information and the speed information for releasing the first element; the speed information comprises speed magnitude and/or speed direction; and determining the resource interaction attribute characteristics in the interaction process based on the target running track and the position information of the second element.
Optionally, the target resource display module 220 is further configured to accumulate the resource interaction attribute feature for the current user if it is determined that the first element falls into the second element based on the target running track and the position information of the second element; and the current user corresponds to the current real-time interactive interface.
Optionally, the apparatus further comprises: and the second element motion module is used for controlling the second element to move according to a preset motion track when detecting that the interaction duration corresponding to the target resource reaches a first duration threshold value, and repeatedly executing the step of determining the resource interaction attribute characteristics in the resource interaction process when detecting that the first element is triggered.
Optionally, the apparatus further comprises: and the other resource interaction display module is used for displaying the resource interaction process corresponding to the other users according to the corresponding transparency parameter.
Optionally, the other resource interaction display module is further configured to pull resource interaction data streams of other users; wherein the other users comprise all users or users triggering selection; and determining a corresponding transparency parameter according to the resource interaction attribute characteristics of the user, and rendering the resource interaction process of the corresponding user to the current real-time interaction interface.
Optionally, the other resource interaction display module is further configured to pull a resource interaction data stream of at least one other user triggered by the current user.
Optionally, the other resource interactive display module is further configured to periodically obtain element position information of the first element of the other user; and determining the interaction process of other corresponding users based on the interpolation processing of the element position information, and displaying according to the corresponding transparency parameters.
Optionally, the apparatus further comprises: the attribute display information determining module is used for acquiring target resource interaction attribute characteristics of other users; determining attribute display information in the current real-time interactive interface based on at least one target resource interaction attribute feature and the current resource interaction attribute feature of the current user; and the current real-time interactive interface corresponds to the current user.
Optionally, the target resource is a game resource, and the apparatus further includes: the attribute superposition special effect display module is used for determining a target user according to the resource interaction attribute characteristics of at least one user when the interaction duration is detected to reach a second preset duration threshold; issuing an attribute superposition special effect for the target user, and displaying in a real-time interaction interface corresponding to the target user; wherein, the attribute superposition special effect is a dynamic special effect or a static special effect.
Optionally, the apparatus further comprises: the first superposition element determining module is used for configuring superposition attributes for at least one first element when the first element triggering the attribute superposition special effect is detected, so as to obtain a first superposition element; and if the target running track of the first superposition element is coincident with the position information of the second element, superposing the resource interaction attribute characteristics of the target user until the first superposition element is the last first superposition element.
Optionally, the apparatus further comprises: the hit interactive user determining module is used for determining participating target interactive users when detecting a control triggering multiple persons to interact in the same real-time interactive interface; determining a hit interactive user hitting a second element according to a target running track of a first element corresponding to the target interactive user; the target interactive users comprise the hit interactive users; and accumulating the resource interaction attribute characteristics of the hit interaction users to update the resource interaction attribute characteristics.
Optionally, the resource interaction suspension or termination condition includes at least one of: the interactive accumulated time length reaches the interactive pause or termination time length; triggering an interaction pause or termination control; the voice message of the user triggers the interaction pause or terminates the wakeup word.
Optionally, the image display module 230 is further configured to obtain a resource interaction attribute feature corresponding to the user when the resource interaction is suspended or terminated; and determining an interactive pause or termination picture according to the descending information of the resource interactive attribute characteristics.
Optionally, the apparatus further comprises: and the resource interaction display module is used for displaying the resource interaction process of each user in the real-time interaction interface of the user to be interacted when detecting that the user to be interacted newly joins.
Optionally, the apparatus further comprises: and the target resource release module is used for releasing the target resource and displaying the real-time interaction interface when detecting that the resource release control is triggered.
Optionally, the apparatus further comprises: and the interaction repeating module is used for keeping the target resource not released when detecting the control triggering the resource interaction again, and repeatedly executing the operation of the resource interaction based on the target resource.
Optionally, the target resource is a basketball game resource, the first element is a basketball element, the second element is a basket element, and the resource interaction attribute feature corresponds to a shooting hit point.
Optionally, the target resource is an audiovisual resource, and the target resource display module 220 is further configured to display, in the real-time interactive interface, the audiovisual resource triggering operation performed by the other user and the audiovisual resource corresponding to the triggering operation according to the corresponding transparency parameter.
According to the technical scheme of the embodiment, the target resource is determined when the target operation is detected in the process that the user interacts based on the real-time interaction interface, the target resource is displayed in the real-time interaction interface corresponding to the user to interact based on the target resource, the purpose of multi-user interaction is achieved, when the condition that the resource interaction pause or termination is met is detected, the interaction pause or termination picture corresponding to the target resource is displayed, the problems of low interaction abundance, poor interactivity and poor user use experience in multi-user conversation are solved, the rich interaction function is achieved, the interactivity and interestingness among the users are improved in the user interaction process, and the technical effect of the user use experience is further improved.
The multi-person interaction device provided by the embodiment of the disclosure can execute the multi-person interaction method provided by any embodiment of the disclosure, and has corresponding functional modules and beneficial effects of the execution method.
It should be noted that, the units and modules included in the apparatus are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the embodiments of the present disclosure.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure. Referring now to fig. 5, a schematic diagram of an electronic device (e.g., the terminal device or the server in fig. 5) 500 suitable for implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, electronic device 500 may include a processing means (e.g., central processing unit, graphics processor, etc.) 501 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other through a bus 504. An editing/output (I/O) interface 505 is also connected to bus 504.
Generally, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 507 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage devices 508 including, for example, magnetic tape, hard disk, etc.; and a communication device 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 illustrates an electronic device 500 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or installed from the storage means 508, or installed from the ROM 502. The computer program performs the above-described functions defined in the methods of the embodiments of the present disclosure when executed by the processing device 501.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The electronic device provided by the embodiment of the disclosure and the multi-person interaction method provided by the embodiment belong to the same inventive concept, and technical details which are not described in detail in the embodiment can be referred to the embodiment, and the embodiment have the same beneficial effects.
The disclosed embodiments provide a computer storage medium having a computer program stored thereon, which when executed by a processor implements the multi-person interaction method provided by the above embodiments.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to:
determining target resources when target operation is detected in the interaction process of a user based on a real-time interaction interface; the target resource is a resource for the interaction of the at least one user;
displaying target resources in a real-time interaction interface corresponding to a user so that the user can perform resource interaction based on the target resources;
and when detecting that the resource interaction pause or termination condition is met, displaying an interaction pause or termination picture corresponding to the target resource.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of an element does not in some cases constitute a limitation on the element itself.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure [ example one ], there is provided a multi-person interaction method, comprising:
determining target resources when target operation is detected in the interaction process of a user based on a real-time interaction interface; the target resource is a resource for the interaction of the at least one user;
displaying target resources in a real-time interaction interface corresponding to a user so that the user can perform resource interaction based on the target resources;
and when detecting that the resource interaction pause or termination condition is met, displaying an interaction pause or termination picture corresponding to the target resource.
According to one or more embodiments of the present disclosure, [ example two ] there is provided a multi-person interaction method, further comprising:
optionally, the target operation includes at least one of:
triggering a target interaction control;
the voice information of the user triggers an interactive wake-up word.
According to one or more embodiments of the present disclosure, [ example three ] there is provided a multi-person interaction method, further comprising:
optionally, the determining the target resource includes:
displaying a resource list comprising at least one resource to be selected in a real-time interactive interface of a user;
and determining the target resource based on the triggering operation of the at least one resource to be selected.
According to one or more embodiments of the present disclosure, [ example four ] there is provided a multi-person interaction method, further comprising:
optionally, the determining the target resource based on the trigger operation on the at least one resource to be selected includes:
determining the target resource according to the trigger time stamp of the user on the at least one resource to be selected; and/or the presence of a gas in the gas,
determining the target resource according to the triggered frequency of at least one resource to be selected in a preset selection duration; and/or the presence of a gas in the gas,
taking the resource to be selected triggered by the behavior user as the target resource; wherein the behavior user corresponds to a user who triggers the target operation.
According to one or more embodiments of the present disclosure, [ example five ] there is provided a multi-person interaction method, further comprising:
optionally, before the target resource is displayed in the real-time interactive interface corresponding to the user, the method further includes:
and loading the target resource to send the target resource to a real-time interactive interface corresponding to the user.
According to one or more embodiments of the present disclosure, [ example six ] there is provided a multi-person interaction method, further comprising:
optionally, the target resource is a game resource, where the game resource includes at least one first element and one second element, and the displaying the target resource in the real-time interactive interface corresponding to the user includes:
and displaying the basic identification of the user in a first area of the real-time interactive interface, and displaying the first element and the second element in the target resource in a second area of the real-time interactive interface.
According to one or more embodiments of the present disclosure, [ example seven ] there is provided a multi-person interaction method, further comprising:
optionally, the performing resource interaction based on the target resource includes:
when the first element is detected to be triggered in the current real-time interactive interface, determining a target running track of the first element based on position information and speed information for releasing the first element; the speed information comprises speed magnitude and/or speed direction;
and determining the resource interaction attribute characteristics in the interaction process based on the target running track and the position information of the second element.
According to one or more embodiments of the present disclosure, [ example eight ] there is provided a multi-person interaction method, further comprising:
optionally, the determining, based on the target trajectory and the position information of the second element, a resource interaction attribute characteristic in an interaction process includes:
if the first element is determined to fall into the second element based on the target running track and the position information of the second element, accumulating the resource interaction attribute characteristics for the current user;
and the current user corresponds to the current real-time interactive interface.
According to one or more embodiments of the present disclosure, [ example nine ] there is provided a multi-person interaction method, further comprising:
optionally, the method further includes:
and when detecting that the interaction duration corresponding to the target resource reaches a first duration threshold, controlling the second element to move according to a preset motion track, and repeatedly executing the step of determining the resource interaction attribute characteristics in the resource interaction process when detecting that the first element is triggered.
According to one or more embodiments of the present disclosure, [ example ten ] there is provided a multi-person interaction method, further comprising:
optionally, in the process of performing resource interaction based on the target resource, the method further includes:
and displaying the resource interaction process corresponding to other users according to the corresponding transparency parameter.
According to one or more embodiments of the present disclosure, [ example eleven ] there is provided a multi-person interaction method, further comprising:
optionally, the displaying the resource interaction process corresponding to the other user according to the corresponding transparency parameter includes:
pulling resource interaction data streams of other users; wherein the other users comprise all users or users triggering selection;
and determining a corresponding transparency parameter according to the resource interaction attribute characteristics of the user, and rendering the resource interaction process of the corresponding user to the current real-time interaction interface.
According to one or more embodiments of the present disclosure, [ example twelve ] there is provided a multi-person interaction method, further comprising:
optionally, the pulling the resource interaction data stream with at least one other user includes:
and pulling the resource interaction data stream of at least one other user triggered by the current user.
According to one or more embodiments of the present disclosure, [ example thirteen ] provides a multi-person interaction method, further comprising:
optionally, the displaying the resource interaction process corresponding to the other user according to the corresponding transparency parameter includes:
periodically acquiring element position information of a first element of other users;
and determining the interaction process of other corresponding users based on the interpolation processing of the element position information, and displaying according to the corresponding transparency parameters.
According to one or more embodiments of the present disclosure, [ example fourteen ] there is provided a multi-person interaction method, further comprising:
optionally, in the process of performing resource interaction based on the target resource, the method further includes:
acquiring target resource interaction attribute characteristics of other users;
determining attribute display information in the current real-time interactive interface based on at least one target resource interaction attribute feature and the current resource interaction attribute feature of the current user;
and the current real-time interactive interface corresponds to the current user.
According to one or more embodiments of the present disclosure, [ example fifteen ] there is provided a multi-person interaction method, further comprising:
optionally, the target resource is a game resource, and in the process of performing interaction based on the target resource, the method further includes:
when the interaction duration reaches a second preset duration threshold value, determining a target user according to the resource interaction attribute characteristics of at least one user;
issuing an attribute superposition special effect for the target user, and displaying in a real-time interaction interface corresponding to the target user;
the attribute superposition special effect is a dynamic special effect or a static special effect.
According to one or more embodiments of the present disclosure, [ example sixteen ] there is provided a multi-person interaction method, further comprising:
optionally, the method further includes:
when detecting that the first element triggers the attribute superposition special effect, configuring superposition attributes for at least one first element to obtain a first superposition element;
and if the target running track of the first superposition element is coincident with the position information of the second element, superposing the resource interaction attribute characteristics of the target user until the first superposition element is the last first superposition element.
According to one or more embodiments of the present disclosure, [ example seventeen ] there is provided a multi-person interaction method, further comprising:
alternatively to this, the first and second parts may,
the target resource is a game resource, and the method further comprises:
when a control triggering multiple persons to interact in the same real-time interaction interface is detected, determining participating target interaction users;
determining a hit interactive user hitting a second element according to a target running track of a first element corresponding to the target interactive user; the target interactive users comprise the hit interactive users;
and accumulating the resource interaction attribute characteristics of the hit interaction users to update the resource interaction attribute characteristics.
According to one or more embodiments of the present disclosure, [ example eighteen ] there is provided a multi-person interaction method, further comprising:
optionally, the resource interaction suspension or termination condition includes at least one of:
the interactive accumulated time length reaches the interactive pause or termination time length;
triggering an interaction pause or termination control;
the voice message of the user triggers the interaction pause or terminates the wakeup word.
According to one or more embodiments of the present disclosure, [ example nineteen ] there is provided a multi-person interaction method, further comprising:
optionally, the displaying an interactive pause or termination picture corresponding to the target resource includes:
acquiring resource interaction attribute characteristics corresponding to a user when the resource interaction is suspended or terminated;
and determining an interactive pause or termination picture according to the descending information of the resource interactive attribute characteristics.
In accordance with one or more embodiments of the present disclosure, [ example twenty ] there is provided a multi-person interaction method, further comprising:
optionally, in the process of performing interaction based on the target resource, the method further includes:
and when a newly added user to be interacted is detected, displaying the resource interaction process of each user in the real-time interaction interface of the user to be interacted.
According to one or more embodiments of the present disclosure, [ example twenty-one ] there is provided a multi-person interaction method, further comprising:
optionally, the method further includes:
and when the triggering of the resource release control is detected, releasing the target resource and displaying the real-time interactive interface.
According to one or more embodiments of the present disclosure, [ example twenty-two ] provides a multi-person interaction method, further comprising:
optionally, the method further includes:
and when the control triggering the resource interaction again is detected, keeping the target resource not to be released, and repeatedly executing the operation of performing the resource interaction based on the target resource.
According to one or more embodiments of the present disclosure, [ example twenty-three ] provides a multi-person interaction method, further comprising:
optionally, the target resource is a basketball game resource, the first element is a basketball element, the second element is a basket element, and the resource interaction attribute feature corresponds to a shooting hit point.
In accordance with one or more embodiments of the present disclosure, [ example twenty-four ] there is provided a multi-person interaction method, further comprising:
optionally, the target resource is an audio-visual resource, and the resource interaction based on the target resource includes:
and displaying the triggering operation of other users on the audio-visual resources and the audio-visual resources corresponding to the triggering operation in a real-time interactive interface according to the corresponding transparency parameters.
According to one or more embodiments of the present disclosure, [ example twenty-five ] there is provided a multi-person interaction device, comprising:
the target resource determining module is used for determining target resources when target operation is detected in the real-time interaction interface-based interaction process of a user; the target resource is a resource for the interaction of the at least one user;
the target resource display module is used for displaying the target resources in a real-time interaction interface corresponding to the user so that the user can perform resource interaction based on the target resources;
and the picture display module is used for displaying the interactive pause or termination picture corresponding to the target resource when the condition that the resource interactive pause or termination condition is met is detected.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Claims (20)
1. A method for multi-person interaction, comprising:
determining target resources when target operation is detected in the interaction process of a user based on a real-time interaction interface; the target resource is a resource for the interaction of the at least one user;
displaying target resources in a real-time interaction interface corresponding to a user so that the user can perform resource interaction based on the target resources;
and when detecting that the resource interaction pause or termination condition is met, displaying an interaction pause or termination picture corresponding to the target resource.
2. The method of claim 1, wherein determining the target resource comprises:
displaying a resource list comprising at least one resource to be selected in a real-time interactive interface of a user;
and determining the target resource based on the triggering operation of the at least one resource to be selected.
3. The method of claim 2, wherein the determining the target resource based on the triggering operation on the at least one resource to be selected comprises:
determining the target resource according to the trigger time stamp of the user on the at least one resource to be selected; and/or the presence of a gas in the gas,
determining the target resource according to the triggered frequency of at least one resource to be selected in a preset selection duration; and/or the presence of a gas in the gas,
taking the resource to be selected triggered by the behavior user as the target resource; wherein the behavior user corresponds to a user who triggers the target operation.
4. The method of claim 1, wherein the target resource is a game resource, the game resource includes at least one first element and a second element, and the displaying the target resource in the real-time interactive interface corresponding to the user includes:
and displaying the basic identification of the user in a first area of the real-time interactive interface, and displaying the first element and the second element in the target resource in a second area of the real-time interactive interface.
5. The method of claim 4, wherein the resource interaction based on the target resource comprises:
when the first element is detected to be triggered in the current real-time interactive interface, determining a target running track of the first element based on position information and speed information for releasing the first element; the speed information comprises speed magnitude and/or speed direction;
and determining the resource interaction attribute characteristics in the interaction process based on the target running track and the position information of the second element.
6. The method of claim 5, wherein determining the resource interaction attribute characteristics during the interaction process based on the target trajectory and the position information of the second element comprises:
if the first element is determined to fall into the second element based on the target running track and the position information of the second element, accumulating the resource interaction attribute characteristics for the current user;
and the current user corresponds to the current real-time interactive interface.
7. The method of claim 5, further comprising:
and when detecting that the interaction duration corresponding to the target resource reaches a first duration threshold, controlling the second element to move according to a preset motion track, and repeatedly executing the step of determining the resource interaction attribute characteristics in the resource interaction process when detecting that the first element is triggered.
8. The method of claim 1, wherein during the resource interaction based on the target resource, further comprising:
and displaying the resource interaction process corresponding to other users according to the corresponding transparency parameter.
9. The method of claim 8, wherein displaying the resource interaction process corresponding to the other user according to the corresponding transparency parameter comprises:
pulling resource interaction data streams of other users; wherein the other users comprise all users or users triggering selection;
and determining a corresponding transparency parameter according to the resource interaction attribute characteristics of the user, and rendering the resource interaction process of the corresponding user to the current real-time interaction interface.
10. The method of claim 8, wherein displaying the resource interaction process corresponding to the other user according to the corresponding transparency parameter comprises:
periodically acquiring element position information of a first element of other users;
and determining the interaction process of other corresponding users based on the interpolation processing of the element position information, and displaying according to the corresponding transparency parameters.
11. The method of claim 1, wherein during the resource interaction based on the target resource, further comprising:
acquiring target resource interaction attribute characteristics of other users;
determining attribute display information in the current real-time interactive interface based on at least one target resource interaction attribute feature and the current resource interaction attribute feature of the current user;
and the current real-time interactive interface corresponds to the current user.
12. The method of claim 1, wherein the target resource is a game resource, and during the interaction based on the target resource, the method further comprises:
when the interaction duration reaches a second preset duration threshold value, determining a target user according to the resource interaction attribute characteristics of at least one user;
issuing an attribute superposition special effect for the target user, and displaying in a real-time interaction interface corresponding to the target user;
wherein, the attribute superposition special effect is a dynamic special effect or a static special effect.
13. The method of claim 12, further comprising:
when detecting that the first element triggers the attribute superposition special effect, configuring superposition attributes for at least one first element to obtain a first superposition element;
and if the target running track of the first superposition element is coincident with the position information of the second element, superposing the resource interaction attribute characteristics of the target user until the first superposition element is the last first superposition element.
14. The method of claim 1, wherein the target resource is a game resource, the method further comprising:
when a control triggering multiple persons to interact in the same real-time interaction interface is detected, determining participating target interaction users;
determining a hit interactive user hitting a second element according to a target running track of a first element corresponding to the target interactive user; the target interactive users comprise the hit interactive users;
and accumulating the resource interaction attribute characteristics of the hit interaction users to update the resource interaction attribute characteristics.
15. The method of claim 1, wherein the presenting an interactive pause or termination screen corresponding to the target resource comprises:
acquiring resource interaction attribute characteristics corresponding to a user when the resource interaction is suspended or terminated;
and determining an interactive pause or termination picture according to the descending information of the resource interactive attribute characteristics.
16. The method of claim 1, further comprising, during the interaction based on the target resource:
and when a newly added user to be interacted is detected, displaying the resource interaction process of each user in the real-time interaction interface of the user to be interacted.
17. The method of claim 1, wherein the target resource is an audiovisual resource, and wherein the interacting the resource based on the target resource comprises:
and displaying the triggering operation of other users on the audio-visual resources and the audio-visual resources corresponding to the triggering operation in a real-time interactive interface according to the corresponding transparency parameters.
18. A multi-person interaction device, comprising:
the target resource determining module is used for determining target resources when target operation is detected in the real-time interaction interface-based interaction process of a user; the target resource is a resource for the interaction of the at least one user;
the target resource display module is used for displaying the target resources in a real-time interaction interface corresponding to the user so that the user can perform resource interaction based on the target resources;
and the picture display module is used for displaying the interactive pause or termination picture corresponding to the target resource when the condition that the resource interactive pause or termination condition is met is detected.
19. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a multi-person interaction method as recited in any of claims 1-17.
20. A storage medium containing computer-executable instructions for performing the multi-person interaction method of any one of claims 1-17 when executed by a computer processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210556342.6A CN114895787A (en) | 2022-05-20 | 2022-05-20 | Multi-person interaction method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210556342.6A CN114895787A (en) | 2022-05-20 | 2022-05-20 | Multi-person interaction method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114895787A true CN114895787A (en) | 2022-08-12 |
Family
ID=82723628
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210556342.6A Pending CN114895787A (en) | 2022-05-20 | 2022-05-20 | Multi-person interaction method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114895787A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115499674A (en) * | 2022-09-15 | 2022-12-20 | 广州方硅信息技术有限公司 | Live broadcast room interactive picture presentation method and device, electronic equipment and storage medium |
CN116896649A (en) * | 2023-09-11 | 2023-10-17 | 北京达佳互联信息技术有限公司 | Live interaction method and device, electronic equipment and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112891944A (en) * | 2021-03-26 | 2021-06-04 | 腾讯科技(深圳)有限公司 | Interaction method and device based on virtual scene, computer equipment and storage medium |
CN113633974A (en) * | 2021-09-01 | 2021-11-12 | 腾讯科技(深圳)有限公司 | Method, device, terminal and storage medium for displaying real-time game-checking information of user |
CN113908559A (en) * | 2021-10-13 | 2022-01-11 | 腾讯科技(深圳)有限公司 | Interface display method, device, terminal and storage medium |
-
2022
- 2022-05-20 CN CN202210556342.6A patent/CN114895787A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112891944A (en) * | 2021-03-26 | 2021-06-04 | 腾讯科技(深圳)有限公司 | Interaction method and device based on virtual scene, computer equipment and storage medium |
CN113633974A (en) * | 2021-09-01 | 2021-11-12 | 腾讯科技(深圳)有限公司 | Method, device, terminal and storage medium for displaying real-time game-checking information of user |
CN113908559A (en) * | 2021-10-13 | 2022-01-11 | 腾讯科技(深圳)有限公司 | Interface display method, device, terminal and storage medium |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115499674A (en) * | 2022-09-15 | 2022-12-20 | 广州方硅信息技术有限公司 | Live broadcast room interactive picture presentation method and device, electronic equipment and storage medium |
CN116896649A (en) * | 2023-09-11 | 2023-10-17 | 北京达佳互联信息技术有限公司 | Live interaction method and device, electronic equipment and storage medium |
CN116896649B (en) * | 2023-09-11 | 2024-01-19 | 北京达佳互联信息技术有限公司 | Live interaction method and device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11794102B2 (en) | Cloud-based game streaming | |
US11172012B2 (en) | Co-streaming within a live interactive video game streaming service | |
WO2022121557A1 (en) | Live streaming interaction method, apparatus and device, and medium | |
US9573057B2 (en) | Method and system for remote game display | |
CN108924661B (en) | Data interaction method, device, terminal and storage medium based on live broadcast room | |
CN102209273B (en) | Interactive and shared viewing experience | |
WO2018010682A1 (en) | Live broadcast method, live broadcast data stream display method and terminal | |
CN111857923B (en) | Special effect display method and device, electronic equipment and computer readable medium | |
WO2018086468A1 (en) | Method and apparatus for processing comment information of playback object | |
CN114895787A (en) | Multi-person interaction method and device, electronic equipment and storage medium | |
KR102669170B1 (en) | Methods, systems, and media for coordinating multiplayer game sessions | |
KR20120031168A (en) | Avatar integrated shared media experience | |
US20240154924A1 (en) | Methods, systems, and media for generating a notification in connection with a video content item | |
CN109495427B (en) | Multimedia data display method and device, storage medium and computer equipment | |
WO2023104102A1 (en) | Live broadcasting comment presentation method and apparatus, and device, program product and medium | |
WO2022267701A1 (en) | Method and apparatus for controlling virtual object, and device, system and readable storage medium | |
CN115175751A (en) | Driving virtual influencers based on predicted game activity and audience characteristics | |
CN113536147B (en) | Group interaction method, device, equipment and storage medium | |
CN113518240A (en) | Live broadcast interaction method, virtual resource configuration method, virtual resource processing method and device | |
CN114173173A (en) | Barrage information display method and device, storage medium and electronic equipment | |
CN112169319A (en) | Application program starting method, device, equipment and storage medium | |
CN114501054B (en) | Live interaction method, device, equipment and computer readable storage medium | |
JP7099753B2 (en) | Information processing equipment | |
WO2024114162A1 (en) | Animation processing method and apparatus, computer device, storage medium, and program product | |
CN115988279A (en) | Live broadcast method, medium, device, system and computing equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |