CN117762539A - Application experience method and related device - Google Patents

Application experience method and related device Download PDF

Info

Publication number
CN117762539A
CN117762539A CN202211137253.4A CN202211137253A CN117762539A CN 117762539 A CN117762539 A CN 117762539A CN 202211137253 A CN202211137253 A CN 202211137253A CN 117762539 A CN117762539 A CN 117762539A
Authority
CN
China
Prior art keywords
target
experience
live
game
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211137253.4A
Other languages
Chinese (zh)
Inventor
刘行
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202211137253.4A priority Critical patent/CN117762539A/en
Publication of CN117762539A publication Critical patent/CN117762539A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an application experience method and a related device, wherein the method comprises the following steps: displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast area, a comment area and an experience entrance control, the live broadcast area is used for playing target live broadcast content, and the experience entrance control is used for triggering target applications related to the target live broadcast content to be experienced; and responding to the triggering operation for the experience entrance control, switching the comment area into an experience area, wherein the experience area is used for supporting the live audience to experience the service content of the target application. The method can improve the propaganda effect on the target application, and effectively attracts users to download and use the propaganda target application.

Description

Application experience method and related device
Technical Field
The application relates to the technical field of computers, in particular to an application experience method and a related device.
Background
With the rapid development of computer technology, various applications (including mobile phone applications, computer applications and the like) are layered, and the continuously emerging applications can provide richer experiences for people.
How to realize promotion and popularization of applications to attract more users to download and use the applications is an important problem facing application providers. Taking a game application as an example, propaganda and popularization of the game application are generally realized at present through directly broadcasting a game playing process of the game application by a relevant host. However, the advertising effect obtained in the above manner is often not ideal, and it is difficult to effectively attract users to download and use the advertised application.
Disclosure of Invention
The embodiment of the application experience method and the related device can improve the propaganda effect on the application, and more effectively attract users to download and use the propaganda application.
In view of this, a first aspect of the present application provides an application experience method, the method comprising:
displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast area, a comment area and an experience entry control, the live broadcast area is used for playing target live broadcast content, and the experience entry control is used for triggering target applications related to the target live broadcast content to be experienced;
and responding to the triggering operation of the experience entrance control, switching the evaluation area into an experience area, wherein the experience area is used for supporting live audience to experience the service content of the target application.
A second aspect of the present application provides an application experience system, the system including a terminal device, a first server for providing a live service, and a second server for providing service content of a target application;
the terminal device is configured to execute the application experience method described in the first aspect;
the first server is used for providing target live broadcast content for the terminal equipment;
And the second server is used for providing the service content of the target application for the live audience based on the experience area displayed by the terminal equipment when the terminal equipment responds to the triggering operation of the experience entry control in the live interface.
A third aspect of the present application provides an application experience apparatus, the apparatus comprising:
the interface display module is used for displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast area, a comment area and an experience entrance control, the live broadcast area is used for playing target live broadcast content, and the experience entrance control is used for triggering target applications related to the target live broadcast content to be experienced;
and the region switching module is used for responding to the triggering operation of the experience entrance control, switching the evaluation region into an experience region, and enabling the live audience to experience the service content of the target application.
A fourth aspect of the present application provides an electronic device comprising a processor and a memory:
the memory is used for storing a computer program;
the processor is configured to execute the steps of the application experience method according to the first aspect described above according to the computer program.
A fifth aspect of the present application provides a computer readable storage medium for storing a computer program for executing the steps of the application experience method of the first aspect described above.
A sixth aspect of the present application provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the computer device performs the steps of the application experience method described in the first aspect.
From the above technical solutions, the embodiments of the present application have the following advantages:
the embodiment of the application provides an application experience method, which comprises the following steps: displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast area, a comment area and an experience entrance control, the live broadcast area is used for playing target live broadcast content, and the experience entrance control is used for triggering target applications related to the target live broadcast content to be experienced; and responding to the triggering operation for the experience entrance control, switching the comment area into an experience area, wherein the experience area is used for supporting the live audience to experience the service content of the target application. On one hand, the method supports live broadcast audiences watching target live broadcast content related to the target application, and service content of the target application is triggered and experienced through the experience entry control, so that the live broadcast audiences can conveniently and deeply learn about the target application through trial experience, and the propaganda effect of the target application can be improved. On the other hand, in response to the triggering operation of the live broadcast audience aiming at the experience entrance control, the comment area in the live broadcast interface is switched into the experience area, so that the live broadcast audience is supported to experience the service content of the target application in the experience area, the live broadcast audience can conveniently watch the live broadcast and the experience target application at the same time, the live broadcast watching experience is not affected, and the convenience of the experience target application is improved.
Drawings
Fig. 1 is a schematic diagram of an operating principle of an application experience system provided in an embodiment of the present application;
fig. 2 is a flow chart of an application experience method according to an embodiment of the present application;
FIG. 3 is a schematic illustration of an exemplary live interface provided by an embodiment of the present application;
FIG. 4 is a flowchart illustrating another application experience method according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram of a change process of a live interface according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a team formation mode according to an embodiment of the present application;
FIG. 7 is a schematic diagram of another team organization mode provided in an embodiment of the present application;
fig. 8 is a schematic diagram illustrating implementation of a resource transfer function according to an embodiment of the present application;
FIG. 9 is a flowchart of another application experience method according to an embodiment of the present disclosure;
fig. 10 is a schematic diagram of another live interface change process provided in an embodiment of the present application;
FIG. 11 is a schematic diagram illustrating another implementation of a resource transfer function according to an embodiment of the present disclosure;
fig. 12 is a schematic diagram of a change process of a live interface according to an embodiment of the present application;
FIG. 13 is a background interaction signaling diagram of an implementation provided in an embodiment of the present application;
FIG. 14 is a background interaction signaling diagram of another implementation provided by an embodiment of the present application;
fig. 15 is a schematic structural diagram of an application experience device according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of a terminal device provided in an embodiment of the present application;
fig. 17 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will clearly and completely describe the technical solution in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims of this application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The application experience method provided by the embodiment of the application experience method can be executed by electronic equipment, the electronic equipment can be specifically terminal equipment, the terminal equipment can be a smart phone, a tablet personal computer, a notebook computer, a desktop computer, an intelligent sound box, a smart watch, intelligent voice interaction equipment, intelligent household appliances, a vehicle-mounted terminal and the like, and the application experience method is not limited to the electronic equipment.
The implementation of the application experience method provided by the embodiment of the application experience method requires the support of a background server, and the server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and also can be a cloud server for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery networks (Content Delivery Network, CDNs), basic cloud computing services such as big data and artificial intelligent platforms. The terminal device and the server may be directly or indirectly connected through wired or wireless communication, which is not limited herein.
In order to facilitate understanding of the application experience method provided by the embodiment of the present application, an application scenario of the application experience method is described in the following by way of example in connection with the application experience system provided by the embodiment of the present application.
Referring to fig. 1, fig. 1 is a schematic diagram illustrating an operating principle of an application experience system according to an embodiment of the present application. As shown in fig. 1, the application experience system includes a terminal device 110, a first server 120, and a second server 130, and the terminal device 110 may communicate with the first server 120 and the second server 130 through a network. The terminal device 110 is configured to execute the application experience method provided by the embodiment of the present application, and support the object trial experience target application; the first server 120 is a server for providing a live service; the second server 130 is a server for providing service contents of a target application.
In a practical application, the terminal device 110 may obtain, from the first server 120, a target live content related to the target application, and the terminal device 112 may also obtain, from the first server 120, comment content of a live audience for the target live content. Accordingly, the terminal device 110 may display a live interface, where the live interface includes a live area, a comment area and an experience entry control, where the live area is used to play the target live content provided by the first server 120, the comment area is used to display comment content of a live audience provided by the first server 120 for the target live content, and the experience entry control is used to trigger to experience a target application related to the target live content.
The terminal device 110 may generate an application experience request accordingly and send the application experience request to the second server 130 in response to the triggering operation for the experience entry control; the application experience request is a request message for requesting an experience target application. After receiving the application experience request, the second server 120 may send experience data related to the target application to the terminal device 110; the experience data can support the terminal equipment 110 to render and display an experience interface corresponding to the target application, and support the live audience to experience the service content of the target application.
After receiving the experience data sent by the second server 130, the terminal device 110 may switch the comment area into an experience area according to the experience data, and display an experience interface corresponding to the target application in the experience area; the experience interface can support the live audience to experience the service content of the target application provided by the second server 130, that is, the live audience triggers related operations through the experience area, so that the terminal device 110 can interact with the second server 130, and thus the service content of the target application provided by the second server 130 is experienced.
It should be understood that the forms of the terminal device and the server in the application experience system shown in fig. 1 are only examples, and the terminal device and the server in the application experience system may also be other forms, which are not limited in this application.
The application experience method provided by the application is described in detail through the method embodiment.
Referring to fig. 2, fig. 2 is a flow chart of an application experience method provided in an embodiment of the present application. For convenience of description, the embodiment is described taking an execution body as an example of a terminal device, and as shown in fig. 2, the application experience method includes the following steps:
step 201: and displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast area, a comment area and an experience entrance control, the live broadcast area is used for playing target live broadcast content, and the experience entrance control is used for triggering target applications related to the target live broadcast content to be experienced.
In this embodiment of the present application, the terminal device may display a live interface, where the live interface may be any interface supporting playing of live content, and may be, for example, an interface capable of playing live video in a video playing application, or may be, for example, a web interface capable of playing live video, which is not limited in this application.
The live interface comprises a live broadcast area, a comment area and an experience entrance control. The live broadcast area is used for playing target live broadcast content related to the target application; the target application can be any application needing promotion, including but not limited to a game application, a social application, a multimedia playing application, a multimedia processing application, a shopping application and the like; the target live content can be any live video which can play a propaganda role on the target application. The comment area is used for displaying comment contents of the live audience aiming at target live contents, wherein the comment contents can comprise bullet screen contents sent by the live audience, comment contents given by the live audience, animation effects generated by the live audience for giving a virtual gift to a host, and the like. The experience entrance control is used for triggering to experience the target application related to the target live content, namely, the live audience can trigger to experience the target application through the experience entrance control.
Fig. 3 is a schematic diagram of an exemplary live interface according to an embodiment of the present application. As shown in fig. 3, the live interface includes a live region 301 in which live video (i.e., target live content) of a game played by a game application (i.e., target application) that is live by a host is played; the live interface also includes a comment area 302, which includes comment content posted by a live audience for the live video and animation effects generated by presenting a virtual gift for the live video; the live broadcast interface also comprises an experience entrance control 303, and the live broadcast audience can trigger to experience the game application by clicking the experience entrance control 303. It should be understood that the representation of experience portal control 303 in fig. 3 is merely an example, and that experience portal control 303 may also take other forms, and embodiments of the present application are not limited in this regard.
In one possible implementation, experience portal controls in a live interface may carry introduction information for a target application. For example, the introduction information of the target application may be displayed directly on the experience entry control. For example, the introduction information of the target application can also be displayed in response to an application information viewing operation triggered by the experience entry control; for example, for a live viewer viewing target live content through a computer, an application information viewing operation may be triggered by moving a mouse cursor to an experience entry control, and in response to the application information viewing operation, an information window may be displayed adjacent to the experience entry control, including introduction information for the target application.
It should be understood that the introduction information of the target application may include, but is not limited to, an application type of the target application, service contents provided by the target application, characteristics of the target application, and the like, which are not limited in this application.
It should be noted that, the experience entry control may be a control for triggering a target mode of the experience target application; the target pattern here may be determined from the live audience behavior data; the target mode is a use mode supported by the target application, and may be a mode of the target application itself (i.e., a mode that the target application can experience when the target application is normally used), or a mode specifically designed for a propaganda scene of the target application (i.e., a mode that the target application cannot experience when the target application is normally used).
The above-described target pattern is generally determined by a server providing a live service (i.e., the first server 120 in fig. 1); the server can correspondingly determine a target mode meeting personalized requirements of each live audience of target live content according to behavior data of the live audience; the server can also select part of live audience from all live audience according to the respective behavior data of all live audience of the target live content, and provide a preset target mode for the part of live audience. The manner in which the target pattern is specifically determined will be described in detail in the method embodiments that follow.
It should be noted that, the server only obtains the authorized license of the live audience, and can obtain the behavior data of the live audience. For example, the server may send authorization prompt information to the terminal device, where the authorization prompt information is used to request that the live audience using the terminal device grant the server permission to acquire the behavior data of the live audience; if the live audience is detected to grant the right for the server to acquire the behavior data through the authorization prompt information, the server can acquire the behavior data of the live audience subsequently and determine a target mode according to the behavior data; if the live audience is not detected and the authorization prompt information grants the server permission to acquire the behavior data of the live audience, the server cannot acquire the behavior data of the live audience, and accordingly, the server cannot provide the live audience with a target mode consistent with the behavior of the live audience.
As an example, the terminal device may display, in the live interface, an experience entry control corresponding to a target mode of the target application after a duration of viewing the target live content by the live audience reaches a preset duration. In this example, the server may determine the target mode provided for the live audience based on at least one of real-time behavior data generated by the live audience for the target live content during the preset time period and historical behavior data generated by the live audience on a playback platform before viewing the target live content.
As another example, when detecting that the live audience triggers playing of the target live content through the live interface, the terminal device may also display an experience entry control corresponding to the target mode of the target application in the live interface. In this example, the server may determine a target pattern to provide to the live audience based on historical behavior data generated on the playback platform before the live audience views the target live content.
As yet another example, the terminal device may display, in the live interface, an experience entry control corresponding to a target mode of the target application when detecting that a host of the target live content initiates a trial right for the target application. In this example, the server may determine the target pattern provided for the live audience based on at least one of real-time behavior data generated by the live audience for the target live content and historical behavior data generated on a playback platform before the live audience views the target live content.
Specifically, for the anchor of the live target live content, the anchor can manually start the trial authority of the target application through the used terminal equipment; after detecting that the anchor starts the trial permission for the target application, the server can correspondingly inform the terminal equipment used by the live audience of the target live content, and display an experience entry control corresponding to the target mode for the target in the live interface. Thus, for target live content, the display time of the experience entrance control is mastered by the anchor, and the display flexibility of the experience entrance control is improved.
Step 202: and responding to the triggering operation of the experience entrance control, switching the evaluation area into an experience area, wherein the experience area is used for supporting live audience to experience the service content of the target application.
Under the condition that the experience entrance control of the target application is displayed in the live broadcast interface, the terminal equipment can respond to the triggering operation aiming at the experience entrance control to switch the comment area in the live broadcast interface into the experience area. Further, the object can experience the service content of the target application through the experience area; for example, when the target application is a game application, the live audience may try to play a game provided by the game application through the experience area; for another example, when the target application is a social application, the live audience may pass through the experience area, try out related functions of the social application, and so on.
As an example, in response to a trigger operation for the experience entry control, the terminal device may overlay and display the experience area on an upper layer of the comment area, that is, keep the comment area displayed on a lower layer (not perceived by the user), and superimpose and display the experience area on a display position of the comment area. As another example, in response to a trigger operation for the experience entry control, the terminal device may cancel displaying the comment area first, and then display the experience area at the original display position of the comment area. The display mode of the experience area is not limited in this application.
According to the application experience method, live broadcast audiences watching target live broadcast content related to the target application are supported, service content of the target application is triggered and experienced through the experience entry control, the live broadcast audiences can conveniently and deeply learn about the target application through trial experience, and the propaganda effect of the target application is improved. In addition, in the method, the comment area in the live broadcast interface is switched to the experience area in response to the triggering operation of the live broadcast audience aiming at the experience entrance control, so that the live broadcast audience is supported to experience the service content of the target application in the experience area, the live broadcast audience can conveniently watch the live broadcast and the experience target application at the same time, the live broadcast watching experience is not influenced, and the convenience of the experience target application is improved.
In one possible implementation manner provided by the embodiment of the present application, when the target application is a game application, the target live audience and the host of the target live content may be supported to participate in the game together. Fig. 4 is a flowchart of an application experience method provided in an embodiment of the present application in this implementation, as shown in fig. 4, where the application experience method includes the following steps:
Step 401: and displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast area, a comment area and an experience entrance control, the live broadcast area is used for playing target live broadcast content, and the experience entrance control is used for triggering target applications related to the target live broadcast content to be experienced.
The implementation of step 401 is similar to that of step 201 in the embodiment shown in fig. 2 and described above, see the introduction of step 201 for details.
Further, in such an implementation, the experience entry control may have team invitation information displayed thereon for instructing the hosting of the target live content to invite the target live audience to participate in the game together; at this time, the experience entry control only opens the experience right for the game application to the target live audience, and does not open the experience right for the game application to other live audiences except the target live audience.
Fig. 5 is a schematic diagram of a change process of a live interface according to an embodiment of the present application. Fig. 5 (a) shows a live interface, where the live interface includes a live broadcast area 501, a comment area 502, and an experience entry control 503 corresponding to a team mode of a game application, where the experience entry control 503 displays a host that invites a spectator a to participate in a game together, and the spectator a is a target live spectator.
Specifically, in this implementation, the server for providing the live service (i.e., the first server 120 in fig. 1) may select a part of live viewers from live viewers of the target live content as target live viewers, and provide, for the part of target live viewers, a team mode specifically designed for the game live scene to participate in the game together with the main broadcasting team as a target mode, which will not be provided for other live viewers except for the target live viewers.
The server can specifically select one or more target live audience among the live audience according to the respective behavior data of the live audience of the target live content, and support the selected target live audience to participate in the game together with the host in the process of hosting live broadcast. Therefore, the interest of the target mode provided for the target live audience can be improved, the viscosity between the target live audience and the anchor can be enhanced, and the enthusiasm of other live audiences for participating in live interaction can be improved.
As one example, the server may select the target live audience by: for each live audience of target live content, determining the corresponding liveness of the live audience according to real-time behavior data generated for the target live content when the live audience watches the target live content; the real-time behavior data comprise at least one of watching time of the live audience for the target live content, virtual gift giving data generated by the live audience for the target live content, barrage sending data, praying data and comment data; and selecting the live broadcast audience with the corresponding liveness meeting the preset liveness condition from all the live broadcast audiences as a target live broadcast audience.
Specifically, for each live audience of the target live content, the server may determine the activity corresponding to the live audience according to at least one of a viewing time of the live audience for the target live content, virtual gift gifting data generated by the live audience for the target live content, barrage transmission data generated by the live audience for the target live content, praise data generated by the live audience for the target live content, and comment data generated by the live audience for the target live content. For example, the server may determine the liveness corresponding to the live audience based on the number of virtual gifts and the value of the virtual gifts presented by the live audience for the target live content. For another example, the server may determine the liveness corresponding to the live audience according to at least one of the number and quality of the live audience (e.g., the number of live spot endorsements) sent by the live audience for the target live content and the number and quality of comments posted by the live audience for the target live content. For another example, the server may further determine, according to a viewing duration of the live broadcast audience for the target live broadcast content, virtual gift gifting data generated by the live broadcast audience for the target live broadcast content, barrage sending data generated by the live broadcast audience for the target live broadcast content, praise data generated by the live broadcast audience for the target live broadcast content, and comment data generated by the live broadcast audience for the target live broadcast content, a first sub-activity level, a second sub-activity level, a third sub-activity level, a fourth sub-activity level, and a fifth sub-activity level corresponding to the live broadcast audience, and further perform weighted processing on the first sub-activity level, the second sub-activity level, the third sub-activity level, the fourth sub-activity level, and the fifth sub-activity level corresponding to the live broadcast audience, to obtain an activity level corresponding to the live broadcast audience. Of course, the above manner of determining the liveness corresponding to the live audience is merely an example, and the present application does not limit this.
After determining the corresponding liveness of each live audience, the server can select live audience with the corresponding liveness meeting the preset liveness condition from all live audience as target live audience. The preset active condition here is a condition set for selecting a target live audience, which may be, for example, the highest activity, which may be, for example, n (n is an integer greater than 1) in which the activity is ranked first.
Step 402: responding to the triggering operation of a target live audience aiming at the experience entrance control, switching the evaluation area into a first game experience area, and displaying a target control object controlled by the target live audience in the target live content; the first game experience area includes game controls that support the targeted live audience to trigger game operations.
Step 403: and responding to a control operation triggered by the target live audience through a game control in the first game experience area, and controlling the target control object in the target live content to execute the action indicated by the control operation.
Since the degree of association between the implementation procedures of step 402 and step 403 is high, the implementation procedures of step 402 and step 403 will be described in the following.
And responding to the triggering operation of the target live audience aiming at the experience entrance control, switching the comment area in the live interface into a first game experience area, wherein the first game experience area comprises a game control supporting the target live audience to trigger the game operation. And simultaneously, displaying a target control object controlled by the target live audience in the target live content.
Fig. 5 is a schematic diagram of a change process of a live interface according to an embodiment of the present application. In response to a click operation triggered by the target live audience with respect to experience entry control 503 in the live interface shown in fig. 5 (a), the terminal device will display the live interface shown in fig. 5 (b), which includes a first game experience area 504 that covers comment area 502, where the first game experience area 504 includes game controls that support the target live audience to trigger game operations, such as game controls for manipulating control objects in the game, including, but not limited to, movement controls and directional controls that manipulate movement of control objects, attack controls that manipulate control objects to perform attack operations, vehicle use controls that manipulate control objects to use virtual vehicles, control style adjustment controls for adjusting control display styles (e.g., control size, transparency, etc.), and so forth.
In the team mode, the control objects controlled by the target live audience are displayed in the target live content played in the live broadcast area, so that the first game experience area provided for the target live audience generally only comprises game controls for controlling the control objects, namely, the target live audience is convenient to control the control objects in the target live content through the game controls in the first game experience area.
As an example, in the team mode described above, the target live audience and the anchor of the target live content may manipulate the same control object. That is, in the target live content, the control object indicating the anchor manipulation of the target live content may be a target control object manipulated by the target live audience.
Fig. 6 is a schematic diagram of a team formation mode according to an embodiment of the present application. As shown in fig. 6, after detecting the triggering operation of the target live audience for the experience entrance control, the terminal device may generate a game participation request and send the game participation request to the server for providing the game service content, and the server may give the target live audience control permission to the control object controlled by the anchor, i.e. the control object controlled by the anchor may be attached to the target live audience, i.e. the control object may be simultaneously controlled by the anchor and the target live audience at this time, where the control object responds to the control operation triggered by the anchor and the control operation triggered by the target live audience. Specifically, the server for providing the game service content for the game application (i.e., the second server 130 in fig. 1) may receive a manipulation instruction of the host for the control object (which is generated according to a control operation triggered by the host), and a manipulation instruction of the target live audience for the control object (which is generated according to a control operation triggered by the target live audience through the game control in the first game experience area), and sequentially control the control object to execute actions indicated by the respective manipulation instructions according to respective corresponding receiving times of the respective manipulation instructions.
It should be appreciated that when the team mode is the mode described above for controlling the same control object together with the anchor, the server for providing the live service typically only needs to select one target live audience in order to avoid confusion in manipulation of the control object. And when the triggering operation of the target live broadcast audience aiming at the experience entrance control is detected, a notification message is sent to the anchor so as to notify the anchor target live broadcast audience and the anchor target live broadcast audience to jointly control the control object.
It should be understood that, in practical application, in order to avoid confusion of control over a control object, only a part of control rights over the control object may be given to the target live audience, for example, only the target live audience is supported to control the movement of the control object, and the control rights of the target live audience over the target control object are not limited in this application.
As another example, in the team mode described above, the target live audience and the anchor of the target live content may also handle different control objects. That is, a new control object may be added to the target live content as a target control object for the target live audience manipulation.
Fig. 7 is a schematic diagram of another team organization mode according to an embodiment of the present application. As shown in fig. 7, after detecting the triggering operation of the target live audience on the experience entrance control, the terminal device may generate a game participation request and send the game participation request to the server for providing the game service content, and the server may newly add a control object in the game play of the host play of the target live audience to control the target live audience in response to the game participation request, which specifically means that a target control object controlled by the target object is newly added in the target live audience. After the target object triggers a control operation through the game control in the first game experience area, the server can respond to the control operation to correspondingly control the target control object to execute the action indicated by the control operation.
It should be appreciated that when the team mode is the mode described above in which the control object is different from the anchor control, the server for providing the live service may select one or more target live audience according to the game rule; accordingly, the number of target control objects newly added in the game play pair depends on the number of target live audience members that trigger the experience team mode. In addition, the newly added target control object controlled by the target live broadcast audience and the control object controlled by the anchor can be friends or enemies, and the application is not limited in any way.
Optionally, in order to further improve the interest of the team mode, in the team mode, the target live audience may also be supported to transfer the target resource to the anchor of the target live content. That is, a target game control for triggering use of a target resource may be included in the first game experience area; and transferring the target resource to a host of the target live content in response to a resource transfer operation triggered by the target live audience through the target game control.
Fig. 8 is a schematic diagram illustrating implementation of a resource transfer function according to an embodiment of the present application. As shown in fig. 8, a target game control 801 capable of triggering use of a target resource is included in the first game experience area. The target live audience may trigger the resource transfer operation for the target resource by dragging the target game control 801 for a long time, for example, the target live audience may drag the target game control 801 for a long time and drag the target game control 801 to a position 802 in the live area where the game control using the target resource is triggered, so as to implement the resource transfer operation for the target resource. The server for providing the game service may accordingly transfer the usage rights of the target resource to the anchor in response to the resource transfer operation triggered by the target live audience, i.e., allow the anchor to use the target resource in the game without allowing the target live audience to use the target resource.
It should be understood that, in practical applications, the target live audience may also acquire the target resources in various manners; for example, the target resource may be configured directly for the target live audience by the server; for another example, the target live audience may obtain the target resource by monumenting or treasuring in the game scene, and for another example, the target live audience may obtain the target resource by completing the try-on task. In addition, the target live audience can check the owned game prop resources by ordering the virtual knapsack of the target live audience, and further, any game prop resource can be used as a target resource, and the target resource is transferred to the host through the resource transfer operation. The present application does not set any limit to this.
It should be appreciated that in practice, the target live audience may also transfer the target resource to the anchor in other ways. For example, a virtual knapsack of a control object controlled by a host can be displayed in target live content played in a live broadcast area, and a game prop resource which can be used by the host is loaded in the virtual knapsack, at this time, a target live audience can transfer the target resource to the host by dragging a target game control to a display position of the virtual knapsack for a long time. For another example, the target live audience may control the target resource to be displayed at the target scene area by dragging the target game control to the target scene area in the target live content for a long time, at which time the anchor may manipulate its control object to go to the target scene area to pick up the target resource. The present application does not set any limit to this.
It should be appreciated that in actual practice, the target live audience may also trigger the use of the target resource through the target game control without transferring the target resource to the anchor.
According to the embodiment of the invention, through the implementation mode, the target live audience with higher liveness can be selected from all live audiences of the target live content, and a team forming mode for participating in games with a main broadcasting team is provided for the target live audience to experience. The experience team mode is beneficial to enabling the target live audience to experience the promoted game application deeper and more truly, thereby being more beneficial to attracting the target live audience to download and use the game application, increasing the viscosity between the target live audience and the host, and improving the enthusiasm of other live audience in the live broadcasting room to participate in live broadcasting interaction.
In one possible implementation manner provided by the embodiment of the present application, when the target application is a game application, the target live audience may be supported to independently experience the game provided by the game application. Fig. 9 is a flowchart of an application experience method provided in an embodiment of the present application in this implementation, as shown in fig. 9, where the application experience method includes the following steps:
Step 901: and displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast area, a comment area and an experience entrance control, the live broadcast area is used for playing target live broadcast content, and the experience entrance control is used for triggering target applications related to the target live broadcast content to be experienced.
The implementation of step 901 is similar to that of step 201 in the embodiment shown in fig. 2 and described in detail with reference to step 201.
Further, in this implementation manner, the server for providing the live broadcast service (i.e., the first server 120 in fig. 1) may determine, for each live audience of the target live broadcast content, a target mode that meets the personalized requirement of the target audience, so as to control the terminal device to display an experience entry control corresponding to the target mode in the live broadcast interface.
Fig. 10 is a schematic diagram of a change process of a live interface according to an embodiment of the present application. Fig. 10 (a) shows a live interface, where the live interface includes a live area 1001, a comment area 1002, and an experience entry control 1003 corresponding to a target mode, where the target mode is a match-up mode determined from candidate modes provided by a game application according to behavior data of a live audience.
Specifically, the server can acquire the generated behavior data of each live audience aiming at the target live content, then analyze and mine the personalized requirements of the live audience according to the behavior data in a targeted manner, and further select a candidate mode matched with the personalized requirements of the live audience from all candidate modes provided by the target application as a target mode provided for the live audience. In this way, the difference of each live audience of the target live content is considered, the target mode matched with the personalized requirement of each live audience in the target application is provided for each live audience, and the target mode provided for the live audience is ensured to accord with the preference of the live audience, so that the live audience is attracted to try out the service content in the target mode of the experience target application, and the live audience is attracted to download and use the target application.
As one example, a server may determine first content of interest of a live audience based on real-time behavior data generated for a target live content while the live audience views the target live content; the real-time behavior data comprise at least one of watching time of a live audience to the target live content, virtual gift giving data, barrage sending data, praying data and comment data generated by the live audience to the target live content; a target pattern is determined among the candidate patterns of the target application based on the first content of interest.
Specifically, for each live audience of a target live content, the server may determine a viewing duration of the live audience for the target live content; when it is determined that the viewing time of the live audience with respect to the target live content exceeds a preset time, the server may determine the first attention content of the live audience according to at least one of virtual gift-giving data, barrage sending data, praying data and comment data generated by the live audience with respect to the target live content. For example, the server may extract the keyword from at least one of the live audience indicated by the live-action data and the comment content indicated by the comment data and sent by the live-action audience for the target live-action content, and further use the extracted keyword as a basis for determining the first attention content of the live-action audience. For another example, the server may determine, according to the value of the virtual gift presented by the live audience indicated by the virtual gift presentation data and the time for presenting the virtual gift, the time for presenting the virtual gift with relatively high value by the live audience, and determine the play content corresponding to the time in the target live content, and further use the play content as the basis for determining the first attention content of the live audience. For another example, the server may determine, according to the timing of the live audience triggering the praise operation indicated by the praise data, the play content corresponding to the timing in the target live content, and further use the play content as a basis for determining the first attention content of the live audience. Of course, the above manner of determining the first attention content is merely an example, and the present application does not limit this.
The server determines the first attention content of the live audience according to at least one of virtual gift giving data, bullet screen sending data, praying data and comment data generated by the live audience aiming at the target live content, and then determines a candidate mode matched with the first attention content from all candidate modes of the target application as a target mode provided for the live audience. The candidate modes of the target application can be divided according to the service content provided by the target application, and when the target application is a game application, a plurality of candidate modes can be divided according to the game content under the game application.
As another example, the server may determine the second content of interest of the live audience based on historical behavior data generated on a playback platform of the target live content before the live audience views the target live content; the historical behavior data comprises at least one of target live content watching historical data, virtual gift giving historical data, barrage sending historical data, praying historical data and comment historical data generated by the live audience on the playing platform; further, a target pattern is determined among the respective candidate patterns of the target application based on the second attention content.
Specifically, for each live audience of the target live content, the server may acquire historical behavior data generated by the live audience on a playing platform of the target live content, for example, acquire video viewing historical data, virtual gift-gifting historical data, barrage sending historical data, praying historical data, comment historical data and the like generated by the live audience through the playing platform, and further determine second attention content of the live audience according to the acquired historical behavior data. For example, the server may determine, based on the video viewing history data of the live audience, video content that the live audience likes to watch in a month, and use the video content as a basis for determining the second attention content of the live audience. For another example, the server may extract the keyword from at least one of the barrage content transmitted by the live audience within one month indicated by the barrage transmission history data of the live audience and the comment content transmitted by the live audience within one month indicated by the comment history data, and further use the extracted keyword as a basis for determining the second attention content of the live audience. For another example, the server may determine, according to the value of the virtual gift presented by the live audience indicated by the virtual gift presenting history data and the live content presented by the virtual gift, the live content for which the live audience presents the virtual gift with a relatively high value, and further use the live content as a basis for determining the second attention content of the live audience. For another example, the server may trigger the live content of the praise operation according to the live audience indicated by the praise history data, and use the content of the live content as a basis for determining the second attention content of the live audience. Of course, the above manner of determining the second attention content is merely an example, and the present application does not limit this.
The server determines the second attention content of the live audience according to at least one of video watching history data, virtual gift giving history data, barrage sending history data, praying history data and comment history data generated by the live audience on a playing platform of the target live content, and then determines a candidate mode matched with the second attention content from various candidate modes of the target application as a target mode provided for the live audience.
In practical applications, the server may also comprehensively consider the first attention content determined according to the real-time behavior data of the live audience and the second attention content determined according to the historical behavior data of the live audience, and determine the target mode provided for the live audience in each candidate mode of the target application. In this way, the target mode provided for the live audience is determined by referring to the current preference and the historical preference of the live audience at the same time, so that the accuracy of the determined target mode can be further improved, namely, the determined target mode can be ensured to attract the live audience to experience.
Step 902: responding to the triggering operation of the live audience aiming at the experience entrance control, and switching the evaluation area into a second game experience area; and a game picture of the game application is displayed in the second game experience area, the game picture comprises a control object controlled by the live audience, and the second game experience area comprises a game control supporting the live audience to trigger game operation.
Step 903: and responding to a control operation triggered by the live audience through a game control in the second game experience area, and controlling a control object in the game picture to execute the action indicated by the control operation.
Since the degree of association between the implementation procedures of step 902 and step 903 is high, the implementation procedures of step 902 and step 903 will be described in the following.
And responding to the triggering operation of the live audience aiming at the experience entrance control, switching the comment area in the live broadcast interface into a second game experience area, wherein a game picture of a game application is displayed in the second game experience area, the game picture comprises a control object controlled by the live broadcast object, and the second game experience area comprises a game control supporting the triggering game operation of the live audience.
Fig. 10 is a schematic diagram of a change process of a live interface according to an embodiment of the present application. In response to a click operation triggered by a live audience with respect to the experience entry control 1003 in the live interface shown in fig. 10 (a), the terminal device displays the live interface shown in fig. 10 (b), where the live interface includes a second game experience area 1004 that covers the comment area 1002, a game screen corresponding to a battle mode (i.e., a target mode) of a battle is displayed in the second game experience area 1004, a control object that can be controlled by the live audience is included in the game screen, and a game control that supports the live audience to trigger a game operation in the battle mode is included in the second game experience area 1004.
In response to a control operation triggered by the live audience through the game control in the second game experience area 1004, the server for providing the service content of the game application (i.e., the second server 130 in fig. 1) may control the control object in the game screen displayed in the second game experience area 1004 to perform an action indicated by the control operation according to the control operation.
In this embodiment, the control object controlled by the live audience and the control object currently controlled by the anchor of the target live content may or may not be in the same game play. For example, when the method provided by the embodiment does not determine the corresponding target mode for the live audience, responding to the operation triggered by the live audience through the experience entry control, and directly providing the live audience with the game pair currently participated by the host of the target live content for the experience of the live audience; for another example, when the target mode determined by the method provided by the embodiment for the live audience is the same as the target mode currently in which the anchor is located, responding to the operation triggered by the live audience through the experience entry control can directly provide the game pair currently participated by the anchor for the live audience to experience. For another example, when the target mode determined by the method provided by the embodiment of the application for the live audience is different from the target mode currently in which the anchor is located, responding to the operation triggered by the live audience through the experience entry control can provide a game pair in the target mode for the live audience to experience.
It should be understood that, in the embodiment of the present application, when the control object controlled by the live audience and the control object controlled by the anchor of the target live content are in the same game, the control objects controlled by the two are different control objects.
Optionally, to further increase interest, live spectators may also be supported to transfer the target resources into the game play. That is, a target game control for triggering use of the target resource may be included in the second game experience area; and responding to the resource transfer operation triggered by the live audience through the target game control, and transferring the target resource to the game corresponding to the target live content.
Specifically, the target resource may be transferred to a game pair corresponding to the target live content in response to an operation of the live audience dragging the target game control from the second game experience area to the live area. That is, the live audience may transfer the target resource into the game play by dragging the target game control in the second game experience area to the live area for a long time; transferring the target resource to the game play can be understood as transferring the target resource to any control object in the game play, wherein the control object can be a control object controlled by a host of target live broadcast content or a control object controlled by other players in the game play.
Fig. 11 is a schematic diagram illustrating implementation of a resource transfer function according to an embodiment of the present application. As shown in FIG. 11, a target game control 1101 that can trigger the use of a target resource is included in the secondary game experience area. The live audience may trigger a resource transfer operation for the target resource by dragging the target game control 1101 long, e.g., the live audience may long press the target game control 1101 and drag the target game control 1101 to the location 1102 in the live area that triggered the game control using the target resource, thus transferring the target resource to the host. The server for providing the game service may accordingly transfer the usage rights of the target resource to the anchor in response to the live audience triggered resource transfer operation, i.e., allow the anchor to use the target resource in the game without allowing the live audience to use the target resource.
It should be understood that, in practical applications, the live audience may acquire the target resources in various ways; for example, the target resource may be configured directly for the live audience by a server; for another example, a live audience may obtain a target resource by monumenting or treasuring in a game scene, and for another example, a live audience may obtain a target resource by completing a trial play task. In addition, the live audience can check the owned game prop resources by checking out the virtual knapsack of the live audience, and further, any game prop resource can be used as a target resource, and the target resource is transferred to a game counter through the resource transfer operation. The present application does not set any limit to this.
It should be appreciated that in practice, the live audience may also transfer the target resources to the game play in a variety of ways. For example, a virtual knapsack of a control object controlled by a host can be displayed in target live content played in a live broadcast area, and a game prop resource which can be used by the host is loaded in the virtual knapsack, at this time, a live broadcast audience can transfer the target resource to the host by dragging a target game control to a display position of the virtual knapsack for a long time. For another example, the live audience may control the target resource to be displayed at the target scene area by dragging a target game control long press to the target scene area in the target live content, at which time each control object in the game pair may go to the target scene area to pick up the target resource. The present application does not set any limit to this.
It should be appreciated that in practice, the live audience may also trigger the use of the target resource through the target game control without diverting the target resource to the game play.
According to the embodiment of the application, the target mode meeting the personalized requirements of each live audience of the target live content in the publicized game application can be determined according to the implementation mode, and the service content in the target mode is provided for the target audience to experience. Because the target mode provided for the live audience is determined according to the behavior data and is matched with the preference of the live audience, the live audience can be better attracted to try out the service content in the target mode, and the live audience can be attracted to download and use the game application.
In practical application, the method provided by the embodiment of the application can be used for propaganda of other types of applications besides game applications, and when propaganda of other types of applications, the terminal equipment responds to the triggering operation for the experience entrance control, and the comment area can be switched into the functional experience area; the function experience area is used for supporting live audience to experience service content of a target function in the target application, an interface with the target function is displayed in the function experience area, and the function experience area comprises a function control used for triggering the experience target function.
It should be appreciated that, in this scenario, the server for providing the live broadcast service may also determine, for its experience, a target mode in the target application according to its personalized requirements according to the behavior data of the live broadcast audience, and specifically determine the target mode in a manner similar to that of determining the target mode in the embodiment shown in fig. 9, which is referred to in detail in the related description.
Fig. 12 is a schematic diagram of a change process of a live interface according to an embodiment of the present application. Taking the target application as a social application as an example, fig. 12 (a) shows a live broadcast interface, where the live broadcast interface includes a live broadcast area 1201, a comment area 1202, and an experience entry control 1203, where the experience entry control 1203 corresponds to a target function of the social application, and the target function is a chat function determined according to behavior data of a live audience. In response to the triggering operation of the live audience against the experience entrance control 1201, the terminal device displays the live interface shown in (b) in fig. 12, namely, switches the comment area 1202 into the function experience area 1204, wherein the function experience area 1204 carries a chat interface in the social application, and the chat interface supports the live audience to try out the chat function provided by the social application, wherein the chat interface comprises an information input control and the like for realizing the chat function, so that the live audience can watch the target live content and the related trial function at the same time.
In order to further understand the method provided in the embodiment of the present application, taking a target application as a game application and a target live content as a live video using the game application as a live broadcast as an example, background implementation processes of the two implementations (an implementation manner of providing a target mode meeting the personalized requirement of each live audience and an implementation manner of selecting a target live audience to provide a team forming mode for each live audience) provided in the embodiment of the present application are respectively described below.
Referring to fig. 13, fig. 13 is a background interaction signaling diagram of a second implementation (an implementation that provides a target mode for each live viewer that meets its personalized needs) provided by an embodiment of the present application. As shown in fig. 13, the interactive system includes a terminal device, a server of a live platform and a server of a cloud game; wherein, the terminal device runs live broadcast application or supports displaying a webpage interface for playing live broadcast video, and the terminal device in fig. 13 covers the terminal device facing live broadcast audience and the terminal device facing anchor; the server of the live platform corresponds to the first server 120 in fig. 1, which is used to provide live services; the server of the cloud game corresponds to the second server 130 in fig. 1, which is used to provide game services at the cloud.
As shown in fig. 13, when a live audience views target live content of a live game using a game application through a terminal device, various behaviors (such as presenting a virtual gift, sending a barrage, posting comments, praying, etc.) can be generated for the target live content, and the terminal device transmits corresponding behavior data to a server of a live platform. After the server of the live broadcast platform receives the behavior data transmitted by the terminal equipment, the personalized requirements of each live audience of the target live broadcast content can be analyzed according to the behavior data of each live audience, and then the target game mode matched with the personalized requirements in the game application is determined; for example, the attention content of the live audience is determined according to the barrage content, the comment content published by the live audience, the target live content playing content corresponding to the opportunity of giving the virtual gift, and the like, and then the target game mode corresponding to the attention content in the game application is determined.
In the live broadcast process, the host of the target live broadcast content can trigger the start of the trial play authority of the game application through the used terminal equipment, and correspondingly, the terminal equipment informs the server of the live broadcast platform, and the host starts the trial play authority of the audience to the game application. The server of the live platform may further notify the server of the cloud game to prepare, for each live audience of the target live content, a target game pattern in the game application that matches the personalized needs of the live audience. After the cloud game server completes the preparation work of the target game mode, the server of the live broadcast platform can be notified, and then the server of the live broadcast platform can control terminal equipment facing a live broadcast audience, and experience entry controls corresponding to the target game mode provided for the live broadcast audience are displayed in a live broadcast interface for playing live broadcast target live broadcast contents.
After the terminal equipment facing the audience detects that the audience clicks the experience entrance control corresponding to the target game mode, a notification message can be correspondingly sent to the server of the live broadcast platform. Furthermore, the server of the live broadcast platform can inform the server of the cloud game to start a target game mode in the game application provided for the audience, and allow the audience to enter the target game mode, and after the server of the cloud game completes the operation, a corresponding notification message can be sent to the server of the live broadcast platform, wherein the notification message carries experience data related to the target game mode. Furthermore, the server of the live platform can inform the terminal device facing the audience that the terminal device has entered a target game mode of the game application, and send experience data related to the target game mode to the terminal device; accordingly, the terminal device can switch the comment area in the live interface into the experience area, and render and display a game interface corresponding to the target game mode in the experience area according to the experience data, and a spectator can interact with the cloud game server through the game interface, so that game content in the target game mode is experienced.
Referring to fig. 14, fig. 14 is a background interaction signaling diagram of a first implementation (an implementation for selecting a target live viewer to provide a team mode for it) provided in an embodiment of the present application. The execution bodies included in the interactive system shown in fig. 14 are the same as those in the interactive system shown in fig. 13, and will not be described again here.
As shown in fig. 14, when a live audience views target live content of a live game using a game application through a terminal device, various behaviors (such as presenting a virtual gift, sending a barrage, posting comments, praying, etc.) can be generated for the target live content, and the terminal device will transmit corresponding behavior data to a server of the live platform. After the server of the live broadcast platform receives the behavior data transmitted by the terminal equipment, behavior analysis can be performed according to the behavior data transmitted by each terminal equipment.
In the live broadcast process, the host of the target live broadcast content can trigger and start the live broadcast audience team with highest invitation activity to participate in the game through the used terminal equipment, and correspondingly, the terminal equipment informs a server of a live broadcast platform, and the host starts the trial play permission of the team combination mode. The server of the live platform may select a target live audience with highest liveness among the live audiences viewing the live target live content based on a behavior analysis that was previously performed according to behavior data of the live audiences. Then, the server of the live broadcast platform informs the server of the cloud game, and allows the target live broadcast audience to enter a cloud game room of the host, and participate in the game together with the host; accordingly, the server of the cloud game may feed back the address of the cloud game room in which the host is located to the server of the live platform. The server of the live broadcast platform can control terminal equipment facing the target live broadcast audience, and display experience entry controls corresponding to the team modes in a live broadcast interface for playing live broadcast target live broadcast contents.
After the terminal equipment facing the target live audience detects that the target live audience clicks the experience entry control corresponding to the team forming mode, a notification message can be sent to the server of the live platform correspondingly. Furthermore, the server of the live broadcast platform can inform the server of the cloud game, so that a target live broadcast audience enters a cloud game room where a host is located; after the cloud game server completes the operation, a corresponding notification message can be sent to the server of the live broadcast platform, wherein the notification message carries experience data related to the team forming mode. Furthermore, the server of the live broadcast platform can inform the terminal equipment facing the target live broadcast audience, the target live broadcast audience enters a cloud game room where a host is located, and experience data related to a team mode is sent to the terminal equipment; correspondingly, the terminal equipment can switch the comment area in the live broadcast interface into the experience area, and render and display a game operation interface corresponding to the team forming mode in the experience area according to the experience data, wherein the game operation interface comprises various game controls for controlling the control object, and the target live broadcast audience can interact with a cloud game server through the game controls in the game operation interface, so that the control object for controlling the hosting control in the live broadcast target live broadcast content is realized.
In addition, after the target live audience enters the cloud game, dedicated game prop resources can be provided for the target live audience; the target live audience can transfer the use permission of the game prop resource to the anchor by triggering the resource transfer operation, the server of the live platform can inform the server of the cloud game after detecting that the target live audience triggers the resource transfer operation for the game prop resource, and then the server of the cloud game can correspondingly transfer the use permission of the game prop to the anchor, and inform the server of the live platform after completing the operation. Furthermore, the server of the live broadcast platform can inform the terminal equipment facing the host, complete the transfer operation of the game prop resource and display the game picture of the received game prop resource.
For the application experience method described above, the application experience device is further provided, so that the application experience method can be applied and realized in practice.
Referring to fig. 15, fig. 15 is a schematic structural diagram of an application experience apparatus 1500 corresponding to the application experience method shown in fig. 2 above. As shown in fig. 15, the application experience apparatus 1500 includes:
An interface display module 1501, configured to display a live interface, where the live interface includes a live area, a comment area, and an experience entry control, where the live area is used to play target live content, and the experience entry control is used to trigger a target application to experience a target live content;
the region switching module 1502 is configured to switch the evaluation region to an experience region in response to a triggering operation for the experience entry control, where the experience region is used to support a live audience to experience service content of the target application.
Optionally, when the target application is a game application, the area switching module 1502 is specifically configured to:
responding to the triggering operation of the target live audience aiming at the experience entrance control, and switching the evaluation area into a first game experience area; the first game experience area comprises a game control supporting the target live audience to trigger game operation;
the apparatus further comprises:
and the object display module is used for responding to the triggering operation of the target live broadcast audience aiming at the experience entrance control and displaying a target control object controlled by the target live broadcast audience in the target live broadcast content.
Optionally, the object display module is specifically configured to:
and in the target live broadcast content, a control object indicating the anchor control of the target live broadcast content is the target control object.
Optionally, the object display module is specifically configured to:
and adding a new control object into the target live broadcast content to serve as the target control object.
Optionally, the apparatus further includes:
and the object control module is used for responding to the control operation triggered by the target live audience through the game control in the first game experience area and controlling the target control object in the target live content to execute the action indicated by the control operation.
Optionally, the experience entry control displays team invitation information; the team invitation information is used for indicating a host of the target live content to invite the target live audience to participate in a game together;
the experience entry control does not open experience permissions for the gaming application for other live viewers other than the target live viewer.
Optionally, when the target application is a game application, the area switching module 1502 is specifically configured to:
responding to the triggering operation of the live audience aiming at the experience entrance control, and switching the evaluation area into a second game experience area; and a game picture of the game application is displayed in the second game experience area, the game picture comprises a control object controlled by the live audience, and the second game experience area comprises a game control supporting the live audience to trigger game operation.
Optionally, the apparatus further includes:
and the object control module is used for responding to the control operation triggered by the live audience through the game control in the second game experience area and controlling the control object in the game picture to execute the action indicated by the control operation.
Optionally, when the target application is a game application, the experience area includes a target game control for triggering to use a target resource; the apparatus further comprises:
and the resource transfer module is used for responding to the resource transfer operation triggered by the target game control and transferring the target resource to the game corresponding to the target live broadcast content.
Optionally, the resource transfer module is specifically configured to:
and in response to the operation of dragging the target game control from the experience area to the live broadcast area, transferring the target resource to a game pair corresponding to the target live broadcast content.
Optionally, a virtual knapsack of a control object controlled by a host of the target live broadcast content is displayed in the target live broadcast content, and the virtual knapsack is used for loading game prop resources which can be used by the host; the resource transfer module is specifically configured to:
And transferring the target resource to the anchor in response to dragging the target game control from the experience area to a display position of the virtual backpack.
Optionally, the resource transfer module is specifically configured to:
responsive to an operation of dragging the target game control from the experience area to a target scene area in the target live content, displaying the target resource in the target scene area; the target resource supports picking up by control objects in the game play.
Optionally, the area switching module 1502 is specifically configured to:
responding to the triggering operation of the experience entrance control, and switching the evaluation area into a functional experience area; the function experience area is used for supporting live audience to experience service content of a target function in the target application, an interface of the target function is displayed in the function experience area, and the function experience area comprises a function control used for triggering to experience the target function.
Optionally, the experience entry control carries introduction information of the target application.
Optionally, the apparatus further includes:
and the information display module is used for responding to the application information viewing operation triggered by the experience entry control and displaying the introduction information of the target application.
According to the application experience device, live broadcast audiences watching target live broadcast content related to the target application are supported, service content of the target application is triggered and experienced through the experience entry control, the live broadcast audiences can conveniently and deeply learn about the target application through trial experience, and the propaganda effect of the target application is improved. In addition, the device responds to the triggering operation of the live broadcast audience aiming at the experience entrance control, the comment area in the live broadcast interface is switched into the experience area, the live broadcast audience is supported to experience the service content of the target application in the experience area, the live broadcast audience can conveniently watch the live broadcast and experience the target application at the same time, the live broadcast watching experience is not affected, and the convenience of the experience target application is improved.
The embodiment of the application also provides an electronic device for experience application, which can be specifically a terminal device, and the terminal device provided by the embodiment of the application will be described from the perspective of hardware materialization.
Referring to fig. 16, fig. 16 is a schematic structural diagram of a terminal device provided in an embodiment of the present application. As shown in fig. 16, for convenience of explanation, only the portions related to the embodiments of the present application are shown, and specific technical details are not disclosed, please refer to the method portions of the embodiments of the present application. The terminal may be any terminal device including a mobile phone, a tablet computer, a personal digital assistant (Personal Digital Assistant, PDA), a Point of Sales (POS), a vehicle-mounted computer, and the like, taking the terminal as an example of a smart phone:
Fig. 16 is a block diagram illustrating a part of a structure of a smart phone related to a terminal provided in an embodiment of the present application. Referring to fig. 16, the smart phone includes: radio Frequency (RF) circuitry 1610, memory 1620, input units 16300 including touch panel 1631 and other input devices 1632, display units 1640 including display panel 1641, sensors 1650, audio circuitry 1660 (which may connect speaker 1661 and microphone 1662), wireless fidelity (wireless fidelity, wiFi) module 1670, processor 1680, and power supply 1690. Those skilled in the art will appreciate that the smartphone structure shown in fig. 16 is not limiting of the smartphone and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The memory 1620 may be used to store software programs and modules, and the processor 1680 performs various functional applications and data processing of the smart phone by executing the software programs and modules stored in the memory 1620. The memory 1620 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, phonebooks, etc.) created according to the use of the smart phone, etc. In addition, memory 1620 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
Processor 1680 is a control center of the smartphone, connects various portions of the entire smartphone using various interfaces and lines, performs various functions of the smartphone and processes data by running or executing software programs and/or modules stored in memory 1620, and invoking data stored in memory 1620. In the alternative, processor 1680 may include one or more processing units; preferably, the processor 1680 may integrate an application processor and a modem processor, wherein the application processor primarily handles operating systems, user interfaces, applications, etc., and the modem processor primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 1680.
In the embodiment of the present application, the processor 1680 included in the terminal is further configured to perform the steps of any implementation manner of the application experience method provided in the embodiment of the present application.
The embodiment of the application also provides a server, referring to fig. 17, and fig. 17 is a schematic structural diagram of a server 1700 provided in the embodiment of the application. The server 1700 may vary considerably in configuration or performance and may include one or more central processing units (central processing units, CPU) 1722 (e.g., one or more processors) and memory 1732, one or more storage media 1730 (e.g., one or more mass storage devices) that store applications 1742 or data 1744. Wherein the memory 1732 and storage medium 1730 may be transitory or persistent storage. The program stored on the storage medium 1730 may include one or more modules (not shown), each of which may include a series of instruction operations on a server. Further, the central processor 1722 may be arranged to communicate with a storage medium 1730 to execute a series of instruction operations in the storage medium 1730 on the server 1700.
The Server 1700 may also include one or more power supplies 1726, one or more wired or wireless network interfaces 1750, one or more input/output interfaces 1758, and/or one or more operating systems, such as Windows Server TM ,Mac OS X TM ,Unix TM ,Linux TM ,FreeBSD TM Etc.
The steps performed by the server in the above embodiments may be based on the server structure shown in fig. 17.
The CPU 1722 may also be configured to perform a method performed by the first server or the second server provided in the embodiments of the present application.
The embodiments of the present application further provide a computer readable storage medium storing a computer program for executing any one of the application experience methods described in the foregoing embodiments.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform any one of the application experience methods described in the foregoing respective embodiments.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, apparatuses, and methods may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RandomAccess Memory, RAM), a magnetic disk, or an optical disk, or other various media in which a computer program can be stored.
It should be understood that in this application, "at least one" means one or more, and "a plurality" means two or more. "and/or" for describing the association relationship of the association object, the representation may have three relationships, for example, "a and/or B" may represent: only a, only B and both a and B are present, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
The above embodiments are merely for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (20)

1. An application experience method, the method comprising:
displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast area, a comment area and an experience entry control, the live broadcast area is used for playing target live broadcast content, and the experience entry control is used for triggering target applications related to the target live broadcast content to be experienced;
and responding to the triggering operation of the experience entrance control, switching the evaluation area into an experience area, wherein the experience area is used for supporting live audience to experience the service content of the target application.
2. The method of claim 1, wherein when the target application is a gaming application, the switching the assessment area to an experience area in response to a triggering operation for the experience entry control comprises:
responding to the triggering operation of the target live audience aiming at the experience entrance control, and switching the evaluation area into a first game experience area; the first game experience area comprises a game control supporting the target live audience to trigger game operation;
the method further comprises the steps of:
and responding to the triggering operation of the target live audience aiming at the experience entrance control, and displaying a target control object controlled by the target live audience in the target live content.
3. The method of claim 2, wherein displaying the target control object manipulated by the target live audience in the target live content comprises:
and in the target live broadcast content, a control object indicating the anchor control of the target live broadcast content is the target control object.
4. The method of claim 2, wherein displaying the target control object manipulated by the target live audience in the target live content comprises:
and adding a new control object into the target live broadcast content to serve as the target control object.
5. The method according to any one of claims 2 to 4, further comprising:
and responding to a control operation triggered by the target live audience through a game control in the first game experience area, and controlling the target control object in the target live content to execute the action indicated by the control operation.
6. The method of claim 2, wherein the experience portal control displays team invitation information; the team invitation information is used for indicating a host of the target live content to invite the target live audience to participate in a game together;
The experience entry control does not open experience permissions for the gaming application for other live viewers other than the target live viewer.
7. The method of claim 1, wherein when the target application is a gaming application, the switching the assessment area to an experience area in response to a triggering operation for the experience entry control comprises:
responding to the triggering operation of the live audience aiming at the experience entrance control, and switching the evaluation area into a second game experience area; and a game picture of the game application is displayed in the second game experience area, the game picture comprises a control object controlled by the live audience, and the second game experience area comprises a game control supporting the live audience to trigger game operation.
8. The method of claim 7, wherein the method further comprises:
and responding to a control operation triggered by the live audience through a game control in the second game experience area, and controlling a control object in the game picture to execute the action indicated by the control operation.
9. The method of claim 1, wherein when the target application is a gaming application, the experience area includes a target game control for triggering use of a target resource; the method further comprises the steps of:
And transferring the target resource to a game corresponding to the target live content in response to a resource transfer operation triggered by the target game control.
10. The method of claim 9, wherein the transferring the target resource into the game play corresponding to the target live content in response to a resource transfer operation triggered by the target game control comprises:
and in response to the operation of dragging the target game control from the experience area to the live broadcast area, transferring the target resource to a game pair corresponding to the target live broadcast content.
11. The method of claim 10, wherein a virtual knapsack of a control object of a hosting manipulation of the target live content is displayed in the target live content, the virtual knapsack being used for loading a game prop resource that can be used by the hosting;
the responding to the operation of dragging the target game control from the experience area to the live broadcast area, transferring the target resource to a game play corresponding to the target live broadcast content, comprises the following steps:
and transferring the target resource to the anchor in response to dragging the target game control from the experience area to a display position of the virtual backpack.
12. The method of claim 10, wherein transferring the target resource into the game play corresponding to the target live content in response to dragging the target game control from the experience area to the live area comprises:
responsive to an operation of dragging the target game control from the experience area to a target scene area in the target live content, displaying the target resource in the target scene area; the target resource supports picking up by control objects in the game play.
13. The method of claim 1, wherein the switching the assessment area to an experience area in response to a triggering operation for the experience entry control comprises:
responding to the triggering operation of the experience entrance control, and switching the evaluation area into a functional experience area; the function experience area is used for supporting live audience to experience service content of a target function in the target application, an interface of the target function is displayed in the function experience area, and the function experience area comprises a function control used for triggering to experience the target function.
14. The method of claim 1, wherein the experience portal control carries introduction information for the target application.
15. The method of claim 14, wherein the method further comprises:
and responding to an application information viewing operation triggered by the experience entrance control, and displaying the introduction information of the target application.
16. An application experience system, characterized in that the system comprises a terminal device, a first server for providing live services, and a second server for providing service content of a target application;
the terminal device configured to perform the application experience method of any one of claims 1 to 15;
the first server is used for providing target live broadcast content for the terminal equipment;
and the second server is used for providing the service content of the target application for the live audience based on the experience area displayed by the terminal equipment when the terminal equipment responds to the triggering operation of the experience entry control in the live interface.
17. An application experience apparatus, the apparatus comprising:
the interface display module is used for displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast area, a comment area and an experience entrance control, the live broadcast area is used for playing target live broadcast content, and the experience entrance control is used for triggering target applications related to the target live broadcast content to be experienced;
And the region switching module is used for responding to the triggering operation of the experience entrance control, switching the evaluation region into an experience region, and enabling the live audience to experience the service content of the target application.
18. An electronic device, the device comprising a processor and a memory;
the memory is used for storing a computer program;
the processor is configured to perform the application experience method of any one of claims 1 to 15 according to the computer program.
19. A computer readable storage medium for storing a computer program for executing the application experience method of any one of claims 1 to 15.
20. A computer program product comprising a computer program or instructions which, when executed by a processor, implements the application experience method of any one of claims 1 to 15.
CN202211137253.4A 2022-09-19 2022-09-19 Application experience method and related device Pending CN117762539A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211137253.4A CN117762539A (en) 2022-09-19 2022-09-19 Application experience method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211137253.4A CN117762539A (en) 2022-09-19 2022-09-19 Application experience method and related device

Publications (1)

Publication Number Publication Date
CN117762539A true CN117762539A (en) 2024-03-26

Family

ID=90322510

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211137253.4A Pending CN117762539A (en) 2022-09-19 2022-09-19 Application experience method and related device

Country Status (1)

Country Link
CN (1) CN117762539A (en)

Similar Documents

Publication Publication Date Title
CN110678239B (en) Distributed sample-based game profiling with game metadata and metrics and game API platform supporting third party content
JP6982684B2 (en) How and system to schedule gameplay for video games
CN109756787B (en) Virtual gift generation method and device and virtual gift presentation system
JP6231006B2 (en) System and method for interactive experience, and controller for the same
US9526989B2 (en) Method and apparatus for receiving game streaming data, and method and server for transmitting game streaming data
CN106846040A (en) Virtual present display methods and system in a kind of direct broadcasting room
US20130303288A1 (en) Method and apparatus for providing content to a user device
US9186587B2 (en) Distribution of electronic game elements
US20220258048A1 (en) Method and apparatus for executing interaction event
KR20130126557A (en) Network system and method of operation thereof
KR102590492B1 (en) Method, system, and computer program for providing ruputation badge for video chat
US8799788B2 (en) Providing a single instance of a virtual space represented in either two dimensions or three dimensions via separate client computing devices
US20130321451A1 (en) Information processing system, computer readable medium, information processing device, and display method
CN113269585A (en) Method for acquiring virtual currency on live broadcast platform and terminal equipment
TW202227173A (en) Method for displaying voting result, device, apparatus, storage medium and program product
CN114286161B (en) Method, device, equipment and storage medium for recommending articles during live event
CN110602543A (en) Method and apparatus for displaying material, storage medium, and electronic apparatus
CN111950670A (en) Virtual interaction task execution method and device, storage medium and electronic device
CN113382277B (en) Network live broadcast method, device and system
Punt et al. An integrated environment and development framework for social gaming using mobile devices, digital TV and Internet
JP6660549B2 (en) Program and recording medium
CN114210071A (en) Game live broadcast display method and device, storage medium and electronic equipment
WO2021000737A1 (en) Neighbor group interaction method and system, client and server
CN117762539A (en) Application experience method and related device
US20220164825A1 (en) Information processing apparatus and system and non-transitory computer readable medium for outputting information to user terminals

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination