CN111182355A - Interaction method, special effect display method and related device - Google Patents

Interaction method, special effect display method and related device Download PDF

Info

Publication number
CN111182355A
CN111182355A CN202010010766.3A CN202010010766A CN111182355A CN 111182355 A CN111182355 A CN 111182355A CN 202010010766 A CN202010010766 A CN 202010010766A CN 111182355 A CN111182355 A CN 111182355A
Authority
CN
China
Prior art keywords
target
game
special effect
live broadcast
broadcast room
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010010766.3A
Other languages
Chinese (zh)
Other versions
CN111182355B (en
Inventor
王宏明
何阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010010766.3A priority Critical patent/CN111182355B/en
Publication of CN111182355A publication Critical patent/CN111182355A/en
Application granted granted Critical
Publication of CN111182355B publication Critical patent/CN111182355B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the application discloses an interaction method, a special effect display method and a related device, wherein the interaction method comprises the following steps: acquiring user behavior data related to a target game in a target live broadcast room, wherein the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to watching behavior information of a user account number entering the target live broadcast room; if the user behavior data meets the behavior conditions corresponding to the target special effect, determining object information corresponding to the target special effect in the target game play of the target game; the target special effect is used for changing the game content of the target game play; and generating a special effect request according to the target special effect and the object information, wherein the special effect request is used for indicating that the target special effect is triggered in the target game play according to the object information. According to the method and the device, the corresponding special effect is triggered in the game content of the anchor based on the user behavior data, the special effect interaction which can change the game content in live game is realized, the substitution feeling of the user is improved, and the participation desire of the user is met.

Description

Interaction method, special effect display method and related device
Technical Field
The present application relates to the field of data processing, and in particular, to an interaction method, a special effect display method, and a related apparatus.
Background
The network video live broadcast is a popular live broadcast mode at present, and a user can watch the live broadcast of a main broadcast through a live broadcast room entering a live broadcast platform. The video live broadcast has various types, wherein one common live broadcast mode is game live broadcast, the game live broadcast can generally display real-time game pictures of games played by a main broadcast, and users watching the game live broadcast can interact with the main broadcast, other users watching the game live broadcast and the like according to the wonderful degree of the real-time game pictures, the trend of game contents and the like.
However, in the related art, the interaction that can be realized in live game is mainly limited to the interaction on the social level of sending a virtual gift, sending a barrage, and the like, and it is difficult to meet the participation requirement of the user when watching the live game.
Disclosure of Invention
In order to solve the technical problems, the application provides an interaction method, a special effect display method and a related device, which can change the live game content of a main broadcast through a special effect based on user behavior data, so that a user can interact with the main broadcast in the live broadcast content through own behavior, the competitive performance and the burstiness in the live game are enhanced, the substitution feeling of the user is improved, the participation desire of the user is met, and the special effect interaction capable of changing the game content in the live game is realized.
The embodiment of the application discloses the following technical scheme:
in a first aspect, an embodiment of the present application provides an interaction method, where the method includes:
acquiring user behavior data related to a target game in a target live broadcast room, wherein the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to watching behavior information of a user account number entering the target live broadcast room;
if the user behavior data meet the behavior conditions corresponding to the target special effect, determining object information corresponding to the target special effect in the target game play of the target game; the target special effect is used for changing the game content of the target game play;
and generating an effect request according to the target effect and the object information, wherein the effect request is used for indicating that the target effect is triggered in the target game play according to the object information.
Optionally, the user behavior data includes one or more of the following:
entering the number of bullet screen information submitted by the user in the target live broadcast room;
entering the number of virtual characteristic values transferred by the user in the target live broadcast room;
and entering the dwell time of the user in the target live broadcast room.
In a second aspect, an embodiment of the present application provides an interactive device, where the device includes a first obtaining unit, a first determining unit, and a generating unit:
the first acquisition unit is used for acquiring user behavior data related to a target game in a target live broadcast room, the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to watching behavior information of a user account number entering the target live broadcast room;
the first determining unit is used for determining object information corresponding to a target special effect in a target game play of the target game if the user behavior data meets a behavior condition corresponding to the target special effect; the target special effect is used for changing the game content of the target game play;
the generating unit is used for generating a special effect request according to the target special effect and the object information, and the special effect request is used for indicating that the target special effect is triggered in the target game play according to the object information.
In a third aspect, an embodiment of the present application provides a special effect display method, in a game-to-game process of a target game in which a target game is live broadcast through a target live broadcast room, the method includes:
acquiring user behavior data related to a target game in the target live broadcast room, wherein the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to watching behavior information of a user account number entering the target live broadcast room;
and if the user behavior data meet the behavior conditions corresponding to the target special effects, displaying the target special effects in the target game play through the target live broadcast room, wherein the target special effects are used for changing the game contents of the target game play.
In a fourth aspect, an embodiment of the present application provides a special effect display apparatus, in a target game match process of a target game live broadcast through a target live broadcast room, the apparatus includes an obtaining unit and a display unit:
the acquisition unit is used for acquiring user behavior data related to a target game in the target live broadcast room, the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to watching behavior information of a user account number entering the target live broadcast room;
the display unit is used for displaying the target special effect in the target game play through the target live broadcast room if the user behavior data meets the behavior condition corresponding to the target special effect, and the target special effect is used for changing the game content of the target game play.
In a fifth aspect, an embodiment of the present application provides an apparatus, including a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the method of the first aspect or the third aspect according to instructions in the program code.
In a sixth aspect, the present application provides a computer-readable storage medium for storing a computer program for executing the method of the first aspect or the third aspect.
According to the technical scheme, the user behavior data required by content interaction is determined according to the watching behavior information generated by the user account number entering the target live broadcast room in the live broadcast process of the target live broadcast room for the live broadcast target game. When the user behavior data meet the behavior conditions corresponding to the target special effects, the object information corresponding to the target special effects in the target game play of the target game can be determined, and a special effect request is generated according to the target special effects and the object information to indicate that the target special effects are triggered according to the object information in the target game play.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic diagram of an interaction method in an actual application scenario according to an embodiment of the present application;
fig. 2 is a flowchart of an interaction method according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of an interaction method according to an embodiment of the present disclosure;
fig. 4 is a flowchart of an interaction method according to an embodiment of the present disclosure;
fig. 5 is a flowchart of a special effect display method according to an embodiment of the present application;
fig. 6 is a flowchart of an interaction method in an actual application scenario according to an embodiment of the present application;
fig. 7a is a schematic diagram of a special effect display method in an actual application scenario according to an embodiment of the present application;
fig. 7b is a schematic diagram of a special effect display method in an actual application scenario according to an embodiment of the present application;
fig. 7c is a schematic diagram of a special effect display method in an actual application scenario according to an embodiment of the present application;
fig. 8 is a flowchart of a special effect display method in an actual application scenario according to an embodiment of the present application;
fig. 9a is a block diagram of an interaction apparatus according to an embodiment of the present disclosure;
fig. 9b is a block diagram of an interaction apparatus according to an embodiment of the present disclosure;
fig. 9c is a block diagram of an interaction apparatus according to an embodiment of the present disclosure;
fig. 9d is a block diagram of an interaction apparatus according to an embodiment of the present disclosure;
fig. 9e is a block diagram of an interaction apparatus according to an embodiment of the present disclosure;
fig. 9f is a block diagram of an interaction apparatus according to an embodiment of the present disclosure;
fig. 9g is a block diagram of an interaction apparatus according to an embodiment of the present disclosure;
FIG. 10 is a block diagram of a special effect display apparatus according to an embodiment of the present disclosure;
FIG. 11 is a block diagram of an apparatus provided in an embodiment of the present application;
fig. 12 is a block diagram of a server according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below with reference to the accompanying drawings.
The live game is one of the popular live broadcast types, and audiences can watch the game operation of the anchor in real time or watch the game operation of other people through the rebroadcast content of the anchor by entering a live broadcast room of the anchor. In the related art, the interactive contents of the audience interacting with the anchor are limited to the user communicating with the anchor through the barrage or the user giving a virtual gift to the anchor, so that the user can communicate with the anchor only on a social level. Users who watch the live game partially often want to be able to actually participate in the live game content of the anchor, and interact with the anchor more specifically and vividly at the game level. If the interaction can be carried out only on the social contact level, the participation sense of the user in watching the game live broadcast is low, the experienced game pleasure is not enough, and the participation and interaction requirements of the user cannot be met.
In order to solve the technical problems, the application provides an interaction method, a special effect display method and a related device, the method can be applied to a live broadcast platform with live broadcast contents being game live broadcasts, and the game contents of the main broadcast live broadcasts can be changed through special effects based on user behavior data, so that a user can interact with the main broadcast in the live broadcast contents through own behaviors, the competitive performance and the burstiness in the game live broadcasts are enhanced, the substitution feeling of the user is improved, and the participation desire of the user is met.
It is understood that the method may be applied to a processing device, which is a processing device having a function of acquiring live broadcast information, and may be, for example, a terminal device or a server carrying a live broadcast platform. The method can be independently executed through the terminal equipment or the server, can also be applied to a network scene of communication between the terminal equipment and the server, and is executed through the cooperation of the terminal equipment and the server. The terminal device may be a computer, a Personal Digital Assistant (PDA), a tablet computer, or the like. The server may be understood as an application server or a Web server, and in actual deployment, the server may be an independent server or a cluster server. Meanwhile, in a hardware environment, the technology has been implemented in the following environments: an ARM architecture processor, an X86 architecture processor; in a software environment, the technology has been implemented in the following environments: android platform, Windows xp and operating systems or Linux operating systems.
In order to facilitate understanding of the technical solution of the present application, the interaction method provided in the embodiment of the present application will be described below with reference to an actual application scenario.
Referring to fig. 1, fig. 1 is a schematic view of an application scenario of an interaction method provided in an embodiment of the present application, where a processing device is a server 101. The server 101 may obtain user behavior data related to the target game from a target live broadcast room, where the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to viewing behavior information of a user account entering the target live broadcast room. The viewing behavior information refers to information of related operations performed by a user through a user account when the user views the live broadcast, in the application scenario, the user behavior data represents a virtual feature value transferred by the user account entering the target live broadcast room when the user account broadcasts the target game in the target live broadcast room, for example, a flower identifier (a virtual gift, which belongs to an expression form of the virtual feature value) shown in fig. 1, and the like, and through the user behavior data, the server 101 may determine the whole number of the virtual feature value transferred according to the user account entering the target live broadcast room.
It can be understood that, for different target live broadcast rooms and different target games live broadcast in the target live broadcast rooms, the server 101 may preset a behavior condition for triggering a target special effect, where the behavior condition refers to a user behavior data standard capable of triggering the target special effect, and the behavior condition may embody a corresponding relationship between the user behavior data and the target special effect. For example, the server 101 may preset different special effects that can be triggered by the transfer number of the virtual feature value, and notify the corresponding relationship to the user or the anchor in the form of a list or the like through a live task or the like, so that the user can make a user behavior corresponding to the target special effect that the user wants to trigger according to the participation desire of the user. For example, in the application scenario, the behavior condition corresponding to the target special effect of "improving 30% of attack power for the own game character" is that the number of virtual feature values transferred in the target live broadcast room reaches 150. When a user watches the live broadcast, when the side where the anchor broadcast is located is in a disadvantage, in order to make the game situation tend to be balanced, a certain amount of virtual characteristic values can be transferred through the user account of the user, so that the number of the virtual characteristic values transferred in the target live broadcast room can be helped to reach a target value.
After acquiring the user behavior data, the server 101 may determine whether it satisfies a behavior condition corresponding to the target special effect. The target special effect is a special effect which can be triggered by user behaviors made by a user through a user account, and the special effect can be used for changing the special effect of the target game on game contents in the game, for example, in the application scene, the target special effect is '30% of attack force is improved for own role'. The target game play is a game play in the game of the opponent type, and in the game play, two game characters 102 and 103 are included, wherein 102 is a game character of a self party, and 103 is a game character of an enemy party. After obtaining the virtual feature value transfer number, the server 101 may compare the virtual feature value transfer number with a preset behavior condition, and determine whether the behavior condition that the virtual feature value transfer number in the target live broadcast room reaches 150 "is satisfied. As can be seen from fig. 1, the number of virtual feature value transitions is 212, which is greater than 150, and therefore the behavior condition corresponding to the special effect of "improving 30% of attack force for the own character" is satisfied.
When the server 101 determines that the user behavior data satisfies the behavior condition corresponding to the target special effect, it is determined that the content in the target game play of the target game needs to be changed according to the target special effect, and at this time, the server 101 may determine object information corresponding to the target special effect in the target game play of the target game. The object information is a basis for changing game contents according to the target special effect, and the object information can determine on what object the target special effect is triggered in the target game play. For example, in this application scenario, since the target special effect is "improve 30% of attack force for the own character", the object information determined in this way may be the own game character 102. It is understood that the self game character herein refers to a game character manipulated by the anchor of the target live broadcast room.
After determining the object information, the server 101 may generate an effect request based on the target effect and the object information. The special effect request is used for indicating that the target special effect is triggered in the target game play according to the object information. Therefore, the generated special effect request triggers the target special effect in the target game play. For example, in the application scenario shown in fig. 1, through the generated special effect request, in the game of battle, the target special effect is triggered on the game character 102 to increase the attack power by 30%.
According to the technical scheme, the server 101 can acquire user behavior data representing user experience and participation intention to a certain extent, judge whether behavior conditions are met or not according to the user behavior data, when the behavior conditions are met, the server 101 can determine object information corresponding to the target special effect in the target game play according to the target special effect corresponding to the behavior conditions and the target game play corresponding to a target live broadcast room, and generate a special effect request according to the object information and the target special effect, so that the viewing experience and participation intention of a user can be finally reflected in the target game play, the user can interact with a main broadcast on the level of live broadcast content, and the substitution feeling of the user is improved; meanwhile, when the target game is a game of confrontation type, the user can influence the strike of the confrontation of the game to a certain extent by utilizing the own user behavior, and the competitive performance and the burstiness in live game are enhanced.
Next, an interaction method provided by the embodiments of the present application will be described with reference to the drawings.
Referring to fig. 2, fig. 2 shows a flow chart of an interaction method, the method comprising:
s201: and acquiring user behavior data related to the target game in the target live broadcast room.
Wherein, the target live broadcast room is a live broadcast room for live broadcast of the target game. It is to be understood that the target live room can also be used for live broadcasting other content, and is not limited to only live broadcasting the target game. The interaction method provided by the embodiment of the application can be applied to the live broadcast room for the live broadcast target game to realize the interaction of the live broadcast content level, wherein the target live broadcast room is any one of the live broadcast rooms for the live broadcast target game.
After entering the target live broadcast room, the user can see the live broadcast content of the related target game, and the live broadcast content can be game play-to-play operated by a main broadcast of the target live broadcast room or game play-to-play relayed by the main broadcast. The target game may be a plurality of kinds of games, such as a countermeasure-type game or a cooperation-type game.
In order to enable the user to interact with the anchor in the target game, the processing device needs to acquire a certain parameter, which can reflect the participation desire of the user, for example, user behavior data. The processing equipment determines the watching behavior of the user account in the target live broadcast room according to the watching behavior of the user account entering the target live broadcast room, and the watching behavior of the user account is a behavior which is spontaneously made by the user according to the participation intention when the user watches the live broadcast content in the target live broadcast room, so that the user behavior data determined according to the watching behavior meets the requirement for embodying the participation intention of the user. The processing device may directly use all the user behavior data as a criterion for determining the trigger target special effect, for example, convert all the user behavior data into a heat value of the target live broadcast according to a preset ratio, and determine whether to trigger the target special effect according to the heat value.
It can be understood that, for different live broadcast rooms, live games, different user participation intentions, and the like, the user behavior data may be different types of data, for example, the user behavior data may be any one or a combination of a number of bullet screen information submitted by the user entering the target live broadcast room, a number of virtual feature values transferred by the user entering the target live broadcast room, and a dwell time of the user entering the target live broadcast room in the target live broadcast room. Wherein the virtual characteristic value may be a virtual gift, a virtual token, etc. that can be transferred by the user to the target live broadcast room.
S202: and if the user behavior data meet the behavior conditions corresponding to the target special effect, determining the object information corresponding to the target special effect in the target game play of the target game.
It can be understood that, in order to enable the user behavior data to express the participation intention of the user more clearly, in a possible implementation manner, the processing device may preset a corresponding relationship between the user behavior data and the special effect, and let the user know the corresponding relationship through a user task or a main task or the like, so that the user can make corresponding viewing behavior information through a user account with a tendency according to the participation intention of the user, thereby triggering the target special effect which the user wants to see in the target game in the target live broadcast room. The user task refers to a task which is judged whether to be completed or not by counting user behavior data of a single user in the target anchor room; the anchor task refers to a task of counting the overall user behavior data of all users in the target live broadcast room to judge whether the tasks are finished. The user can acquire the user behavior data capable of triggering the target special effect through the task conditions in the task.
Based on the set correspondence, associations of at least one special effect with behavior conditions may be identified, where any one special effect may have at least one behavior condition. In the embodiment of the present application, if the user behavior data acquired in S201 currently satisfies a behavior condition of a special effect, the special effect may be used as the target special effect mentioned in the embodiment of the present application.
The target special effect can be used for changing the game content of the target game play, so that the target special effect determined based on the user behavior data can reflect the change intention of the user on the game content, and the user can trigger the target special effect in the target game play to interact with the main play of the target live broadcast room on the level of the game content. It is understood that the target special effects can be divided into a plurality of types according to the changes that can be made to the game content by the target special effects, for example, gain type special effects (Buff), profit-reduction type special effects (Debuff), neutral type special effects or call type special effects, etc. The gain special effect refers to a special effect which is beneficial to change of a game object in the game of the target game, such as improvement of the attack force of the game object; the decreasing special effect refers to a special effect which makes unfavorable changes to game objects in the game play of the target game, such as reducing the defense force of the game objects; the neutral special effect is a special effect which does not have harmful influence on game objects in the game played by the target game, such as a special effect of changing game environment and weather; the calling-type special effect refers to a special effect which can call one or more game objects in the target game pair, for example, a group of wild monsters and the like.
The behavior condition corresponding to the target special effect is a user behavior data standard capable of triggering the target special effect, and can embody the corresponding relation between the user behavior data and the special effect. For example, the behavior condition may be a standard value of the number of bullet screen information submitted by the user in the target live broadcast room, a standard value of the number of virtual characteristic values transferred by the user, and the like. After the processing device obtains the user behavior data, in order to determine the type of the special effect that the user wants to trigger and whether the special effect can be triggered, the obtained user behavior data may be compared with the behavior condition, and if the user behavior data satisfies a certain behavior condition, the processing device may determine that the target special effect that the behavior condition corresponds to the user wants to trigger.
After determining that the user behavior data meets the behavior condition corresponding to the target special effect, in order to trigger the target special effect in the target game play, the processing device needs to determine object information corresponding to the target special effect in the target game play. The object information is used to identify a game object that can be changed by a target special effect in a target game play, where the game object may be a game scene, a Non-Player Character (NPC), a game Character manipulated by a Player, or the like. For example, when the target special effect is "reduce the life value of the enemy wild monster by half", the game object identified from the object information at this time is the enemy wild monster of the main broadcasting view angle of the target live broadcasting room.
It will be appreciated that some of the effects may be effective on all of the game objects in the game pair, for example, when the target effect is "change map environment to rainy environment", all of the game objects in the target game pair are affected by the rainy environment, and thus the object information may directly identify all of the game objects without further classification of the game objects. Meanwhile, a certain number of special effects exist, only part of game objects in the game play will be effective, and at this time, the condition for screening the game objects needs to be determined according to the specific content of the special effects.
In a possible implementation manner, in order to determine the object information corresponding to the target special effect more accurately, the processing device may determine an object type and an object number of an object corresponding to the target special effect, and determine the object information according to the object type and the object number. For example, the target special effect "30% of attack force is increased for the own game character" shown in fig. 1, when determining the object information corresponding to the target special effect, the object type "own" needs to be determined; when the target special effect is that the attacking force is improved by 30% for three game characters, the processing equipment needs to determine the number of the three objects; when the target special effect is that the game character with three own parties is improved by 30% of attack force, the processing device needs to determine the object type of the three own parties and the number of the three objects, and then can determine the object information of the game character with half own parties according to the object type and the number of the three objects.
As shown in the following table, the table is a special effect classification table made based on a special effect type, a special effect, an object type corresponding to the special effect, an object number and prompt information in a certain countermeasure game. The prompt information corresponding to the special effect can be displayed in game play for more clearly expressing the trigger condition of the special effect to the user. For example, when the target effect is the "inspire" effect in the table below, at the time the target effect is triggered, a text box may appear in the center of the operator interface of the host game player in the target game play, the text box having the contents of "the host game character inspires! ".
Figure BDA0002357065530000101
Figure BDA0002357065530000111
Furthermore, it will be appreciated that other information that may be obtained by the processing device may also differ for different games and different game plays, given the determination of the object type and number of objects.
In one possible case, the processing device can only acquire the game object corresponding to the anchor of the target live broadcast in the target game play, and cannot acquire play information in the target game play.
At this time, when the anchor in the target live broadcast room participates in the live broadcast target game play, there may be a case that some users want to trigger the target special effect in the game object controlled by the anchor, so that interaction with the anchor can be performed more directly in the game. In a possible implementation manner, if the processing device determines that the game object associated with the first anchor is in the target game play and the type of the game object belongs to the object type of the object corresponding to the target special effect, the processing device may determine the object information according to the object type, the number of objects, and the game object, so that the game object controlled by the first anchor can trigger the target special effect, and the participation sense of the user is further enhanced.
In another possible case, the processing device may determine not only the object type and the number of objects but also match information of the target game match when determining the object information.
The play information includes an object identification in the target game play that may be used to tag a game object in the target game play. At this time, in order to change the game content of the target game play more finely by the target special effect and enable the user to have a clearer participation target when the user wants to participate in the game content, the processing device may further determine a specific game object corresponding to the target special effect. In one possible implementation, the processing device may determine a target object identifier corresponding to a target special effect in the target game play pair according to the play pair information, the object type and the number of objects, and determine the object information according to the target object identifier. The target object identification can directly mark a game object which can trigger a target special effect in the target game play, so that the processing equipment can directly determine a specific game object, and the participation degree of a user in game content is improved.
S203: and generating a special effect request according to the target special effect and the object information.
After determining the target special effect and the object information, in order to be able to trigger the target special effect in the target game play, the processing device may generate a special effect request according to the target special effect and the object information, wherein the special effect request is used for indicating that the target special effect is triggered in the target game play according to the object information. It will be appreciated that in part of the game, for example a target game run under the control of the processing device itself, the processing device may trigger a target effect in the target game play pair directly in accordance with the effect request. Meanwhile, part of the game is controlled and operated by the corresponding game server, the game server is a server except the processing device, and at this time, the processing device can send a special effect request to the game server for triggering the target special effect.
It will be appreciated that in part of the game, it may be necessary to invoke the relevant interface in order to trigger an effect within the game from outside. In one possible implementation, after generating the effect request, the processing device may invoke an effect interface provided by the target game via the effect request to trigger the target effect in the target game play. For example, when the object information includes a game object associated with the first anchor, the processing device may determine an id of the game object according to the game object associated with the first anchor, obtain an id of the target special effect according to the target special effect, and call the interface of the game special effect according to the id, thereby triggering the target special effect in the target game session.
It will be appreciated that the live content in different live rooms may be different, and thus the manner in which the targeted game play is determined may also be different. The embodiment of the application provides the following two conditions for determining the target game match:
first, in a part of live broadcast rooms, the live game play is the game play of the first anchor itself in the live broadcast room, for example, the game play in which the first anchor participates directly or the game play in which the first anchor plays in a target game. In this case, the processing device may determine, according to a anchor identifier of a first anchor corresponding to the target live broadcast room, an account identifier of the first anchor in the target game, and determine a game play associated with the account identifier as a target game play.
In addition, in another part of the live broadcast rooms, the game play live by the first anchor may be the game play of other anchors, but not in the game play itself. For example, a first anchor may relay live content in another live room without logging into a target game live in the live room. At this time, in order to enable the user to participate in the target game play of the target game as well, the processing device may determine, according to the anchor identifier of the first anchor corresponding to the target live broadcast room, that the target live broadcast room is a game play of the second anchor in the target game, then determine, according to the anchor identifier of the second anchor, an account identifier of the second anchor in the target game, and determine, as the target game play, a game play associated with the account identifier of the second anchor in the target game. For example, when a first anchor live-broadcast target game is paired, the processing device may determine an account id according to an anchor id of the first anchor, and detect whether the first anchor is in the target game; if the game is not in the target game play, determining that the live game is the game play of the second anchor, and at this time, the processing device may detect, through the anchor identifier of the first anchor, a live broadcast room into which the anchor account of the first anchor enters, which may be a live broadcast room of the second anchor, so as to obtain the anchor identifier of the second anchor and determine the target game play.
Fig. 4 shows a method for determining a target game play according to a host identifier, where fig. 4 is a flowchart of a method for triggering a target special effect in the target game play, and the method includes:
s401: and determining an account identifier corresponding to the anchor identifier according to the anchor identifier.
S402: and acquiring the latest login game zone service information of the anchor according to the account identification.
In order to determine the target game play of the anchor, the processing device first determines the zone service of the anchor, for example, the processing device may obtain the game zone service information that the anchor has recently logged in through an interface opened by the target game item group according to an account id corresponding to the anchor.
S403: and analyzing the regional service information corresponding to the trigger target special effect.
After the processing device obtains the game zone service information, in order to accurately position the game of the target game where the anchor is located, the processing device can analyze the game zone service information, and obtain more detailed zone service information such as a large zone, a small zone, a game platform and the like corresponding to the account identifier of the anchor. The community is a part of a large area, and the game platform refers to a platform on which a target game runs, such as an ios platform or an android platform.
S404: and acquiring a game role identifier of which the account identifier corresponds to the zone service information, and judging whether the target game play of the game role identifier meets a special effect triggering condition.
After obtaining the more detailed zone service information, the processing device may obtain, according to the account id of the anchor, the game character id corresponding to the zone service information under the account. In addition, the processing device may obtain, according to the game character identifier, a game match where the game character identifier is located, where the game match is a target game match. It will be appreciated that there may be some target game plays that do not have a target effect triggered, for example, there may be some game plays that require a high level of fairness for competitive playing, and in such game plays, it may be prohibited that live room users have an impact on the balance of the game plays, and thus, in one possible embodiment, the processing device may determine whether the target game play satisfies an effect trigger condition after determining the target game play
S405: and calling a target game special effect interface by utilizing information such as the anchor identification, the target special effect id and the like, and triggering the target special effect in the target game.
It can be understood that, in order to enable the user to clearly see the influence of the viewing behavior information made by the user through the user account on the target game play, the processing device may display the target special effect in the target game play through the target live broadcast room, so as to give the user participation feedback, and further improve the participation experience and the interaction degree of the user.
According to the technical scheme, the user behavior data required by content interaction is determined according to the watching behavior information generated by the user account number entering the target live broadcast room in the live broadcast process of the target live broadcast room for the live broadcast target game. When the user behavior data meet the behavior conditions corresponding to the target special effects, the object information corresponding to the target special effects in the target game play of the target game can be determined, and a special effect request is generated according to the target special effects and the object information to indicate that the target special effects are triggered according to the object information in the target game play.
In addition, an embodiment of the present application further provides a task system, as shown in fig. 3, fig. 3 is a schematic diagram of a task system that may be included in a processing device, and the task system may be configured to determine whether user behavior data meets a behavior condition of special effect triggering. The processing equipment can receive the user behavior data reported by the live broadcast platform terminal or the front end through the background task system. For different types of user behavior data, such as the number of bullet screen information submitted by a user, the number of virtual eigenvalues transferred by the user, and the like, different modules may be provided in the task system to receive data, such as a bullet screen information module and a virtual eigenvalue module. The processing device reports the user behavior data to the message queue through the modules. It can be understood that, in order to perform real-time and efficient identification on the messages in the message queue, a Daemon (Daemon) for loop execution may be created in the task system, and is used to perform loop scanning on the message queue, and the loop interval time may be configured in a customized manner, for example, set to 2 s. The daemon can read and analyze the data in the message queue one by one.
Meanwhile, the task system includes a series of lua scripts (lua is a script language) for respectively counting different types of user behavior data. The user behavior data reported to the message queue by different modules has different Report types (Report types), such as barrage, virtual feature value, etc. The task system can perform data filtering according to different report types, distribute the filtered data to a plurality of corresponding lua scripts according to the data types, and store the data into a continuous physical storage space distribution module (cmem) after the lua scripts are detected and counted.
In the process of counting user behavior data, when certain user behavior data reaches a behavior condition corresponding to a certain special effect in the condition configuration module, the task system sends the special effect identifier, the anchor identifier and the like serving as parameters to the special effect triggering module, so that the target special effect is triggered in the target game play of the target game. In addition, the task system can also provide an interface at the front end of the live platform, so that a user or a main broadcast can see the completion condition of various tasks.
It can be understood that different triggerable special effects can be configured for different anchor according to different types or different levels of anchors in order to embody personalized characteristics of each anchor. For example, for some high-level anchor, a user in the anchor live room may be allowed to trigger some high-level special effects; for a anchor with insufficient level, a user in the anchor live broadcast room can be allowed to trigger only a common special effect.
In addition, in order to enable the anchor to perform personalized management on the special effect in the live broadcast room of the anchor and improve the discrimination of the live broadcast effect, the anchor can automatically select whether to trigger the special effect in the live broadcast room or not aiming at partial special effects. For example, the anchor may select whether to accept an anchor task corresponding to the special effect, and if so, mark the anchor as one that can trigger the special effect.
Based on this, in one possible implementation, a target anchor identification can be included in the behavior condition, and the target anchor identification is used for marking an anchor which is allowed to trigger the target special effect.
If the user behavior data meets the behavior condition corresponding to the target special effect, the processing device may determine whether an anchor identifier of a first anchor corresponding to the target live broadcast room is in the target anchor identifier of the behavior condition, and if so, the first anchor qualifies to trigger the target special effect, and at this time, the processing device may execute a step of determining object information corresponding to the target special effect in the target game play of the target game. The first anchor is an anchor corresponding to the target live broadcast room.
As shown in the following table, the table is a detailed trigger information table of a certain special effect B in a game a, where a special effect id is a special effect identifier of the special effect B in the game a, a task module indicates whether the special effect is triggered by a anchor task or a user task, here, the task module is the anchor task, the task type is a behavior condition type corresponding to the special effect B, here, bullet screen information submitted by a user in a live broadcast room is provided, the specific content of the behavior condition is that "a bullet screen is sent in a target live broadcast room to reach 100 pieces", and the target anchor identifier includes anchor 1 to anchor 10, that is, only the 10 anchors can trigger the special effect B by completing the anchor task in the game a. The start time and the end time refer to the start-stop time of the anchor task, and the special effect interval time refers to the minimum difference between two special effect trigger moments.
Figure BDA0002357065530000161
Next, a special effect display method provided by the embodiments of the present application will be described in an application level with reference to the accompanying drawings. It is understood that the effect display method is performed based on the interaction method in the above embodiment, and the displayed effect is the target effect triggered in the target game play pair of the target game by the interaction method.
Referring to fig. 5, fig. 5 is a schematic diagram of a special effect displaying method provided in an embodiment of the present application, where the method includes:
s501: and acquiring user behavior data related to the target game in the target live broadcast room.
The target live broadcast room is used for live broadcasting of the target game, and the user behavior data is determined according to the watching behavior information of the user account number entering the target live broadcast room.
When a user enters a target live broadcast room to watch a main live broadcast target game, in order to meet the participation intention of the user on the target game, the processing equipment can acquire user behavior data related to the target game from the target live broadcast room, and the user behavior data can reflect the participation intention of the user on the target game to a certain extent.
It can be understood that the processing device may show the corresponding relationship between the user behavior data and the special effect to the user in a manner of a host task, a user task, and the like through the target live broadcast room, so that the user can complete the corresponding task by watching the behavior information according to the participation desire of the user, thereby triggering the target special effect.
S502: and if the user behavior data meet the behavior conditions corresponding to the target special effect, displaying the target special effect in the target game match through the target live broadcast room.
The processing device may compare the acquired user behavior data with a behavior condition corresponding to a preset target special effect, and if the behavior condition is satisfied, the processing device determines that the user behavior data may trigger the target special effect. In order to enable a user to visually see the trigger effect of the target special effect in the target game play, the processing device can display the target special effect and prompt information corresponding to the target special effect in the target game play through the target live broadcast room.
Next, the interaction method provided in the embodiment of the present application will be described in conjunction with an actual application scenario. In the application scene, the target special effect is 'improving the attack force of three game roles including a main player and the like of the main player', the target game is a 5v5 confrontation game, and the main player directly plays the operation of the main player in the target game through a target direct playing room. The behavior condition corresponding to the target special effect is that the total number of the bullet screen information in the target live broadcast room reaches 100. Referring to fig. 6, fig. 6 is a flowchart of an interaction method in an actual application scenario, where the method includes:
s601: and acquiring the number of bullet screen information in the target live broadcast room.
When the anchor broadcasts the confrontation game through the target live broadcast room, the processing equipment can acquire the bullet screen information quantity in the target live broadcast room.
S602: and judging whether the bullet screen information quantity meets a preset bullet screen information quantity standard value or not.
In order to determine whether the target special effect can be triggered, the processing device may compare the acquired bullet screen information quantity with a preset bullet screen information quantity standard value, and determine whether the bullet screen information quantity meets the preset standard value. Wherein, the standard value of the bullet screen information quantity is 100.
S603: and if so, determining the account identification of the first anchor in the target game according to the anchor identification of the first anchor corresponding to the target live broadcast room.
And after judging that the quantity of the bullet screen information meets a preset standard value, the processing equipment starts to execute the triggering of the target special effect. First, the processing device needs to determine the target game play pair where the anchor is located in the target game according to the anchor identifier.
S604: and determining the game pair associated with the account identification as a target game pair.
The processing device can acquire the area service information which is recently logged in by the first anchor according to the account identifier of the first anchor, analyze the game play of the game role controlled by the first anchor according to the area service information, and determine the game play as the target game play.
S605: a game character associated with the first anchor is determined.
Since a trigger on a game character manipulated by the anchor is specified in the target special effect, the processing device needs to determine the game character associated with the first anchor.
S606: and determining the object type and the number of the objects corresponding to the target special effect.
In the actual application scene, the type of the object corresponding to the target special effect is the own game role, and the number of the objects is 3.
S607: object information is determined based on the object type, the number of objects, and the game object.
The processing device determines object information of the own 3 game characters including the game character associated with the anchor according to the object type of the own game character, the number of the 3 objects and the game character associated with the first anchor.
S608: and generating a special effect request according to the target special effect and the object information.
S609: and calling an effect interface provided by the target game through the effect request.
The processing device triggers the target special effect in the target game play pair by using a special effect interface provided by the target game called by the special effect request.
Next, a special effect display method provided in the embodiment of the present application will be introduced in combination with an actual application scenario. In the practical application scenario, the target game is a fighting game, as shown in fig. 7a, fig. 7a is a live broadcast image diagram of the target game live broadcast in a target live broadcast room, where a game role a is a game role controlled by a main broadcast, a game role B is an enemy game role, a rose mark on the right side of the target live broadcast room is a virtual feature value transferred by a user in the target live broadcast room, the number of roses is the number of transferred virtual feature values, and behavior conditions are that the number of virtual feature values transferred by all users in the live broadcast room reaches a virtual feature value number standard value of 200. The target special effect corresponding to the behavior condition is '30% of attacking force is improved for the own game role'. Fig. 7a is a schematic diagram of target game play below, wherein the left diagram is a schematic diagram before the target special effect is triggered, and the right diagram is a schematic diagram after the target special effect is triggered. Referring to fig. 8, fig. 8 is a flowchart of a special effect display method in an actual application scenario provided in the embodiment of the present application, where the method includes:
s801: and acquiring the quantity of virtual characteristic values transferred by all users in the target live broadcast room.
S802: and if the quantity of the virtual characteristic values meets the behavior conditions corresponding to the target special effect, displaying the target special effect and prompt information corresponding to the target special effect in the target game play-to-play through the target live broadcast room.
And when the processing equipment judges that the number of the obtained virtual characteristic values meets the number standard value 200 of the virtual characteristics, the special effect of improving 30% of attacking force of own game characters is displayed in the target game play-to-play through the target live broadcast room, wherein the own game characters refer to the game characters A controlled by the main broadcast. Meanwhile, in order to make the user observe the trigger of the target special effect more clearly, the processing device may display a prompt message corresponding to the target special effect in the target game play, that is, the prompt message "you are strengthened!shown in the center of the interface in the right picture! "is used.
In addition, in the application scene or other practical application scenes, other types of special effects can be displayed by the special effect display method. For example, as shown in fig. 7b, fig. 7b is a schematic diagram of a debuffef-type special effect display of "reducing 30% of attack force of an enemy game character" in the same fighting-type game as that in fig. 7 a; as shown in fig. 7c, fig. 7c is a schematic diagram of a neutral special effect of "weather environment changes to rainy environment" in a certain game. It is understood that fig. 7b and 7c illustrate a change of game screens, which can be displayed in the live broadcasting room by the above-described special effect display method.
Based on the interaction method provided in the foregoing embodiment, an interaction apparatus 900 is further provided in the embodiment of the present application, referring to fig. 9a, the apparatus 900 includes a first obtaining unit 901, a first determining unit 902, and a generating unit 903:
a first obtaining unit 901, configured to obtain user behavior data related to a target game in a target live broadcast room, where the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to viewing behavior information of a user account entering the target live broadcast room;
a first determining unit 902, configured to determine, if the user behavior data meets a behavior condition corresponding to the target special effect, object information corresponding to the target special effect in a target game play of the target game; the target special effect is used for changing the game content of the target game play;
a generating unit 903, configured to generate an effect request according to the target effect and the object information, where the effect request is used to instruct to trigger the target effect in the target game play according to the object information.
In a possible implementation manner, the first determining unit 902 is specifically configured to:
determining the object type and the object number of objects corresponding to the target special effect;
and determining object information according to the object type and the number of the objects.
In a possible implementation manner, if it is determined that the game object associated with the first anchor is in the target game play and the type of the game object belongs to the object type of the object corresponding to the target special effect, the first anchor is an anchor corresponding to the target live broadcast room;
the first determining unit 902 is specifically configured to:
object information is determined based on the object type, the number of objects, and the game object.
In a possible implementation, referring to fig. 9b, the apparatus 900 further comprises a second obtaining unit 904:
a second obtaining unit 904, configured to obtain match information of the target game match, where the match information includes an object identifier in the target game match;
the first determining unit 902 is specifically configured to:
determining a target object identification corresponding to a target special effect in the game play of the target according to the game play information, the object types and the object quantity;
and determining object information according to the target object identification.
In a possible implementation, the action condition includes a target anchor identification, see fig. 9c, the apparatus 900 further includes a second determining unit 905:
a second determining unit 905, configured to determine whether the anchor identifier of the first anchor corresponding to the target live broadcast room is in the target anchor identifier;
and if so, executing the step of determining the object information corresponding to the target special effect in the target game play of the target game.
In one possible implementation, referring to fig. 9d, the apparatus 900 further includes a calling unit 906:
a calling unit 906, configured to call, through the special effect request, a special effect interface provided by the target game to trigger the target special effect in the target game play.
In one possible implementation, referring to fig. 9e, the device 900 further includes a display unit 907:
the display unit 907 is configured to display a target special effect in the target game play through the target live broadcast room.
In one possible embodiment, the user behavior data comprises a combination of one or more of:
entering the number of bullet screen information submitted by the user in the target live broadcast room;
entering the number of virtual characteristic values transferred by the user in the target live broadcast room;
and entering the dwell time of the user in the target live broadcast room.
In a possible implementation, as shown in fig. 9f, the apparatus 900 further comprises a third determining unit 908:
a third determining unit 908, configured to determine, according to the anchor identifier of the first anchor corresponding to the target live broadcast room, an account identifier of the first anchor in the target game;
and determining the game pair associated with the account identification as a target game pair.
In one possible implementation, as shown in fig. 9g, the apparatus 900 further includes a fourth determining unit 909:
a fourth determining unit 909, configured to determine, according to the anchor identifier of the first anchor corresponding to the target live broadcast room, that the target live broadcast room is live broadcast by the second anchor in the target game;
determining account identification of the second anchor in the target game according to the anchor identification of the second anchor;
and determining the game play associated with the account identification of the second anchor in the target game as the target game play.
Based on the special effect display method provided by the above embodiment, an embodiment of the present application provides a special effect display apparatus 1000, and referring to fig. 10, the apparatus 1000 includes an obtaining unit 1001 and a display unit 1002:
an obtaining unit 1001, configured to obtain user behavior data related to a target game in a target live broadcast room, where the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to viewing behavior information of a user account entering the target live broadcast room;
the display unit 1002 is configured to display the target special effect in the target game play through the target live broadcast room if the user behavior data meets the behavior condition corresponding to the target special effect, where the target special effect is used to change the game content of the target game play.
In one possible embodiment, the target effect includes any one of:
gain special effects, benefit-reducing special effects, neutral special effects or calling special effects.
In a possible implementation, the display unit 1002 is specifically configured to:
and displaying the target special effect and prompt information corresponding to the target special effect in the target game play.
The embodiment of the application also provides equipment, and the equipment is described below by combining the attached drawings. Referring to fig. 11, an apparatus 1100 is provided in this embodiment of the present application, where the apparatus 1100 may also be a terminal apparatus, and the terminal apparatus may be any intelligent terminal including a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a Point of Sales (POS), a vehicle-mounted computer, and the terminal apparatus is a mobile phone for example:
fig. 11 is a block diagram illustrating a partial structure of a mobile phone related to a terminal device provided in an embodiment of the present application. Referring to fig. 11, the cellular phone includes: a Radio Frequency (RF) circuit 1110, a memory 1120, an input unit 1130, a display unit 1140, a sensor 1150, an audio circuit 1160, a wireless fidelity (WiFi) module 1170, a processor 1180, and a power supply 1190. Those skilled in the art will appreciate that the handset configuration shown in fig. 11 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 11:
RF circuit 1110 may be used for receiving and transmitting signals during a message transmission or call, and in particular, for receiving downlink messages from a base station and then processing the received downlink messages to processor 1180; in addition, the data for designing uplink is transmitted to the base station. In general, RF circuit 1110 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 1110 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), and the like.
The memory 1120 may be used to store software programs and modules, and the processor 1180 may execute various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 1120. The memory 1120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1130 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 1130 may include a touch panel 1131 and other input devices 1132. Touch panel 1131, also referred to as a touch screen, can collect touch operations of a user on or near the touch panel 1131 (for example, operations of the user on or near touch panel 1131 by using any suitable object or accessory such as a finger or a stylus pen), and drive corresponding connection devices according to a preset program. Alternatively, the touch panel 1131 may include two parts, namely, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1180, and can receive and execute commands sent by the processor 1180. In addition, the touch panel 1131 can be implemented by using various types, such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 1130 may include other input devices 1132 in addition to the touch panel 1131. In particular, other input devices 1132 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 1140 may be used to display information input by the user or information provided to the user and various menus of the cellular phone. The Display unit 1140 may include a Display panel 1141, and optionally, the Display panel 1141 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 1131 can cover the display panel 1141, and when the touch panel 1131 detects a touch operation on or near the touch panel, the touch panel is transmitted to the processor 1180 to determine the type of the touch event, and then the processor 1180 provides a corresponding visual output on the display panel 1141 according to the type of the touch event. Although in fig. 11, the touch panel 1131 and the display panel 1141 are two independent components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 1131 and the display panel 1141 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 1150, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1141 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1141 and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 1160, speakers 1161, and microphone 1162 may provide an audio interface between a user and a cell phone. The audio circuit 1160 may transmit the electrical signal converted from the received audio data to the speaker 1161, and convert the electrical signal into a sound signal for output by the speaker 1161; on the other hand, the microphone 1162 converts the collected sound signals into electrical signals, which are received by the audio circuit 1160 and converted into audio data, which are then processed by the audio data output processor 1180, and then transmitted to, for example, another cellular phone via the RF circuit 1110, or output to the memory 1120 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the cell phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 1170, and provides wireless broadband internet access for the user. Although fig. 11 shows the WiFi module 1170, it is understood that it does not belong to the essential constitution of the handset, and can be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 1180 is a control center of the mobile phone, and is connected to various parts of the whole mobile phone through various interfaces and lines, and executes various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 1120 and calling data stored in the memory 1120, thereby performing overall monitoring of the mobile phone. Optionally, processor 1180 may include one or more processing units; preferably, the processor 1180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated within processor 1180.
The phone also includes a power supply 1190 (e.g., a battery) for powering the various components, and preferably, the power supply may be logically connected to the processor 1180 via a power management system, so that the power management system may manage charging, discharging, and power consumption management functions.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In this embodiment, the processor 1180 included in the terminal device further has the following functions:
acquiring user behavior data related to a target game in a target live broadcast room, wherein the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to watching behavior information of a user account number entering the target live broadcast room;
if the user behavior data meet the behavior conditions corresponding to the target special effect, determining object information corresponding to the target special effect in the target game play of the target game; the target special effect is used for changing the game content of the target game play;
and generating an effect request according to the target effect and the object information, wherein the effect request is used for indicating that the target effect is triggered in the target game play according to the object information.
And, the processor 1180 may be further configured to:
acquiring user behavior data related to a target game in the target live broadcast room, wherein the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to watching behavior information of a user account number entering the target live broadcast room;
and if the user behavior data meet the behavior conditions corresponding to the target special effects, displaying the target special effects in the target game play through the target live broadcast room, wherein the target special effects are used for changing the game contents of the target game play.
Referring to fig. 12, fig. 12 is a block diagram of a server 1200 provided in this embodiment, and the server 1200 may have a relatively large difference due to different configurations or performances, and may include one or more Central Processing Units (CPUs) 1222 (e.g., one or more processors) and a memory 1232, and one or more storage media 1230 (e.g., one or more mass storage devices) storing an application program 1242 or data 1244. Memory 1232 and storage media 1230 can be, among other things, transient storage or persistent storage. The program stored in the storage medium 1230 may include one or more modules (not shown), each of which may include a series of instruction operations for the server. Still further, the central processor 1222 may be configured to communicate with the storage medium 1230, to execute a series of instruction operations in the storage medium 1230 on the server 1200.
The server 1200 may also include one or more power supplies 1226, one or more wired or wireless network interfaces 1250, one or more input-output interfaces 1258, and/or one or more operating systems 1241, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
The steps executed by the server in the above embodiments may further provide a computer-readable storage medium for storing a computer program, where the computer program is used to execute any one of the interaction method and the special effect displaying method described in the foregoing embodiments.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium may be at least one of the following media: various media that can store program codes, such as read-only memory (ROM), RAM, magnetic disk, or optical disk.
It should be noted that, in the present specification, all the embodiments are described in a progressive manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus and system embodiments, since they are substantially similar to the method embodiments, they are described in a relatively simple manner, and reference may be made to some of the descriptions of the method embodiments for related points. The above-described embodiments of the apparatus and system are merely illustrative, and the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only one specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. An interactive method, comprising:
acquiring user behavior data related to a target game in a target live broadcast room, wherein the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to watching behavior information of a user account number entering the target live broadcast room;
if the user behavior data meet the behavior conditions corresponding to the target special effect, determining object information corresponding to the target special effect in the target game play of the target game; the target special effect is used for changing the game content of the target game play;
and generating an effect request according to the target effect and the object information, wherein the effect request is used for indicating that the target effect is triggered in the target game play according to the object information.
2. The method of claim 1, wherein the determining object information corresponding to the target effect in the target game play of the target game comprises:
determining the object type and the object number of the object corresponding to the target special effect;
and determining the object information according to the object type and the object quantity.
3. The method of claim 2, wherein if it is determined that the game object associated with the first anchor is in the target game play and the type of the game object belongs to the object type of the object corresponding to the target special effect, the first anchor is an anchor corresponding to the target live broadcast room;
the determining the object information according to the object type and the object number includes:
determining the object information according to the object type, the number of objects, and the game object.
4. The method of claim 2, further comprising:
obtaining the game-matching information of the target game, wherein the game-matching information comprises an object identifier in the target game;
the determining the object information according to the object type and the object number includes:
determining a target object identification corresponding to the target special effect in the target game play according to the play information, the object type and the object number;
and determining the object information according to the target object identification.
5. The method of any of claims 1-4, wherein the target game play is determined by:
determining an account identifier of a first anchor in a target game according to an anchor identifier of the first anchor corresponding to the target live broadcast room;
and determining the game pair associated with the account identification as the target game pair.
6. The method of any of claims 1-4, wherein the target game play is determined by:
determining that the target live broadcast room is a game match of a second anchor in the target game according to the anchor identification of the first anchor corresponding to the target live broadcast room;
determining an account identifier of the second anchor in a target game according to the anchor identifier of the second anchor;
and determining the game play associated with the account identification of the second anchor in the target game as the target game play.
7. The method of claim 1, wherein the behavioral condition comprises a target anchor identification, and wherein prior to the determining object information corresponding to the target special effect in a target game play of the target game, the method further comprises:
determining whether a main broadcast identifier of a first main broadcast corresponding to the target live broadcast room is the target main broadcast identifier;
and if so, executing the step of determining the object information corresponding to the target special effect in the target game play of the target game.
8. The method of claim 1, wherein after the generating a special effect request based on the target special effect and the object information, the method further comprises:
and calling an effect interface provided by the target game through an effect request so as to trigger the target effect in the target game play.
9. An interactive device is characterized in that the device comprises a first acquisition unit, a first determination unit and a generation unit:
the first acquisition unit is used for acquiring user behavior data related to a target game in a target live broadcast room, the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to watching behavior information of a user account number entering the target live broadcast room;
the first determining unit is used for determining object information corresponding to a target special effect in a target game play of the target game if the user behavior data meets a behavior condition corresponding to the target special effect; the target special effect is used for changing the game content of the target game play;
the generating unit is used for generating a special effect request according to the target special effect and the object information, and the special effect request is used for indicating that the target special effect is triggered in the target game play according to the object information.
10. A special effect display method is characterized in that in the process of game play of a target game which is played through a target live broadcast room, the method comprises the following steps:
acquiring user behavior data related to a target game in the target live broadcast room, wherein the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to watching behavior information of a user account number entering the target live broadcast room;
and if the user behavior data meet the behavior conditions corresponding to the target special effects, displaying the target special effects in the target game play through the target live broadcast room, wherein the target special effects are used for changing the game contents of the target game play.
11. The method of claim 10, wherein the target effect comprises any one of:
gain special effects, benefit-reducing special effects, neutral special effects or calling special effects.
12. The method of claim 10 or 11, wherein said presenting the target special effect in the target game play comprises:
and displaying the target special effect and prompt information corresponding to the target special effect in the target game play.
13. The special effect display device is characterized in that in a target game play process of a target game played through a target live broadcast room, the device comprises an acquisition unit and a display unit:
the acquisition unit is used for acquiring user behavior data related to a target game in the target live broadcast room, the target live broadcast room is used for live broadcasting the target game, and the user behavior data is determined according to watching behavior information of a user account number entering the target live broadcast room;
the display unit is used for displaying the target special effect in the target game play through the target live broadcast room if the user behavior data meets the behavior condition corresponding to the target special effect, and the target special effect is used for changing the game content of the target game play.
14. An apparatus, comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the method of any of claims 1-8 or 10-12 according to instructions in the program code.
15. A computer-readable storage medium, characterized in that the computer-readable storage medium is used to store a computer program for performing the method of any one of claims 1-8 or claims 10-12.
CN202010010766.3A 2020-01-06 2020-01-06 Interaction method, special effect display method and related device Active CN111182355B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010010766.3A CN111182355B (en) 2020-01-06 2020-01-06 Interaction method, special effect display method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010010766.3A CN111182355B (en) 2020-01-06 2020-01-06 Interaction method, special effect display method and related device

Publications (2)

Publication Number Publication Date
CN111182355A true CN111182355A (en) 2020-05-19
CN111182355B CN111182355B (en) 2021-05-04

Family

ID=70654573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010010766.3A Active CN111182355B (en) 2020-01-06 2020-01-06 Interaction method, special effect display method and related device

Country Status (1)

Country Link
CN (1) CN111182355B (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111770356A (en) * 2020-07-23 2020-10-13 网易(杭州)网络有限公司 Interaction method and device based on live game
CN111773702A (en) * 2020-07-30 2020-10-16 网易(杭州)网络有限公司 Control method and device for live game
CN111787015A (en) * 2020-07-03 2020-10-16 珠海金山网络游戏科技有限公司 Game live broadcast interaction system, data processing method and device
CN112040264A (en) * 2020-09-06 2020-12-04 北京字节跳动网络技术有限公司 Interactive system, method, apparatus, computer device and storage medium
CN112156478A (en) * 2020-10-19 2021-01-01 腾讯科技(深圳)有限公司 Interaction method and device in live broadcast and computer readable storage medium
CN112256128A (en) * 2020-10-22 2021-01-22 武汉科领软件科技有限公司 Interactive effect development platform
CN112616061A (en) * 2020-12-04 2021-04-06 Oppo广东移动通信有限公司 Live broadcast interaction method and device, live broadcast server and storage medium
CN112717375A (en) * 2021-01-04 2021-04-30 厦门梦加网络科技股份有限公司 Game special effect realization method
CN113269584A (en) * 2021-05-17 2021-08-17 北京达佳互联信息技术有限公司 Resource allocation method, device, electronic equipment and storage medium
CN113485617A (en) * 2021-07-02 2021-10-08 广州博冠信息科技有限公司 Animation display method and device, electronic equipment and storage medium
CN113873284A (en) * 2021-09-30 2021-12-31 广州方硅信息技术有限公司 Interaction method and device for live webcasting, terminal equipment and storage medium
CN114339438A (en) * 2021-11-24 2022-04-12 腾讯科技(深圳)有限公司 Interaction method and device based on live broadcast picture, electronic equipment and storage medium
WO2022096017A1 (en) * 2020-11-09 2022-05-12 北京达佳互联信息技术有限公司 Content display method and apparatus

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101436226A (en) * 2007-11-12 2009-05-20 雷爵网络科技股份有限公司 Network game system with platform for providing feedback and method for providing network game feedback
US8719857B1 (en) * 2005-08-24 2014-05-06 Rovi Guides, Inc. Systems and methods for providing parental control features in video mosaic environments
CN104415536A (en) * 2013-09-05 2015-03-18 株式会社万代南梦宫游戏 Game system
CN105245546A (en) * 2015-10-28 2016-01-13 广州华多网络科技有限公司 Information display method and system
CN106851427A (en) * 2015-12-04 2017-06-13 腾讯科技(深圳)有限公司 A kind of transfer control method and device of live video of playing
CN107659559A (en) * 2017-08-24 2018-02-02 网易(杭州)网络有限公司 A kind of games system
CN108579090A (en) * 2018-04-16 2018-09-28 腾讯科技(深圳)有限公司 Article display method, apparatus in virtual scene and storage medium
CN108848394A (en) * 2018-07-27 2018-11-20 广州酷狗计算机科技有限公司 Net cast method, apparatus, terminal and storage medium
CN108900858A (en) * 2018-08-09 2018-11-27 广州酷狗计算机科技有限公司 A kind of method and apparatus for giving virtual present
CN109040849A (en) * 2018-07-20 2018-12-18 广州虎牙信息科技有限公司 A kind of live streaming platform exchange method, device, equipment and storage medium
CN109327709A (en) * 2018-11-23 2019-02-12 网易(杭州)网络有限公司 Stage property put-on method and device, computer storage medium, electronic equipment
CN109348248A (en) * 2018-11-27 2019-02-15 网易(杭州)网络有限公司 A kind of data processing method, system and the device of game live streaming
CN109568963A (en) * 2017-09-29 2019-04-05 腾讯科技(深圳)有限公司 Virtual resource data processing method, device, computer equipment and storage medium
CN109756747A (en) * 2019-03-25 2019-05-14 广州华多网络科技有限公司 The interaction live broadcasting method and system of more main broadcasters
CN110213612A (en) * 2019-07-10 2019-09-06 广州酷狗计算机科技有限公司 Living broadcast interactive method, apparatus and storage medium
CN110312145A (en) * 2019-08-09 2019-10-08 厦门星海无限科技有限公司 The method of interactive game, storage medium are carried out based on live streaming barrage and spectators

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8719857B1 (en) * 2005-08-24 2014-05-06 Rovi Guides, Inc. Systems and methods for providing parental control features in video mosaic environments
CN101436226A (en) * 2007-11-12 2009-05-20 雷爵网络科技股份有限公司 Network game system with platform for providing feedback and method for providing network game feedback
CN104415536A (en) * 2013-09-05 2015-03-18 株式会社万代南梦宫游戏 Game system
CN105245546A (en) * 2015-10-28 2016-01-13 广州华多网络科技有限公司 Information display method and system
CN106851427A (en) * 2015-12-04 2017-06-13 腾讯科技(深圳)有限公司 A kind of transfer control method and device of live video of playing
CN107659559A (en) * 2017-08-24 2018-02-02 网易(杭州)网络有限公司 A kind of games system
CN109568963A (en) * 2017-09-29 2019-04-05 腾讯科技(深圳)有限公司 Virtual resource data processing method, device, computer equipment and storage medium
CN108579090A (en) * 2018-04-16 2018-09-28 腾讯科技(深圳)有限公司 Article display method, apparatus in virtual scene and storage medium
CN109040849A (en) * 2018-07-20 2018-12-18 广州虎牙信息科技有限公司 A kind of live streaming platform exchange method, device, equipment and storage medium
CN108848394A (en) * 2018-07-27 2018-11-20 广州酷狗计算机科技有限公司 Net cast method, apparatus, terminal and storage medium
CN108900858A (en) * 2018-08-09 2018-11-27 广州酷狗计算机科技有限公司 A kind of method and apparatus for giving virtual present
CN109327709A (en) * 2018-11-23 2019-02-12 网易(杭州)网络有限公司 Stage property put-on method and device, computer storage medium, electronic equipment
CN109348248A (en) * 2018-11-27 2019-02-15 网易(杭州)网络有限公司 A kind of data processing method, system and the device of game live streaming
CN109756747A (en) * 2019-03-25 2019-05-14 广州华多网络科技有限公司 The interaction live broadcasting method and system of more main broadcasters
CN110213612A (en) * 2019-07-10 2019-09-06 广州酷狗计算机科技有限公司 Living broadcast interactive method, apparatus and storage medium
CN110312145A (en) * 2019-08-09 2019-10-08 厦门星海无限科技有限公司 The method of interactive game, storage medium are carried out based on live streaming barrage and spectators

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王星雨: "中国游戏直播平台的发展研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111787015B (en) * 2020-07-03 2022-07-22 珠海金山网络游戏科技有限公司 Game live broadcast interaction system, data processing method and device
CN111787015A (en) * 2020-07-03 2020-10-16 珠海金山网络游戏科技有限公司 Game live broadcast interaction system, data processing method and device
CN111770356A (en) * 2020-07-23 2020-10-13 网易(杭州)网络有限公司 Interaction method and device based on live game
CN111770356B (en) * 2020-07-23 2023-02-03 网易(杭州)网络有限公司 Interaction method and device based on live game
CN111773702A (en) * 2020-07-30 2020-10-16 网易(杭州)网络有限公司 Control method and device for live game
CN112040264B (en) * 2020-09-06 2023-04-21 北京字节跳动网络技术有限公司 Interactive system, method, device, computer equipment and storage medium
CN112040264A (en) * 2020-09-06 2020-12-04 北京字节跳动网络技术有限公司 Interactive system, method, apparatus, computer device and storage medium
CN112156478A (en) * 2020-10-19 2021-01-01 腾讯科技(深圳)有限公司 Interaction method and device in live broadcast and computer readable storage medium
CN112156478B (en) * 2020-10-19 2022-04-12 腾讯科技(深圳)有限公司 Interaction method and device in live broadcast and computer readable storage medium
CN112256128A (en) * 2020-10-22 2021-01-22 武汉科领软件科技有限公司 Interactive effect development platform
WO2022096017A1 (en) * 2020-11-09 2022-05-12 北京达佳互联信息技术有限公司 Content display method and apparatus
CN112616061A (en) * 2020-12-04 2021-04-06 Oppo广东移动通信有限公司 Live broadcast interaction method and device, live broadcast server and storage medium
CN112616061B (en) * 2020-12-04 2023-11-10 Oppo广东移动通信有限公司 Live interaction method and device, live server and storage medium
CN112717375A (en) * 2021-01-04 2021-04-30 厦门梦加网络科技股份有限公司 Game special effect realization method
WO2022242119A1 (en) * 2021-05-17 2022-11-24 北京达佳互联信息技术有限公司 Resource allocation method and apparatus
CN113269584A (en) * 2021-05-17 2021-08-17 北京达佳互联信息技术有限公司 Resource allocation method, device, electronic equipment and storage medium
CN113485617A (en) * 2021-07-02 2021-10-08 广州博冠信息科技有限公司 Animation display method and device, electronic equipment and storage medium
CN113485617B (en) * 2021-07-02 2024-05-03 广州博冠信息科技有限公司 Animation display method and device, electronic equipment and storage medium
CN113873284A (en) * 2021-09-30 2021-12-31 广州方硅信息技术有限公司 Interaction method and device for live webcasting, terminal equipment and storage medium
CN114339438A (en) * 2021-11-24 2022-04-12 腾讯科技(深圳)有限公司 Interaction method and device based on live broadcast picture, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111182355B (en) 2021-05-04

Similar Documents

Publication Publication Date Title
CN111182355B (en) Interaction method, special effect display method and related device
CN111773696B (en) Virtual object display method, related device and storage medium
CN111294622B (en) Interaction method and related device
KR20190103307A (en) Information processing method and apparatus and server
CN110711380B (en) State processing method and related device
CN106303733B (en) Method and device for playing live special effect information
CN108379834B (en) Information processing method and related equipment
CN111318026B (en) Team forming method, device, equipment and storage medium for competitive game
CN111491197A (en) Live content display method and device and storage medium
CN113117331B (en) Message sending method, device, terminal and medium in multi-person online battle program
CN111327914A (en) Interaction method and related device
CN112691367B (en) Data processing method and related device
CN113058264A (en) Virtual scene display method, virtual scene processing method, device and equipment
CN107754316B (en) Information exchange processing method and mobile terminal
CN113350783A (en) Game live broadcast method and device, computer equipment and storage medium
CN112169327A (en) Control method of cloud game and related device
WO2023005234A1 (en) Virtual resource delivery control method and apparatus, computer device, and storage medium
CN111158624A (en) Application sharing method, electronic equipment and computer readable storage medium
CN109529335B (en) Game role sound effect processing method and device, mobile terminal and storage medium
WO2022083451A1 (en) Skill selection method and apparatus for virtual object, and device, medium and program product
CN110743167A (en) Method and device for realizing interactive function
CN114221810B (en) Live broadcast platform-based peep-proof screen method, device, medium and equipment
CN111589113B (en) Virtual mark display method, device, equipment and storage medium
CN113018857B (en) Game operation data processing method, device, equipment and storage medium
CN113599825A (en) Method and related device for updating virtual resources in game match

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant