CN116920389A - Auxiliary information prompting method, device, electronic equipment and storage medium - Google Patents
Auxiliary information prompting method, device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN116920389A CN116920389A CN202210347855.6A CN202210347855A CN116920389A CN 116920389 A CN116920389 A CN 116920389A CN 202210347855 A CN202210347855 A CN 202210347855A CN 116920389 A CN116920389 A CN 116920389A
- Authority
- CN
- China
- Prior art keywords
- information
- progress
- target
- auxiliary
- node
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 80
- 230000002452 interceptive effect Effects 0.000 claims abstract description 38
- 238000012545 processing Methods 0.000 claims description 16
- 238000004458 analytical method Methods 0.000 claims description 14
- 230000006870 function Effects 0.000 claims description 13
- 230000008569 process Effects 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 9
- 230000004044 response Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 19
- 239000008280 blood Substances 0.000 description 12
- 210000004369 blood Anatomy 0.000 description 12
- 239000000463 material Substances 0.000 description 9
- 239000003814 drug Substances 0.000 description 8
- 239000013589 supplement Substances 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 239000003795 chemical substances by application Substances 0.000 description 3
- 230000000295 complement effect Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000000047 product Substances 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/533—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5378—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/69—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Radar, Positioning & Navigation (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses an auxiliary information prompting method, an auxiliary information prompting device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring current task operation of a target object executed on an interactive interface provided by a target application, acquiring task execution progress information corresponding to the target object in the target application according to the current task operation, acquiring task execution auxiliary prompt information corresponding to the current task operation according to task execution progress information corresponding to the target object in the target application, and displaying the task execution auxiliary prompt information to the target object on the interactive interface provided by the target application. The method can avoid the problems that the target object depends on the auxiliary prompt information or does not continue to advance the target application due to the auxiliary prompt information, and the like, improves the effectiveness of information display, and improves the user experience.
Description
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method and apparatus for prompting auxiliary information, an electronic device, and a storage medium.
Background
With the development of internet technology, games in various electronic device platforms are more and more increased, and users may encounter difficulties in playing games, and in the face of such situations, some users choose to acquire game attack information to continue to advance the games.
In the prior art, the manner in which users obtain game play information is typically to search through various gaming websites for game play information. However, the situation described by the game attack information in each game website is usually in and out of the situation of the user at the moment, so that the user cannot be well prompted, the effectiveness of displaying the game attack information is low, and the user needs to leave the current game to find the game attack information, so that the overall game experience is interrupted.
Disclosure of Invention
The application provides an auxiliary information prompting method, an auxiliary information prompting device, electronic equipment and a storage medium, which can improve the effectiveness of information display.
In one aspect, the present application provides an auxiliary information prompting method, which includes:
acquiring current task operation executed by a target object on an interactive interface provided by a target application;
acquiring task execution progress information corresponding to the target object in the target application according to the current task operation; the task execution progress information corresponding to the target object in the target application comprises current task execution scene characteristics and current task execution state characteristics of the target object in the target application or current task execution state characteristics of the target object in the target application;
Acquiring task execution auxiliary prompt information corresponding to current task operation executed on an interactive interface of the target application according to task execution progress information corresponding to the target object in the target application;
and displaying the task execution auxiliary prompt information to a target object on an interactive interface provided by the target application.
Another aspect provides an auxiliary information prompting apparatus, the apparatus comprising:
the task operation acquisition module is used for acquiring the current task operation executed by the target object on the interactive interface provided by the target application;
the execution progress acquisition module is used for acquiring task execution progress information corresponding to the target object in the target application according to the current task operation; the task execution progress information corresponding to the target object in the target application comprises current task execution scene characteristics and current task execution state characteristics of the target object in the target application or current task execution state characteristics of the target object in the target application;
the prompt information acquisition module is used for acquiring task execution auxiliary prompt information corresponding to the current task operation executed on the interactive interface of the target application according to the task execution progress information corresponding to the target object in the target application;
And the prompt information display module is used for displaying the task execution auxiliary prompt information to the target object on the interactive interface provided by the target application.
In another aspect, an electronic device is provided, where the electronic device includes a processor and a memory, where at least one instruction or at least one program is stored, where the at least one instruction or the at least one program is loaded and executed by the processor to implement the auxiliary information prompting method as described above.
Another aspect provides a computer readable storage medium comprising a processor and a memory having stored therein at least one instruction or at least one program loaded and executed by the processor to implement the auxiliary information prompting method as described above.
In another aspect, a computer program product is provided, comprising a computer program, which when executed by a processor implements the auxiliary information prompting method described above.
The application provides an auxiliary information prompting method, an auxiliary information prompting device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring current task operation of a target object executed on an interactive interface provided by a target application, acquiring task execution progress information corresponding to the target object in the target application according to the current task operation, acquiring task execution auxiliary prompt information corresponding to the current task operation according to task execution progress information corresponding to the target object in the target application, and displaying the task execution auxiliary prompt information to the target object on the interactive interface provided by the target application. The method can avoid the problems that the target object depends on the auxiliary prompt information or does not continue to advance the target application due to the auxiliary prompt information, and the like, improves the effectiveness of information display, and improves the user experience.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an application scenario of an auxiliary information prompting method provided by an embodiment of the present application;
FIG. 2 is a flowchart of an auxiliary information prompting method according to an embodiment of the present application;
fig. 3 is a schematic diagram of a scenario game scene according to ring identification information in the auxiliary information prompting method according to the embodiment of the present application;
fig. 4 is a schematic diagram of a method for presenting auxiliary information according to an embodiment of the present application, in which the auxiliary information is presented in an action game scene by using ring identification information;
fig. 5 is a flowchart of determining a target node under a situation that a target application corresponds to a scene in the auxiliary information prompting method provided by the embodiment of the present application;
fig. 6 is a schematic diagram of determining a target node corresponding to a position of a target object in the auxiliary information prompting method according to an embodiment of the present application;
Fig. 7 is a flowchart of determining a target node without scene feature information in the auxiliary information prompting method according to an embodiment of the present application;
FIG. 8 is a flowchart of a method for constructing a target progress node tree in the auxiliary information prompting method according to an embodiment of the present application;
fig. 9 is a schematic diagram of a graph structure formed by task progress information, key progress information and object progress information in the auxiliary information prompting method provided by the embodiment of the application;
FIG. 10 is a schematic diagram of a tree structure corresponding to a game task in a game scene according to an auxiliary information prompting method provided by an embodiment of the present application;
FIG. 11 is a schematic diagram of a content association relationship between progress nodes and task auxiliary information in an auxiliary information prompting method according to an embodiment of the present application;
FIG. 12 is a schematic diagram of text analysis results in the auxiliary information prompting method according to the embodiment of the present application;
fig. 13 is a schematic diagram showing auxiliary information in a game scene in the auxiliary information prompting method according to the embodiment of the present application;
fig. 14 is a schematic structural diagram of an auxiliary information prompting device according to an embodiment of the present application;
fig. 15 is a schematic hardware structure of an apparatus for implementing the method provided by the embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the accompanying drawings, for the purpose of making the objects, technical solutions and advantages of the present application more apparent. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In the description of the present application, it should be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. Moreover, the terms "first," "second," and the like, are used to distinguish between similar objects and do not necessarily describe a particular order or precedence. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein.
It will be appreciated that in the specific embodiments of the present application, related data such as user information is involved, and when the above embodiments of the present application are applied to specific products or technologies, user permissions or consents need to be obtained, and the collection, use and processing of related data need to comply with related laws and regulations and standards of related countries and regions.
Referring to fig. 1, an application scenario schematic diagram of an information display method provided by an embodiment of the present application is shown, where the application scenario includes a client 110 and a server 120, the server 120 obtains object association information of a target object from the client 110, and the server 120 obtains a pre-stored target progress node tree corresponding to the target application. The server determines a target node corresponding to the target object in the target progress node tree based on the object association information of the target object, and the server 120 acquires task execution auxiliary prompt information corresponding to the target node and displays the task execution auxiliary prompt information. In a cloud game scenario, the server 120 may also directly record object association information of a target object corresponding to the client 110.
In an embodiment of the present application, the client 110 includes, but is not limited to, a mobile phone, a computer, an intelligent voice interaction device, an intelligent home appliance, a vehicle-mounted terminal, an aircraft, and the like. The server 120 may be a cloud server, and the server 120 may include a server that operates independently, or a distributed server, or a server cluster that is composed of a plurality of servers. The embodiment of the application can be applied to various scenes, including but not limited to cloud technology, artificial intelligence, intelligent transportation, auxiliary driving and the like.
Referring to fig. 2, an auxiliary information prompting method is shown, which can be applied to a server side, and the method includes:
s210, acquiring current task operation executed by a target object on an interactive interface provided by a target application;
in some embodiments, based on the operation instruction input by the target object, the current task operation performed by the target object on the interactive interface provided by the target application may be obtained. The operation instruction input by the target object may be based on an external device input, such as a keyboard, a mouse, and the like. In the application scene of the cloud game, the operation instruction input by the user can be recorded based on the cloud server, so that the current task operation is obtained.
S220, acquiring task execution progress information corresponding to the target object in the target application according to the current task operation; the task execution progress information corresponding to the target object in the target application comprises current task execution scene characteristics and current task execution state characteristics of the target object in the target application or current task execution state characteristics of the target object in the target application;
in some embodiments, if the scenario is included in the target application, the task execution progress information may include a current task execution scenario feature and a current task execution status feature. If the target application does not include a scene, the task execution progress information may include only the current task execution status feature.
S230, acquiring task execution auxiliary prompt information corresponding to current task operation executed on an interactive interface of the target application according to task execution progress information corresponding to the target object in the target application;
in some embodiments, based on the association information between the preset task execution progress information and the task execution auxiliary prompt information, the task execution auxiliary prompt information corresponding to the current task operation executed on the interactive interface of the target application may be determined according to the task execution progress information corresponding to the target object in the target application. The association information may be established based on a target progress node tree.
S240, displaying task execution auxiliary prompt information to the target object on an interactive interface provided by the target application.
In some embodiments, the target application may be a game application, a game screen may be displayed on the interactive interface, and the task execution auxiliary prompt information may be directly added to the game screen corresponding to the game application. Referring to fig. 3 and fig. 4, fig. 3 is a schematic diagram of a scenario game scene being prompted by the ring-shaped identification information, and fig. 4 is a schematic diagram of an action game scene being prompted by the ring-shaped identification information.
In some embodiments, according to task execution progress information corresponding to a target object in a target application, obtaining task execution auxiliary prompt information corresponding to a current task operation executed on an interactive interface of the target application includes:
determining target nodes corresponding to task execution progress information in a preset target progress node tree, wherein the target progress node tree is generated based on the execution sequence of a plurality of progress nodes corresponding to target applications;
and taking the auxiliary prompt information corresponding to the target node as task execution auxiliary prompt information corresponding to the current task operation.
In some embodiments, the task flow corresponding to the target application may be divided into a plurality of progress nodes, and in the game scenario, that is, the game flow in the game application is divided into a plurality of micro-level, where the micro-level is independent of the level set in the game, and the micro-level represents progress information in the game flow pushing process.
The target progress node tree can be generated based on the execution sequence of a plurality of progress nodes corresponding to the target application, so that each progress node can correspond to the progress information in the task flow of the target application, and after the task execution progress information is obtained, the progress nodes matched with the task execution progress information can be obtained based on the target progress node tree, and the matched progress nodes are the target nodes.
Auxiliary prompt information is corresponding to progress nodes in the target progress node tree, so that the currently displayed auxiliary content can be determined based on the auxiliary prompt information corresponding to the progress nodes, and the auxiliary content is displayed based on the prompt mode information corresponding to the auxiliary content. Therefore, task execution auxiliary prompt information corresponding to the current task operation can be determined.
And determining target nodes corresponding to the task execution progress information based on a preset target progress node tree, so as to obtain task execution auxiliary prompt information, and unifying task flows of different target applications based on the target progress node tree, thereby improving the universality of the auxiliary information prompt method.
In some embodiments, before obtaining the current task operation performed by the target object on the interactive interface provided by the target application, the method further includes:
acquiring an auxiliary prompt starting instruction corresponding to an auxiliary prompt function in a target application;
and in response to the auxiliary prompt starting instruction, preloading a target progress node tree corresponding to the target application.
In some embodiments, the auxiliary prompt starting instruction corresponding to the auxiliary prompt function in the target application may be obtained at any time point in the task flow corresponding to the target application or before the task corresponding to the target application is started. And responding to the auxiliary prompt starting instruction, preloading a target progress node tree corresponding to the target application, executing the determined task execution auxiliary prompt information in the subsequent steps based on the target progress node tree, and displaying the task execution auxiliary prompt information.
Under the condition that an auxiliary prompt starting instruction corresponding to an auxiliary prompt function in the target application is not acquired, the operation of preloading the target progress node tree corresponding to the target application is not executed.
Whether the auxiliary prompt function is started or not is determined based on the selection of the user, so that the auxiliary prompt function can be set in a personalized mode according to the user requirement, and user experience can be improved.
In some embodiments, referring to fig. 5, the task execution progress information includes a current task execution scenario feature and a current task execution status feature, and determining a target node corresponding to the task execution progress information in the target progress node tree includes:
s510, determining object position information of a target object based on the current task execution scene characteristics and the current task execution state characteristics;
s520, matching position information corresponding to progress nodes in the target progress node tree with object position information to obtain a position matching result;
s530, taking a progress node corresponding to the position matching result as a target node under the condition that the position matching result indicates matching.
In some embodiments, the current task execution state characteristics may be derived based on operation information associated with the task movement state in the current task operation. The current task execution scene feature may be obtained based on operation information associated with a scene in the current task operation, the current task execution scene feature may be map information corresponding to the operation information associated with the scene, the map information may not include rendering materials, and the map information may be a current moving map corresponding to the target object or a target transmission map corresponding to the target object.
The method can be applied to cloud games, wherein the cloud games are game modes based on cloud computing, all games are run at a server side in a running mode of the cloud games, and rendered game pictures are compressed and then transmitted to a user through a network, so that the server can record mobile operation instructions input by the user in the running mode of the cloud games, and current task execution state characteristics are obtained.
In the case that the target application has a corresponding scene, based on the current task execution scene feature and the current task execution state feature, the movement condition of the target object in the scene where the target object is located can be determined, thereby determining the object position information of the target object. And part of progress nodes in the target progress node tree can be progress nodes corresponding to the position information in the scene, the position information corresponding to the progress nodes is matched with the object position information of the target object, and a position matching result can be obtained. And under the condition that the position matching result indicates no matching, the target object can be determined not to move to the position corresponding to the progress node, and the progress node is not the target node. In the case that the position matching result indicates matching, it may be determined that the target object has moved to a position corresponding to a progress node, which is the target node.
Referring to fig. 6, a schematic diagram of determining a target node corresponding to a position of a target object is shown in fig. 6. The progress node marked as the door locked is a progress node, the position information corresponding to the progress node is in front of the gate, and the scene characteristic information is map information. The mobile operation information input by the keyboard is acquired, and based on the map information and the mobile operation information, the situation that the target object moves to the front of the gate in the scene can be determined, so that the object position information is obtained. Matching the object location information with the location information may determine that the object location information matches the location information, and thus may determine a progress node identified as "locked door" as the target node.
Under the condition that the target application has a corresponding scene, the scene characteristic information can be referred to, and the accuracy of determining the object position information is improved, so that the accuracy of determining the target node is improved.
In some embodiments, referring to fig. 7, the task execution progress information includes a current task execution status feature, and determining a target node corresponding to the task execution progress information in the target progress node tree includes:
s710, determining object task branch information of a target object based on the current task execution state characteristics;
S720, matching task branch information corresponding to progress nodes in the target progress node tree with object task branch information to obtain task branch matching results;
s730, taking a progress node corresponding to the task branch matching result as a target node under the condition that the task branch matching result indicates matching.
In some embodiments, in the case that the target application does not have a corresponding scene or the target application has a corresponding scene and executes to a non-scene progress, the target node may be determined only according to the current task execution status feature corresponding to the target object. The current task execution state characteristics may be derived based on operation information associated with the task interaction state in the current task operation. Object task branch information may be determined based on the current task execution state characteristics.
The method can be applied to cloud games, wherein the cloud games are game modes based on cloud computing, all games are run at a server side in a running mode of the cloud games, and rendered game pictures are compressed and then transmitted to a user through a network, so that the server can record operation steps of the user in the running mode of the cloud games, and interactive operation information is obtained.
Progress nodes in the target progress node tree may correspond to task branch information in the target application. Matching the object task branch information and the task branch information corresponding to the current task execution state characteristics to obtain a task branch matching result, wherein a progress node corresponding to the task branch matching result is not used as a target node when the task branch matching result indicates no matching, and the progress node corresponding to the task branch matching result can be used as the target node when the task branch matching result indicates matching.
For example, when the target application is a game of a selection type, the option operation input by the target object can be obtained, so that the current task execution state feature is obtained, the scenario branch where the target object is currently located can be located based on the current task execution state feature, so that the target task branch information is obtained, and the progress node of matching the task branch information with the target task branch information is used as the target node. Referring also to fig. 6, as shown in fig. 6, the progress node identified as "replenishment blood volume" is not a scene-related progress node and does not correspond to the position information. An interactive operation of user input is acquired, which may be to pick up a medicament bottle. Based on the current task execution status feature corresponding to the interactive operation, the object task branch information of the target object can be determined to be the complementary blood volume, and if the object task branch information is matched with the task branch information corresponding to the progress node identified as the complementary blood volume, the progress node identified as the complementary blood volume can be determined to be the target node.
Under the condition that the target application does not have a corresponding scene, the target node corresponding to the target object can be determined directly based on the current task execution state characteristics, and the efficiency of determining the target node can be improved.
In some embodiments, referring to fig. 8, before obtaining the target progress node tree corresponding to the target application, the method further includes:
s810, acquiring first type node information corresponding to a target application and second type node information corresponding to the target application;
in some embodiments, the first type node information and the second type node information may each represent progress information corresponding to the target application at any task stage, where the first type node information is task progress information and the second type node information is object progress information and key progress information.
S820, determining third type node information corresponding to progress execution conditions of the first type node information from the second type node information;
in some embodiments, the second type node information includes third type node information and fourth type node information, the third type node information is a progress execution condition corresponding to a task progress included in the first type node information, and the third type node information is a progress node before the first type node information. For example, the progress node corresponding to the first role state and the progress node corresponding to the key are the third type node information in the second type node information, the progress node corresponding to the locked door is the first type node information, and the task progress of opening the locked door can be executed under the condition that the first role state and the key are met, that is, the progress node corresponding to the first role state and the progress node corresponding to the key are the progress executing conditions of the progress node corresponding to the locked door.
S830, determining fourth type node information corresponding to the progress execution result of the first type node information from the second type node information;
in some embodiments, the fourth type node information is a progress execution result corresponding to a task progress included in the first type node information, and the fourth type node information is a progress node after the first type node information. For example, the progress node corresponding to the sword is found to be the fourth type node information in the second type node information, the progress node corresponding to the locked door is the first type node information, and the sword can be obtained only when the task progress of opening the locked door is completed, that is, the progress executing result that the progress node corresponding to the sword is found to be the progress node corresponding to the locked door is obtained.
S840, determining a first association relationship between the first type node information and the third type node information;
in some embodiments, the first association relationship indicates an execution order between the first type node information and a progress execution condition corresponding to the first type node information.
S850, determining a second association relationship between the first type node information and the fourth type node information;
in some embodiments, the second association relationship indicates an execution order between the first type node information and a progress execution result corresponding to the first type node information.
S860, constructing a target progress node tree by taking the first type node information and the second type node information as progress nodes and taking the first association relation and the second association relation as edges.
In some embodiments, the first type node information and the second type node information are taken as progress nodes, and the progress nodes are combined according to the execution sequence corresponding to the first association relation and the execution sequence corresponding to the second association relation, so that a target progress node tree is constructed.
In some embodiments, first type node information corresponding to a target application and second type node information corresponding to the target application are obtained. The first type node information is task progress information in the target application, and the second type node information can comprise object progress information and key progress information. In a game scene, the first type node information is microscopic level progress information in a game, and represents game progress propelled by a user, wherein the microscopic level can be a knockdown enemy, a prop acquisition or the like. The object progress information in the second type node information may be a current state of the target object, and the state may be information of a blood volume, equipment, etc. in a Role-playing Game (RPG) Game, information currently owned by a user, etc. in an Adventure Game (AVG) Game, and a current business state in a policy Game (SLG) Game, etc. The key progress information in the second type node information can be an element passing through the current checkpoint, and can be a key, a key prop or a key enemy.
The schedule execution condition of the first type node information may correspond to at least one second type node information, please refer to fig. 9, and fig. 9 is a schematic diagram of a graph structure formed by task schedule information, key schedule information and object schedule information in a target application, where the task schedule information may be reached under the condition that the key schedule information and the object schedule information are satisfied. And associating a plurality of groups of identical graph structures based on the progress execution sequence, so as to obtain a target progress node tree. In addition, the progress execution result of the first type node information may correspond to one second type node information. After completing a certain task progress information, a certain object progress information may be reached, for example, a state of an object of chromaemia may be reached after supplementing the blood volume. Alternatively, some key progress information may be obtained after completion of some task progress information, e.g., key props may be obtained after opening a locked door.
Therefore, from the second type node information, the third type node information corresponding to the progress execution condition of the first type node information can be determined, and the fourth type node information corresponding to the progress execution result of the first type node information can be determined. After determining the first association relationship between the first type node information and the third type node information and determining the second association relationship between the first type node information and the fourth type node information, the target progress node tree can be constructed by taking the first type node information and the second type node information as progress nodes and taking the first association relationship and the second association relationship as edges. Because the progress nodes in the target progress node tree are arranged according to the execution sequence, the third type node information corresponding to each first type node information can be the fourth type node information corresponding to the previous first type node information.
Referring to fig. 10, fig. 10 is a schematic tree structure diagram corresponding to a game task in a game scene. The progress node "monster", the progress node "locked gate" and the progress node "blood volume supplement" are the first type node information, the progress node "key", the progress node "find a sword", the progress node "medicament", the progress node "first role state", the progress node "second role state" and the progress node "third role state" are the second type node information.
The method comprises the steps that a progress node key and a progress node first character state can reach a door locked by the progress node under the condition that the progress node key and the progress node first character state meet, a first association relation is formed between the progress node key and the door locked by the progress node, and a first association relation is formed between the progress node first character state and the door locked by the progress node.
The progress node 'agent' and the progress node 'second role state' can reach the progress node 'blood volume supplement' under the condition that the progress node 'agent' and the progress node 'second role state' meet, a first association relation is formed between the progress node 'agent' and the progress node 'blood volume supplement', and a first association relation is formed between the progress node 'second role state' and the progress node 'blood volume supplement'.
The progress node can reach the progress node monster under the condition that the progress node finds a sword and the progress node third color state meets, the progress node finds a first association relation between the sword and the progress node monster, and the progress node third color state and the progress node monster have a first association relation.
After the execution of the progress node 'locked gate' is completed, the progress node 'found sword' can be reached, and a second association relationship exists between the progress node 'locked gate' and the progress node 'found sword'. After the execution of the progress node 'blood volume supplement' is completed, the progress node 'third color state' can be reached, and a second association relationship exists between the progress node 'blood volume supplement' and the progress node 'third color state'.
The target progress node tree is constructed based on the graph structure and the progress execution sequence formed by the task progress information, the key progress information and the object progress information, the differences among different game types and different level types can be smoothed, the node tree with the same structure is generated, the current progress of the user corresponds to the nodes in the node tree, and the determination of the target node is conveniently realized, so that the universality of auxiliary information prompt is improved.
In some embodiments, before obtaining the target progress node tree corresponding to the target application, the method further comprises:
task auxiliary information corresponding to a target application is obtained;
and determining auxiliary prompt information corresponding to each progress node in the target progress node tree based on the content association relation between the progress nodes in the target progress node tree and the auxiliary content in the task auxiliary information.
In some embodiments, after the target progress node tree is constructed, task auxiliary information corresponding to the target application may be acquired. The task auxiliary information can be preset task auxiliary information, and can also be obtained by performing content analysis processing on the original auxiliary information.
The task auxiliary information can comprise auxiliary identifications, auxiliary contents, prompt positions, prompt mode information, preset prompt time and the like, the auxiliary identifications can comprise identifications of different types such as circles, arrows and the like, the auxiliary contents can comprise props, clues and the like, the prompt positions can be represented by coordinates on an interactive interface, and the prompt mode information can comprise voice prompt information, flash prompt information and the like. The progress node in the target progress node tree has a content association relationship with auxiliary content in the task auxiliary information, and the task auxiliary information with the content association relationship with the progress node can be used as auxiliary prompt information of the progress node.
Referring to fig. 11, fig. 11 is a schematic diagram showing a content association relationship between a progress node and task auxiliary information. The task auxiliary information 1 comprises an auxiliary identifier 1, auxiliary content keys, prompting positions of x-axis coordinates and y-axis coordinates, prompting mode information of an aperture, no prompting time of 10 minutes is found, the auxiliary content keys in the task auxiliary information 1 correspond to progress node keys, the auxiliary content keys have a content association relation with the progress node keys, and the task auxiliary information 1 is auxiliary prompting information corresponding to the progress node keys.
The task auxiliary information 2 comprises an auxiliary identifier 2, auxiliary content ' key holes ', prompting positions ' x-axis coordinates and y-axis coordinates ', prompting mode information ' aperture ', preset prompting time ' not found after 5 minutes before a gate, auxiliary content ' key holes ' in the task auxiliary information 2 correspond to a progress node ' locked gate ', and have a content association relation with the progress node ' locked gate ', and the task auxiliary information 2 is auxiliary prompting information corresponding to the progress node ' locked gate '.
The task auxiliary information 3 comprises an auxiliary identifier 3, auxiliary content 'monster weak points', prompting positions 'x-axis coordinates and y-axis coordinates', prompting mode information 'aperture and sound', preset prompting time 'in the fight process', the auxiliary content 'monster weak points' in the task auxiliary information 3 correspond to progress nodes 'monsters', and have a content association relation with the progress nodes 'monsters', and the task auxiliary information 3 is auxiliary prompting information corresponding to the progress nodes 'monsters'.
The task auxiliary information 4 comprises an auxiliary identifier 4, auxiliary content medicament, a prompt position x-axis coordinate and a prompt mode information flash, a preset prompt time after entering a target place, and auxiliary content medicament corresponding to a progress node medicament in the task auxiliary information 4, wherein the auxiliary content medicament has a content association relation with the progress node medicament, and the task auxiliary information 4 is auxiliary prompt information corresponding to the progress node medicament.
The task auxiliary information is associated with progress nodes in the target progress node tree, so that auxiliary prompt information corresponding to the target nodes, namely task execution auxiliary prompt information, can be displayed to a target object after the target nodes are determined, and the target object is prompted, so that the prompt can be directly displayed in the execution process of the target application, the intuitiveness of the target auxiliary information prompt is improved, and the user experience is improved.
In some embodiments, obtaining task assistance information corresponding to a target application includes:
performing content analysis processing on original auxiliary information corresponding to a target application to obtain key auxiliary content and operation auxiliary information corresponding to the key auxiliary content;
task assistance information is generated based on the key assistance content and the operation assistance information.
In some embodiments, in the case that the target application does not have the preset task auxiliary information, the original auxiliary information corresponding to the target application may be obtained, where the original auxiliary information may be auxiliary information of the target application made by the user. The original auxiliary information corresponding to the target application can be input into a preset content analysis model, and content analysis processing is carried out on the original auxiliary information to obtain key auxiliary content and operation auxiliary information corresponding to the key auxiliary content. The key auxiliary content is core content of task progress promotion in the target application. The operation assistance information is description information associated with key assistance content, and may include a place, a person, and the like. Based on the key auxiliary content and the operation auxiliary information, auxiliary content, a prompt position and the like information can be determined, so that task auxiliary information is generated.
The content analysis model may include a semantic analysis model that performs content analysis on text information in the original auxiliary information and an image analysis model that performs content analysis on image information in the original auxiliary information.
Referring to fig. 12, fig. 12 is a schematic diagram of the text content analysis result. The method comprises the steps of marking a tree with marks as key auxiliary information, returning home and a target person 1 as operation auxiliary information, marking the key auxiliary information and the operation auxiliary information through different marks, and adding the key auxiliary information and the operation auxiliary information into task auxiliary information.
The original auxiliary information can be analyzed in a content analysis processing mode, so that task auxiliary information is generated based on the original auxiliary information, and the efficiency of acquiring the task auxiliary information can be improved.
In some embodiments, presenting task execution assistance information to a target object on an interactive interface provided by a target application includes:
continuously detecting the operation time corresponding to the target object in the execution process of the target node;
and under the condition that the operation time reaches the preset prompting time in the task execution auxiliary prompting information, displaying the task execution auxiliary prompting information.
In some embodiments, the auxiliary prompt information includes a preset prompt time, and the operation time corresponding to the target object in the execution process of the target node can be continuously detected, and compared with the preset prompt time, and when the operation time matches with the preset prompt time, that is, when the operation time is detected to reach the preset prompt time in the task execution auxiliary prompt information, the task execution auxiliary prompt information is displayed to the target object.
By detecting the operation time, the auxiliary prompt information of task execution is delayed to be displayed, so that the interactivity of a user when executing the target application can be improved, and the user experience is improved.
In some embodiments, please refer to fig. 13, which is a schematic diagram illustrating auxiliary information in a game scene. The target object may be a user. In the preprocessing stage, the task flow of game level tasks in the target game can be divided, so that a plurality of microcosmic game levels corresponding to each game level task are obtained. The microcosmic game level represents a stage task when the game level task is executed, and the execution progress of the corresponding game level task changes when one microcosmic game level is executed. The stage task may be performed based on key information when the stage task is performed and state information of a user when the stage task is performed.
For example, a user receives a game task 1 from a non-player character (NPC) of a release task, and the task flow of the game task 1 is to obtain a certain material a, and conduct a transaction with the NPC of a target transaction place to obtain a material B, and then return the material B to the NPC of the release task to obtain a target clue. The game task 1 can be divided into three microcosmic game stages of acquisition stage tasks, transaction task stage tasks and clue acquisition stage tasks.
The acquisition stage task is to acquire a material A, the corresponding key information is the acquisition place of the material A, and the corresponding state information is the user state related to acquisition; the transaction stage task is to acquire a material B, the corresponding key information is an NPC transaction material A of a target transaction place, and the corresponding state information is a user state related to the target transaction place; the task in the thread obtaining stage is to obtain a target thread, the corresponding key information is NPC for giving the material B to the release task, and the corresponding state information is the user state related to the release task NPC.
Each micro game level is determined to be first type node information corresponding to a game, the first type node information can be task progress information in a task flow of the game, and based on key progress information and state progress information of a user when each micro game level is executed, second type node information can be determined, and the second type node information can be key progress information corresponding to progress execution conditions or progress execution results of the task progress information and corresponding state progress information.
According to the task flow of the game, the execution sequence among the nodes can be determined. Based on the execution sequence among the nodes, the first type node information and the second type node information can be combined to generate a target progress node tree corresponding to the game level task. The game level tasks may include a main line task and a branch line task, and the main line task and the branch line task may be irrelevant, so that there may be multiple target progress node trees, and when determining the task execution auxiliary prompt information, the current game level task currently being executed by the user may be determined first, and the target node is determined from the target progress node tree corresponding to the current game level task based on the current game level task of the user and the corresponding current game progress.
When task auxiliary information corresponding to a game is acquired, the task auxiliary information can be a target game attack, for a newly added game, the target game attack associated with a target progress node tree can be directly acquired from a game producer, for a game which has been released before the newly added game, the content in the original game attack can be analyzed by using the existing original game attack to acquire the target game attack, and a stage task execution strategy corresponding to the task progress information, the key progress information or the state progress information can be determined based on the target game attack. And associating the stage task execution strategy with a corresponding progress node in the target progress node tree, so that auxiliary prompt information corresponding to each progress node can be obtained. The auxiliary prompt information is displayed to the user before the user triggers the stage task corresponding to the progress node.
And starting the auxiliary prompt function based on the auxiliary prompt starting instruction input by the user when the user plays the game or before the user plays the game. The user can close the auxiliary prompt function at any time in the game playing process. In the case of starting the auxiliary prompt function, the target progress node tree may be preloaded so that the current game progress of the user is determined during the game play of the user based on the target progress node tree.
In the case of a plurality of target progress node trees, the target progress node tree to be loaded may be determined based on task execution progress information when the user starts the auxiliary prompt information. If the auxiliary prompt information is started before the user plays the game, that is, if the task execution progress is 0, the target progress node tree corresponding to the main line task can be loaded by default, and if the auxiliary prompt information is started during the game playing process, the current game level task corresponding to the user is determined based on the task execution progress information corresponding to the current task operation of the user, and then the target progress node tree corresponding to the current game level task is loaded.
The current game progress of the user can be determined through a background thread, and the background thread is a background thread of a cloud server in a cloud game scene. When determining the current game progress of the user, the game type of the game with the auxiliary prompt function started can be acquired in the background thread, and whether map data need to be loaded is determined based on the game type, for example, if the game type is RPG, the map data need to be loaded, and if the game type is AVG, the map data need not be loaded.
The current task operation of the target object on the interactive interface provided by the target application can be obtained by recording operation information input by external devices such as a mouse, a keyboard, a handle and virtual reality equipment of a user. The current task execution scenario feature may be derived based on operation information associated with the scenario in the current task operation, such as a move operation or a transfer operation based on user input. The current task execution state characteristics may be derived based on operation information associated with the task execution state in the current task operation. When a scene is included in the target application, the task execution state may be a moving execution state, such as a moving direction, a moving distance, or the like. When the scenario is not included in the target application, the task execution state may be an interactive execution state, such as target branching information selected by the user in branching selection, or the like.
Under the condition that the map data is loaded, the background thread can acquire map loading information corresponding to mobile operation or transmitting operation in the current task operation, and reads the map data corresponding to the map loading information in the loaded map data, so that the current task execution scene characteristics can be determined based on the map data corresponding to the map loading information. And based on the operation information related to the task execution state in the current task operation, the current task execution state characteristics can be acquired. The current task execution scene feature and the current task execution state feature are corresponding task execution progress information in the game, and can represent the current game progress.
The user does not need to prompt at every progress node, so the process of determining the current game progress can be cyclic execution, and task execution progress information can be updated in real time, namely, the current game progress can be updated in real time. And under the condition that the prompt condition is met, determining the current target node which is considered to correspond to the execution progress information, and taking the auxiliary prompt information corresponding to the target node as task execution auxiliary prompt information. The prompting condition can be preset prompting time, and under the condition that the operation time of the user reaches the preset prompting time, auxiliary prompting information of task execution is determined and displayed. The user may change the preset reminder time.
After the current game progress is obtained, a corresponding target node can be determined in the target progress node tree based on the current task execution state characteristics, or the current task execution scene characteristics and the current task execution state characteristics.
In the case where the map data is loaded, the server may also read scene operation setting information, such as mouse sensitivity, field of view, and the like, among the operation setting information, which is auxiliary information of the current task execution scene feature and the current task execution state feature. For example, if the user adjusts the speed setting in the sensitivity of the mouse, the actual operation speed of the user in the game may be determined based on the speed setting, so that the target node is determined in the target progress node tree based on the actual operation speed, the current task execution scene feature corresponding to the current task operation, and the current task execution state feature corresponding to the current task operation.
Accordingly, when determining the current position of the user, object position information corresponding to the target object may be determined based on the scene operation setting information, the current task execution scene feature, and the current task execution state feature, and a target node corresponding to the object position information may be determined in the target progress node tree.
In the case where the map data is not loaded, object task branching information of the user, which may be a branching scenario in a corresponding game, may be determined based on the current task execution state characteristics. For example, some games enter different routes based on different options of the user, so as to enter different branching storylines, and then the current task execution state characteristics can be determined based on the selection operation of the user, and then the branching storylines entered by the user can be determined based on the current task execution state characteristics.
The server may also read device operation setting information in the operation setting information, for example, a key setting condition of a handle, a key setting condition of a keyboard, etc., which may assist in determining the object task branch information. For example, in both game a and game B, the user may select different items to enter different branching scenarios, the button 1 of the handle in game a is for viewing the holding item, the button 2 of the handle in game B is for viewing the holding item, and when executing the game stage task of game a, the user may read the device operation setting information "button 1 is for viewing the holding item", and when executing the game stage task of game B, the user may read the device operation setting information "button 2 is for viewing the holding item".
Accordingly, it is possible to determine object task branch information of the user based on the device operation setting information and the current task execution state characteristics, and determine a target node corresponding to the object task branch information in the target progress node tree.
Based on the auxiliary prompt information corresponding to each progress node in the target progress node tree, task execution auxiliary prompt information corresponding to the target node can be determined, the task execution auxiliary prompt information can be a game strategy to be displayed, and the task execution auxiliary prompt information is auxiliary prompt information displayed before the task in the target stage is triggered. The task execution assistance prompt may be presented to the user. The task execution auxiliary prompt information can comprise auxiliary marks, auxiliary contents, prompt positions, prompt mode information, preset prompt time and the like, the auxiliary marks can comprise marks of different types such as circles, arrows and the like, the auxiliary contents can comprise props, clues and the like, the prompt positions can be represented by coordinates on an interactive interface, and the prompt mode information can comprise voice prompt information, flash prompt information and the like.
When the task execution auxiliary prompt information is displayed, auxiliary content corresponding to the prompt position on the interactive interface can be identified based on the auxiliary identification, and the user is prompted at preset prompt time based on preset prompt mode information.
The embodiment of the application provides an auxiliary information prompting method, which can construct a node tree with a unified structure aiming at different types of target applications, determine the current task progress of a target object based on the node tree and object association information, thereby displaying task execution auxiliary prompting information corresponding to the current task progress.
The embodiment of the application also provides an auxiliary information prompting device, referring to fig. 14, the device comprises:
the task operation obtaining module 1410 is configured to obtain a current task operation that is executed by the target object on an interactive interface provided by the target application;
The execution progress obtaining module 1420 is configured to obtain task execution progress information corresponding to the target object in the target application according to the current task operation; the task execution progress information corresponding to the target object in the target application comprises current task execution scene characteristics and current task execution state characteristics of the target object in the target application or current task execution state characteristics of the target object in the target application;
the prompt information obtaining module 1430 is configured to obtain task execution auxiliary prompt information corresponding to a current task operation executed on an interactive interface of the target application according to task execution progress information corresponding to the target object in the target application;
the prompt information display module 1440 is configured to display task execution auxiliary prompt information to a target object on an interactive interface provided by the target application.
In some embodiments, the hint information acquisition module 1430 includes:
the target node determining unit is used for determining target nodes corresponding to task execution progress information in a preset target progress node tree, and the target progress node tree is generated based on the execution sequence of a plurality of progress nodes corresponding to the target application;
and the prompt information determining unit is used for taking the auxiliary prompt information corresponding to the target node as task execution auxiliary prompt information corresponding to the current task operation.
In some embodiments, the apparatus further comprises:
the node information acquisition module is used for acquiring first type node information corresponding to the target application and second type node information corresponding to the target application;
a first node determining module, configured to determine, from the second type node information, third type node information corresponding to a progress execution condition of the first type node information;
the second node determining module is used for determining fourth type node information corresponding to the progress execution result of the first type node information from the second type node information;
the first association relation determining module is used for determining a first association relation between the first type node information and the third type node information;
the second association relation determining module is used for determining a second association relation between the first type node information and the fourth type node information;
the node tree construction module is used for constructing a target progress node tree by taking the first type node information and the second type node information as progress nodes and taking the first association relation and the second association relation as edges.
In some embodiments, the apparatus further comprises:
the task auxiliary information acquisition module is used for acquiring task auxiliary information corresponding to the target application;
And the node content association module is used for determining auxiliary prompt information corresponding to each progress node in the target progress node tree based on the content association relation between the progress nodes in the target progress node tree and the auxiliary content in the task auxiliary information.
In some embodiments, the object association information includes mobile operation information and scene feature information, the progress nodes in the target progress node tree include scene progress nodes corresponding to location information in the scene, and the target node determination module includes:
the task execution progress information includes a current task execution scenario feature and a current task execution state feature, and the target node determining unit includes:
the object position determining unit is used for determining object position information of the target object based on the current task execution scene characteristics and the current task execution state characteristics;
the position matching unit is used for matching the position information corresponding to the progress node in the target progress node tree with the position information of the object to obtain a position matching result;
and the first target node determining unit is used for taking the progress node corresponding to the position matching result as a target node when the position matching result indicates matching.
In some embodiments, the task execution progress information includes current task execution status characteristics, and the target node determination module includes:
the object task branch determining unit is used for determining object task branch information of a target object based on the current task execution state characteristics;
the branch matching unit is used for matching task branch information corresponding to the progress node in the target progress node tree with object task branch information to obtain a task branch matching result;
and the second target node determining unit is used for taking the progress node corresponding to the task branch matching result as a target node when the task branch matching result indicates matching.
In some embodiments, the task assistance information acquisition module includes:
the content analysis unit is used for carrying out content analysis processing on the original auxiliary information corresponding to the target application to obtain key auxiliary content and operation auxiliary information corresponding to the key auxiliary content;
and an auxiliary information generating unit for generating task auxiliary information based on the key auxiliary content and the operation auxiliary information.
In some embodiments, the information presentation module comprises:
an operation time detection unit, configured to continuously detect an operation time corresponding to a target object in an execution process of the target node;
The auxiliary information prompt unit is used for displaying the task execution auxiliary prompt information under the condition that the operation time is detected to reach the preset prompt time in the task execution auxiliary prompt information.
In some embodiments, the apparatus further comprises:
the starting instruction acquisition module is used for acquiring an auxiliary prompt starting instruction corresponding to the auxiliary prompt function in the target application;
and the node tree loading module is used for pre-loading a target progress node tree corresponding to the target application in response to the auxiliary prompt starting instruction.
The device provided in the above embodiment can execute the method provided in any embodiment of the present application, and has the corresponding functional modules and beneficial effects of executing the method. Technical details not described in detail in the above embodiments may be referred to an auxiliary information prompting method provided in any embodiment of the present application.
The embodiment also provides a computer readable storage medium, in which computer executable instructions are stored, the computer executable instructions being loaded by a processor and executing a method for prompting auxiliary information according to the embodiment.
The present embodiments also provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The computer instructions are read from the computer-readable storage medium by a processor of a computer device, and executed by the processor, cause the computer device to perform the methods provided in the various alternative implementations of information presentation described above.
The present embodiment also provides an electronic device, which includes a processor and a memory, where the memory stores a computer program, and the computer program is adapted to be loaded by the processor and execute a method for presenting auxiliary information according to the present embodiment.
The device may be a computer terminal, a mobile terminal or a server, and the device may also participate in constructing an apparatus or a system provided by an embodiment of the present application. As shown in fig. 15, the server 15 may include one or more processors 1502 (shown in the figures as 1502a, 1502b, … …,1502 n) (the processor 1502 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA), a memory 1504 for storing data, and a transmission device 1506 for communication functions. In addition, the method may further include: input/output interface (I/O interface). It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 15 is merely illustrative and is not intended to limit the configuration of the electronic device described above. For example, the server 15 may also include more or fewer components than shown in fig. 15, or have a different configuration than shown in fig. 15.
It should be noted that the one or more processors 1502 and/or other data processing circuits described above may be referred to herein generally as "data processing circuits. The data processing circuit may be embodied in whole or in part in software, hardware, firmware, or any other combination. Furthermore, the data processing circuitry may be a single stand-alone processing module, or incorporated in whole or in part into any of the other elements in the server 15.
The memory 1504 may be used to store software programs and modules of application software, and the processor 1502 executes the software programs and modules stored in the memory 1504 to perform various functional applications and data processing, i.e., to implement a method for generating a time-series behavior capturing frame based on a self-attention network according to the program instructions/data storage device corresponding to the method according to the embodiments of the present application. The memory 1504 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 1504 may further include memory remotely located relative to the processor 1502, which may be connected to the server 15 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 1506 is used to receive or transmit data via a network. The specific example of the network described above may include a wireless network provided by a communication provider of the server 15. In one example, the transmission device 1506 includes a network adapter (Network Interface Controller, NIC) that may be connected to other network devices through a base station to communicate with the internet.
The present specification provides method operational steps as described in the examples or flowcharts, but may include more or fewer operational steps based on conventional or non-inventive labor. The steps and sequences recited in the embodiments are merely one manner of performing the sequence of steps and are not meant to be exclusive of the sequence of steps performed. In actual system or interrupt product execution, the methods illustrated in the embodiments or figures may be performed sequentially or in parallel (e.g., in the context of parallel processors or multi-threaded processing).
The structures shown in this embodiment are only partial structures related to the present application and do not constitute limitations of the apparatus to which the present application is applied, and a specific apparatus may include more or less components than those shown, or may combine some components, or may have different arrangements of components. It should be understood that the methods, apparatuses, etc. disclosed in the embodiments may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and the division of the modules is merely a division of one logic function, and may be implemented in other manners, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or unit modules.
Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a random access Memory (RAM, randomAccess Memory), a magnetic disk, an optical disk, or other various media capable of storing program codes.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.
Claims (13)
1. An auxiliary information prompting method, characterized in that the method comprises:
acquiring current task operation executed by a target object on an interactive interface provided by a target application;
acquiring task execution progress information corresponding to the target object in the target application according to the current task operation; the task execution progress information corresponding to the target object in the target application comprises current task execution scene characteristics and current task execution state characteristics of the target object in the target application or current task execution state characteristics of the target object in the target application;
acquiring task execution auxiliary prompt information corresponding to current task operation executed on an interactive interface of the target application according to task execution progress information corresponding to the target object in the target application;
And displaying the task execution auxiliary prompt information to a target object on an interactive interface provided by the target application.
2. The auxiliary information prompting method according to claim 1, wherein the obtaining, according to task execution progress information corresponding to the target object in the target application, task execution auxiliary prompting information corresponding to a current task operation executed on an interactive interface of the target application includes:
determining target nodes corresponding to the task execution progress information in a preset target progress node tree, wherein the target progress node tree is generated based on the execution sequence of a plurality of progress nodes corresponding to the target application;
and taking the auxiliary prompt information corresponding to the target node as task execution auxiliary prompt information corresponding to the current task operation.
3. The auxiliary information prompting method according to claim 2, wherein before the obtaining a target progress node tree corresponding to the target application, the method further comprises:
acquiring first type node information corresponding to the target application and second type node information corresponding to the target application;
determining third type node information corresponding to the progress execution condition of the first type node information from the second type node information;
Determining fourth type node information corresponding to a progress execution result of the first type node information from the second type node information;
determining a first association relationship between the first type node information and the third type node information;
determining a second association relationship between the first type node information and the fourth type node information;
and constructing the target progress node tree by taking the first type node information and the second type node information as progress nodes and the first association relation and the second association relation as edges.
4. The auxiliary information prompting method according to claim 3, wherein after said target progress node tree is constructed by using said first type node information and said second type node information as progress nodes and said first association relationship and said second association relationship as edges, said method further comprises:
acquiring task auxiliary information corresponding to the target application;
and determining auxiliary prompt information corresponding to each progress node in the target progress node tree based on the content association relation between the progress nodes in the target progress node tree and the auxiliary content in the task auxiliary information.
5. The auxiliary information prompting method according to claim 2, wherein the task execution progress information includes a current task execution scenario feature and a current task execution status feature, and the determining a target node corresponding to the task execution progress information in the target progress node tree includes:
determining object position information of the target object based on the current task execution scene characteristics and the current task execution state characteristics;
matching the position information corresponding to the progress node in the target progress node tree with the object position information to obtain a position matching result;
and under the condition that the position matching result indicates matching, taking a progress node corresponding to the position matching result as the target node.
6. The auxiliary information prompting method according to claim 2, wherein the task execution progress information includes a current task execution status feature, and the determining a target node corresponding to the task execution progress information in the target progress node tree includes:
determining object task branch information of the target object based on the current task execution state characteristics;
Matching task branch information corresponding to progress nodes in the target progress node tree with the object task branch information to obtain a task branch matching result;
and under the condition that the task branch matching result indicates matching, taking a progress node corresponding to the task branch matching result as the target node.
7. The method for presenting auxiliary information according to claim 4, wherein the obtaining task auxiliary information corresponding to the target application includes:
performing content analysis processing on the original auxiliary information corresponding to the target application to obtain key auxiliary content and operation auxiliary information corresponding to the key auxiliary content;
the task assistance information is generated based on the key assistance content and the operation assistance information.
8. The method of claim 1, wherein the presenting the task execution assistance information comprises:
continuously detecting the operation time corresponding to the target object in the execution process of the target node;
and under the condition that the operation time reaches the preset prompting time in the task execution auxiliary prompting information, displaying the task execution auxiliary prompting information.
9. The auxiliary information prompting method according to claim 1, wherein the acquiring the target object is before a current task operation performed on an interactive interface provided by the target application, the method further comprising:
acquiring an auxiliary prompt starting instruction corresponding to an auxiliary prompt function in the target application;
and pre-loading a target progress node tree corresponding to the target application in response to the auxiliary prompt starting instruction.
10. An auxiliary information presentation apparatus, the apparatus comprising:
the task operation acquisition module is used for acquiring the current task operation executed by the target object on the interactive interface provided by the target application;
the execution progress acquisition module is used for acquiring task execution progress information corresponding to the target object in the target application according to the current task operation; the task execution progress information corresponding to the target object in the target application comprises current task execution scene characteristics and current task execution state characteristics of the target object in the target application or current task execution state characteristics of the target object in the target application;
the prompt information acquisition module is used for acquiring task execution auxiliary prompt information corresponding to the current task operation executed on the interactive interface of the target application according to the task execution progress information corresponding to the target object in the target application;
And the prompt information display module is used for displaying the task execution auxiliary prompt information to the target object on the interactive interface provided by the target application.
11. An electronic device comprising a processor and a memory, wherein the memory stores at least one instruction or at least one program, the at least one instruction or the at least one program being loaded and executed by the processor to implement the auxiliary information prompting method of any one of claims 1-9.
12. A computer readable storage medium, characterized in that the storage medium comprises a processor and a memory, in which at least one instruction or at least one program is stored, which is loaded and executed by the processor to implement the auxiliary information prompting method according to any one of claims 1-9.
13. A computer program product comprising a computer program, characterized in that the computer program, when executed by a processor, implements the auxiliary information prompting method of any one of claims 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210347855.6A CN116920389A (en) | 2022-04-01 | 2022-04-01 | Auxiliary information prompting method, device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210347855.6A CN116920389A (en) | 2022-04-01 | 2022-04-01 | Auxiliary information prompting method, device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116920389A true CN116920389A (en) | 2023-10-24 |
Family
ID=88392953
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210347855.6A Pending CN116920389A (en) | 2022-04-01 | 2022-04-01 | Auxiliary information prompting method, device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116920389A (en) |
-
2022
- 2022-04-01 CN CN202210347855.6A patent/CN116920389A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220297016A1 (en) | A marker in a message providing access to a full version of a video game | |
US10987593B2 (en) | Dynamic interfaces for launching direct gameplay | |
US20170354884A1 (en) | Game play companion application | |
US20130084969A1 (en) | Asynchronous gameplay with rival display | |
WO2022222597A1 (en) | Game process control method and apparatus, electronic device, and storage medium | |
TWI796844B (en) | Method for displaying voting result, device, apparatus, storage medium and program product | |
WO2022222592A1 (en) | Method and apparatus for displaying information of virtual object, electronic device, and storage medium | |
US8795080B1 (en) | Gaming platform providing a game action sweep mechanic | |
CN113101639B (en) | Target attack method and device in game and electronic equipment | |
US11992762B2 (en) | Server-based generation of a help map in a video game | |
CN111298430A (en) | Virtual item control method and device, storage medium and electronic device | |
US11266914B2 (en) | Dynamic modifications of single player and multiplayer mode in a video game | |
US10717010B2 (en) | Systems and methods for providing efficient game access | |
US11972244B2 (en) | Method and apparatus for improving a mobile application | |
CN116920389A (en) | Auxiliary information prompting method, device, electronic equipment and storage medium | |
EP3469801B1 (en) | Generating challenges using a location based game play companion applicaiton | |
KR102219028B1 (en) | Method and system for providing game using past game data | |
JP7071308B2 (en) | Programs, information processing equipment, game servers and game systems | |
CN115645918A (en) | Method, device, equipment, medium and product for aggregation in virtual scene | |
CN116617665A (en) | Virtual character control method, device, electronic equipment and storage medium | |
CN116688496A (en) | Task prompting method and device, storage medium and electronic equipment | |
CN117563225A (en) | Game game play control method, device, equipment and computer storage medium | |
CN117339206A (en) | Method and device for processing task information in game, electronic equipment and storage medium | |
CN113713379A (en) | Object matching method and device, storage medium, computer program product and electronic equipment | |
CN115089968A (en) | Operation guiding method and device in game, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |